3v4l.org

run code in 300+ PHP versions simultaneously
<?php $hrefPattern = '/<a\\s+[^>]*href=(["\']??)([^"\'>]*?)\\1([^>]*)>(.*)<\/a>/siU'; $html = <<<HTML <p>If you find any cases where this code falls down, let us know using the Feedback link below.</p> <p>Before using this or similar scripts to fetch pages from other websites, we suggest you read through the related article on <a href="/php/parse-robots/">setting a user agent and parsing robots.txt</a>.</p> <h2>First checking robots.txt</h2> <p>As mentioned above, before using a script to download files you should always <a href="/php/parse-robots/">check the robots.txt file</a>. Here we're making use of the <tt>robots_allowed</tt> function from the article linked above to determine whether we're allowed to access files:</p> HTML; preg_match_all($hrefPattern, $html, $matches); var_dump($matches);
Output for git.master, git.master_jit, rfc.property-hooks
array(5) { [0]=> array(2) { [0]=> string(76) "<a href="/php/parse-robots/">setting a user agent and parsing robots.txt</a>" [1]=> string(59) "<a href="/php/parse-robots/">check the robots.txt file</a>" } [1]=> array(2) { [0]=> string(1) """ [1]=> string(1) """ } [2]=> array(2) { [0]=> string(18) "/php/parse-robots/" [1]=> string(18) "/php/parse-robots/" } [3]=> array(2) { [0]=> string(0) "" [1]=> string(0) "" } [4]=> array(2) { [0]=> string(43) "setting a user agent and parsing robots.txt" [1]=> string(26) "check the robots.txt file" } }

This tab shows result from various feature-branches currently under review by the php developers. Contact me to have additional branches featured.

Active branches

Archived branches

Once feature-branches are merged or declined, they are no longer available. Their functionality (when merged) can be viewed from the main output page


preferences:
40.29 ms | 402 KiB | 8 Q