get-headers

PHP - Errors with get_headers and SSL

a 夏天 提交于 2021-02-07 12:39:44
问题 This is my code $url = 'http://www.wikipedia.com'; // URL WITH HTTP $hurl = str_replace("http", "https", $url); // URL WITH HTTPS $urlheads = get_headers($url, 1); $surlheads = get_headers($hurl, 1); $urlx = false; $surlx = false; foreach ($urlheads as $name => $value) { if ($name === 'Location') { $urlx=$value; } else{ } } print_r($urlx); And this is error I'm getting: Warning: get_headers(): Peer certificate CN=`*.wikipedia.org' did not match expected CN=`www.wikipedia.com' in.... Warning:

PHP - Differences between `get_headers` and `stream_get_meta_data`?

南楼画角 提交于 2020-01-04 05:14:10
问题 Intro / Disclaimer Decent chunks of this are outputs that can largely be ignored. It is still a bit of a reader, but I'm trying to be thorough in my analysis and questioning. If you are familiar with stream_get_meta_data , you would be fine to skip to the "Questions" at the end . Other than in the DOCs, I am having trouble finding out much about PHP's stream_get_meta_data. The overall functionality is not vastly different to that of PHP's get_headers, but I cannot for the life of me find any

php get_headers a good way to tell if a site is up?

房东的猫 提交于 2020-01-01 19:29:26
问题 I'm still fairly new to php, can you comment on whether the code below is any good to tell if a site is up or down and if it might not be suitable the reasons why and better altneratives? Thanks in advance. $siteHeader = @get_headers($url , 1); if ($siteHeader > 1) { $siteUp = true; } else { $siteUp = false; } 回答1: i use curl, but thats just me: function check($url, $ignore = '') { $agent = "Mozilla/4.0 (B*U*S)"; $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT

PHP - get_headers() return wrong result when non-UTF symbols are in URL

孤人 提交于 2019-12-24 05:21:29
问题 I stumbled upon the wrong result of get_headers() method. URL for testing: http://www.zakon.hr/z/199/Zakon-o-elektroni%C4%8Dkoj-trgovini Here's simple curl request to that URL: As you can see on screenshot there's successful response with 200 OK code. But if I using get_headers() for the same URL I'm getting anothere result: var_dump(get_headers('http://www.zakon.hr/z/199/Zakon-o-elektroničkoj-trgovini')); array(4) { [0]=> string(24) "HTTP/1.0 400 Bad request" [1]=> string(23) "Cache-Control:

PHP - `get_headers` returns “400 Bad Request” and “403 Forbidden” for valid URLs?

孤街浪徒 提交于 2019-12-22 06:49:39
问题 Working solution at bottom of description! I am running PHP 5.4, and trying to get the headers of a list of URLs. For the most part, everything is working fine, but there are three URLs that are causing issues (and likely more, with more extensive testing). 'http://www.alealimay.com' 'http://www.thelovelist.net' 'http://www.bleedingcool.com' All three sites work fine in a browser, and produce the following header responses: (From Safari) Note that all three header responses are Code = 200 But

relative url not working for getheaders

眉间皱痕 提交于 2019-12-13 04:02:20
问题 here is this script which works just fine for absolute url in img src tag but if img src tag is having relative url it fails and give error Warning: get_headers() [function.get-headers] : This function may only be used against URLs in in first link img src is relative url but in second link img src is absolute url and it gives perfect result but i want my script to work in both cases any idea?? <?php $websitelink = 'http://img172.imagevenue.com/img.php?image=90465_Gwen3_122_17lo.jpg'; //

php check if file exists on an external doman (accessing form a sub domain)

纵然是瞬间 提交于 2019-12-12 17:24:58
问题 I have a website on http://www.reelfilmlocations.co.uk The above site has an admin area where images are uploaded and different size copies created in subfolders of an uploads/images directory. I am creating a site for mobile devices, which will operate on a sub-domain, but use the database and images from the main domain, http://2012.reelfilmlocations.co.uk I want to be able to access the images that are on the parent domain, which i can do by linking to the image with the full domain i.e

getting raw output in local wamp same code working fine in webserver

旧巷老猫 提交于 2019-12-12 02:55:31
问题 i have this script which is workign just fine in server but in local wamp it is giving error i have fopen is on Warning: get_headers(): This function may only be used against URLs in C:\wamp\www\url\test5.php on line 8 <?php $websitelink= 'http://www.brobible.com/girls/article/miley-cyrus-21st-birthday-party'; $html = file_get_contents($websitelink); $doc = new DOMDocument(); @$doc->loadHTML($html); $tags = $doc->getElementsByTagName('img'); foreach ($tags as $tag) { $data = get_headers($tag-

how to set user agent for get_headers PHP function

六月ゝ 毕业季﹏ 提交于 2019-12-10 13:26:23
问题 I know It's easy to set user agent for curl but my code is based on get_headers, by default get_headers user agent is empty. thanks for any help. 回答1: Maybe this? ini_set('user_agent', 'Mozilla/5.0'); 回答2: get_headers only specifies the data sent by the server to the client (in this case, PHP), it doesn't specify request headers. If you're trying to find the user agent the get_headers request was made with, you'll have to use: ini_get('user_agent'); For more documentation see the links below:

How to use request.getHeader(“Referer”)

北慕城南 提交于 2019-12-10 13:12:57
问题 In my current project, I have a shopping cart integrated with the main site. Now I have to create some mini sites to display the data retrieved from main site. When the user click buy now button in mini site, it should redirect to the main shopping cart. But when user click the Continue shopping button, that should be send back to the mini site page where he was browsing. Both sites will be in 2 different domain names. Can I send him back to the page where he was browsing us? request