I am a beginner if it comes to PHP - but here a simple web page scraper in PHP is very straightforward.
First, we define the HTTP get/post function
function http_post_content($url,$data) {
$data = http_build_query($data);
$aContext = array(
'http' => array(
'proxy' => 'proxyserver:8080',
'request_fulluri' => True,
'method'=>'POST',
'header'=>
"Content-type: application/x-www-form-urlencoded
"
."Content-Length: " . strlen($data) . "
",
'content' => $data
),
);
$cxContext = stream_context_create($aContext);
$content = file_get_contents($url, FILE_TEXT, $cxContext);
return $content;
/*
$fp = @fopen($url, 'rb', false, $cxContext);
if (!$fp) {
throw new Exception("Problem with $url, $php_errormsg");
}
$content = @stream_get_contents($fp);
if ($content === false) {
throw new Exception("Problem reading data
from $url, $php_errormsg");
}
*/
}
In the http array we can define the proxy server settings, method which can be post or get, and the content. we use file_get_contents to retrieve the web page instead of cURL, since is more powerful.
give some ideas - lemme know what you think
Try and test it Cheeers