Saving images from any live url is very easy with php.
Here are three ways to do this.
1. Using Curl
2. Using File functions
3. Using GD library functions.
Condition is in your server php.ini should have “allow_url_fopen” On. check you php setting using phpinfo() function.
Create a folder name “images_saved” to save your images.
Using Curl
Image ' . basename($i) . ' Downloaded Successfully';
}else{
echo 'Image ' . basename($i) . ' Download Failed
';
}
}
function image_save_from_url($my_img,$fullpath){
if($fullpath!="" && $fullpath){
$fullpath = $fullpath."/".basename($my_img);
}
$ch = curl_init ($my_img);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
curl_setopt ($ch, CURLOPT_FOLLOWLOCATION, 1);
$rawdata=curl_exec($ch);
curl_close ($ch);
if(file_exists($fullpath)){
unlink($fullpath);
}
$fp = fopen($fullpath,'x');
fwrite($fp, $rawdata);
fclose($fp);
}
?>
Using File functions
Using GD library functions.
Thank you!!
That script is resource demanding so you need to change setting in php.ini. Please look at comment below.
Find the following section in the php.ini file.
max_execution_time = 30
max_input_time = 60
memory_limit = 128M
Try increase the memory_limit value to 256M.
If the php memory_limit is already at 256M, you can increase it to 512M.
hello..this script is working ..but some time it will return warning for storing image. if i am use store more tham one image at same time .so what can i do…
OM,
Is this working to auto extract images from 500 urls placed in a TXT file ?
Thanks for your work.
Cheers
Thank you for the scripts…