lundi 5 janvier 2015

Save a large number of images from remote servers to local computer **as fast as possible**

When I open a web page containing thousands of images in img tags, the browser (last version of Firefox) downloads and renders them pretty quickly. I need to download and save these images as fast as possible!!! But if I use "Save as..." dialog and save this page on disk in a standard way, it takes too much time. But it would be nice if I could save those images directly to SQLite database with the same speed as the browser takes to render them all!


I tried to apply some methods from related articles on the Web, but they do not address this specific requirement.


How to store or save and retrieve multiple remote images locally;

Copy Image from Remote Server Over HTTP;

Save image through curl from url to local folder

http://ift.tt/1yuIcAw;

http://ift.tt/1BCw4L6;


Is there some existing solutions for such a common problem?


Firefox addons such as http://ift.tt/1yuIeZh or DownThemAll, or wget/curl, are the same as using "Save as..." dialog, or slower. I think images must be appended to one single file on disk or in RAM for dumping...


Maybe use PHP and SQLite? For example, I have a .csv file like this:



10;http://ift.tt/1yuIffw; title1
200;http://ift.tt/1yuIcQU; title2
3000;http://ift.tt/1BCw4Lc; title3


It means that I will create one file for one SQLite table containing three columns: "name" (to store the first part: 10, 200, 3000 and so on), "imageblob" (to store images retrieved from corresponding urls) and "title" (to store related title). Is it possible? If yes, what is the maximum speed that I can achieve?


What surprises me is that I can't find a way of solving this problem, except a standard Firefox "Save as...". I don´t want to go into all of the details, but – NO. This is not the way.


Aucun commentaire:

Enregistrer un commentaire