CURL or file_get_contents to update a list of feeds?

"YOU AND THE ART OF ONLINE DATING" is the only product on the market that will take you step-by-step through the process of online dating, provide you with the resources to help ensure success. Get it now!

Fetching google. Com using file_get_contents took (in seconds): 2.31319094 2.30374217 2.21512604 3.30553889 2.30124092 CURL took: 0.68719101 0.64675593 0.64326 0.81983113 0.63956594 This was using the benchmark class from davidwalsh.name/php-timer-benchmark.

Get_file_contents is faster. It does a simple http without any extra instantiations.

– Kaoukkos Sep 21 at 17:11 1 cURL will get the headers for the http request as well. You can add your own request headers, post variables and proxy server in cURL. However, for a simple get request, cURL is unnecessary.

I find cURL takes 10 ms longer on my wamp server apache2.2.17 php5.3.5 for google. Com – Pranav Hosangadi Sep 21 at 17:19 @mtopia it doesn't really make a difference because get_file_contents is synchronous, i.e. The next line is executed only after this line has been completely executed – Pranav Hosangadi Oct 3 at 6:34.

Actually I think curl is faster than file_get_contents. Googling a bit I've found out some benchmarks here in SO: file_get_contents VS CURL, what has better performance?

I would recommend considering using curl ... while it might be some development overhead at first sight, it is much more powerful than file_get_contents. Especially if you want to fetch multiple feeds, curl multi requests might be worth looking at: php.net/manual/en/function.curl-multi-in....

If you want flexibility for the future (e.g. Authentication, Cookies, Proxy etc. ) then use cURL. The speed is about the same as file_get_contents() judging from benchmarks (some say it's faster) If you want a quick and easy solution then by all means use file_get_contents(). However, it wasn't built for the purpose for requesting external URL's.

Most people swear by cURL for doing any work with external URL's, even simple GET requests. The only additional work with using cURL is a few extra lines of code, wrap it in a function and you're good to go.

Because you will be updating 50 feeds at once, I would strongly suggest using CURL for two reasons: you can use curl_multi() functions that will allow you to send all 50 requests at once, while file_get_contents() will only go one-by-one. The documentation for these functions is a bit sparse, so I would suggest using a lightweight library - it's much easier to work with. I personally use https://github.Com/petewarden/ParallelCurl, but you will find many around.

As you are pinging the services, you do not really need to know the response, I guess (as long as it's HTTP 200). So you could use the CURL option CURLOPT_NOBODY to make it into a HEAD request, thus in response you would get the headers only, too. This should speed up the process even more.

Put it otherwise, file_get_contents might be faster for simple requests, but in this case your situation is not simple. Firing 50 requests without really needed to get the whole document back is not a standard request.

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions