Get number of concurrent requests by browser?

There are a lot of things to consider here. In most situations, I would only choose one cookieless domain/subdomain to host your images such as static.mywebsite.com. And ideally static files should be hosted by a CDN, but that's another story First of all, IE7 allowed only two concurrent connections per host.

But most browsers today allow more than that. IE8 allows 6 concurrent connections, Chrome allows 6, and Firefox allows 8 So if your web page only has 6 images, for example, then it'd really be pointless to spread your images across multiple subdomains So let's say you have 24 images on a page. Well, few things in life are free and there's such a thing as death by parallelization.

If you host your images in 4 different subdomains, then that means that every single image could theoretically be downloaded in parallel. However, it also means that there are 3 additional DNS lookups involved. And a DNS lookup could be 100 ms, 150 ms, or sometimes longer.

This added delay could easily offset any benefit of parallel downloads. You can see real-world examples of this by testing sites with webpagetest.org Of course the best solution is to use CSS sprites when possible to cut down on the number of requests. I talk about that and the inherent overhead of every request in this article.

There are a lot of things to consider here. In most situations, I would only choose one cookieless domain/subdomain to host your images such as static.mywebsite.com. And ideally static files should be hosted by a CDN, but that's another story.

First of all, IE7 allowed only two concurrent connections per host. But most browsers today allow more than that. IE8 allows 6 concurrent connections, Chrome allows 6, and Firefox allows 8.So if your web page only has 6 images, for example, then it'd really be pointless to spread your images across multiple subdomains.

So let's say you have 24 images on a page. Well, few things in life are free and there's such a thing as death by parallelization. If you host your images in 4 different subdomains, then that means that every single image could theoretically be downloaded in parallel.

However, it also means that there are 3 additional DNS lookups involved. And a DNS lookup could be 100 ms, 150 ms, or sometimes longer. This added delay could easily offset any benefit of parallel downloads.

You can see real-world examples of this by testing sites with webpagetest.org/ Of course the best solution is to use CSS sprites when possible to cut down on the number of requests. I talk about that and the inherent overhead of every request in this article.

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions