It's great that you can run this benchmark, but ask yourself what it means. Think about the real amount of traffic you'll be receiving. You really need a question to benchmark against: I expect that my site will have 3,000 users.
I expect that during peak usage, 500 of them will be hitting the page A typically usage is 3 requests over a minute: 3 * 500 / 60 = ~ 25 req/sec Can my site handle 25 req/sec and be responsive (Unless you're in the top few percent of the web, your page won't see 100 concurrent requests in real life. It doesn't make sense to tune your site for that level of traffic. To hit those numbers, you need to make design compromises at the architecture level (database usage, caching methods, etc: hence your number of failures when the database is on).
If you're only trying to profile your script, use xdebug to find where your code is spending it's time.
I think this looks like a great job. Way off? I'd say it's way above the norm.
One question is what the load on the server will be in production. Your script is firing requests at this server, but if you're the only user on a development instance you aren't seeing what happens when you hit the production server while it's processing the typical production load. If that's the case, you need TWO sources of request: one that represents your new app, and another for the production processes that it'll compete with for resources.
Can you set the # of simultaneous users in the benchmark software? Does this test just send off 1000 requests, one after the other? Having multiple users banging on the server at the same time might be more realistic.
Can you make the sending interval random? That might be a better representation of your real situation. Can you vary the data that the script uses?
Does it represent the conditions under which it'll be used very well? Other than that, all I can offer is my congratulations. Looks like you're being very thorough to me.
Try using xdebug to profile your code. Xdebug will also give you better on-screen errors & stack traces. Then use webgrind to view the profile in a nice format.
200ms per Request is a somewhat common number at which a page appears to be 'fast' for the majority of users.
You shouldn't be getting any failed requests - you need to check your error log to see why they're failing. It's most likely to be MySQL running out of connections, in which case you can simply tune your server to allow more concurrent connections (if you expect that amount of traffic).
I think this looks like a great job. Way off? I'd say it's way above the norm.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.