Okay, so is the correct lots of you've built crazy database intensive pages...
I'm creating a page that Let me pull a variety of different / unrelated databse information from....here are a few sample different queries for that one page:
-articles and info
-When the author is really a registered user, their info -UPDATE the article's view counter -retrieve comments around the article -retrieve information for that authors from the comments -when the readers of this article is signed in, query for information on them
and so forth...
i understand they are essentially destined to be pretty lightning quick...and that i can mix some...however i desired to make certain this is not abnormal?
the number of fairly normal and unheavy queries can you limit you to ultimately on the page?
As much as needed, although not more.
Really: don't be concerned about optimisation (at this time). Construct it first, measure performance second, and IFF there's a performance problem somewhere, then begin with optimisation.
Otherwise, you risk investing considerable time on optimizing something which does not need optimisation.
I have had pages with 50 queries in it with no problem. A quick query to some non-large (ie, matches primary memory) table sometimes happens in 1 nanosecond or less, to help you do a number of of individuals.
If your page loads in under 200 ms, you'll have a snappy site. A large chunk of that's getting used by latency involving the server and also the browser, so I love to goal for < 100ms of your time allocated to the server. Do as numerous queries as you would like for the reason that period of time.
The large bottleneck is most likely the period of time you need to invest in the project, so optimize for your first :) Optimize the code later, if you need to. That being stated, if you are planning to create any code associated with this issue, write something which causes it to be apparent how lengthy your queries take. That method for you to a minimum of discover you've got a problem.
Every query takes a round-visit to your database server, so the price of many queries develops bigger using the latency into it.
Whether it works on the same host there it's still a small speed penalty, not just just because a socket is involving the application but additionally since the server needs to parse your query, build the response, check access and other things overhead you have with SQL servers.
So generally it's easier to tight on queries.
Gradually alter do whenever possible in SQL, though: do not get stuff as input for many formula inside your client language once the same formula might be implemented without hassle in SQL itself. This won't reduce the amount of your queries but in addition helps a good deal in choosing just the rows you'll need.
Piskvor's answer still is applicable regardless.
I do not think there's anyone correct response to this. I'd say as lengthy because the queries are fast, and also the page follows may well flow, there should not be any arbitrary cap enforced in it. I have seen pages fly having a dozen queries, and I have seen them crawl with one.
Wordpress, for example, can pull-up to 30 queries a webpage. You will find a number of things will stop MySQL pull lower - one of these being memchache - but at this time and, while you say, if it will likely be straightforward just make certain all data you pull is correctly indexed in MySQL and do not worry about the amount of queries.
If you are utilizing a Framework (CodeIgniter for instance) you are able to generally pull data for that page creation occasions and appearance what is tugging your website lower.
As other have stated, there's not one number. Whenever you can please use SQL for which it had been designed for and retrieve teams of data together.
Generally a sign that you simply might be doing a problem is if you have a SQL in the loop.
Whenever possible Use joins to retrieve data that goes together versus delivering several claims.
Always attempt to make certain your claims retrieve exactly the thing you need without any extra fields/rows.