I'm focusing on a CakePHP application right now and i'm writing a couple of validation rules for among the models.
I've got a feeling I'm making a lot of inspections towards the database right now and wish to obtain a take on this.
For instance I'm saving to some model which provides extensive connected models as a result I'm saving 5 different foreign id's per save operation. To make sure that all the id's are correct I've got a validation rule for every which inspections that actually the ID does appear in the right table within the database.
You will find additionally a couple of other validation rules, but without moving in to an excessive amount of detail I believe I'm striking the database around 10 occasions for just one save operation, to make sure that all the information is valid.
Can there be any kind of limit I ought to be using, or perhaps is the overhead fairly minimal?
When the connections you establish are permanently reason (and validations are pretty high available online for) then it might be okay for connecting while you must. However, isn't it easy to put these validations together? Are you currently using saved methods? Should you choose, then it might be easy to place the batch of validations together in a single saved procedure. It's similar to saying run as numerous validation rules as possible together all at once, then expect multiple result sets (as needed --and this is better somewhat rather than simply return false when one validation fails however that is dependent around the dependencies of those validations)
To shorten my publish.. Will it be easy to put these validation queries together in a single saved procedure?
make sure that your database is correctly designed. With limited information you provided its hard to judge if you will need to undergo all tables or otherwise. But typically you wouldn't have to do it.
Also striking a database unnecessarily isn't such advisable. Everything is dependent in your application. If you do not actually need a really high end application you might be ok. The overhead for reads wouldn't be badly because the database would perform some memory caching. Creates will be more costly. Particularly if you have foreign key constraints. Because the db will see if the write applies.
Easiest way would be to write an evaluation which attempts to write "x" records within the database via your logic and time that it. That provides you with a sign if it becomes an problem whatsoever.
Premature optimisation may be the cause of all evil.
Get the application folded out. Once real people apply it real reasons, that you can do benchmarks to determine where you will need to optimize.