Given a little group of organizations (say, 10 or less) to place, remove, or update within an application, what's the easiest method to carry out the necessary database procedures? Should multiple queries be released, one for every entity to become affected? Or should some kind of XML construct that may be parsed through the database engine be utilized, to ensure that just one command must be released? Or something like that other method that I am unaware of?
I request this just because a common pattern inside my current shop appears to become to format up an XML document that contains all the new changes, then send that string towards the database to become processed through the database engine's XML functionality. However, using XML in by doing this appears rather cumbersome, especially because of the simple character from the task to become carried out.
You did not mention what database you're using, however in SQL Server 2008, you should use table variables to pass through complex data such as this to some saved procedure. Parse it there and perform your procedures. For more information, see Scott Allen's article on ode to code.
It is dependent on the number of you must do, and just how fast the procedures have to run. Whether it's merely a couple of, then doing them individually with whatever mechanism you've for doing single procedures works fine.
If you want to do 1000's or even more, and it must run rapidly, you need to re-make use of the connection and command, altering the arguments for that parameters towards the query throughout each iteration. This can minimize resource usage. You won't want to re-produce the connection and command for every operation.
Most databases support BULK UPDATE or BULK Remove procedures.
From the "businessInch design perspective, if you're doing different procedures on all of some organizations, you ought to have each entity handle its very own persistence.
If you will find common batch activities (like "remove all over the age of x date", for example), I'd write a static method on the collection class that executes the batch update or remove. I generally let organizations handle their very own card inserts atomically.
The solution is dependent around the amount of data you are speaking about. If you have a reasonably small group of records in memory you need to synchronise to disk then multiple queries is most likely appropriate. Whether it's a bigger group of data you have to take a look at other available choices.
I lately needed to implement a mechanism where an exterior data feed provided ~17,000 rows of dta which i required to synchronise having a local table. The answer I selected there is to load the exterior data right into a staging table and call a saved proc that did the synchronisation completely inside the database.