I love ORM tools, however i have frequently believed that for big updates (1000's of rows), it appears inefficient to load, update and save when something similar to

UPDATE [table] set [column] = [value] WHERE [predicate]

gives far better performance.

However, presuming one took it lower this route for performance reasons, how does one then make certain that any objects cached in memory were up-to-date properly.

Say you are using LINQ to SQL, and you have been focusing on a DataContext, how can you make certain that the high-performance UPDATE is reflected within the DataContext's object graph?

This can be a "you do notInch or "use triggers around the DB to call .Internet code that drops the cache" and so forth, but I am interested to listen to common methods to this kind of problem.

You are right, in cases like this utilizing an ORM to load, change after which persist records isn't efficient. My process goes something similar to this

1) Early implementation use ORM, during my situation NHibernate, solely

2) As development matures identify performance issues, that will include large updates

3) Refactor individuals to sql or SP approach

4) Use Refresh(object) command to update cached objects,

My large problem continues to be telling other clients the update has happened. More often than not we now have recognized that some clients is going to be stale, the situation with standard ORM usage anyway, after which check a timestamp on update/place.

Most ORMs also provide facilities for carrying out large or "bulk" updates effectively. The Stateless Session is a such mechanism obtainable in Hibernate for Java which apparently is going to be obtainable in NHibernate 2.x:

http://ayende.com/Blog/archive/2007/11/13/What-is-going-on-with-NHibernate-2..aspx

ORMs are ideal for rapid development, but you are right -- they are not capable. They are great for the reason that you don't have to consider the actual systems which convert your objects in memory to rows in tables and again. However, many occasions the ORM does not pick the best process to achieve that. Should you mind concerning the performance of the application, it is best to make use of a DBA that will help you design the database and tune your queries properly. (or at best comprehend the fundamental concepts of SQL yourself)

Bulk updates really are a questionable design. Sometimes they appears necessary oftentimes, however, a much better application design can remove the requirement for bulk updates.

Frequently, various areas of the applying already touched each object individually the "bulk" update must have been completed in another area of the application.

In some cases, the update is really a prelude to processing elsewhere. Within this situation, the update should participate the later processing.

My general design technique is to refactor programs to get rid of bulk updates.

ORMs just will not be as efficient as hands-crafted SQL. Period. Much like hands-crafted assembler is going to be faster than C#. Whether that performance difference matters is dependent on plenty of things. In some instances the greater degree of abstraction ORMs give you may be more vital than potentailly greater performance, in some cases not.

Crossing associations with object code can be very nice but while you appropriately point available are potential issues.

That being stated, Personally, i view ORms to become largely an incorrect economy. I will not repeat myself here but simply indicate Utilizing an ORM or plain SQL?