Wanted to understand about logging data modification.

I've come across to methods for monitoring data change (DML).

  1. Using Triggers
  2. Keeping posts in same table for additional Date, Added By, Modified Date, Modified By.

Using approach (1), I'm able to write triggers for Place/Remove/Update on each table to log changes and therefore can use foreign key relationship along with other constraints like unique key constraints on all of the tables according to requirement.

However I did not know how you'll be able to apply various constrains using approach (2). Since I must make composite unique key and also have to think about a lot more posts.

Can there be any design issues in database tables. What's the recommended method for approach (2) to log data.

Which approach is much better.

Also I have started to know from a number of my co-workers that creates don' fire on bulk place queries could it be true?

The answer I love perfect for this really is to want (using security measures, if at all possible) all application data access (or at best all CRUD procedures) to undergo saved methods, and to achieve the SPs manage change monitoring and auditing.

The issue with triggers is they don't scale well they are able to impose significant performance hits by turning what is batched card inserts or updates into many single-row procedures.

Whether to help keep the alterations in-row or perhaps in another table, my preference is by using another table. This way, I'm also able to have past actual changes if needed, as opposed to just who-and-when. Keeping the origin table narrow will also help with query/read performance.

You may also apply partitioning towards the audit tables, which can make them simple to age off / drop when it's time.

You are missing a large one: Change Data Capture. Sql Server Enterprise Edition has this feature built-in already.