Within an online ticketing system I have built, I have to add real-time analytical confirming on purchases for my client.
Important order information is split over multiple tables (clients, orders, line_products, package_types, tickets). Each table consists of additional data that's trivial to the report my client may require.
I am thinking about recording each order like a separate line item inside a denormalized report table. I am trying to puzzle out if the is sensible or otherwise.
Generally, the queries I am running for that report just join across 2 or 3 from the tables at any given time. Each table has got the appropriate indices added.
Will it seem sensible to compile all the order data into one table that consists of just the necessary posts for that confirming?
The applying is made on Ruby on Rails 3 and also the DB is Postgresql.
EDIT: The aim of this is to render the information within the browser as quickly as possible for that user.
is dependent on which your ultimate goal is. if you wish to result in the report results faster to show then that will certainly work. the downside would be that the information is somewhat maintained through batch updates. You can write a trigger that updates the table whenever a new record is available in towards the base tables, but that may potentially add lots of overhead.
Perhaps a view rather than a brand new table is really a better solution within this situation?