I've got a web application which has a MySql database having a device_status table that appears something similar to this...

deviceid | ... various status cols ... | created 

This table will get placed into many occasions each day (2000+ per device daily (believed to possess 100+ products through the finish of the season))

Essentially this table will get an archive when nearly anything happens around the device.

My real question is how must i cope with a table that's likely to grow large very rapidly?

  1. Must I just relax and hope the database is going to be fine inside a couple of several weeks if this table has over ten million rows? after which each year if this has 100 million rows? This is actually the easiest, but appears just like a table that large might have terrible performance.

  2. Must I just archive older data after a while period (per month, per week) and then suggest the net application query the live table for recent reviews and query both live and archive table for reviews covering a bigger time period.

  3. Must I come with an hourly and/or daily aggregate table that covers the different statuses for any device? Basically do that, what's the easiest method to trigger the aggregation? Cron? DB Trigger? Also I'd most likely still have to archive.

There has to be a far more elegant means to fix handling this kind of data.

I'd an identical problem in monitoring the amount of sights seen for marketers on my small site. Initially I had been placing a brand new row for every view, so that as you predict here, that rapidly brought towards the table growing unreasonably large (to the stage it was indeed leading to performance issues which ultimately brought to my webhost shutting lower the website for any couple of hrs until I'd addressed the problem).

The answer I opted for is comparable to your #3 solution. Rather than placing a brand new record whenever a new view happens, I update the present record for that time-frame under consideration. During my situation, I opted for daily records for every ad. what time-frame for your application depends positioned on the more knowledge about your computer data as well as your needs.

Unless of course you have to particularly track each occurrence during the last hour, you may be over-doing the work to even store them and aggregate later. Rather than disturbing using the cron job to do regular aggregation, you can simply look for an entry with matching specs. If you discover one, then you definitely update a count area from the matching row rather than placing a brand new row.