It is possible to reason why most logs appear to stay in plain text, instead of being include a MySQL/other kind of database?
It appears in my experience that putting them right into a database will make analysis much, much easier…but would which come in the sacrifice of speed or something like that else?
(I am not too worried about portability, and clearly you'd have text logs for the database connection.)
I'm able to think about two large reasons:
First, databases are reduced than text files if this involves simply appending information to some file. Having a database, you need to begin a connection, transmit data within the network, store it within an indexed structure, etc. Having a file, you'll need only create the mistake to local disk.
Second, sometimes what you wish to log have to do with the database being damaged. When the local disk is damaged, you have bigger problems than attempting to generate log files. However, you can log database black outs even if anything else is working.
With that said, you will find lots of situations in which the information I wish to log is pertinent only as the application is correctly functioning, so when I curently have a database connection. In individuals cases, I actually do log straight to MySQL.
Although you aren't worried about portability, In my opinion this is a large area of the reason. File I/O is almost universal with an very consistent API. Other advantages include:
- Less moving parts
- Absolutely nothing to install
- Low skill barrier
- Mature tools for logging, parsing, controlling
- Not determined by the network (presuming local disk and never NFS or any other remote storage)
- Can continue to put these right into a DB for confirming/analysis
- grep is the friend
- No file same as SQL injection to bother with (well for storing the information, not always so for confirming it or subsequent DB upload).
Nevertheless, there is nothing wrong with logging to some DB when the character from the application gives itself to that particular and I have seen many applications which do so.
Databases have a significant overhead when it comes to memory, space for storage, and efficiency. Adding new records to some database or modifying existing records is way reduced. (Additionally, many don't know SQL and/or even the more knowledge about establishing a database.)
If, however, you'll need analysis or metric evaluation abilities which are nearly impossible to find using a simple text file, plus there is certainly nothing wrong with this. It's greatly a situation-by-situation situation.
(Others have previously stated numerous advantages regarding file-based logging.)
I believe DB logging gets to be more helpful when logs are collected on the remote machine (for instance, via syslog/rsyslog on Linux), for backup: this could helpful when the original machine is jeopardized and it is logs changed. Collecting logs inside a database (possibly particularly around the remote machine) is helpful within this situation, as it can certainly help sorting individuals logs. You may also see the logs more easily via tools for example phpLogCon, or browse them using custom web-pages (it's frequently simpler than logging onto the equipment if you are just doing a bit of casual monitoring).
This being stated, remote logging, logging to some DB and getting a pleasant tool to see the logs are rather independent (I believe phpLogCon can function with file logs too). Basically store logs inside a DB, I additionally store logs inside a file simultaneously, if perhaps to have the ability to read once the link with the DB is lower.