A few years ago I was evaluating a cool log analysis package called Splunk for a project at work. I had a few instances running on a development machine at work and on a server at home. I found that I was able to drill down to very specific events to debug what was happening so I could correlate problems among various devices and software packages. When I upgraded my home server a year ago I didn't spend the time to reinstall Splunk, as I was busy with moving into a new house and having children, so it went to the back burner.
Recently I was having a conversation on system monitoring architecture and Splunk came up. I decided to take a look and see what a few years of maturity has done. First of all, the basic software is now free for individual use. While there is a reduction in enterprise features and there is no password/account authentication, the core functionality is all there. There is a 500Mb limit on the amount of data you can processes, but if you have half a gig of syslog/logfiles/etc to parse a day, then you shouldn't be so cheap and just buy a full license. If you were paranoid, it would be very easy to use this software and to only share the management port to localhost, so you would have to use a SSH tunnel to get into the box to be able to view any of the data. I know that is pretty hokey, but it does work as far as anyone with account access to the box gets to see your data. Beyond that you could always run Splunk within a virtual machine.
Beyond the cool factor of being able to drill down into your data, it runs well on pretty anaemic hardware. The server I installed this software on is cobbled together from remnants of several dead computers that are at least six years old, yet the response time from the database with around half a million events is surprisingly fast.