Re: [tkined] SQL Databases...

From: John Stumbles (
Date: Tue Feb 29 2000 - 14:31:25 MET

On Mon, 28 Feb 2000, Greg A. Woods wrote:

> I'm not such a big fan of SQL.
> However I am beginning to be a big fan of RRDtool. It's designed
> explicitly for the purpose of storing relatively(*) large amounts of
> SNMP data in ways that make it both quick to store and easy to retrieve
> (it of course stores in the the "smart" way):

        excuse my laziness here (I could plough through the docs on
RRDTool for myself :-) but it seems to me that a mixture of an SQL DB and
something like RRDtool might best fit the bill (my bill, anyway :-).

        I currently log some devices' interface stats to flat files, one
per day; a) because it's relatively easy to do (and I've been doing it
since before I had a DB installed) and b) because it seems to fit the data
pretty well. The only data compaction I have is gzipping yesterday's files
and archiving all data > 3 days old to tape ;-) but I've been meaning to
(and think it would be relatively easy to) do data reduction from the 10
second samples I currently collect to mean and peaks for longer periods (5
minutes, 1 hour etc) as MRTG does. (I've also been meaning to resurrect my
MRTG-like graphing tool so I can actually display all this data! Another
job awaiting A Round Tuit :-)

        The file-per-day model would make it relatively easy to produce
graphs of, for example, this week compared to last week: I'm not sure how
easy it is to delve into the data store to extract specific time intervals
with an RRDTool/MRTG type storage.

        On the other hand I use my SQL (mySQL :-) database to store all
other network info: this is where the relational nature is useful, e.g. I
have a table of IP addresses against MAC addresses (from routers' arp
caches) and a table of Ethernet hardware vendors' names and their OUI
codes (from and I can join the tables on the
vendor code substring of the MAC addresses to show what sort of devices we
have on our network. This can be done (and with mSQL I had to do it) with
scripts(*) retrieving data from the separate tables and doing the
substring of MAC address and match to vendor code, and as long as I have
some script code gluing the SQL DB, www/cgi interface and some other tasks
together it's not essential to be able to do this in an SQL table join.

        What is essential about the DB approach (whether or not it uses
SQL) is the ability for separate processes to be able to read from and
update the database at the same time, and the offloading of the indexing
and data retrieval process from the script is doing the query: having to
read in and search through flat-files for each table is horrible!

(* Perl: shhh! :-)

John Stumbles                            
I.T. Services Centre,   University of Reading 
          Press any key to continue or any other key to quit.

-- !! This message is brought to you via the `tkined & scotty' mailing list. !! Please do not reply to this message to unsubscribe. To subscribe or !! unsubscribe, send a mail message to <>. !! See for more information.

This archive was generated by hypermail 2b29 : Mon Jan 08 2001 - 15:27:38 MET