

Build Your Own Circular Log with MySQL - jekor
http://dt.deviantart.com/blog/36639591/

======
Groxx
If you don't mind having a more static table size, you can also do:

    
    
      create table test_table( id integer identity(1), val varchar(max) )
      -----
      create trigger auto_updater on test_table after insert as
      begin
       declare @id integer = (select id from inserted)
       declare @val varchar = (select val from inserted)
       
       update test_table 
       set val=@val
       where id=@id%10+1
       
       if @id >= 10
          delete from test where id=@id
      end
    

Then to insert you just insert. It'll auto-override the correct entry.

    
    
      insert into test_table(val) values ('hooray')
    

insta-looping for free (syntactically). Though I did write this in t-sql, I'd
assume one can pull the same trick in MySQL (identity => autoincrement). You
can also have it timestamp while you update.

------
kanwisher
This helped me find another really cool way of handling this. INSERT ... ON
DUPLICATE KEY UPDATE

------
btn
It's a nice solution, but doesn't seem to solve the problem they present at
the start of the article. "What we wanted was a way to keep at least 24 hours
worth of entries at all times"; but their solution only stores the last
MAX_CIRCULAR_LOG_ROWS entries.

~~~
jekor
Ah yes, good point. Luckily we have a regular rate of messages and we knew
that 2 million would be enough to always have roughly 24 hours worth.

------
binaryfinery
Wouldnt it be faster to have the log_id in a variable so you don't have to do
a select and an update? In fact wouldnt it be faster to just store 48 hours
and delete the oldest 24? Is it going to kill you to have 47:59 hours of logs?

~~~
spicyj
That's exactly my thought. You wouldn't even need a database then and you
could use a flat file for each day.

~~~
lurchpop
they probably have this in SQL because of the ease-of-use to run stats on it
though.

