Mysql row count limit

Haden

Senior member
Nov 21, 2001
578
0
0
Is it possible to get more than 2^32 rows in mysql table on x86 hardware?
{maybe postgresql?}

I would rather avoid having multiple tables and doing multiple selects to overcome this problem.
 

Armitage

Banned
Feb 23, 2001
8,086
0
0
According to my book, MySQL has an 8 byte int (BIGINT). Not sure if you can have that many rows or not, but if you've got > 4 billion rows in one table you might want to reconsider some things abyway.
 

Haden

Senior member
Nov 21, 2001
578
0
0
Armitage - setting MAX_ROWS when creating table resets to 2^32, not sure what can be considered about table thought.
As for table - it's very primitive for loggin new connections, but it fills up in about two weeks and considering network grows situation will get worse:

CREATE TABLE ulog (
local_time INT UNSIGNED NOT NULL,
oob_prefix CHAR(6) NOT NULL,
ip_saddr INT UNSIGNED NOT NULL,
ip_daddr INT UNSIGNED NOT NULL,
ip_protocol TINYINT UNSIGNED NOT NULL,
ip_ttl TINYINT UNSIGNED NOT NULL,
tcp_sport SMALLINT UNSIGNED,
tcp_dport SMALLINT UNSIGNED,
INDEX(local_time)
) MAX_ROWS=500000000000;
 

MrChad

Lifer
Aug 22, 2001
13,507
3
81
Originally posted by: Haden
Armitage - setting MAX_ROWS when creating table resets to 2^32, not sure what can be considered about table thought.
As for table - it's very primitive for loggin new connections, but it fills up in about two weeks and considering network grows situation will get worse:

CREATE TABLE ulog (
local_time INT UNSIGNED NOT NULL,
oob_prefix CHAR(6) NOT NULL,
ip_saddr INT UNSIGNED NOT NULL,
ip_daddr INT UNSIGNED NOT NULL,
ip_protocol TINYINT UNSIGNED NOT NULL,
ip_ttl TINYINT UNSIGNED NOT NULL,
tcp_sport SMALLINT UNSIGNED,
tcp_dport SMALLINT UNSIGNED,
INDEX(local_time)
) MAX_ROWS=500000000000;

A table like that should be archived on a regular basis. Querying a table that large will cripple your database system, and it really should not be necessary to keep connection logs in the same table indefinitely.
 

Haden

Senior member
Nov 21, 2001
578
0
0
MrChad, when approaching limit I see no problems with performance (all queries select by time).
I don't want to keep logs indefinitely - but one month is minimum and I would need maybe 2^34 for that.
 

Armitage

Banned
Feb 23, 2001
8,086
0
0
You get 3550 new connections per second 24/7?? And MySQL keeps up with that? Because that's what it would take to fill that table in two weeks.

On a fairly respectable system, I set up a table with just a single in column, and wrote a python script to push 1,000,000 values into it ... I got about 7352 inserts/second (6369/sec with an index on that int). So I suppose it's possible ... but wow.
 

Armitage

Banned
Feb 23, 2001
8,086
0
0
Originally posted by: Haden
MrChad, when approaching limit I see no problems with performance (all queries select by time).
I don't want to keep logs indefinitely - but one month is minimum and I would need maybe 2^34 for that.

You can keep the logs indefefinitely ... but if it's older then 2 weeks, you have to go to the archive. Which could still be a table on the database, just not the current live table that the logging is currently going to.

Could just be a simple script like this:

CREATE TABLE ulog_timestamp SELECT * from ULOG WHERE local_time > timestamp;
DELETE FROM ulog WHERE local_time > timestamp;
 

Modeps

Lifer
Oct 24, 2000
17,254
44
91
I'd suggest the logging be done in a flat text file instead of a database.
 

Haden

Senior member
Nov 21, 2001
578
0
0
Armitage, more like 2400, to be exact it takes around 19-21 days to fill it.

Anyway, thanks for your time, I've already done solution with several tables just that it seems like a hack to me.
 

Armitage

Banned
Feb 23, 2001
8,086
0
0
Originally posted by: Haden
Armitage, more like 2400, to be exact it takes around 19-21 days to fill it.

Anyway, thanks for your time, I've already done solution with several tables just that it seems like a hack to me.

I'm still curious what generates 2400 logins/seconds! Seems that you'd have bigger guns then MySQL available for something with that kind of traffic.
 

Haden

Senior member
Nov 21, 2001
578
0
0
It's dedicated box for logging somewhat large network connections. That data should actually be queried pretty rearally, based on some reported incident time.
As for using mysql - well I'm in testing phase right now and it worked fine at first.
Then again, what started as "holds 2 weeks" is actually more like "holds more than two weeks and I've done lots of portscanning to test things out"...
 

Armitage

Banned
Feb 23, 2001
8,086
0
0
What table type are you using? Probably MyISAM. You might try InnoDB ... have no idea if it matters, but might be worth trying.
 

Armitage

Banned
Feb 23, 2001
8,086
0
0
I just came across something that might help you out. Look up MySQL MyIsam Merge Tables. It lets you define a table consisting of several sub tables.
It might not help with the row count limit, but could make the split up log tables easier to deal with.