This is an old revision of the document!
The current development Zabbix monitoring server that I've been building has filled the partitioning containing the PostgreSQL database. When I originally installed the system, I did not choose to use partioning within PostgreSQL DB tables. I am regretting that now.
What I'm going to attempt here is to retain some of the history collected thus far. I will dump the entire zabbix database, drop the zabbix database, “re-install” the database via the Zabbix installer process, then re-import the old data from the previously made backup.
If you have a large amount of historical data that isn't relevant, you can delete some of it by issuing these SQL commands:
DELETE FROM history_uint WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days'; DELETE FROM history_text WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days'; DELETE FROM history_str WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days'; DELETE FROM history WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days';
Additional tables that get a little big that could have some:
DELETE FROM events WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days'; DELETE FROM alerts WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days'; DELETE FROM trends WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days'; DELETE FROM trends_uint WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days';
Use this to dump the entire Zabbix database + schema structure.
pg_dump -U zabbix zabbix | gzip -1 > ./zabdb.zabbix.Pg.gz
Testing…
The history table is by far the largest
pg_dump -U zabbix -t history_uint zabbix | gzip -1 > ./zabdb.zabbix.history_uint.Pg.gz