Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
linux:monitoring:zabbix:database:postgresql:partitioning [2020/03/30 13:14]
jlucas created
linux:monitoring:zabbix:database:postgresql:partitioning [2020/04/02 08:20] (current)
jlucas [Dump individual Zabbix database tables for testing]
Line 1: Line 1:
 +======Adding DB table partitioning to Zabbix DB======
  
-The current development Zabbix monitoring server that I've been building has filled the partitioning containing the PostgreSQL database.  When I originally installed the system, I did not choose to use partioning within PostgreSQL DB tables.  I am regretting that now.+===Note from the Author===
  
-What I'm going to attempt here is to retain some of the history collected thus far.  I will dump the entire zabbix database, drop the zabbix database, "re-install" the database via the Zabbix installer process, then re-import the old data from the previously made backup.+First off, the servers that I'm using in my setup are OpenBSD.  Therefore the CLI commands shown below will most likely be different then what you need to be able to run the same command on your system.  For example, I'll be using 'doas' in place of 'sudo' and 'rcctl' in place of 'service' or 'systemctl' Be sure to pay attention to this when issue these commands. 
 + 
 +===Summary=== 
 + 
 +The current Zabbix monitoring server I'm working on is a development server that I have been working on to build for a client.  The issue that I have run into is that the partitioning containing the PostgreSQL database is nearly full.  And there is not enough space to properly VACUUM the larger tables to reclaim the DB space.  When I originally installed the system, I did not choose to use partitioning within PostgreSQL DB tables.  I am regretting that now. 
 + 
 +What I'm going to attempt here is to retain the history collected thus far.  I plan to DELETE some of the oldest data from the largest tables. 
 + 
 +I will dump the entire zabbix database, drop the zabbix database, "re-install" the database via the Zabbix installer process, then re-import the old data from the previously made backup. 
 + 
 +=====Getting things ready to perform the work===== 
 + 
 +====Shutdown the Zabbix server data collection processes==== 
 + 
 +I happen to be working with two different servers.  One is the Zabbix process server and the second is the PostgreSQL database server. 
 + 
 +Issue the following command to shutdown your Zabbix server data collection processes. 
 + 
 +  doas rcctl stop zabbix_server 
 + 
 +====Delete older data [optional]====
  
 If you have a large amount of historical data that isn't relevant, you can delete some of it by issuing these SQL commands: If you have a large amount of historical data that isn't relevant, you can delete some of it by issuing these SQL commands:
Line 13: Line 34:
 Additional tables that get a little big that could have some: Additional tables that get a little big that could have some:
  
-  DELETE FROM events WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days'; 
   DELETE FROM alerts WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days';   DELETE FROM alerts WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days';
-  DELETE FROM trends WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days';+  DELETE FROM events WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days';
   DELETE FROM trends_uint WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days';   DELETE FROM trends_uint WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days';
 +  DELETE FROM trends WHERE to_timestamp(clock) < NOW() - INTERVAL '40 days';
  
-Use this to dump the entire Zabbix database + schema structure.+====Dump the entire Zabbix DB==== 
 + 
 +dump the entire Zabbix database
  
   pg_dump -U zabbix zabbix | gzip -1 > ./zabdb.zabbix.Pg.gz   pg_dump -U zabbix zabbix | gzip -1 > ./zabdb.zabbix.Pg.gz
  
-Testing... 
  
-The history table is by far the largest + 
-  pg_dump -U zabbix -t history_uint zabbix | gzip -1 > ./zabdb.zabbix.history_uint.Pg.gz+  pg_dump -U postgres postgres | psql -U zabbix zabbix 
 + 
 +====Dump individual Zabbix database tables for testing==== 
 + 
 +The following use of pg_dump and pg_restore performs a data-only backup using the custom-form output option and using INSERTs instead of COPY statements: 
 +  pg_dump -U zabbix -a --inserts -t history_str -Fc zabbix | gzip -1 > /var/postgresql/zabdb.zabbix.history_str.Pg.gz    
 + 
 +Once the backup is done, drop all data from the table: 
 +  psql -U zabbix -c 'truncate history_str;' zabbix 
 + 
 +Now restore the data to the table: 
 +  gunzip -c zabdb.zabbix.history_str.Pg.gz | pg_restore -U zabbix -d zabbix