Home > Databases > Migrating Zabbix from MySQL to PostgreSQL

Migrating Zabbix from MySQL to PostgreSQL

Three evil giants
of the south
are constantly on the attack
With lies and fire
from their mouths
but we always send them back
(Amon Amarth – Guardians of Asgaard)

Sooner or later someone will come across a similar problem – migrating some MySQL based application to PgSQL. Reasons are various. I decided to migrate Zabbix to another backend because it was the only app using MySQL while all others were using PostgreSQL. So there was really no point at all to keep up with another database, set up another replication, backups, etc… Someone else will probably have different reasons.

So, how did I do it? First of all, it’s important to understand database schema and data types. Since Zabbix supports both backends, it’s obvious migration shouldn’t be that hard. First thing I did was upgrade to latest version of Zabbix (from 2.0 to 2.2.1). I also upgraded database to Percona 5.6. PostgreSQL versoin of choice was latest from 9.3 series. So this is my setup:

  • MySQL Percona 5.6
  • Zabbix 2.2.1
  • PostgreSQL 9.3

After upgrade to latest versions of Zabbix and MySQL, it’s good to make sure zabbix housekeeper has done it’s work, and database is as compacted as it should be. To be sure of this, take a look at zabbix-server logs and search for term ‘housekeep’.

As for migration, I decided to both dump tables and save them into CSV format. This is a simple script that does it:

mkdir /var/tmp/zabbix/CSV; chown -R mysql: /var/tmp/zabbix
for i in `find /var/lib/mysql/zabbix/ -type f | \
 sed 's/\....$//' | sort | uniq | sed 's#/var/lib/mysql/zabbix/##'`; do
   mysqldump --compact -c -e -n -t --compatible=postgresql zabbix $i > $i.sql ;
   mysqldump -t -T/var/tmp/zabbix/CSV zabbix $i --fields-terminated-by=';' ;
done

So, for every table in zabbix database in MySQL, I create both SQL dump (in postgresql compatible mode) and a CSV dump with semicolon as a column delimiter. Why did I choose semicolon? Because Zabbix tends to have a lot of colons in it’s data, so this eases up insertion of data and lowers needed number of escape characters, which can pose problems.
Note: As you can notice, I use InnoDB’s ‘ file-per-table’ mode, that’s why I can get a list of available tables just by searching through /var/lib/mysql. If you don’t use that setting, you should modify the commad by using “SHOW TABLES” query.

Next step is to modify schema.sql from Zabbix server installation and remove some of the constraints. I removed all “ALTER TABLE“, “CREATE INDEX” and “CREATE UNIQUE INDEX” queries from schema file, and imported that schema into zabbix database in PostgreSQL:

$ psql zabbix < schema_modified.sql

Those queries will be needed later, after all data is successfully imported.

Now that zabbix database has it’s base structure, we can start with importing data. I’ve used “COPY” command that imports from CSV. Import is atomic, so if some error occurs, target table will stay empty.

$ psql zabbix
zabbix# COPY alerts FROM '/var/tmp/zabbix/CSV/alerts.txt' DELIMITER ';' CSV;

If COPY exits with number of inserted rows, that that table is successfully imported. If, on the other hand, COPY spits out some kind of error or warning, that table probably wasn’t imported correctly. So, I recorded all the failed tables, and decided to import them via standard SQL dump. This is a list of tables that failed to import from CSV format:

items
acknowledges
auditlog
auditlog_details
alerts
actions
httptest
opmessage
triggers

Typical error which I saw was:

ERROR:  missing data for column "agent"

There were other errors, which I did not want to even try to fix. So, my next move was to use SQL dumps for those tables. But, I had to modify them a little bit, by adding following header:

SET standard_conforming_strings = 'off';
SET backslash_quote = 'on';
SET escape_string_warning = 'off';
BEGIN;

and offcourse a footer with “COMMIT” 😉 These settings are compatibility flags used for importing MySQL dumps. After all data was imported, all I needed to do is add all those indexes and table alterations from schema.sql that was ommited when I created database, and to fix permissions:

for i in `echo '\dt' | psql zabbix | grep 'public |' | \
awk -F\| '{ print $2 }' `; do
    echo "ALTER TABLE public.${i} OWNER TO zabbix;" | psql zabbix;
done

And that’s it, bye bye MySQL 🙂

Advertisements
Categories: Databases Tags: , , ,
  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: