Fri, Feb 10, 2012
We recently had the need to move one of our Django app's back-end from MySQL to PostgreSQL. We wanted to try some of PgSQL's features in that particular context, and so far, we are quite pleased! Unfortunately the migration process was not as painless as we would have liked. At least, not until we found Philip Southam's cool py-mysql2pgsql tool.
August 2018: Please note that this post was written for an older version of Django. Changes in the code might be necessary to adapt it to the latest versions and best practices.
Being an improved port of Max Lapshin's ruby-written mysql2pgsql, py-mysql2pgsql allows you to connect to a MySQL database and dump it's contents into a PostgreSQL compatible dump file or directly pipe it into and already running Postgre Server. It handles large data sets (millions of rows) with more ease than its Ruby inspirator.
So we went ahead and tried the tool but, as it so often happens, the process wasn't smooth and when migrating a table's constraints we got this error:
Ouch! Apparently py-mysql2pgsql was having trouble with some of our Foreign Keys, making them refer tables not yet imported. We investigated a bit and found that someone had already coded a workaround to this here by letting you choose the order in which the tables will be imported. So this is the process we followed:
And that should be it. Hopefully you now have a fully functional Postgre database (or a very compatible and cool dump file) with all your data.
© 2024. All rights reserved.