This release of pgloader is following the tradition of simplifying things for users, or if you allow me to quote Alan Kay, I believe that if simple things should be simple, complex things should be possible.
Category “Pgloader” — 26 articles
Back then, I showed that using pgloader made it easier to import the data,
but also showed quite poor performances characteristics due to using the
debug mode in the timings. Let’s update that article with
current pgloader wonders!
As presented at the PostgreSQL Conference Europe the new version of pgloader is now able to fully migrate a MySQL database, including discovering the schema, casting data types, transforming data and default values. Sakila is the traditional MySQL example database, in this article we’re going to fully migrate it over to PostgreSQL.
In our previous article about Loading Geolocation Data, we did load some data into PostgreSQL and saw the quite noticable impact of a user transformation. As it happens, the function that did the integer to IP representation was so naive as to scratch the micro optimisation itch of some Common Lisp hackers: thanks a lot guys, in particular stassats who came up with the solution we’re seeing now.
About the only time when I will accept to work with MySQL is when you need help to migrate away from it because you decided to move to PostgreSQL instead. And it’s already been too much of a pain really, so after all this time I began consolidating what I know about that topic and am writing a software to help me here. Consider it the MySQL Migration Toolkit.
While making progress with both Event Triggers and Extension Templates, I needed to make a little break. My current keeping sane mental exercise seems to mainly involve using Common Lisp, a programming language that ships with about all the building blocks you need.
Yes, that old language brings so much on the table
When using Common Lisp, you have an awesome interactive development environment where you can redefine function and objects while testing them.
pgloader is a tool to help loading data into PostgreSQL, adding some error management to the COPY command. COPY is the fast way of loading data into PostgreSQL and is transaction safe. That means that if a single error appears within your bulk of data, you will have loaded none of it. pgloader will submit the data again in smaller chunks until it’s able to isolate the bad from the good, and then the good is loaded in.