December 22, 2018railssqlite3postgresql
I have developed many years ago two Rails applications and have used them many years. I started with nearly no know-how, so I took all defaults. I was glad at that time that I did not have to manage a full blown database to run my small Rails apps.
Time has changed, and nowadays, I have docker locally running under Windows 10, with only 2 minutes work to have a current PostgreSQL server up and running. So I decided to give it a try again, and to see if it would be possible to run the applications inside a docker container, connecting to PostgreSQL. This blog post is mostly about the migration task. I will later follow up with more stories about containerizing it, and (at the end hopefully) run it on the internet to have it available all the time.
So I took the following steps:
… as Docker container: docker run --name rails_db -p 5439:5432 -e POSTGRES_DB=rails_db -d postgres:9.6.8-alpine
What does this do?
postgres:9.6.8-alpine
. (The reason I use that version is that we use exactly that version at work, so feel free to use any other current version)rails_db
, so it is easier later to start that container by give the name only.
docker stop rails_db
docker start rails_db
I used pgAdmin to do it, your steps may be different here.
You now have a new database server named rails_db, but without any database, users, …
To use the database, you need system users to do that.
Create Login/Group Role...
Create a database by doing the following steps.
You should now be able to login as that user, go to the database, and create something.
I found the following recipe to setup a dockerized Rails app. Here are the relevant steps to just create the application locally.
rails new myapp --database=postgresql
Generates a new app with postgresql set as databasedatabase.yml
and ensure that database, username and password are set.rake db:setup
and see if that runs through. If yes, your database is created, but empty.rails g scaffold Post title:string body:text
and again rake db:setup
.rails server
, and visit it under localhost:3000/posts.posts
.So you have seen that the database is working, and a new Rails app is able to use that database.
Repeat the steps to create the real database for your application. Necessary are:
Now comes the hard part. I have read a little bit about a tool named pgloader which sounded reasonable, and I have seen that this tool is available as docker container. So I took that road then. Here are the steps that were necessary for me:
docker run --rm -i -t -v c:/Users/mliebelt/dev/priv:/usr/src/apps dimitri/pgloader bash
/usr/src/apps
.pgloader prod.db postgresql://<username>:<pwd>@<ip-address>:5439/<database>
<username>
is the system user that has access to the database<pwd>
is the password of that system user<ip-address>
Local IP address of my computer, in Windows found with: ipconfig
<database>
is the name of the databaseWhen executin the migration, I got the following output.
WARNING:
Couldn't re-execute SBCL with proper personality flags (/proc isn't mounted? setuid?)
Trying to continue anyway.
2018-12-22T14:29:47.069000Z LOG Migrating from #<SQLITE-CONNECTION sqlite:///usr/src/apps/minds/db/prod.db {1005F72BE3}>
2018-12-22T14:29:47.071000Z LOG Migrating into #<PGSQL-CONNECTION pgsql://<username>@<ip-address>:5439/<database> {10061BB263}>
2018-12-22T14:29:47.447000Z LOG report summary reset
table name errors rows bytes total time
----------------------- --------- --------- --------- --------------
fetch 0 0 0.000s
fetch meta data 0 21 0.033s
Create Schemas 0 0 0.000s
Create SQL Types 0 0 0.005s
Create tables 0 20 0.056s
Set Table OIDs 0 10 0.006s
----------------------- --------- --------- --------- --------------
notices 0 500 205.5 kB 0.040s
people 0 925 162.9 kB 0.025s
schema_migrations 0 22 0.2 kB 0.024s
pages 0 58 49.6 kB 0.039s
book_rows 0 129 8.6 kB 0.065s
sessions 0 170 58.6 kB 0.093s
people_ressources 0 1129 59.2 kB 0.025s
citations 0 5 2.4 kB 0.009s
books_authors 0 0 0.024s
ressources 0 982 363.6 kB 0.108s
----------------------- --------- --------- --------- --------------
COPY Threads Completion 0 4 0.159s
Create Indexes 0 11 0.069s
Index Build Completion 0 11 0.012s
Reset Sequences 0 7 0.013s
Primary Keys 0 8 0.010s
Create Foreign Keys 0 0 0.000s
Create Triggers 0 0 0.000s
Install Comments 0 0 0.000s
----------------------- --------- --------- --------- --------------
Total import time ? 3920 910.4 kB 0.263s
I was really astonished how fast the migration ran. And I could then check the result by going directly to the database and see that it is working.
Everything is now ripe to be run. Only the file database.yml
of the application has to be adapted. I did the following changes.
default: &default
adapter: postgresql
encoding: unicode
pool: 5
timeout: 5000
development:
<<: *default
database: <database>
username: <username>
password: <pwd>
port: 5439
# Not touched
test:
adapter: sqlite3
pool: 5
timeout: 5000
database: db/test.sqlite3
production:
<<: *default
database: <database>
username: <username>
password: <pwd>
port: 5439
Caveat: I had to change the port of the database, because the default one (5432) was already taken. The application works like a charm, now I have to check how to drive it in the cloud (as docker container). But that will be another story …
Written by Markus Liebelt who lives and works near Stuttgart in Germany. You should follow him on Twitter, or see his stats on Stackoverflow.