SQLite DB Migrations with PRAGMA user_version

This blog used a simple homegrown blogging engine that I wrote backed by a SQLite database. I have a function in the flask app that performs database migrations. My current approach has been to keep a folder full of migrations and run them sequentially whenever the app starts.

This works well for the case of adding and removing tables since SQLite has the handy IF NOT EXISTS option. However, when you are altering an existing table, this entire model falls apart since IF NOT EXISTS no longer works.

Practically, this means that outside of a fresh install my database migrations are useless.

I am still being stubborn and not using a well written solution like Alembic (which I would highly recommend for a “serious” project) for this blog. Instead, I discovered that SQLite comes with a built in mechanism to keep track of the user schema. This is the pragmastatement, and specifically user_version.

Using PRAGMA user_data for DB Migrations

My migrations folder structure looks like this:

.
├── blog.db
├── blog.py
├── __init__.py
├── migrations
│   ├── 0001_initial_schema.sql
│   ├── 0002_add_unique_index_to_posts_tags.sql
│   ├── 0003_add_fts.sql
│   ├── 0004_add_column_to_post.sql
│   ├── 0005_add_comments_table.sql
│   └── 0006_add_admin_flag_to_comments.sql

As you can see the naming convention is 000N_migration_description.sql. Each migration file has the following statement in it:

PRAGMA user_version=N; (where N is the 000"N" part of the file name)

This steps the current user_version to be equal to the current version as defined by the file name.

The code to do stuff with the database is shown below:

def connect_db():
    """Connects to Database."""
    rv = sqlite3.connect(
        app.config['DATABASE'],
        detect_types=sqlite3.PARSE_DECLTYPES | sqlite3.PARSE_COLNAMES)
    rv.row_factory = sqlite3.Row
    return rv


def get_db():
    """Opens new db connection if there is not an
    existing one for the current app ctx.
    """
    if not hasattr(g, 'sqlite_db'):
        g.sqlite_db = connect_db()
    return g.sqlite_db


def migrate_db():
    """Run database migrations."""

    def get_script_version(path):
        return int(path.split('_')[0].split('/')[1])

    db = get_db()
    current_version = db.cursor().execute('pragma user_version').fetchone()[0]

    directory = os.path.dirname(__file__)
    migrations_path = os.path.join(directory, 'migrations/')
    migration_files = list(os.listdir(migrations_path))
    for migration in sorted(migration_files):
        path = "migrations/{0}".format(migration)
        migration_version = get_script_version(path)

        if migration_version > current_version:
            print("applying migration {0}".format(migration_version))
            with app.open_resource(path, mode='r') as f:
                 db.cursor().executescript(f.read())
                 print("database now at version {0}".format(migration_version))
        else:
            print("migration {0} already applied".format(migration_version))

The relevant part to this blog post is the migrate_db() function. Two things are happening.

  1. The get_script_version() helper function extracts the integer from the migration name.
  2. current_version gets the current value of user_version of your database.
  3. We iterate over each migration file in the migrations folder and perform a simple check. If the migration version is larger than the current_version we run the migration, otherwise it gets skipped.

This solves for most cases and allows for a smooth upgrade path if anyone ever decides to start using this blogging engine for themselves. I am still pretty happy with this approach because this is essentially a fully functional migration system in just a handful of lines of python.

A Robot With a Soul

OPUS: The Day We Found Earth was released on Nintendo Switch this week. I picked it up and played through the main story in a few hours. There are few other games at the $5 price point that are worth playing in the Nintendo eShop. This simple game tells a very compelling story. Like most great short stories, it quickly establishes an emotional connection with the main characters and draws you in.

Lately, I’ve been thinking about video games as a medium for telling compelling stories. No one does this better than indie developers and the team at SIGONO delivers with this emotional adventure.

In OPUS, you play as a tiny robot who’s mission is to find the planet Earth in order to save the human race. You do this by exploring a vast galaxy from a space ship that is equipped with a powerful telescope. As you progress through the game you uncover additional parts of the space ship and begin to understand the curious circumstances in which the robot finds himself.

The game is short, the graphics are not revolutionary, and the game mechanics are very simple. However, where OPUS really shines is in the story that is told. The robot loves the woman who programmed him, he exhibits emotions, and you are quickly drawn in to feel sympathy and concern for his wellbeing. Coupled with the calming soundtrack by Triodust, you are immersed in the game and race against time to fulfill the seeming futile task of finding Earth.

I really loved this game. I can’t wait to see what comes next from Sigono and I would love to see more games like this in the Nintendo eShop.

Dockerized PostgreSQL and Django for Local Development

Docker and docker-compose make it dead simple to avoid dependency hell and have a consistent environment for your whole team while doing local development. This post walks through setting up a new Django project from scratch to use Docker and docker-compose. It is modeled after a previous post that I wrote about doing a similar thing with Laravel and MySQL.

Dockerfile

Nothing too interesting happening here. Installing python and pip.

FROM ubuntu:16.04

# system update
RUN apt update
RUN apt upgrade -y

# python deps
RUN apt install -y python3-dev python3-pip

docker-compose.yml

version: '2'
services:
  app:
    build: .
    ports:
      - "8000:8000"
    volumes:
      - .:/app
    working_dir: /app
    command: bash -c "pip3 install -r requirements.txt && python3 manage.py migrate && python3 manage.py runserver 0:8000"
    depends_on:
      - db
  db:
    image: postgres:9.6.5-alpine
    environment:
      - POSTGRES_USER=feedread
      - POSTGRES_PASSWORD=feedread
    volumes:
      - ./data:/var/lib/postgresql/data
    ports:
      - "5432:5432"

With this in place you can start your Django app with docker-compose up. Each time the app starts it will install the latest dependencies, run migrations, and start serving the app on localhost:8000

Notes

  1. In order to do stuff with the database locally you should add the following record to your local /etc/hosts file
    # /etc/hosts
    
    127.0.0.1 db
    
  2. Since we define – .:/app as a volume, this means that all of your local changes are immediately visible in the dockerized app.
  3. If you need to access the running app or db container you can do so with docker-compose exec app bash or docker-compose exec db bash.
  4. This docker-compose file is not really suitable for production since it is not likely that you would want to build the container each time the app starts or automatically run migrations.
  5. You can add additional services like memcached, a mail server, an app server, a queue, etc., using the same method that we are using above with our database.

I Want to Become a Core Python Developer

I’ve been tinkering with python for almost five years now. I am absolutely in love with the language. My new goal is to make enough contributions to the project to join the core team.

This post is my attempt to keep a list of all that I’ve done in this endeavor. I will keep this up to date on a monthly basis.

Short Term Goals

  • Ship some actual code. Focus on improving test coverage.
  • Attend the next available local meetup.
  • Get this PR merged.
  • Work on some other low hanging fruit from bedevere.

November 2017

Code

  • Reported an issue with Vagrant and Ansible on the pythondotorg repo, and assisted with testing the resolution. (note, for any future newbies, reporting issues, writing docs, testing PRs, these are all super valuable things that you can do to get more familiar with a projects code base).
  • Substantial refactoring of the dev guide merged.

Community

  • Reached out to the core workflow team to see if we could introduce CircleCI into the Python organization. This addresses the PoC showed in this PR.

October 2017

Code

Community

  • Became a PSF Member.
  • Hang out in various IRC channels. Notably #python on freenode and help out where I can.
  • Join the PSF Volunteers mailing list and volunteer for opportunities as they come in.
  • Sign up for all the dev related mailing lists.
  • Joined the BAyPIGgies local python meetup group.

Python Mocks Test Helpers

I’ve been writing a python wrapper for the CircleCI API over the last week. I wanted to do this “the right way” with test driven development.

I have a couple integration tests that actually hit the CircleCI API, but most of the unit tests so far are using MagicMock to ensure that the basic functions are working as expected.

This generally involves the tedious process of dumping out JSON, saving it to a file, and then reloading that file later on to actually test it.

I wrote two helper functions that make this process slightly less tedious.

Load Mock

The first is a function that loads a file and overrides every request to return that file (typically as JSON).

    def loadMock(self, filename):
        """helper function to open mock responses"""
        filename = 'tests/mocks/{0}'.format(filename)

        with open(filename, 'r') as f:
            self.c._request = MagicMock(return_value=f.read())

Test Helper

The second is a function that runs a real request for the first time and dumps the output to a file.

    def test_helper(self):
        resp = self.c.add_circle_key()
        print(resp)
        with open('tests/mocks/mock_add_circle_key_response', 'w') as f:
             json.dump(resp, f)

Naming it test_helper allows it to be picked up and ran when you run your test suite since by default unittest will capture any methods that start with test.

Usage

An actual example is shown below.

    def test_clear_cache(self):
        self.loadMock('mock_clear_cache_response')
        resp = json.loads(self.c.clear_cache('levlaz', 'circleci-sandbox'))

        self.assertEqual('build dependency caches deleted', resp['status'])

Writing the tests is easy, we just copy and paste the name of the file that was created with test_helper and verify that the contents are what we expect them to be.

This approach has been working very well for me so far. One thing to keep in mind with writing these types of tests is that you should also include some general integration tests against the API that you are working with. This way you can catch any regressions with your library in the event that the API changes in any way. However, as a basic sanity check mocking these requests is a good practice and less prone to flakiness.

Backing up and Restoring MySQL with mysqldump

I backup and restore databases across servers every few months, but each time I have to resort to reading this very verbose documentation. The steps below are a no fuss way to do this each time.

Backup Server

SSH into the server with the database that you wish to backup run the following command.

mysqldump -u root -p $DB_NAME > $DB_NAME.sql

Copy the File to Destination Server

Using scp, we can securely transfer the backup

scp $DB_NAME.sql $USER@$SERVER:

Restore on Destination Server

SSH into the server with the database that you wish to restore. From the the previous step the backup file should now be located in the root directory.

  1. Create new database
    mysql -u root -p -e 'CREATE DATABASE $DB_NAME'
    
  2. Restore your backup
    mysql -u root -p $DB_NAME < $DB_NAME.sql

Bicycle

In 2010, when I was living in Maryland I purchased the new B.o.B album that had just come out on iTunes and was listening to it while I walked two miles down Cherry Hill Road to target. I don’t remember exactly why I walked. Perhaps my car was in the repair store. At target, I bought a new bicycle for the first time in many years. Walked down means I had to ride it up. This didn’t last very long and I had an embarrassing walk of shame since I was not able to make it up the hill.

Back then, and even today, I know nothing about bikes. I am pretty sure I bought the worst possible bike for the occasion. It was slow, clunky, and felt like it would fall apart at any moment.

It’s worth noting, that at the time I was working at the National Naval Medical Center which was under 10 miles from my apartment. 10 miles in beltway traffic can quickly turn into a 90 minute commute. The only other option was taking the train, which was in an inconvenient U shape. The train station was a few miles away and the bus to take you there was slow, also in some inconvenient letter shape, and overall the commute time was not much better.

Riding a bike to the train station was a viable option, and I ended up doing just that a number of times. The best part about this is that there was a bike trail directly next to my apartment that took you up to the University of Maryland campus and the metro station. This trail was beautiful, and there was even a creepy swamp straight out of a horror movie that would be filled with Silent Hill esque fog in the mornings.

The best memory that I have of this bike was the time that my Ford Focus broke down for good. I loved that car. It was the firs thing I bought with my first military pay check. I got it with around 7 miles on it brand new in 2007. I drove all over the east coast and the midwest in that egg shaped hot red car. I blew the speakers out listening to house music that Gerald introduced me to. I popped it in third gear one time and chased a woman down Wisconsin Ave in a fit of road rage when she cut me off one day. One of my friends joked to me that the moment I reached 45,000 miles the car would break down.

Damn that person. Literally the day I reached 45,000 miles my clutch went out on the beltway. It was the most frustrating experiences of my life. I somehow made it back to my apartment. This was one of the most memorable moments of that old bike. I rode it, in the middle of the winter, over ice, to a Honda dealership.

Let’s make one thing clear. When you show up to a dealership on a bicycle in the middle of the winter, you just made the day of whoever is lucky enough to come talk to you first because there is no way that you are leaving there without a car. I got a Honda Civic. Also brand new. No clutch this time. That car, named Chester, is still around. My dad drives it these days.

In 2011, when I was preparing to leave Maryland and move back to Ohio I sold the bike to a University of Maryland engineering student for a fraction of what I paid for it. I remember watching him ride away into the sunset. That was the last time I rode a bicycle.

Maryland, Ohio, and New Jersey where I spent most of the last decade are not really big bicycle towns. San Francisco on the other hand is full of bike lanes, bike shares, and every morning you can see hundreds of cyclists commuting to work like a herd of gazelles down Market Street.

I remember one of the doctors that I worked with biked to work every day. Unlike the folks around here who do it unpretentiously, it was an entire event for him. He would wear the whole tight clothes getup, take a shower before he started to work, and then change into his uniform. Must be nice, who has time for that?

Ever since I moved here, I have been wanting to get a bicycle. A few weeks ago I asked my twitter followers to recommend a bike shop. My good friend, and co-worker, Tad made me an offer I couldn’t refuse. Rather than recommending a bike shop or a bike model he gave me an old bike instead. Tonight, I finally got a chance to go pick it up and take it for a spin.

It was amazing.

By far, this is the best bicycle that I have ever ridden on. It has huge wheels. It takes very little effort to pick up speed. It’s fast. Most of all, its fun. I felt like a kid again riding on that thing.

We rode to Golden Gate park to watch the awesome photosynthesis light show at the Conservatory of Flowers. I learned about The Wiggle and rode home from the Haight to SoMa. There was something truly amazing and freeing about biking home tonight. I saw the city in a whole different light.

Besides living in constant fear of my front wheel or seat being stolen, I cannot wait to take this for a spin all over the city. The first thing I want to do is finally make my way all around Golden Gate Park. That place is huge and walking around would take an entire day. I am too lazy for that. Naturally I am going to join the flock of tourists and take a ride over the golden gate bridge one of these days as well.

My last bike offered me so many great memories that I have not really thought about until now. I can’t wait to see what adventures this new bike will have in store for me.

I want to give a public, heartfelt, humongous THANK YOU to Tad. He really made my day.

Spring Security, Webjars, and MIME type error

I volunteered to be JLO (Java Language Owner) at CircleCI and I am currently working on getting a sample Spring Framework project running on CircleCI 2.0. I made a simple app bootstrapped with the Spring Initializer. I included Spring Security for the first time and I decided to try out WebJars for static Javascript libraries such as bootstrap. I am using Thymeleaf for templating.The app does not actually do anything yet but I ran into a pretty strange issue today that I wanted to write up here. My home page is pretty straightforward.

<!DOCTYPE html>
<html xmlns:th="http://www.thymeleaf.org">
<head>
    <title>CircleCI Spring Demo</title>
    <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />

    <link rel="stylesheet" th:href="@{/webjars/bootstrap/3.3.7/css/bootstrap.min.css}" />

    <link rel="stylesheet" th:href="@{/css/style.css}" href="../static/css/style.css" />
</head>
<body>

    <nav class="navbar">
        <div class="container">
            <div class="navbar-header">
                <a class="navbar-brand" href="#">CircleCI Demo Spring</a>
            </div>
            <div id="navbar" class="collapse navbar-collapse">
                <ul class="nav navbar-nav">
                    <li class="active"><a href="#">Home</a></li>
                    <li><a href="#">About</a></li>
                </ul>
            </div>
        </div>
    </nav>

    <div class="container">
        <h1> CircleCI Spring Demo </h1>
    </div>

    <script th:src="@{/webjars/bootstrap/3.3.7/js/bootstrap.min.js}"></script>
</body>
</html>

However, when I tried to load up the app with mvn spring-boot:run none of the styles showed up and console showed the following error message:

Resource interpreted as Stylesheet but transferred with MIME type text/html

It turns out, that a default spring-security config will basically block any request unless you whitelist it. The MIME type is a red herring since what is actually happening is that my spring-security config is redirecting all unauthenticated users to my login page (which is login.html) instead of serving up the stylesheet from the/webjars directory. The solution is to update my security configuration to whitelist anything that comes from /webjars

package com.circleci.demojavaspring;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Configuration;
import org.springframework.security.config.annotation.authentication.builders.AuthenticationManagerBuilder;
import org.springframework.security.config.annotation.web.builders.HttpSecurity;
import org.springframework.security.config.annotation.web.configuration.WebSecurityConfigurerAdapter;
import org.springframework.security.config.annotation.web.configuration.EnableWebSecurity;

@Configuration
@EnableWebSecurity
public class WebSecurityConfig extends WebSecurityConfigurerAdapter {
    @Override
    protected void configure(HttpSecurity http) throws Exception {
        http
            .authorizeRequests()
                .antMatchers("/", "/home", "/webjars/**").permitAll()
                .anyRequest().authenticated()
                .and()
            .formLogin()
                .loginPage("/login")
                .permitAll()
                .and()
            .logout()
                .permitAll();
    }
}

Now, the styles load as expected.

Install Netbeans on Debian Stable

Netbeans is a great open source Java IDE. For some reason it is missing from the current stable repository on debian. In order to get it installed as a regular desktop application in Debian Jessie (using GNOME) you should do the following:

  1. JDK 8 is required in order to use netbeans. The default-jdkpackage on Jessie installs jdk7. First you must enable debian backportsand then you You can install it with sudo apt install -t jessie-backports openjdk-8-jdk
  2. Download the latest version from the releases page. There are a couple different flavors. I usually choose the one that contains everything. This will download a bash installer script.
  3. Open up a terminal and navigate to wherever you downloaded the script from Step 2. Execute the script with sh netbeans*.sh
  4. This will run some pre-flight checks and then fire up an installation wizard that will guide you through the rest of the process.
  5. Once Netbeans has been installed you can launch it by clicking on the icon that should now be on your desktop.

Using gtk-doc with Anjuta on Debian Stable

gtk-doc is a library that helps extract code documentation. When you create a new project with Anjuta it asks if you wish to include gkt-doc. Unfortunately, on Debian stable there seems to be a bug because the autoconf configuration is looking for the wrong version of gtk-doc.

/home/levlaz/git/librefocus/configure: line 13072: syntax error near unexpected token `1.0'
/home/levlaz/git/librefocus/configure: line 13072: `GTK_DOC_CHECK(1.0)'

On Debian stable, the version of GTK doc that comes with thegtk-doc-tools package is 1.21. In order to resolve this error you need to update configure.ac to use the newer version of gtk-doc as shown below:

GTK_DOC_CHECK([1.21])

Then you need to regenerate the entire project and everything should work as expected.