Dockerized PostgreSQL and Django for Local Development

Docker and docker-compose make it dead simple to avoid dependency hell and have a consistent environment for your whole team while doing local development. This post walks through setting up a new Django project from scratch to use Docker and docker-compose. It is modeled after a previous post that I wrote about doing a similar thing with Laravel and MySQL.

Dockerfile

Nothing too interesting happening here. Installing python and pip.

FROM ubuntu:16.04

# system update
RUN apt update
RUN apt upgrade -y

# python deps
RUN apt install -y python3-dev python3-pip

docker-compose.yml

version: '2'
services:
  app:
    build: .
    ports:
      - "8000:8000"
    volumes:
      - .:/app
    working_dir: /app
    command: bash -c "pip3 install -r requirements.txt && python3 manage.py migrate && python3 manage.py runserver 0:8000"
    depends_on:
      - db
  db:
    image: postgres:9.6.5-alpine
    environment:
      - POSTGRES_USER=feedread
      - POSTGRES_PASSWORD=feedread
    volumes:
      - ./data:/var/lib/postgresql/data
    ports:
      - "5432:5432"

With this in place you can start your Django app with docker-compose up. Each time the app starts it will install the latest dependencies, run migrations, and start serving the app on localhost:8000

Notes

  1. In order to do stuff with the database locally you should add the following record to your local /etc/hosts file
    # /etc/hosts
    
    127.0.0.1 db
    
  2. Since we define – .:/app as a volume, this means that all of your local changes are immediately visible in the dockerized app.
  3. If you need to access the running app or db container you can do so with docker-compose exec app bash or docker-compose exec db bash.
  4. This docker-compose file is not really suitable for production since it is not likely that you would want to build the container each time the app starts or automatically run migrations.
  5. You can add additional services like memcached, a mail server, an app server, a queue, etc., using the same method that we are using above with our database.

I Want to Become a Core Python Developer

I’ve been tinkering with python for almost five years now. I am absolutely in love with the language. My new goal is to make enough contributions to the project to join the core team.

This post is my attempt to keep a list of all that I’ve done in this endeavor. I will keep this up to date on a monthly basis.

Short Term Goals

  • Ship some actual code. Focus on improving test coverage.
  • Attend the next available local meetup.
  • Get this PR merged.
  • Work on some other low hanging fruit from bedevere.

November 2017

Code

  • Reported an issue with Vagrant and Ansible on the pythondotorg repo, and assisted with testing the resolution. (note, for any future newbies, reporting issues, writing docs, testing PRs, these are all super valuable things that you can do to get more familiar with a projects code base).
  • Substantial refactoring of the dev guide merged.

Community

  • Reached out to the core workflow team to see if we could introduce CircleCI into the Python organization. This addresses the PoC showed in this PR.

October 2017

Code

Community

  • Became a PSF Member.
  • Hang out in various IRC channels. Notably #python on freenode and help out where I can.
  • Join the PSF Volunteers mailing list and volunteer for opportunities as they come in.
  • Sign up for all the dev related mailing lists.
  • Joined the BAyPIGgies local python meetup group.

Python Mocks Test Helpers

I’ve been writing a python wrapper for the CircleCI API over the last week. I wanted to do this “the right way” with test driven development.

I have a couple integration tests that actually hit the CircleCI API, but most of the unit tests so far are using MagicMock to ensure that the basic functions are working as expected.

This generally involves the tedious process of dumping out JSON, saving it to a file, and then reloading that file later on to actually test it.

I wrote two helper functions that make this process slightly less tedious.

Load Mock

The first is a function that loads a file and overrides every request to return that file (typically as JSON).

    def loadMock(self, filename):
        """helper function to open mock responses"""
        filename = 'tests/mocks/{0}'.format(filename)

        with open(filename, 'r') as f:
            self.c._request = MagicMock(return_value=f.read())

Test Helper

The second is a function that runs a real request for the first time and dumps the output to a file.

    def test_helper(self):
        resp = self.c.add_circle_key()
        print(resp)
        with open('tests/mocks/mock_add_circle_key_response', 'w') as f:
             json.dump(resp, f)

Naming it test_helper allows it to be picked up and ran when you run your test suite since by default unittest will capture any methods that start with test.

Usage

An actual example is shown below.

    def test_clear_cache(self):
        self.loadMock('mock_clear_cache_response')
        resp = json.loads(self.c.clear_cache('levlaz', 'circleci-sandbox'))

        self.assertEqual('build dependency caches deleted', resp['status'])

Writing the tests is easy, we just copy and paste the name of the file that was created with test_helper and verify that the contents are what we expect them to be.

This approach has been working very well for me so far. One thing to keep in mind with writing these types of tests is that you should also include some general integration tests against the API that you are working with. This way you can catch any regressions with your library in the event that the API changes in any way. However, as a basic sanity check mocking these requests is a good practice and less prone to flakiness.

Backing up and Restoring MySQL with mysqldump

I backup and restore databases across servers every few months, but each time I have to resort to reading this very verbose documentation. The steps below are a no fuss way to do this each time.

Backup Server

SSH into the server with the database that you wish to backup run the following command.

mysqldump -u root -p $DB_NAME > $DB_NAME.sql

Copy the File to Destination Server

Using scp, we can securely transfer the backup

scp $DB_NAME.sql $USER@$SERVER:

Restore on Destination Server

SSH into the server with the database that you wish to restore. From the the previous step the backup file should now be located in the root directory.

  1. Create new database
    mysql -u root -p -e 'CREATE DATABASE $DB_NAME'
    
  2. Restore your backup
    mysql -u root -p $DB_NAME < $DB_NAME.sql