Python Mocks Test Helpers

| python | programming | ci |

I’ve been writing a python wrapper for the CircleCI API over the last week. I wanted to do this “the right way” with test driven development.

I have a couple integration tests that actually hit the CircleCI API, but most of the unit tests so far are using MagicMock to ensure that the basic functions are working as expected.

This generally involves the tedious process of dumping out JSON, saving it to a file, and then reloading that file later on to actually test it.

I wrote two helper functions that make this process slightly less tedious.

Load Mock

The first is a function that loads a file and overrides every request to return that file (typically as JSON).
    def loadMock(self, filename):
        """helper function to open mock responses"""
        filename = 'tests/mocks/{0}'.format(filename)
    with open(filename, 'r') as f:
        self.c._request = MagicMock(return_value=f.read())

Test Helper

The second is a function that runs a real request for the first time and dumps the output to a file.
    def test_helper(self):
        resp = self.c.add_circle_key()
        print(resp)
        with open('tests/mocks/mock_add_circle_key_response', 'w') as f:
             json.dump(resp, f)
Naming it test_helper allows it to be picked up and ran when you run your test suite since by default unittest will capture any methods that start with test.

Usage

An actual example is shown below.
    def test_clear_cache(self):
        self.loadMock('mock_clear_cache_response')
        resp = json.loads(self.c.clear_cache('levlaz', 'circleci-sandbox'))
    self.assertEqual('build dependency caches deleted', resp['status'])

Writing the tests is easy, we just copy and paste the name of the file that was created with test_helper and verify that the contents are what we expect them to be.

This approach has been working very well for me so far. One thing to keep in mind with writing these types of tests is that you should also include some general integration tests against the API that you are working with. This way you can catch any regressions with your library in the event that the API changes in any way. However, as a basic sanity check mocking these requests is a good practice and less prone to flakiness.

Thank you for reading! Share your thoughts with me on bluesky, mastodon, or via email.

Check out some more stuff to read down below.

Most popular posts this month

Recent Favorite Blog Posts

This is a collection of the last 8 posts that I bookmarked.

Articles from blogs I follow around the net

I was inconsiderate but now I’m everywhere

When I was in the music business, there was a record producer who lived in New Jersey, but refused to come into New York City. Anyone that wanted to meet with him had to drive all the way down to his little town. I thought, “What’s his problem? Is he lazy o…

via Derek Sivers blog March 12, 2026

Pluralistic: AI "journalists" prove that media bosses don't give a shit (11 Mar 2026)

Today's links AI "journalists" prove that media bosses don't give a shit: In case there was ever any doubt. Hey look at this: Delights to delectate. Object permanence: Eggflation x excuseflation; Haunted Mansion stretch portraits; "Los…

via Pluralistic: Daily links from Cory Doctorow March 11, 2026

Generative AI vegetarianism

For some spicier takes! Anthony Moser’s “I Am An AI Hater”, Jenny Zhang’s “choosing friction”, Rusty Foster’s “A.I. Isn’t People”, or Ed Zitron’s “The Case Against Generative AI” if you have most of an afternoon to read it. Emily Bender and Alex Hanna’s p…

via Sean Boots March 11, 2026

Generated by openring