Testing microservices versus testing monoliths
There are numerous advantages of using microservices over the monolithic application structure.
Microservices, though, unlike monoliths don’t have the established development patterns.
A lot of problems are still unsolved and we yet to witness the emergence of de-facto standards of “the microservices way” of development.
Testing is not an exception. For monoliths, there is unit testing, component testing, integration testing. The boundaries are clear, the way to write tests is clear as well.
What about microservices?
Say, you use rest REST over HTTP(s) between microservices as your communication layer.
In a typical application a (micro)service has a set of dependencies, probably other (micro)services.
Like in unit testing, the first idea that comes to mind is mocking.
But what’s a good way to mock microservices?
Or should you always run the real instance of the dependency with the test data (or fixtures) to support testing?
We thought of another way.
For the microservices reference application we defined multiple levels of tests.
This is the familiar unit testing for the application, not much to say. Depend on the implementation language.
Test the service without external dependencies. Use data fixtures.
Test the service as a container. This includes controlled injection of (mocked) dependencies and testing the behaviour of a service under different circumstances. Test the exposed API.
API specification and testing endpoints
If you are serious about continuous integration of your microservice zoo you would consider writing a specification for the API.
Having a specification allows you to establish a contract between the producer and an API consumer.
This is an essential piece towards maintainability and continuous integration.
We chose OpenAPI (Swagger) to describe our microservices.
Now that we have the spec, the first logical step is to integrate automated API testing into our testing workflow.
For this, we chose an outstanding tool Dredd .
Testing APIs with Dredd
Dredd is simple and effective.
It takes your Swagger (or APIBlueprint) specification and the endpoint that provides an API that complies to the specification.
It then runs the tests against this endpoint and makes sure that it acts exactly the way the specification describes.
This is crucial, now we have a way to automatically validate our APIs.
Integration into testing workflow
We use containers for running our microservices, also for running our test suite.
Each level of testing is a directory with a set of tests for the level.
Let’s take a look at the container level API test:
command = ['docker', 'run',
'weaveworksdemos/accounts:' + self.TAG]
out = Dredd().test_against_endpoint("accounts/accounts.json", AccountsContainerTest.container_name, "http://accounts/", "mongodb://accounts-db:27017/data", self.mongo_container_name)
self.assertGreater(out.find("0 failing"), -1)
self.assertGreater(out.find("0 errors"), -1)
Here we are running Dredd against the API endpoint.
image = 'weaveworksdemos/openapi'
# start the testing container and run it against the endpoint
def test_against_endpoint(self, json_spec, container_name, api_endpoint):
command = ['docker', 'run',
out = Docker().execute(command)
The routine starts the Dredd container and gives it location of the spec and the endpoint with the running API.
Dredd is supplied with the hooks.js file that seeds the database with fixtures for the service.
You can learn more about the Dredd Docker image we created on our repository  and read more on Dredd hooks in the documentation.
More work to do
With this workflow we have defined the microservices testing levels and integrated API endpoints testing against the specification into continuous integration stage.
There’s a lot more work to do.
For example, it would be nice to introduce versioning for the API.
Also, currently we have to manually write and update the specification which gets tedious fast. But it’s a necessary evil as we use different technologies for our microservices and we didn’t achieve full automation yet.
But it’s a good start and gives us more confidence as we keep deploying our services.