Disclaimer: This summary has been generated by AI. It is experimental, and feedback is welcomed. Please reach out to info@qconlondon.com with any comments or concerns.
The presentation titled "Scaling API Independence: Mocking, Contract Testing & Observability in Large Microservices Environments" by Tom Akehurst focuses on addressing challenges in microservices environments through mocking, contract testing, and observability.
Key Topics Discussed:
- Microservices Challenges: Despite promises of independence, microservices often face issues like broken environments and dependency on other team APIs. These challenges hinder productivity and development speed.
- Mocking and API Simulation: Mocking helps decouple systems, but maintaining realistic mocks across thousands of APIs is challenging. Mock APIs might drift from real counterparts; hence, continuous validation is essential.
- Contract Testing: Contracts provide a syntactic description of APIs but lack behavioral context. Combining contracts with observations and simulations helps validate and ensure API reliability.
- API Observability: Observability involves actively and passively capturing API interactions to understand system behavior. Tools like EBPF and service mesh facilitate these processes without breaching encryption.
- Productivity through AI: AI and Large Language Models (LLMs) can enhance API simulations by generating and refining mock data, serving as productivity levers when paired with existing contract testing infrastructure.
Conclusion: The integration of these techniques—mocking, contract testing, and observability—enhances productivity and reliability within large microservices environments. They help in managing API complexities and improving the independence of service teams.
This is the end of the AI-generated content.
Microservices promise faster deployments and team autonomy. In reality, engineers are often blocked waiting for APIs, dealing with broken sandboxes, or wrangling test environments.
Mocking helps decouple dependencies and teams - but at the scale of 1,000+ internal APIs, maintaining realistic, reliable mocks is a challenge of its own. How do you ensure contract alignment? How do you keep mocks up-to-date without excessive maintenance?
In this talk, we'll explore new ways to combine mocking, contract testing, and traffic observation to support fast-flowing development and testing.
Interview:
What is the focus of your work?
Building tools to help organisations who depend heavily on APIs develop and test more productively.
What’s the motivation for your talk?
I encounter many engineering orgs for whom the promise of microservices - decoupled teams shipping independently - isn't being realized. They're stuck firefighting flakey dependencies and debugging spurious test failures in integrated environments or waiting entire weekends for test runs to complete. In my view this is largely due to some limiting assumptions and beliefs, particularly about mocking/simulation of APIs and how this can be done effectively at scale.
Who is your talk for?
Senior engineers/tech leads, engineering managers, senior QAs, QA managers
What do you want someone to walk away with from your presentation?
With increased confidence that API simulation can be a core pillar of their dev and test strategy, and some new ideas about how to achieve this in complex engineering organizations.
What do you think is the next big disruption in software?
It's hard to bet against AI coding tools at the moment.
What was one interesting thing that you learned from a previous QCon?
In 2013 Damian Conway's presentations were both unforgettable - the showmanship and craft involved in building a latin code interpreter just to present was incredibly impressive, and his second talk on presentation technique is still the one I refer back to the most.
Speaker

Tom Akehurst
CTO and Co-Founder @WireMock, 20+ Years Building Enterprise Systems
Tom is a career software developer who’s spent over 20 years building enterprise systems, primarily as a backend Java/JVM developer but with dabblings in infrastructure/DevOps, web development and performance engineering. He’s spent more than half of that time thinking about how to develop and test networked services more productively and is the creator of the WireMock open source API mocking tool. Lately he’s also the CTO and co-founder of WireMock, Inc.