Contract-First API Development with AsyncAPI and OpenAPI: Best Practices for Event-Driven Architectures
Most teams build APIs the old way. They write code first, then figure out the documentation later. This works for small projects, but it breaks down fast when you’re building microservices or event-driven systems.
Contract-first development flips this around. You define what your API should look like before writing any code. It sounds like extra work, but it actually saves time and prevents bugs.
Here’s why this matters: when you have multiple teams working on different services, they need to know exactly how to talk to each other. Without clear contracts, you get integration problems, broken tests, and frustrated developers.
Why Contract-First Matters
Think about building a house. You wouldn’t start hammering nails without blueprints, right? API contracts are like blueprints for your services.
With contract-first development, you get:
- Clear expectations: Everyone knows what the API should do before coding starts
- Better testing: You can test against the contract, not just the implementation
- Faster development: Teams can work in parallel once contracts are defined
- Fewer bugs: Integration issues get caught early
This approach works especially well for microservices. When you have 10 or 20 services talking to each other, you need clear rules about how they communicate.
OpenAPI for REST APIs
OpenAPI is the standard for documenting REST APIs. Most developers know it, but many still use it wrong. They write code first, then generate OpenAPI docs from their code. This misses the point.
Defining Schemas First
Let’s look at a simple example. Say you’re building a user service. Here’s how you’d define the contract first:
openapi: 3.0.3
info:
title: User Service API
version: 1.0.0
description: Manages user accounts and profiles
paths:
/users:
get:
summary: List all users
parameters:
- name: limit
in: query
schema:
type: integer
minimum: 1
maximum: 100
default: 20
- name: offset
in: query
schema:
type: integer
minimum: 0
default: 0
responses:
'200':
description: List of users
content:
application/json:
schema:
type: object
properties:
users:
type: array
items:
$ref: '#/components/schemas/User'
total:
type: integer
hasMore:
type: boolean
post:
summary: Create a new user
requestBody:
required: true
content:
application/json:
schema:
$ref: '#/components/schemas/CreateUserRequest'
responses:
'201':
description: User created successfully
content:
application/json:
schema:
$ref: '#/components/schemas/User'
'400':
description: Invalid input
content:
application/json:
schema:
$ref: '#/components/schemas/Error'
components:
schemas:
User:
type: object
required:
- id
- email
- createdAt
properties:
id:
type: string
format: uuid
email:
type: string
format: email
firstName:
type: string
lastName:
type: string
createdAt:
type: string
format: date-time
updatedAt:
type: string
format: date-time
CreateUserRequest:
type: object
required:
- email
- firstName
- lastName
properties:
email:
type: string
format: email
firstName:
type: string
minLength: 1
maxLength: 50
lastName:
type: string
minLength: 1
maxLength: 50
Error:
type: object
required:
- code
- message
properties:
code:
type: string
message:
type: string
details:
type: object
This contract tells you exactly what the API should do. It defines the data types, validation rules, and response formats. Now you can generate server stubs and client libraries from this specification.
Generating Server Stubs and Clients
Once you have your OpenAPI spec, you can generate code. For a Node.js service, you might use the OpenAPI Generator:
# Generate server stub
npx @openapitools/openapi-generator-cli generate \
-i user-service.yaml \
-g nodejs-express-server \
-o ./server
# Generate TypeScript client
npx @openapitools/openapi-generator-cli generate \
-i user-service.yaml \
-g typescript-axios \
-o ./client
The generated server gives you a starting point with all the endpoints defined. You just need to implement the business logic. The client library gives other teams a type-safe way to call your API.
CI/CD Pipeline Integration
Here’s where contract-first really shines. You can validate your API contracts in your CI/CD pipeline:
name: API Contract Validation
on:
pull_request:
paths:
- 'contracts/**'
- 'src/**'
jobs:
validate-contracts:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: '18'
- name: Install dependencies
run: npm install
- name: Validate OpenAPI specs
run: |
npx swagger-parser validate contracts/user-service.yaml
npx swagger-parser validate contracts/order-service.yaml
- name: Generate and test client
run: |
npx @openapitools/openapi-generator-cli generate \
-i contracts/user-service.yaml \
-g typescript-axios \
-o ./generated-client
npm run test:integration
This pipeline catches contract issues before they reach production. If someone changes an API in a breaking way, the tests will fail.
AsyncAPI for Event-Driven Systems
REST APIs are great for request-response patterns. But modern applications also need to handle events. That’s where AsyncAPI comes in.
AsyncAPI is like OpenAPI, but for event-driven architectures. It documents message queues, event streams, and real-time communication.
Documenting Event Streams
Let’s say you’re building an e-commerce system. You need to handle events like “order created,” “payment processed,” and “inventory updated.” Here’s how you’d document this with AsyncAPI:
asyncapi: 3.0.0
info:
title: E-commerce Event Stream
version: 1.0.0
description: Events for order processing and inventory management
servers:
production:
host: kafka.production.com:9092
protocol: kafka
description: Production Kafka cluster
staging:
host: kafka.staging.com:9092
protocol: kafka
description: Staging Kafka cluster
channels:
order/created:
address: order.created
messages:
orderCreated:
$ref: '#/components/messages/OrderCreated'
payment/processed:
address: payment.processed
messages:
paymentProcessed:
$ref: '#/components/messages/PaymentProcessed'
inventory/updated:
address: inventory.updated
messages:
inventoryUpdated:
$ref: '#/components/messages/InventoryUpdated'
operations:
publishOrderCreated:
action: send
channel:
$ref: '#/channels/order/created'
summary: Publish order created event
traits:
- $ref: '#/components/operationTraits/kafka'
subscribeToPaymentProcessed:
action: receive
channel:
$ref: '#/channels/payment/processed'
summary: Listen for payment processed events
traits:
- $ref: '#/components/operationTraits/kafka'
components:
messages:
OrderCreated:
name: OrderCreated
title: Order Created Event
summary: Published when a new order is created
contentType: application/json
payload:
$ref: '#/components/schemas/OrderCreatedPayload'
PaymentProcessed:
name: PaymentProcessed
title: Payment Processed Event
summary: Published when payment is successfully processed
contentType: application/json
payload:
$ref: '#/components/schemas/PaymentProcessedPayload'
InventoryUpdated:
name: InventoryUpdated
title: Inventory Updated Event
summary: Published when inventory levels change
contentType: application/json
payload:
$ref: '#/components/schemas/InventoryUpdatedPayload'
schemas:
OrderCreatedPayload:
type: object
required:
- orderId
- customerId
- items
- totalAmount
- timestamp
properties:
orderId:
type: string
format: uuid
customerId:
type: string
format: uuid
items:
type: array
items:
$ref: '#/components/schemas/OrderItem'
totalAmount:
type: number
format: decimal
currency:
type: string
default: USD
timestamp:
type: string
format: date-time
OrderItem:
type: object
required:
- productId
- quantity
- price
properties:
productId:
type: string
quantity:
type: integer
minimum: 1
price:
type: number
format: decimal
PaymentProcessedPayload:
type: object
required:
- paymentId
- orderId
- amount
- status
- timestamp
properties:
paymentId:
type: string
format: uuid
orderId:
type: string
format: uuid
amount:
type: number
format: decimal
currency:
type: string
default: USD
status:
type: string
enum: [completed, failed, pending]
timestamp:
type: string
format: date-time
InventoryUpdatedPayload:
type: object
required:
- productId
- newQuantity
- previousQuantity
- timestamp
properties:
productId:
type: string
newQuantity:
type: integer
minimum: 0
previousQuantity:
type: integer
minimum: 0
reason:
type: string
enum: [order, restock, adjustment, return]
timestamp:
type: string
format: date-time
operationTraits:
kafka:
bindings:
kafka:
groupId: ecommerce-service
clientId: ecommerce-service-${process.env.NODE_ENV}
This AsyncAPI spec tells you exactly what events are available, what data they contain, and how to connect to them. Other teams can use this to build event consumers without guessing about the data structure.
How AsyncAPI Enables Discoverability
With AsyncAPI, you can generate documentation, client libraries, and even mock servers. This makes event-driven systems much easier to work with.
You can generate a TypeScript client for consuming events:
npx @asyncapi/generator \
-p template=@asyncapi/typescript-template \
-o ./generated-client \
ecommerce-events.yaml
The generated client gives you type-safe event handling:
import { OrderCreated, PaymentProcessed } from './generated-client';
// Type-safe event handling
async function handleOrderCreated(event: OrderCreated) {
console.log(`Order ${event.orderId} created for customer ${event.customerId}`);
// Your business logic here
}
async function handlePaymentProcessed(event: PaymentProcessed) {
if (event.status === 'completed') {
console.log(`Payment ${event.paymentId} completed for order ${event.orderId}`);
// Update order status, send confirmation email, etc.
}
}
Unifying REST and Event APIs
Most real applications need both REST APIs and event streams. The challenge is managing both types of contracts in a consistent way.
Managing Both in One Repository
Here’s a practical approach for organizing your contracts:
contracts/
├── rest/
│ ├── user-service/
│ │ ├── v1/
│ │ │ └── user-service.yaml
│ │ └── v2/
│ │ └── user-service.yaml
│ ├── order-service/
│ │ └── v1/
│ │ └── order-service.yaml
│ └── payment-service/
│ └── v1/
│ └── payment-service.yaml
├── events/
│ ├── ecommerce-events.yaml
│ ├── user-events.yaml
│ └── notification-events.yaml
└── shared/
├── common-schemas.yaml
└── error-schemas.yaml
This structure keeps related contracts together while maintaining clear separation between REST and event APIs.
Governance and Schema Validation
You need rules for how contracts can change. Here’s a simple governance model:
- Breaking changes require a new version
- All changes must be reviewed
- Contracts must pass validation
- Integration tests must pass
You can enforce this with tooling:
name: Contract Governance
on:
pull_request:
paths:
- 'contracts/**'
jobs:
validate-changes:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Check for breaking changes
run: |
# Compare against main branch
npx @apidevtools/swagger-diff contracts/rest/user-service/v1/user-service.yaml \
contracts/rest/user-service/v1/user-service.yaml \
--fail-on-breaking
- name: Validate all contracts
run: |
find contracts -name "*.yaml" -exec npx swagger-parser validate {} \;
find contracts -name "*.yaml" -exec npx @asyncapi/parser validate {} \;
- name: Run integration tests
run: |
npm run test:contracts
Best Practices
Here are the key things to remember when doing contract-first development:
Versioning and Backward Compatibility
Version your contracts carefully. For REST APIs, use URL versioning:
/api/v1/users
/api/v2/users
For events, include version information in the message payload:
{
"version": "1.0",
"eventType": "order.created",
"data": {
"orderId": "123",
"customerId": "456"
}
}
Always maintain backward compatibility when possible. Add new fields as optional, never remove required fields without a major version bump.
Validating Contracts in CI/CD
Don’t just validate syntax. Test that your contracts actually work:
name: Contract Testing
jobs:
test-contracts:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Generate mock servers
run: |
npx @openapitools/openapi-generator-cli generate \
-i contracts/rest/user-service/v1/user-service.yaml \
-g nodejs-express-server \
-o ./mock-server
- name: Start mock server
run: |
cd mock-server
npm install
npm start &
sleep 10
- name: Test API endpoints
run: |
curl -f http://localhost:3000/users || exit 1
curl -f http://localhost:3000/users/123 || exit 1
- name: Test event schemas
run: |
npx @asyncapi/generator \
-p template=@asyncapi/mock-server-template \
-o ./mock-events \
contracts/events/ecommerce-events.yaml
Using Code Generation Wisely
Code generation is powerful, but don’t rely on it blindly. Use it for:
- Boilerplate code: Request/response models, validation
- Client libraries: Type-safe API clients
- Documentation: Auto-generated API docs
Don’t use it for:
- Business logic: Keep this in your application code
- Complex validation: Custom rules should be in your code
- Everything: Sometimes hand-written code is clearer
Industry Trends and Future
AsyncAPI is gaining traction fast. Companies like Netflix, Uber, and Spotify use it for their event-driven architectures. The AsyncAPI community is growing, and tooling is improving.
The future of API development is moving toward:
- Unified tooling: Tools that handle both REST and event APIs
- Better testing: Contract testing becoming standard practice
- API governance: Automated enforcement of API standards
- Real-time collaboration: Better tooling for teams working on contracts
Getting Started
If you want to try contract-first development, start small:
- Pick one API: Choose a simple REST endpoint
- Write the contract first: Define the OpenAPI spec before coding
- Generate the stub: Use code generation to create a starting point
- Test it: Make sure the contract works as expected
- Iterate: Refine the contract based on what you learn
Once you’re comfortable with REST APIs, try AsyncAPI for event-driven systems. The principles are the same, but the tooling is different.
The key is to start thinking about APIs as contracts, not just code. When you do this, you’ll build better systems that are easier to test, maintain, and integrate.
Contract-first development isn’t just a technical practice. It’s a way of thinking about how systems should communicate. When you get this right, everything else becomes easier.
Join the Discussion
Have thoughts on this article? Share your insights and engage with the community.