Building microservices with django and gRPC

Saheed Adedeji
12 min readMar 26, 2023

--

gRPC as described on its official documentation is a modern open source high performance Remote Procedure Call (RPC) framework that can run in any environment. See here for more on gRPC.

Microservices on the other hand is an architectural style that structures an application as a collection of services that are independently deployable, loosely coupled and organized around business capabilities. see here for more on microservices.

The goal of the tutorial is to build a project that consists of 2 applications sharing data using gRPC.
The applications exists inside a single folder and are being run by a docker compose file on different ports. The first application dashboard includes APIs to create, update and delete a product , then the same actions are implemented on the product table of the product application.
The second application product includes an API to like a product, then on the dashboard application, the number of likes of that particular product increases by 1.
The main requirements used are:
- Django and django_rest_framework (The framework used to build the applications)
- https://github.com/profmcdan/django-grpc-framework (This is built on the django_grpc_framework library with updates for the latest version of django)
- grpcio (The python library for gRPC)
- grpcio-tools (The tool for generating gRPC code)
- The complete source code for the application can be found here

version: "3.9"
services:
product: &product
build:
context: .
dockerfile: product/Dockerfile
command: python manage.py runserver 0.0.0.0:10007
volumes:
- ./product:/app
ports:
- "10007:10007"
env_file:
- ./.env
restart: always

dashboard: &dashboard
build:
context: .
dockerfile: dashboard/Dockerfile
command: python manage.py runserver 0.0.0.0:10008
volumes:
- ./dashboard:/app
ports:
- "10008:10008"
env_file:
- ./.env
restart: always

dashboard_grpc: &dashboard_grpc
build:
context: .
dockerfile: dashboard/Dockerfile
command: python manage.py grpcrunserver 0.0.0.0:10009
volumes:
- ./dashboard:/app
ports:
- "10009:10009"
env_file:
- ./.env
restart: always

product_grpc: &product_grpc
build:
context: .
dockerfile: dashboard/Dockerfile
command: python manage.py grpcrunserver 0.0.0.0:10006
volumes:
- ./product:/app
ports:
- "10006:10006"
env_file:
- ./.env
restart: always

As shown above in the docker-compose.yml file, There are 4 services, the first 2 are the applications (product and dashboard), while the last 2 are to start the grpc server for the respective applications.

THE DASHBOARD APPLICATION

For this demo I created an app (that also includes the handlers.py and serializers.py), and outside the app folder I created the protos folder.

The proto files are used to describe services that can be called on an application, it also includes the description for their input and output. You can also think of services as functions that will be called from external applications.
The proto files are compiled using the grpcio-tools to generate the gRPC code which contains implementations that are later use in the code. It should be noted that the proto files are saved with the .proto extension.

There are 2 proto files in the protos folder.

syntax = "proto3";
package product;

service ProductService {
rpc LikeProduct(LikeProductRequest) returns (LikeProductResponse) {}
}

message Product {
string id = 1;
string title = 2;
string image = 3;
int64 likes = 4;
}

message LikeProductRequest {
string id = 1;
}

message LikeProductResponse {
Product product = 1;
}

As seen above in the product.proto file, there’s a LikeProduct service that takes in LikeProductRequest (which is only an id of type string) and returns LikeProductResponse (which is has a custom data type called Product).

syntax = "proto3";
package user;

service UserService {
rpc GetUser(GetUserRequest) returns (GetUserResponse) {}
}

message GetUserResponse {
string id = 1;
}

message GetUserRequest {
}

The user.proto file describes a GetUser service that takes in a GetUserRequest (which does not have any input) and a GetUserResponse (which is an id of type string).

The proto files are then compiled using the command below:

python -m grpc_tools.protoc -I./protos --python_out=. --pyi_out=. --grpc_python_out=. ./protos/user.proto

The command generates 3 files which are named based on the proto file name. If the user proto is named user.proto then the generated files will be user_pb2.py, user_pb2.pyi and user_pb2_grpc.py.
-I./protos states the folder where the protos folder is located
— python_out=. state where the python file (user_pb2.py) should be created
— pyi_out=. state where the pyi file (user_pb2.pyi) should be created
— grpc_python_out=. state where the python grpc file (user_pb2_grpc.py) should be created
./protos/user.proto state the location of the proto file

It should be noted that it doesn’t matter where these 3 files are created, the main goal is that you must be able to import them in all your code, so one way to achieve this is to copy and paste them across all application where you need to access service from the dashboard application.
For this tutorial, all the gRPC codes generated are stored inside a single folder, the folder is bundle as a python package named micorservice_demo_with_grpc_shared_utils and then install across applications the whole applications (see the installation line on the entrypoint.sh file). The source code for this can be found here.

# from product.producer import publish
import grpc
from rest_framework import viewsets, status
from rest_framework.response import Response
from rest_framework.views import APIView
from django.conf import settings

from .utils import get_random_user
from .serializers import ProductSerializer
from .models import Product, User
# import product_receiver_pb2
# import product_receiver_pb2_grpc
from micorservice_demo_with_grpc_shared_utils import product_receiver_pb2, product_receiver_pb2_grpc

PRODUCT_GRPC_SERVER = settings.PRODUCT_GRPC_SERVER

class ProductViewSet(viewsets.ModelViewSet):
queryset = Product.objects.all()
serializer_class = ProductSerializer

def list(self, request):
products = Product.objects.all()
serializer = ProductSerializer(products, many=True)
return Response(serializer.data)

def create(self, request):
serializer = ProductSerializer(data=request.data)
serializer.is_valid(raise_exception=True)
product = serializer.save()
# publish('product_created', serializer.data)
with grpc.insecure_channel(PRODUCT_GRPC_SERVER) as channel:
stub = product_receiver_pb2_grpc.ProductReceiverServiceStub(channel)
product_ser = product_receiver_pb2.ProductReceiver(product_id=str(product.id), title=product.title, image=product.image)
res = stub.CreateProductFromReceiver(product_receiver_pb2.CreateProductFromReceiverRequest(product_receiver=product_ser))
print(res, end='')
return Response(serializer.data, status=status.HTTP_201_CREATED)

def retrieve(self, request, pk):
product = self.get_object()
serializer = ProductSerializer(product)
return Response(serializer.data)

def update(self, request, pk):
product = self.get_object()
serializer = ProductSerializer(instance=product, data=request.data, partial=True)
serializer.is_valid(raise_exception=True)
serializer.save()
# publish('product_updated', serializer.data)
with grpc.insecure_channel(PRODUCT_GRPC_SERVER) as channel:
stub = product_receiver_pb2_grpc.ProductReceiverServiceStub(channel)
product_data = {
'product_id' : str(product.id),
'title': product.title,
'image': product.image
}
product_ser = product_receiver_pb2.ProductReceiver(**product_data)
res = stub.Update(product_receiver_pb2.UpdateByIdRequest(product_receiver=product_ser))
print(res, end='')
return Response(serializer.data, status=status.HTTP_202_ACCEPTED)

def destroy(self, request, pk):
product = self.get_object()
# publish('product_deleted', pk)
with grpc.insecure_channel(PRODUCT_GRPC_SERVER) as channel:
stub = product_receiver_pb2_grpc.ProductReceiverServiceStub(channel)
res = stub.Delete(product_receiver_pb2.DeleteByIdRequest(product_id=str(product.id)))
print(res, end='')
product.delete()
return Response(status=status.HTTP_204_NO_CONTENT)


class UserAPIView(APIView):
def get(self, _):
user = get_random_user()
return Response({
'id': user.id
}, status=status.HTTP_200_OK)

def post(self, _):
User.objects.create()
return Response(status=status.HTTP_201_CREATED)

As shown in the app views.py, on the create, update and destroy actions a connection is made to the other application

with grpc.insecure_channel(PRODUCT_GRPC_SERVER) as channel:
stub = product_receiver_pb2_grpc.ProductReceiverServiceStub(channel)

the first 2 lines of these connections are same and just specifying connection to PRODUCT_GRPC_SERVER which is set as 172.17.0.1:10006 on the .env file, the local host 172.17.0.1 and port 10006 specify for product_grpc service on the docker compose.

product_ser = product_receiver_pb2.ProductReceiver(product_id=str(product.id), title=product.title, image=product.image)
res = stub.CreateProductFromReceiver(product_receiver_pb2.CreateProductFromReceiverRequest(product_receiver=product_ser))
print(res, end='')

For the create method, we are calling the CreateProductFromReceiver from the product application which accepts a request type of CreateProductFromReceiverRequest. CreateProductFromReceiverRequest takes in an object of type ProductReceiver, hence the first line is to use the created product data to create the ProductReceiver objects, which is then passed into the CreateProductFromReceiverRequest, and this is passed into the service.

product_data = {
'product_id' : str(product.id),
'title': product.title,
'image': product.image
}
product_ser = product_receiver_pb2.ProductReceiver(**product_data)
res = stub.Update(product_receiver_pb2.UpdateByIdRequest(product_receiver=product_ser))
print(res, end='')

For the update method, we are calling the Update method from the product application and this accepts UpdateByIdRequest which takes in an object of type ProductReceiver, the ProductReceiver is also created using the updated data of the product object.

res = stub.Delete(product_receiver_pb2.DeleteByIdRequest(product_id=str(product.id)))
print(res, end='')

For the delete method, we are calling the delete method from the product application and this accepts DeleteByIdRequest which take in the product id as string.

These implementation will be clearer when we see the proto file of the product application.

from django.db import models
from core.models import AuditableModel

class Product(AuditableModel):
title = models.CharField(max_length=200)
image = models.CharField(max_length=200, null=True)
likes = models.PositiveBigIntegerField(default=0)

class User(AuditableModel):
pass

The app models.py just include a Product Table that has title, image and likes fields and a User that will only have id, created and updated_at (these are inherited from the AudtiableModel)

from .models import Product
from rest_framework import serializers
from django_grpc_framework import proto_serializers
# import product_pb2
from micorservice_demo_with_grpc_shared_utils import product_pb2


class ProductProtoSerializer(proto_serializers.ModelProtoSerializer):
class Meta:
model = Product
proto_class = product_pb2.Product
fields = ['id', 'title', 'image', 'likes']


class ProductSerializer(serializers.ModelSerializer):
class Meta:
model = Product
fields = '__all__'

Above is the app serializers.py which has 2 serializer classes, the ProductProtoSerializer is for the ProtoService in the app services.py which will be described later and the ProductSerializer is for the ProductViewSet in the app views.py.

from django_grpc_framework import generics
from django.db.models import F
from .models import Product
from .utils import get_random_user
from .serializers import ProductProtoSerializer
# import user_pb2
# import product_pb2
from micorservice_demo_with_grpc_shared_utils import user_pb2, product_pb2

class ProductService(generics.ModelService):
"""
gRPC services for products
"""
queryset = Product.objects.all()
serializer_class = ProductProtoSerializer
look_up_field = 'id'

def LikeProduct(self, request, context):
product = self.get_object()
# product.likes = F('likes') + 1 # This runs on db level and obj will have to refreshed to get update
product.likes = product.likes + 1
product.save()
serialized = self.serializer_class(product).message
return product_pb2.LikeProductResponse(product=serialized)


class UserService(generics.GenericService):

def GetUser(self, request, context):
user = get_random_user()
return user_pb2.GetUserResponse(id=str(user.id))

As shown above in the app services.py, This file houses the services function implementation. If you remember in the product.proto file we declare a service called LikeProduct and in the user.proto file we declared another service called the GetUser.
Another thing to note here is the similarity between rest_framework and django_grpc_framework, generic services can be seen as the equivalent for viewsets in django_grpc_framework, just as we have ModelViewset that houses the CRUD actions in rest_framework, ModelService handles the CRUD actions here. And similarly we can use mixins to pick the require actions we want also. Just as ModelViewset requires a queryset and serializer_class, the same applies to ModelService and the good part is functions like get_queryset, get_serializer etc. are also valid for ModelService.
The main difference and you may have noticed this already is that in django_grpc_framework the functions starts with uppercase.
You can check the complete services documentation here

# import product_pb2_grpc
# import user_pb2_grpc
from .services import ProductService, UserService
from micorservice_demo_with_grpc_shared_utils import product_pb2_grpc, user_pb2_grpc

def grpc_handlers(server):
product_pb2_grpc.add_ProductServiceServicer_to_server(ProductService.as_servicer(), server)
user_pb2_grpc.add_UserServiceServicer_to_server(UserService.as_servicer(), server)

As shown above in handlers.py, the services are added to the server so as to enable other applications to be able to access them.

from django.contrib import admin
from django.urls import path, include
from drf_spectacular.views import SpectacularAPIView, SpectacularRedocView, SpectacularSwaggerView
from app.handlers import grpc_handlers as product_grpc_handlers

urlpatterns = [
path('api/schema/', SpectacularAPIView.as_view(), name='schema'),
path('api/v1/doc/', SpectacularSwaggerView.as_view(url_name='schema'), name='swagger-ui'),
path('api/v1/redoc/', SpectacularRedocView.as_view(url_name='schema'), name='redoc'),
path('admin/', admin.site.urls),
path('api/', include('app.urls')),
]

def grpc_handlers(server):
product_grpc_handlers(server)

The handlers is then imported in the project urls.py file and added as shown above.

Also in the project settings.py file, add the django_grpc_framework to installed apps and also add this at the bottom of the file

PRODUCT_GRPC_SERVER = os.getenv('PRODUCT_GRPC_SERVER')

GRPC_FRAMEWORK = {
'ROOT_HANDLERS_HOOK': 'project.urls.grpc_handlers',
}

The project.urls.grpc_handlers is the path to the grpc_handlers function added in the project urls.py and PRODUCT_GRPC_SERVER is just to hold the value of PRODUCT_GRPC_SERVER added in the env so it can be imported from settings rather than calling it from .env whenever it’s needed.

This sums it up for the dashboard application.

THE PRODUCT APPLICATION

The product application also maintains the same structure with the dashboard application so I won’t be explaining extensively here.

We only have one proto file in the protos folder here called product_reciever.proto

syntax = "proto3";
package product_receiver;
import "google/protobuf/empty.proto";

service ProductReceiverService {
rpc CreateProductFromReceiver(CreateProductFromReceiverRequest) returns (CreateProductFromReceiverResponse) {}
rpc Update(UpdateByIdRequest) returns (UpdateByIdResponse) {}
rpc Delete(DeleteByIdRequest) returns (google.protobuf.Empty) {}
}

message ProductReceiver {
string product_id = 1;
string title = 2;
string image = 3;
}

message CreateProductFromReceiverRequest {
ProductReceiver product_receiver = 1;
}

message CreateProductFromReceiverResponse {
ProductReceiver product_receiver = 1;
}

message UpdateByIdRequest {
ProductReceiver product_receiver = 1;
}

message UpdateByIdResponse {
ProductReceiver product_receiver = 1;
}

message DeleteByIdRequest {
string product_id = 1;
}

If you go back to the app views.py of the dasboard apllication, you will notice that after the create method we called the CreateProductFromReceiver and passed in a ProductReceiver object, same for the update method we called the Update and passed the ProductReceiver object, while for the destroy method we called the Delete and passed the product id as string. The declaration of the CreateProductFromReceiver, Update and Delete services is shown above which also includes their request and response structure.

from django.db import models
from core.models import AuditableModel

# Create your models here.
class Product(AuditableModel):
product_id = models.UUIDField(null=True)
title = models.CharField(max_length=200)
image = models.CharField(max_length=200, null=True, blank=True)


class ProductUser(AuditableModel):
user_id = models.UUIDField()
product_id = models.UUIDField()

class Meta:
unique_together = ['user_id', 'product_id']

On the app models.py, we have 2 models, the first one is to store the product after its creation on the dashboard application, notice the product_id which holds the value of the product id on the dashboard application. The other table is the ProductUser to store the user id and product id when a user likes a product and the unique_together ensures that a user can only like a product once.

from rest_framework import serializers
from django_grpc_framework import proto_serializers
from .models import Product
# import product_receiver_pb2
from micorservice_demo_with_grpc_shared_utils import product_receiver_pb2


class ProductReceiverProtoSerializer(proto_serializers.ModelProtoSerializer):
class Meta:
model = Product
proto_class = product_receiver_pb2.ProductReceiver
fields = ['product_id', 'title', 'image']


class ProductSerializer(serializers.ModelSerializer):
class Meta:
model = Product
fields = '__all__'

In the app serializers.py, we have ProductReceiverProtoSerializer to be used in the generic service declared in the app services.py and the ProductSerializer which is used in the app views.py viewset.

import grpc
from django.db.utils import IntegrityError
from rest_framework import viewsets, status
from rest_framework.views import APIView
from rest_framework.response import Response
# import requests
from .models import Product, ProductUser
# from .producer import publish
from .serializers import ProductSerializer
from django.conf import settings
# import user_pb2_grpc
# import user_pb2
# import product_pb2
# import product_pb2_grpc
from micorservice_demo_with_grpc_shared_utils import user_pb2_grpc, user_pb2, product_pb2, product_pb2_grpc

DASHBOARD_GRPC_SERVER = settings.DASHBOARD_GRPC_SERVER

class ProductViewSet(viewsets.ViewSet):
def list(self, request):
products = Product.objects.all()
serializer = ProductSerializer(products, many=True)
return Response(serializer.data)


class LikeView(APIView):
def get(self, _, pk):
# req = requests.get('http://127.0.0.1:8000/api/user')
# response = req.json()
with grpc.insecure_channel(DASHBOARD_GRPC_SERVER) as channel:
stub = user_pb2_grpc.UserServiceStub(channel)
response = stub.GetUser(user_pb2.GetUserRequest())
print(response, end='')
try:
product = Product.objects.get(pk=pk)
product_user = ProductUser.objects.create(user_id=response.id, product_id=pk)
# publish('product_liked', product.product_id)
with grpc.insecure_channel(DASHBOARD_GRPC_SERVER) as channel:
stub = product_pb2_grpc.ProductServiceStub(channel)
response = stub.LikeProduct(product_pb2.LikeProductRequest(id=str(product.product_id)))
print(response, end='')
return Response({
'messsage': 'Success'
})
except IntegrityError:
return Response({
'message': 'You already liked this product'
}, status=status.HTTP_400_BAD_REQUEST)
# return Response(req.json())

The app views.py file has a view to list product and another to like the product.

from django.urls import path
from .views import ProductViewSet, LikeView

urlpatterns = [
path('products', ProductViewSet.as_view({
'get': 'list',
# 'post': 'create'
})),
path('products/<uuid:pk>/like', LikeView.as_view())
]

And the app urls.py setup is as expected

from django_grpc_framework import generics, mixins
from .models import Product
from .serializers import ProductReceiverProtoSerializer
# import product_receiver_pb2
from micorservice_demo_with_grpc_shared_utils import product_receiver_pb2

# class ProductService(generics.ModelService):
class ProductService(mixins.CreateModelMixin,
mixins.PartialUpdateModelMixin,
mixins.DestroyModelMixin,
generics.GenericService):
"""
gRPC services for products
"""
queryset = Product.objects.all()
serializer_class = ProductReceiverProtoSerializer

def get_object(self):
"Using this because lookup field can only be access via request.product_receiver.product_id for the Update method"
request = self.request
if hasattr(request, 'product_receiver'):
product_id = request.product_receiver.product_id
else:
product_id = request.product_id
return Product.objects.get(product_id=product_id)


def CreateProductFromReceiver(self, request, context):
product = self.Create(request.product_receiver, context)
# serializer = product_receiver_pb2.ProductReceiver(product_id=str(product.product_id), title=product.title, image=product.image)
serializer = self.serializer_class(product).message
return product_receiver_pb2.CreateProductFromReceiverResponse(product_receiver=serializer)

def Update(self, request, context):
product = self.PartialUpdate(request.product_receiver, context)
serializer = self.serializer_class(product).message
return product_receiver_pb2.UpdateByIdResponse(product_receiver=serializer)

def Delete(self, request, context):
return self.Destroy(request, context)

In the app services.py here, we write the functionality tied to each service and as you can see I’m using the CreateModelMixin, DestroyModelMixin and PartialUpdateModelMixin from the django_grpc_framework mixin because I’m calling the Create, PartialUpdate and Destroy method in their respective views.
Another thing to notice here is I had to override the get_object method, this is because of the Update service, the get_object uses the lookup_field (which is by default the pk) to fetch the object but in our case since the object is not initially created here, we should have set the lookup_field to product_id which is the id of the product from the dashboard application product table. The update method on the other hand receives a ProductReciever object not id (which is used for DeleteRequest), hence I had to update the get_object to pick the id from the ProductReciever object.
Another thing to note is, I could have do without declaring the Delete method if i just rewrite this line on the product_receiver.proto file:

rpc Delete(DeleteByIdRequest) returns (google.protobuf.Empty) {}

as

rpc Destroy(DeleteByIdRequest) returns (google.protobuf.Empty) {}

beacuse the Destroy method is already provided by the DestroyModelMixin.

This sums it up for the product application and once the project is up, you can see all the servers running.

In conclusion, though at first glance using the django_grpc_framework seems a bit complex compared to using rabbitMQ for building microservices but after some time you will notice it just follows a pattern. First, you create your proto file stating the service name, its request and response fields, then you compile this proto file to generate the gRPC code which you then need to copy across all the applications where you need to call this service. Then in you service.py file you create the service implementation based on the request and response field you specified in the proto.
There’s a bit difference in the way the services are connected in the rabbitMQ implementation and using django_grpc_framework. In rabbitMQ you publish your message to the other application through the broker (rabbitMQ) and you generally don’t care if there’s a function to handle the message since you’re not expecting response but with gRPC the services must have been declared on the application before you can use them in external applications and you can also return response here, hence you can call the service synchronously or asynchronously.
I hope to some point this help simplfies the building of microservices using django and gRPC. And again you can find the full source code for the project here and the one for the shared package that houses the gRPC code here.

--

--