ShareChat’s Philosophy of Utilising Cloud-Based Systems

The Indian government’s ban on TikTok in 2019 may have upset millions of users, but it was definitely good news for its rivals. Homegrown short video platforms have captured every ounce of attention from the same user base, and ShareChat has joined its own cluster of social platforms.

Analytics India Magazine caught up with Gaurav Bhatia, Senior Vice President of Engineering at ShareChat and Moj, to learn more about how ShareChat built its cloud technology stack.

Purpose: How did ShareChat build its cloud-first technology stack?


Gaurav: ShareChat (Mohalla Tech Pvt Ltd) is India’s largest domestic social media company with over 400 million monthly active users across two platforms (ShareChat and Moj). We receive over 280 billion views per day and upload over 165 million combined daily content across our two platforms.

To manage and scale our services to serve hundreds of millions of users every month, we built our technology stack in a cloud-first manner. We run on Google Cloud Platform (GCP), following a microservices architecture, where business functions and functions are broken down into smaller components, developed, deployed and scaled independently. We build and run microservices on Kubernetes (k8) clusters hosted on Google Cloud. Our reliance on a platform-as-a-service model allows us to own core functionality, providing a best-in-class infrastructure for the teams that develop product features.

AIM: Can you provide more details on how you followed the microservices architecture with Google Cloud?


Gaurav: All the services that support our application can be broken down into small components. For example, when you open ShareChat or Moj, our Android app contacts our video feed service, which determines the most personalized list of videos to show the user. The Video Feed service in turn relies on a number of internal services to provide user details as well as a list of videos and metadata (eg more.

When a user interacts with content in the form of a successful watch/like or skip, the event is recorded and sent to the event service, which can then use these signals to better personalize the user’s video feed. Likewise, when a user uploads a video, it goes through several different small services, including the upload service, content moderation, encoding pipeline, and service. Each runs in its own Docker container on k8s and can scale up or down based on needs and capabilities.

As new services are created or existing services are upgraded, they can be deployed to a smaller percentage of users first (in the form of a canary deployment) so that unexpected bugs or scenarios can be caught and quickly fixed with minimal impact . Services running on Kubernetes clusters (k8s) depend on Google PaaS services such as Pub/Sub, Dataflow, Spanner, and BigTable. We use BigQuery for data analysis.

Goal: How does ShareChat provide a best-in-class infrastructure for teams developing product features?


Gaurav: At ShareChat, we have an internal platform engineering team that does a variety of things, including creating abstractions on top of Google services, such as our database, queue, and cache drivers. Services are built and delivered in a CI/CD model, and we’ve also developed internal tools like Atlas that allow developers to sign up and leverage the power of Google Cloud with a few clicks, while making sure we have enough guardrails in place to Avoid unintentional mistakes. We have invested heavily in test automation and have a device lab with many hosted physical devices that can be used remotely to run automated tests for our Android and iOS applications.

Purpose: Would it be beneficial for ShareChat to move your services to GoLang? What services do you want to engage in in the future?


Gaurav: Switching from Node.js to GoLang has brought us incredible savings, we’ve saved 90% of our infrastructure on many services. This not only results in server cost optimization, but also easier monitoring and maintenance through rapid scaling up/down. We are currently rewriting our entire product stack using GoLang. We’re always evaluating new services from Google.

We’re also excited about the security offerings and observability and disaster recovery capabilities that Google Cloud has added. At Kubernetes, we are passionate about eBPF and actively exploring service meshes. We were also impressed by the initial results we saw without bringing in ScyllaDB, which provided us with excellent low latency and high throughput.

AIM: What technologies are used behind ShareChat’s recommendation system? How does it stand out from its competitors?


Gaurav: At ShareChat, we have many different types of content, including short videos, long videos, pictures, GIFs, Weibo posts, and news content. We’ve invested heavily in building the ML infrastructure to enable rapid in-session personalization across all content surfaces. A shared feature store that can be used for quick experimentation on any content surface – with its own requirements – is a very powerful way to personalize. Also, it was helpful for us to build our ranker as a service for different contextual needs. For example, during the holiday season, users are looking for content that is more suitable for sharing than devotional content for personal use on other days.

Leave a Comment