High-performance, cloud-based voting platform
A high-performance, cloud-based voting platform built for Channel 4’s interactive talent show The Singer Takes It All, handling up to 300,000 concurrent users securely on AWS.
The Singer Takes It All was a groundbreaking Channel 4 talent show that combined live television with interactive audience participation via a mobile app. Contestants uploaded karaoke-style video performances, and viewers voted them “Hit” or “Miss” in real time through the app, ultimately deciding which hopeful singers got a chance to appear on the live broadcast.
up to 300,000 concurrent users at peak times.
This real-time voting platform needed to scale to an audience of up to 300,000 concurrent users at peak times, demanding careful engineering for performance and reliability. This case study explains how we built a secure, high-performance backend on AWS to power the show’s massive voting system, and the steps taken to ensure security, fairness, and scalability under intense load.
9.5k video performances uploaded, 21 million Hopeful votes cast
Project Background & Requirements
The show’s premise put real stakes on the line, viewer votes determined who advanced and potentially won prizes on live TV. Consequently, the online system was mission-critical and under heavy scrutiny.
My role was to lead backend development, focusing on the Web API and cloud infrastructure that supported the mobile voting apps.
Key responsibilities:
-
Secure Voting API: Develop a secure Web API for the mobile apps to submit votes for contestants (nicknamed “hopefuls”), even under ~300,000 concurrent users at peak. The system had to prevent fraudulent voting and ensure each user’s votes were counted fairly.
-
Performance & Load Testing: Stress-test the API endpoints to ensure they could handle sudden traffic spikes during live broadcasts. This involved simulating heavy load and optimising code and SQL queries for speed.
-
Developer Experience: Provide clear API documentation for mobile developers. With multiple teams consuming the API, comprehensive and easy-to-use documentation was essential.
-
Contestant Tracking Admin: Build an internal web system for production staff to track contestants’ progress. This admin interface displayed voting results, leaderboards, and content submissions in real time.
-
Content Moderation Tools: Implement a moderation workflow to review user-submitted videos, filtering out inappropriate content before it appeared publicly.
These requirements meant balancing development speed with performance and scalability. We had to deliver rapidly for the TV launch, yet ensure stability under unprecedented load.
Architecture and Tech Stack
To meet the scalability demands, we built the solution on Amazon Web Services (AWS), allowing us to scale resources and handle high traffic.
Key elements of the tech stack:
-
Backend Framework: The backend was built with ASP.NET Web API (C#), providing a robust framework for RESTful endpoints. This allowed rapid development of secure routes and business logic for voting, user management, and content managment.
-
Database: A MySQL relational database stored user data, votes, and contest content. The schema was tuned for read/write performance and structured for rapid vote tallying. SQL queries were carefully optimised - at this scale, even small inefficiencies could become bottlenecks.
-
Cloud Hosting & Scalability: Hosted on AWS EC2 instances behind an Elastic Load Balancer (ELB), with AWS API Gateway handling routing. This setup allowed horizontal scaling, adding more web servers during peak periods.
-
Content Delivery & Caching: Static data (like leaderboards and top contestants) was pre-exported as JSON and cached on Amazon S3, reducing database load and improving response times for users.
By using AWS services and a scalable architecture, we ensured the system could dynamically handle extreme load while remaining responsive.
Ensuring Secure and Fair Voting
Security and fairness were paramount for a system influencing televised results.
Key safeguards:
-
User Authentication & Anti-Fraud: Voting required authenticated users. We integrated OAuth2 login (Facebook/Twitter), issuing short-lived bearer tokens for each session. All API calls used HTTPS, and tokens could be revoked if abuse was detected. This model prevented spam or duplicate votes and ensured integrity.
-
Fair Vote Distribution: To avoid bias, contestant videos were delivered to users in random order. Each contestant entry was assigned a random number, and SQL queries fetched random contestants for viewing. Random seeds were refreshed frequently, ensuring an equal distribution of exposure and votes.
These measures maintained one-vote-per-user integrity and fairness in content exposure, reinforcing credibility for both producers and the public.
Performance and Scalability Under Load
The defining challenge was scalability. The system had to handle huge surges in traffic during live broadcasts without degradation.
Approaches taken:
-
High-Concurrency Optimisations: During peak usage (~300,000 concurrent users), we fine-tuned database indices, optimised queries, and used asynchronous C# patterns to enable parallel request handling. Critical operations were handled by stored procedures to maximise MySQL performance.
-
Elastic Scaling & Caching: Before each live episode, EC2 capacity was scaled up pre-emptively. Cached JSON leaderboards on S3 ensured that heavy read traffic didn’t overload the database.
-
Stress Testing & Tuning: I developed a custom load-testing harness (which was actually a complex product in itself) simulating thousands of concurrent users performing realistic actions (login, watch, vote). This revealed bottlenecks early, allowing targeted optimisations. During live shows, the backend ran flawlessly under load, confirming our preparations.
Developer Experience and Collaboration
The mobile applications were built by another team, so clear API documentation and collaboration were essential.
We used Swagger (OpenAPI) to generate interactive documentation, allowing developers to explore and test endpoints directly. This reduced integration friction and supported parallel development.
Results & Insights
The Singer Takes It All was a major success. The platform handled massive engagement with no downtime or slowdowns. Across the series, viewers uploaded around 9,500 videos and cast over 21 million votes through the system.
The project earned industry recognition, winning “Best Multiplatform Project” and “Best App” at the Broadcast Digital Awards, and receiving a BAFTA nomination for Digital Creativity. These accolades reflected both technical excellence and innovation.
From a personal standpoint, this project was a masterclass in high-performance systems, real-time architecture, and cloud scaling. It sharpened my approach to concurrency, SQL optimisation, and load testing, proving that with the right architecture and preparation, even extreme workloads can be handled gracefully.


