The Promise vs The Reality
Serverless computing promised to eliminate infrastructure management, reduce costs, and let developers focus purely on code. It delivered on some of those promises — but not all of them, and not for every workload.
How Serverless Actually Works
In serverless architectures, your code runs in stateless, event-driven functions managed entirely by the cloud provider. You don't provision servers, manage scaling, or pay for idle capacity.
Major serverless platforms:
- AWS Lambda
- Google Cloud Functions
- Azure Functions
- Vercel / Netlify (for frontend)
- Cloudflare Workers (edge computing)
Where Serverless Excels
Event-Driven Workloads
Processing webhooks, handling file uploads, responding to database changes, and managing queue messages. These are inherently bursty — serverless handles the scaling automatically.
API Backends with Variable Traffic
If your API handles 10 requests per minute most of the time but spikes to 10,000 during peaks, serverless is ideal. You pay for actual usage, not provisioned capacity.
Scheduled Tasks and Cron Jobs
Running periodic cleanup, report generation, or data synchronization. No need to keep a server running 24/7 for tasks that run once an hour.
Prototype and MVP Development
When you need to ship fast and validate an idea before investing in infrastructure architecture. Serverless removes operational overhead entirely.
Where Serverless Struggles
Long-Running Processes
Most serverless functions have execution time limits (15 minutes on AWS Lambda). Video processing, large data migrations, and ML model training need different solutions.
Consistent, High-Throughput Workloads
If your workload is steady and predictable, a traditional server or container is usually cheaper. Serverless pricing advantages come from variable usage.
Stateful Applications
Serverless functions are stateless by design. Applications that need persistent connections (WebSockets, game servers) require additional architecture to work.
Cold Start Sensitivity
Functions that haven't been invoked recently take longer to start. For latency-sensitive applications, cold starts of 100ms–3s can be unacceptable.
Cost Reality Check
| Scenario | Serverless Cost | Traditional Server |
|---|---|---|
| Low traffic API (1,000 req/day) | ~$1/month | ~$15/month |
| Medium traffic (100,000 req/day) | ~$30/month | ~$50/month |
| High traffic (1M req/day) | ~$200/month | ~$100/month |
| Always-on background processing | Expensive | Much cheaper |
The crossover point is typically around 1–2 million requests per month. Beyond that, reserved containers or VMs are often more cost-effective.
A Practical Architecture
Most modern applications use serverless selectively, not exclusively:
- API routes: Serverless functions (AWS Lambda, Vercel Functions)
- Frontend: Static hosting with CDN (Vercel, Cloudflare)
- Background jobs: Container-based workers (ECS, Cloud Run)
- Database: Managed service (RDS, PlanetScale, Supabase)
- File storage: Object storage (S3, R2)
- Real-time features: Dedicated WebSocket service
Conclusion
Serverless isn't an all-or-nothing decision. Use it where it shines — variable workloads, event-driven processing, and rapid development — and use containers or managed services where it doesn't. The best architectures are pragmatic, not dogmatic.

