The State of Serverless Computing 2021

The State of Serverless Computing 2021

Serverless computing is redefining the way organizations develop, deploy, and integrate cloud-native applications. According to an industry report, the market size of serverless computing is expected to reach 7.72 billion by 2021. A new and compelling paradigm for the deployment of cloud applications, serverless computing is at the precipice of enterprise shift towards containers and microservices.

In the year 2021, the serverless paradigm shift presents exciting opportunities to organizations by providing a simplified programming model for creating cloud applications by abstracting away most operational concerns. Major cloud vendors Microsoft, Google, and Amazon are already in the game with their respective offerings and there is no reason you shouldn’t board the train.
The ”Pay-as-you-go” Backend

Serverless computing turns the backend into a ‘pay-as-you-go’ service, which means you are only billed when somebody accesses one of your backend services. Moving your backend from a server-full to serverless computing is like switching your data plan from monthly to per-byte billing.

The phrase ‘server less’ is a little ambiguous because at the end of the day backend services are still hosted on a server somewhere. In serverless computing, the server storage and infrastructure affairs are conducted by the vendor so that developers don’t have to get into server chatters.
2021 Is the Year of FaaS

All major providers of serverless computing offer several types and tiers of database and storage services to their customers. In addition, all major cloud players such as Amazon, Microsoft, and Google offer Function-as-a-Service (FaaS) platforms with their serverless offerings.

Function-as-a-Service (FaaS) is a ‘serverless’ way to run modular bits of code on the network edge. FaaS allows developers to write and deploy a piece of code anytime, which can then be run upon event triggers. An event could be a user clicking Submit at the bottom of a web form. Serverless computing makes it easy to scale your application and is a cost-effective way to deploy microservices.

With FaaS, developers can build applications in a modular architecture, making the application code more scalable without having to devote resources to supporting the underlying backend.
Serverless Addresses Problems With Traditional Cloud Models

Organizations going serverless with their backend services are billed by their serverless vendor based on their compute consumption and do not have to reserve and pay for a fixed amount of bandwidth or number of servers, as the service is auto-scaling with the incoming demand.

In the initial days of web development, developers had to own or rent physical hardware to run and test their application code. Running production applications was another nightmare because now they had to keep those servers running during the lifecycle of the application. Cloud computing and virtualization brought much-needed relief to web developers. Now they could rent virtual servers from a cloud vendor according to their needs. The problem was they still had to over-purchase to keep up with traffic spikes or their application would break down. Except, much of the server space they were paying for was going to waste. Cloud vendors introduced auto-scaling compute models to mitigate the problem. However, auto-scaling in response to an unsolicited spike in traffic (such as a DDoS Attack) could turn quite expensive.

Serverless addressed most limitations brought by traditional cloud models and the typical client-server implementations. No doubt, serverless is growing faster than most cloud models.
Serverless Computing Looks a Lot More Promising in 2021

According to RightScale’s 2018 State of the Cloud report, serverless computing grew at a compounded annual growth rate of 75%. With the ability to reduce costs, operational complexities, and DevOps disparities, Serverless computing is likely to grow at a steady pace in the coming years.

Among various reasons to choose serverless computing for application deployment, “flexibility in scaling” remained the number one. In Codingsan’s State of Serverless Report 2020, 63% of participants stated flexibility in scaling as their reason for serverless computing followed by speed of development, decreased need for system administration, and cost-effectiveness at 55%, 54%, and 53% respectively. Scalability, cost-effectiveness, and speed of development are likely to remain major drivers to serverless computing adoption in the year 2021 too.
Graphic from Codingsans Report

Amazon, the market leader in the cloud computing market with AWS, continues its lead in serverless computing to capturing a whopping 87% market share in 2020, according to the report. Microsoft came a distant second with its Azure Functions capturing 16% market share. Amazon Lambda is expected to continue its huge lead in 2021 and many years to come although Google and Microsoft are catching fast with their respective serverless offerings.
Serverless Computing Market

When it comes to programming languages preferred for building functions in serverless applications, Node.js still takes a formidable lead over the next popular programming language in the serverless domain, Python. Node.js still captures the attention of 77% of developers. Python is catching up fast but it is likely Node.js will remain the preferred language in the near future.
Serverless Programming Language Preference
Serverless Is the Most Exciting Thing in Computing Right Now
No More Server Management

Although ‘serverless’ computing takes place on servers, developers don’t have to worry about their existence at the time of application deployment. Your serverless vendor manages it on your behalf, which decreases the investment required in DevOps, outlays, and time developers take to build and expand their applications. Applications would no more be constrained by server capacity or compute power.
Pay-as-you-go Backend

As with a ‘pay-as-you-go’ data plan where you are billed only for the amount of data consumed, in serverless computing, developers are billed only when the application code is executed. Application code only executes when backend functions are run in response to an event. The code auto-scales according to the demand and provisioning remains dynamic, accurate, and instantaneous.

On the other hand, in ‘server full computing such as AWS EC2’s, developers have to pre-plan how much server capacity they are going to use and then buy that capacity, whether they will use it or not.
Serverless Computing Means Scalability

Imagine if airlines could disappear, appear planes in and out of thin air, increasing the fleet size during the holiday season, and shrinking the fleet size during the off-peak seasons without ever taking a loss in fleet operational costs. This is what serverless applications are capable of.

Serverless applications upscale, downscale according to the current demand. Serverless computing allows applications to go from hundreds of compute instances to just one and vice-a-versa in a matter of seconds to accommodate complex demand curves. Serverless vendors employ algorithms to start, run, and end those instances as needed using containers.

Consequently, serverless applications can handle millions of concurrent requests or a single request with the same throughput.
Quicker Deployments and Updates Are Possible

Serverless infrastructure doesn’t need complicated backend configurations to make an application work. Developers may upload entire backend code at once but going one function at a time makes a lot more sense. A serverless application is a collection of functions managed by the vendor rather than one large, unmanageable monolithic block of cumbersome code.

When it comes to releasing updates, patches, and fixes, developers can get away by only altering the affected functions. Likewise, they can add new functions to reflect a new application feature.
Edge Locations Reduce Latency

Serverless applications are not hosted on the origin server but at various edge locations of the vendor’s infrastructure. In response to demand, the closest edge location triggers the event and the function. Edge computing reduces latency because now requests don’t have to go all the way up to an origin server.
Serverless Computing Should Reduce the Cost for Most Use Cases

Serverless architectures are most efficient at reducing the costs of applications with uneven usage. If at peak periods your application alternates between instances of little to no traffic, then renting server space for a fixed period of time won’t make sense. Paying for an always running, available server space is uneconomical when you would be only using it for a fraction of renting period.

However, inconsistent usage is not consistent with most application scenarios. For example, if you’re a content delivery network (CDN) provider, then your applications need to run extended workloads such as video transcoding, image compression, video on demand, etc. These batch jobs mean your applications would be running most of the time and serverless computing won’t be practical unless you don’t mind paying aggravated cloud bills.

Serverless computing is not practical in every practical scenario and as emerging cloud technology, it is bounded by limitations. Nevertheless, serverless vendors are hard at work to make serverless the universal way of building and running applications.
Serverless Is Still Not the ”Go-to” Method of Application Deployments
Serverless Computing Hinders Security Audits

Serverless vendors abstract most operational concerns from the users including the security ones. This layer of abstraction prevents internal and independent security audits on your DevSecOps practices. Some jurisdictions mandate independent security verifications for applications handling personal or sensitive data like EMRs and SSNs. If that is the case, serverless computing is not up for you yet.

In addition, serverless vendors tend to run application code from multiple applications on a single server. Multitenancy means your applications and that your competitors could be now neighbors on the same piece of physical hardware. Multitenancy is not much of an issue if your serverless vendor follows security best practices.

Again, finding whether they are following those practices or not is subjective since there is no way to verify the claims.

Moreover, multitenancy may impact the performance of your application if the multi-tenant servers are misconfigured. Misconfigured servers are much more likely to expose your customer’s data to a cybercriminal.

Fortunately, most serverless vendors’ sandbox functions properly, have the capable infrastructure to begin with and make use of proven workarounds.
Performance May Take a Backseat

Serverless applications are triggered in response to an event, which means they have to cold-start every time there is a trigger. A cold start degrades performance by extending the application start time.

Serverless vendors mitigate these performance bottlenecks by employing a powerful infrastructure and a bunch of workarounds.
Vendor Lock-In Is the Harsh Reality of Cloud Computing And…

With your serverless vendor controlling every aspect of your backend services, you’re inevitably locked-in to the vendor during the lifecycle of your application. You may need to switch your vendor for a variety of reasons.

Serverless computing services from major vendors like Amazon Web Services (AWS) and Microsoft Azure are easier to migrate because they are written in platform-agnostic programming languages such as NodeJS and Python and employ service worker’s API.
The Market of Serverless Computing

In addition to serverless offerings from Amazon Web Service and Microsoft Azure, serverless computing is an upcoming yet thriving market among other cloud computing vendors. Although together with Azure Functions, Cloud Functions, and AWS Lambda pretty much own the serverless market, there is still a lot of scope for additional players if they could standout.

For example, Knative, deployed on top of Kubernetes clusters, is a promising open-source project in direction of multi-vendor serverless computing. Oracle Functions and IBM Cloud Functions are offered on the top of an open-source project too. Moreover, Firebase still has a niche following of developers. Further, Cloudflare workers allow low latency serverless computing with edge locations.
Serverless Vendor Differentiators
Serverless Will Eventually Replace Cloud Computing

Serverless computing is a natural evolution from traditional cloud computing but the former is still in infancy as a technology to replace its older cousin. While the pros are already outnumbering cons, the use-cases of successful serverless applications are still scant. Only a handful of major corporations are invested in the technology to drive adoption and raise confidence.

No doubt, serverless as a technology still has lots of holes to fills. The limitations are showstoppers to widespread adoption of serverless architecture for application development.

The competition is at an all-time high and serverless vendors don’t want to leave any stone unturned.

Leave a Reply

Your email address will not be published. Required fields are marked *