Disrupting the Cloud with Serverless

Serverless may indeed be the new black. It made Gartner’s 2019 list for top 10 trends in Infrastructure and Operations. An annual growth rate of 75 percent qualified serverless as the fastest-growing cloud service model in RightScale’s 2018 State of the Cloud report. And deployment of serverless computing technologies among global enterprises is expected to hit 20 percent by 2020, up from the current 5 percent.

Despite the nomenclature, computing will continue to involve servers. Developers, however, will no longer have to be involved with provisioning, deploying and monitoring servers. Those administrative tasks will now be handled by a new pedigree of services such as AWS Lambda, Google Cloud Functions, Microsoft Azure Functions, and IBM OpenWhisk.

Serverless, FaaS & BaaS

Serverless broadly refers to applications that circumvent the need for an always-on server component by using third-party hosted services to manage server-side logic and state. There are currently two types of serverless deployments; Functions-as-a-service (FaaS), provided by the new pedigree of services mentioned above, or Backend-as-a-service(BaaS) providers like Google Firebase, AWS DynamoDB, BaaSBox, and Backendless.

BaaS provides a complete online backend service, such as storage, social networking integration, or locations services, so that developers don’t have to develop another backend for each service that the applications use or access.

FaaS, on the other hand, is a serverless approach to application development, that enables developers to write and deploy code without having to manage their own infrastructure. The developer defines events and sets triggers and the service provider ensures that the right amount of infrastructure is delivered to execute the code. The code is executed only when backend functions are invoked by events triggered by users, dies once the task is completed, and the customer is charged only for the duration of the execution.

Evolution of Technology

SOURCE:https://www2.deloitte.com/content/dam/Deloitte/tr/Documents/technology-media-telecommunications/Serverless%20Computing.pdf

The serverless model represents a significant shift even from the traditional cloud service model where customers had to reserve and pay for a predetermined amount of bandwidth or server space, irrespective of usage. It is therefore easy to see the value in a model that delivers on-demand execution, dynamic scaling based on application loads and pay-per-use pricing.

As a fully-managed service, serverless ensures that all infrastructure related issues are shifted from the developer to the vendor. Let’s now take a look at some key benefits of the serverless model.

Benefits of Serverless

Efficient utilization: The serverless model ensures infinite scalability and built-in high availability and the pay-per-use feature eliminates idle time as customers only pay when functions are executed. Dynamic provisioning ensures that the infrastructure scales automatically based on application loads, making it especially attractive in the case of applications with wildly inconsistent usage patterns.

Enhance developer productivity: Freed from the demands of infrastructure management, developers can focus fully on the functions that they are expected to deliver without having to worry about server capacity. Every function that comprises the applications can be updated independently, or all at one go, without having to change the entire application. This makes it much quicker and easier to update, patch, fix or add new features to an application. Serverless also accelerates go-to-market times by drastically simplifying the process of writing, testing, and deploying code.

Reduced latency: A serverless architecture makes it possible for developers to reduce latency by running code closer to the end user. Since application functions are not tied to a centralized origin server, they can now be shifted to servers closer to the end user to decrease latency.

Improved security: It is also argued that constraining the developer to using only code constructs that work within the serverless context can produce code that is aligned with security, governance, and best practice protocols.

In addition to all this, there is also the idea floated by Gartner that serverless computing can improve the productivity of infrastructure and operations (I&O) administrators. According to the analyst, the event-driven architecture of FaaS is perfectly suited to automate cloud infrastructure operations.

EDA-based Automation Principles

SOURCE:https://www2.deloitte.com/content/dam/Deloitte/tr/Documents/technology-media-telecommunications/Serverless%20Computing.pdf

Notwithstanding all these advantages, there are still some key challenges associated with serverless that may not make it the ideal choice for all applications and situations.

Key Serverless Challenges

Development challenges: The programming approach for serverless applications can be quite different from even that of traditional cloud services. Moreover, there is currently a lot of variation in the programming languages and frameworks supported by different serverless service provider. Then there are limitations imposed by some vendors on code size, memory, etc., all of which seriously limit the type of applications that can be built for this environment.

Testing and debugging can also be challenging in this model. The primary reason for this is the dearth of testing tools for developers that can exactly emulate cloud events in a local development environment. Even debugging gets complicated because of the limited visibility into backend processes.

Performance challenges: A traditional always-on server can process user queries instantly. Since serverless code is shut down after a function has been invoked and the task is completed, it has to be booted up again the next time it is triggered. Functions that have been inactive for a while will then have to be cold started, which can slow down response times and increase latency. Hence, this model may not be suitable for applications that are used rarely yet still need to respond extremely quickly to events. This ephemerality of the service, combined with the resource limitations imposed by vendors, can render serverless unsuitable for applications with larger processing requirement.

Security challenges: There have been several critical security risks that have been identified for serverless applications, primary among these being vendor security and multi-tenancy. With vendors taking complete control of the backend where the actual execution occurs, developers have little visibility into the security protocols in place. This can be of particular concern with respect to applications that use sensitive data. Multi-tenancy raises similar concerns as serverless applications in all likelihood run on shared infrastructure and can result in data being exposed.

Finally, there is the challenge of vendor lock-in. With every vendor offering their own combination of programming languages, workflows and features, it can become extremely cumbersome to migrate applications from one platform to another.

Serverless represents the latest in a long trend of abstractions that have transitioned enterprise IT from physical servers to virtual machines and containers. Now, infrastructure itself is being abstracted away.

Conclusion

Today, serverless is the hottest cloud computing trend, with Microsoft’s CEO declaring it the core of the future of distributed computing. But serverless is still a nascent technology and its promises of cost transparency, agility, and productivity being counterbalanced by several significant challenges still have to be addressed. Although it may not be appropriate in every use case, serverless adoption predictions are through the roof. Your specific use cases may or may not benefit from serverless and you should evaluate them on a case-by-case basis.