Key Workloads and Use Cases for Serverless

Exploring Full Potentials of Serverless Computing H2: Pure Pay-Per-Use With Serverless

The cloud has completely transformed the way that enterprise IT organizations drive business value. Foundational to the success of the model has been the pay-per-use feature that finally freed businesses from the conventional high-wire balancing act of optimizing between underprovisioning and risking growth and innovation, and overprovisioning, and wasted resources and skewed ROIs.

So, would it even be possible to improve on this basic feature that has enabled businesses of all sizes and stripes to consume technological resources more efficiently and productively?

It apparently is, with serverless computing.


Pure Pay-Per-Use With Serverless

There is no debate that cloud computing has shifted IT spending from upfront CAPEX to recurrent, distributed OPEX. However, the prepaid OPEX mode offered by many cloud vendors involves customers paying in advance for resources that may not be entirely utilized. In some quarters, this does not represent pay-per-use in its purest form. In others, the need to constantly spin servers up and down is even seen as a critical limitation of the as-a-Service model. In a pure utility compute model, the argument goes, the need to constantly create and destroy servers to execute functions will be replaced by the ability to just invoke the functions that need to be executed with clients only being charged for the resources consumed during the execution of those functions.

And that is the newfound ability that serverless adds to the already formidable repertoire of cloud computing features.

Now, pure pay-per-use is without a doubt a significant milestone in the evolution of the cloud. The obvious cost benefits notwithstanding, the serverless model also enables a whole host of performance and productivity improvements.



Image source:O’Reilly

According to a 2019 serverless survey, organizations that have adopted serverless technologies have seen a wide range of benefits, starting with the expected decrease in operational costs and extending to lower development costs, increased developer productivity, and reduced engineering lead times. However, the survey also found that there was a learning curve in between adoption and successful implementations with sophisticated businesses, those with over three years of serverless experience, being more likely to have a higher success rate in implementations.

Nevertheless, serverless is the zeitgeist with Gartner predicting that over the next five years, half of all global enterprises will have deployed serverless technologies, up from just 20% today. It is therefore only appropriate to understand how serverless platforms can supercharge the development of some of the most disruptive technology trends of the day.


Disruptive Serverless Use Cases

Serverless architectures offer some compelling advantages over conventional cloud computing especially for workloads that either comprises self-contained event-driven tasks that run periodically for a short time or fluctuate substantially across peaks and lows. In the first case, serverless eliminates the server idle time as clients only pay for functions that are triggered and executed based on specified events. In the second, serverless automatically scale the infrastructure to match the demand without any need for manual intervention.

These attributes make serverless perfectly suited for a range of everyday workloads such as cloud automation and CRON jobs, scheduled task automation, hosting high-traffic websites that require high-availability and scalability, multimedia processing, event-driven data transformation, etc.

Over and above these workloads, serverless architecture can also supercharge the development of emerging technological patterns such as serverless API Gateways, IoT, Big Data analytics, and AI/ML.


Serverless API Gateways

Building backend APIs for web and mobile applications emerged as the most prevalent use case among almost half of the respondents to a recent serverless community survey. With the serverless framework, businesses can now develop and deploy backend applications and APIs without having to take on the technical and financial obligations of server management.



Image source: AWS

In the serverless model, API Gateways serve to bring together API definitions and serverless functions. For instance, the AWS API Gateway can be used for setting up REST API services and triggering the code/function that corresponds to each incoming REST call or event. The gateway can even be used to route different parts of the API to different functions. AWS Lambda, AWS’ event-driven serverless computing platform, automatically scales each function based on its individual demand, enabling more cost-effective and flexible API setups.

There are several advantages of using AWS API Gateways to create HTTP APIs. The most obvious of these is the elimination of dedicated API servers as each HTTP endpoint is mapped to a specific function. The service can also be used to map Websocket API events to a serverless function in order to streamline real-time functionality. The ability to map different serverless functions to different parts of the API also means that each function can now be focused on a specific aspect of business logic. With AWS API Gateway, developers also have access to a wide range of readymade integrations, for authentication and authorization, profiling and debugging API requests, etc., that can significantly cut down on time-to-market and while augmenting developer productivity.


Serverless & IoT

There are two IoT-specific categories that serverless is uniquely qualified to address. The first is the sheer number of IoT connected devices — an estimated 24 billion of them by 2030, up from9.5 billion last year — and the overwhelming streams of data from each one of these devices. The second aspect is that of variability traffic across these nodes and their need for on-demand data processing, neither of which can be cost-effectively nor productively addressed with a non-serverless architecture.



Image source: AWS

Serverless’ event-driven and scalable credentials are particularly suited for the data processing needs of the IoT industry. The model offers the flexible computing power required to address a range of connection and processing requirements across a diverse variety of geographically dispersed nodes with vastly different onboard capabilities. For instance, functions serving IoT field devices with sophisticated capabilities can be vastly similar to those that do not. However, serverless functions can be used to add on an assortment of capabilities even with nodes with very limited onboard features.

A serverless architecture also provides the foundation for event-driven data analytics in IoT. For instance, AWS Lambda can easily integrate with AWS’ data sources and services, like Amazon DynamoDB and AWS IoT Analytics, for example, to trigger specific functions for specific data events. Integration with backend data services enables the automation of the collection, processing, storage, and analysis at scale.

And finally, serverless delivers sophisticated computing resources to the network edge with solutions such as Azure IoT Edge and AWS IoT Greengrass seamlessly extending cloud capabilities to edge devices so that they can act locally on the data they generate. This not only reduces the cost of running IoT applications but also facilitates real-time responses to local events even if the devices are offline.


Serverless & Big Data Analytics

Though the cloud did lower the entry barriers for building Big Data apps, it did not mitigate the complexities involved in creating cloud architectures for these apps. However, serverless solutions like AWS Lambda offer a variety of ‘plug-and-play’ components as a service, thereby reducing the number of functionalities that have to be built from scratch. Today serverless platforms offer a range of services to build scalable Big Data pipelines minus the hassles of worrying about infrastructure creation and management.



Image source: Oracle

Take real-time data transformation, for instance. Quite often it is necessary to manipulate raw data for a variety of reasons including normalizing data from multiple endpoints, adding metadata, converting/combining data, etc. Today, serverless platforms offer powerful tools for data transformation. For instance, in AWS Lambda applying custom logic to transform data is as simple as invoking a Lambda function. Similarly, ETL services like AWS Glue can expedite the recreation of legacy data warehouses as serverless data lakes with the agility and flexibility to handle a range of analytical workloads and with the added benefit of a 90% drop in operating costs.

The serverless model drastically reduces the costs and complexities of developing, deploying, and managing Big Data applications. The ease and simplicity of creating and deploying functions drastically reduce time-to-market and allows developers and data scientists to focus on business logic. Big data pipelines can be scaled seamlessly without the need for radical redesign or downtimes. The always-available nature of serverless means that data pipelines can be augmented to respond in real-time to data changes and to adapt control flow and application logic based on data trends.


Serverless and AI/ML

Gartner predicts that cloud-based AI will grow by a factor of five to become one of the top cloud services by 2023. However, new computing capabilities such as cognitive APIs, containers, and serverless computing will play a key role in eliminating some of the complexities of deploying AI.

Serverless holds the potential to usher in a new era of event-driven AI where continuous intelligence will enable split-second insights and real-time decision making.

To start with, though, serverless addresses some of the challenges associated with the development of AI/ML. For instance, this event-driven model does not require practitioners to ponder over cluster management, scalability, and query processing allowing them to focus instead on training the model. Machine learning models deployed as serverless functions can be invoked, updated, and deleted without any adverse impact on the system. Using API gateways to expose ML models as a service makes it easier to decentralize the backend and isolate failure more accurately.



Image source: Google

Every major serverless platform, including AWS Lambda, Microsoft Azure Functions, Google Cloud Functions, and IBM Cloud Functions, currently offers a fairly comprehensive portfolio of serverless machine learning offerings, from datasets to train ML models, libraries of machine learning algorithms, on-demand model training, and automated model tuning to enhance predictive capacity, to name a few. Open source serverless frameworks like Nuclio are offering new serverless features, like GPU support at scale, multi-cloud deployment, and native integration, even as they extend the benefits of serverless beyond event-driven workloads to long-lasting, parallel, and data-intensive jobs. Today, it is possible for developers to build production-ready intelligent applications using readily available services such as AWS Lambda, Amazon S3, and Amazon API Gateway.

Going forward, serverless AI pipelines are expected to drive more sophisticated automation opportunities. Rather than just triggering events in a serverless workflow, soon workflows will be able to refer back to the outcomes of every previous instance of the workflow and learn to replicate successful outcomes.


Serverless & DevOps

Nearly a third (31%) of all respondents to the serverless community survey mentioned earlier have deployed serverless to support their DevOps initiatives. And serverless can definitely play a critical role in the transformation of DevOps as the architecture does eliminate the line between cloud development and operations and can help harmonize the two functions.



Image source: TechTarget

DevOps automation is one of the key benefits of serverless infrastructures. Since serverless pipelines are deployed without hosted solutions, DevOps automation should be a relatively straightforward process using infrastructure as code solutions and automated events. Businesses can also leverage Functions as a Service to further streamline DevOps. For instance, CI/CD pipelines can be automated by mapping events from developers’ code check-ins to trigger specific functions such as automated testing or even deployment.

Serverless platforms too provide multiple tools for developers to monitor and streamline theirCI/CD pipelines. On AWS, there are several services to monitor and manage CI/CD metrics which can be combined with Amazon QuickSight to build custom visualizations that allow developers to stay on top of deployments. Users can also monitor availability and performance by deploying functions that impersonate user traffic to services in production. Monitoring tools such as Thundra or AWS Cloudwatch can then identify and inform users about any failures or performance degradations.

Serverless can also be used to accelerate DevSecOps as its discrete functions and microservices allow for a more fine-grained approach to security.


Acknowledging the Challenges of Serverless

Serverless is still a nascent technology even though it is evolving at a rapid pace. There are several procedural as well as organizational challenges that have yet to be addressed for successful deployment.Even organizations that have adopted a serverless model cite several challenges including staff education and integration. There are also several systemic concerns, such as vendor lock-in, lack of adequate observability, concerns around cold start times, resource limitations in terms of code size and memory, the need for custom tooling, security risks, loss of control to third-party vendors and even unexpected costs. Many if not all of these concerns will be addressed as the technology matures and the business value that it represents far outweighs even such a lengthy litany of challenges and concerns. Serverless may well still be work-in-progress, but it is a “logical advancement of cloud-native” with the potential to redefine application and workload performance in a cloud-first world.