Serverless Cloud Computing: Introduction, Emergence, Limitations, and Challenges — Part 01

Seralahthan
5 min readJul 8, 2019

Serverless cloud computing handles virtually all the operations related to managing resource and system administration easier for programmers to use the cloud. This gives developers the freedom to focus on building applications rather than the software that powers them (serverless architecture manages the backend infrastructure of the cloud)

Serverless architecture is generally stateless. It provisions resources on demand, and only incurs the cost for the resources actually used, potentially enabling organizations to scale rapidly while saving money.

So, does it mean “Serverless is the perfect solution for organizations to go with…?”. No, serverless cloud computing technology is fairly new and has its own drawbacks including inadequate storage, and performance and security concerns.

Introduction to Serverless Computing: Faas + BaaS

Serverless computing has many advantages over traditional computing platforms.

Advantages of Serverless Cloud Computing

  1. The appearance of infinite computing resources on demand
  2. The elimination of an up-front commitment by cloud users.
  3. Pay-as-you-go model, pay for the computing resources on a short-term basis as needed.
  4. The economic way of scaling significantly reduced cost due to many, very large data centers.
  5. Simplifying operation and increasing utilization via resource virtualization.
  6. Higher hardware utilization by multiplexing workloads from different organizations.

Serverless computing lifts the burden of complex operations and workloads from the users by efficient multiplexing. But the shortfall is Multiplexing worked well for batch style workloads such as MapReduce
or high-performance computing
, which could fully utilize the instances they allocated. It worked less well for stateful services, such as when porting enterprise software like a database management system to the cloud.

There are mainly two competing approaches to virtualization in the cloud.

  • Low-level virtual machine approach
  • High-level application domain-specific platform approach

Amazon EC2 supports low-level virtual machine approach. An EC2 instance looks much like physical hardware, and users can control nearly the entire software stack, from the kernel upward.

Google App Engine supports application domain-specific platform approach. This approach enforces the application structure of clean separation between a stateless computation tier and a stateful storage tier. App Engine gracefully handles the automatic scaling and high-availability mechanisms adhering to the separation constraint.

Common issues addressed in setting up an environment for cloud users.

  1. Redundancy for availability, so that a single machine failure doesn’t take down the service.
  2. Geographic distribution of redundant copies to preserve the service in case of disaster.
  3. Load balancing and request routing to efficiently utilize resources.
  4. Autoscaling in response to changes in load to scale up or down the system.
  5. Monitoring to make sure the service is still running well.
  6. Logging to record messages needed for debugging or performance tuning.
  7. System upgrades, including security patching.
  8. Migration to new instances as they become available.

Some of these issues might take many steps to be addressed.

For example, autoscaling requires determining the need to scale; picking the type and number of servers to use; requesting the servers; waiting for them to come online; configuring them with the application; confirming that no errors occurred; instrumenting them with monitoring tools; and sending traffic at them to test them.

Modern-day Serverless Computing = FaaS + BaaS

In the early era of serverless computing, the low-level virtual machine-based approach was dominant as most of the users were trying to migrate their in-house environments to the cloud. This forced the developers to be engaged in DevOps activities like maintaining the infrastructure and configuring auto-scaling etc which hinders the development process.

Also, in the beginning, the application tier was tightly coupled with the computation + storage tiers. This affected the very idea of the “Serverless Computing”, Serverless computing is arguably an oxymoron, you are still using servers to compute, the name presumably stuck because it suggests that the cloud user simply writes the code and leaves all the server provisioning and administration tasks to the cloud provider.

Due to the tight coupling between computation and storage, databases need to reserve instances long term. However, their workloads can be bursty, which results in low resource utilization.

Recognition of these needs led to a new option from Amazon in 2015 called the AWS Lambda service.

AWS Lambda service offers the core of serverless cloud functionalities packaged as a function. FaaS (Function as a Service) model. Cloud platforms also provide specialized serverless frameworks that cater to specific
application requirements as BaaS (Backend as a Service) offerings.

BaaS originated as a mobile-centric cloud framework and has grown to encompass any application-specific serverless cloud service, such as serverless databases and serverless big data processing frameworks.

Different cloud providers have different FaaS offerings.
Amazon Web Services — AWS Lambda
Google Cloud Platform — Google Cloud Functions
IBM Cloud — IBM Cloud Functions
Microsoft Azure — Azure Functions

It is worth noting that FaaS and PaaS are two different approaches.
PaaSPlatform as a Service. The main difference between FaaS and PaaS is, when deployed as PaaS, an application is typically running on at least one server at all times. With FaaS, it may not be running at all until the function needs to be executed. It starts the function within a few milliseconds and then shuts it down.

We will analyze the difference between FaaS and PaaS in-depth in the upcoming series of blogs.

In simple terms Serverless computing = FaaS + BaaS. For a service to be considered serverless, it must scale automatically with no need for explicit provisioning, and be billed based on usage.

Let’s wind it up for now here. We will cover the Emergence, Limitations, and Challenges of Serverless Cloud Computing in the up-coming blogs.

References:

--

--

Seralahthan

Consultant - Integration & CIAM | ATL@WSO2 | BScEng(Hons) in Computer Engineering | Interested in BigData, ML & AI