Skip to main content

Serverless computing

Serverless computing is an event-driven application design and deployment paradigm in which computing resources are provided as scalable cloud services. In traditional application deployments, the server’s computing resources represent fixed and recurring costs, regardless of the amount of computing work that is actually being performed by the server. In a serverless computing deployment, the cloud customer only pays for service usage; there is never any cost associated with idle, down-time.

Reallocation 

Serverless computing does not eliminate servers, but instead seeks to emphasize the idea that computing resource considerations can be moved into the background during the design process. Developers can drop in code, create backend applications, create event handling routines and process data – all without worrying about servers, virtual machines (VMs) or the underlying compute resources because the actual hardware and infrastructure involved are all maintained by the provider.

The term serverless computing is often associated with the NoOps movement and the concept may also be referred to as serverless cloud computing, function as a service (FaaS) or runtime as a service (RaaS).

How serverless computing works

With serverless computing, developers don’t have to deal with managing machine instances in the cloud. Instead, they can run code on cloud servers without having to configure or maintain the servers at all. Pricing is based on the actual amount of resources consumed by an application, rather than on pre-purchased units of capacity.

Typically, if developers host their applications on virtual servers based in the cloud, they have to set up and manage those servers, install operating systems on them, monitor them and continually update the software.

With the serverless model, a developer can write a function in his or her favourite programming language and posts it to a serverless platform. The cloud service provider manages the infrastructure and the software and maps the function to an API endpoint, transparently scaling function instances on demand.

Role of serverless computing in digital transformation

Serverless computing plays an important part in digital transformation. First, it enables developers to be more productive by helping them focus on writing code that has business value, without having to worry about the underlying infrastructure that will support the code. Regardless of vertical industry or company size, a serverless computing strategy can help increase developer productivity by eliminating management overhead.  

Features of a serverless computing software development environment include:

  • zero server management
  • auto-scaling to meet changing traffic demands
  • managed integrated security

What to look for in a serverless architecture

Organizations should look for serverless platforms that help them develop applications end-to-end, tapping services across databases, storage, messaging, data analytics, machine learning and smart assistants.

Some serverless cloud services provide scalability and cost savings, but create additional complexities - like constrained runtimes or vendor lock-in, so that's also an important consideration when choosing a serverless architecture.

Developers often face a hard trade-off between the ease and velocity of serverless and the flexibility and portability of containers. This is why most organizations benefit from a full-stack approach, rather than limiting serverless to compute functions.

Advantages and disadvantages of serverless computing

The advantages of serverless computing include:

  • Cost-effective - Users and developers just have to pay for the time that code is running on a serverless compute platform. They don’t have to pay for virtual machines sitting idle.
  • Easy to deploy - Developers can deploy apps in hours or days rather than weeks or months.
  • Auto scaling - Cloud providers handle the scaling up or spinning down when the code’s not running.
  • Increased developer productivity - Developers can spend most of their time writing and developing apps, instead of dealing with servers and runtimes.

Disadvantages of serverless computing include:

  • Vendor lock-in - Switching cloud providers may be difficult since the way serverless services are delivered may differ from one vendor to another.
  • Not efficient for long-running apps - Sometimes using long tasks can cost much more than running a workload on a virtual machine or dedicated server.
  • Latency - There is a delay in the time it takes for a scalable serverless platform to handle a function for the first time, often known as a cold start.
  • Debugging is more difficult - Because a serverless instance creates a new version of itself each time it spins up, it’s hard to amass the data needed to debug and fix a serverless function.

Serverless computing use cases

There are number of use cases for serverless computing, including:

  • Event-triggered computing - For scenarios that involve numerous devices accessing various file types, such as mobile phones and PCs uploading videos, text files and images.
  • Internet of things (IoT) data processing - Serverless computing provides a way to combine and analyse data from a variety of devices and then trigger the desired events, offering a highly functional, less expensive way to manage IoT.
  • Backend tasks for mobile apps or websites - A serverless function can take a request, such as for information from a user database, from the front end of the site or application, retrieve the information and hand it back to the front end.
  • High volume background processes - Serverless can be used to transfer data to long-term storage, convert, process and analyze the data and move metrics to an analytics service.

Comments

Popular posts from this blog

Black swan

A  black swan event  is an incident that occurs randomly and unexpectedly and has wide-spread ramifications. The event is usually followed with reflection and a flawed rationalization that it was inevitable. The phrase illustrates the frailty of inductive reasoning and the danger of making sweeping generalizations from limited observations. The term came from the idea that if a man saw a thousand swans and they were all white, he might logically conclude that all swans are white. The flaw in his logic is that even when the premises are true, the conclusion can still be false. In other words, just because the man has never seen a black swan, it does not mean they do not exist. As Dutch explorers discovered in 1697, black swans are simply outliers -- rare birds, unknown to Europeans until Willem de Vlamingh and his crew visited Australia. Statistician Nassim Nicholas Taleb uses the phrase black swan as a metaphor for how humans deal with unpredictable events in his 2007...

A Graphics Processing Unit (GPU)

A graphics processing unit (GPU) is a computer chip that performs rapid mathematical calculations, primarily for the purpose of rendering images. A GPU may be found integrated with a central processing unit (CPU) on the same circuit, on a graphics card or in the motherboard of a personal computer or server. In the early days of computing, the CPU performed these calculations. As more graphics-intensive applications such as AutoCAD were developed; however, their demands put strain on the CPU and degraded performance. GPUs came about as a way to offload those tasks from CPUs, freeing up their processing power. NVIDIA, AMD, Intel and ARM are some of the major players in the GPU market. GPU vs. CPU A graphics processing unit is able to render images more quickly than a central processing unit because of its parallel processing architecture, which allows it to perform multiple calculations at the same time. A single CPU does not have this capability, although multi...

6G (sixth-generation wireless)

6G (sixth-generation wireless) is the successor to 5G cellular technology. 6G networks will be able to use higher frequencies than 5G networks and provide substantially higher capacity and much lower latency. One of the goals of the 6G Internet will be to support one micro-second latency communications, representing 1,000 times faster -- or 1/1000th the latency -- than one millisecond throughput. The 6G technology market is expected to facilitate large improvements in the areas of imaging, presence technology and location awareness. Working in conjunction with AI, the computational infrastructure of 6G will be able to autonomously determine the best location for computing to occur; this includes decisions about data storage, processing and sharing.  Advantages of 6G over 5G 6G is expected to support 1 terabyte per second (Tbps) speeds. This level of capacity and latency will be unprecedented and wi...