There are a lot of blog posts and projects related to Serverless concepts out there, and I am hesitant to add another introductory post to the mix, yet I find myself writing one. As much content as there is discussing Serverless, I think it could be useful to explain some of the overall concepts before we carry on to more specific details in the context of this blog.
Also, we have to start somewhere and here seems a logical place.
Rather than try to redefine things, here I will present my understanding of what Serverless is, how I see it fitting into today's and tomorrow's solutions landscape and how users of Oracle Cloud Infrastructure can leverage it.
Serverless is an architecture which allows developers to focus on developing code, abstracting away the system and platform concerns related to building, deploying and scaling their solutions. Most of us have surely seen the memes declaring a simple truth: there are servers in Serverless. As much as we threaten to play out this phrase, it's important to consider it for a few reasons.
There are servers in Serverless, and of course, no one ever said there weren't. While we're at it, the cloud is just someone else's computer. There's no magic in the cloud, after all.
The cloud is about powerful abstractions which simplify our use of complex resources. Serverless is the latest in the long line of abstractions in computing. No magic, and that allows us to bring things back down to earth a bit.
One of the many valuable features of a serverless architecture is that you pay only for execution time. If your code is not executing, the meter is not running. Another is that scaling is the responsibility of the serverless solution you're leveraging. These are abstractions that allow us as developers to forego in-depth knowledge of how the components work and enable the use of those resources far more effectively than ever before. So there are servers, but they become infinitely simpler to account for and should cost a whole lot less to use.
I've used the term abstraction a few times, let's briefly dig into this. In computing, abstractions are used to hide the implementation details of a lower level. Typically, with each new abstraction, less domain-specific knowledge is required to make use of the underlying system. To unroll this a bit more, let's reach back to the early days of computing and bring that context forward through the age of the PC and into the cloud computing solutions of today.
Back in the very early days of modern computing, it was primarily about designing logic circuitry with vacuum tubes, and later transistors, and implementing algorithms to make use of that circuitry. The advent of integrated circuits and later microprocessors changed things considerably, making it simpler to build systems with off-the-shelf components. The resulting systems, including personal computers, were more easily designed and built, and more readily adopted and effectively used by everyday consumers.
Similarly, on the software side of things, early computers leveraged unique machine languages. These made way for assembler languages, which create machine language for use on a given platform. This gave rise to a third generation of languages, such as C, that abstract the assembler for a given system away allowing for portable code across platforms. Today, there are a plethora of expressive and compact languages that we all know and love which are even more abstract from the underlying system.
The cloud represents another abstraction by making compute, networking and storage resources indirectly available to consumers. Cloud providers continue to abstract these components, offering more resources with less complexity. Where we once rented time on remote hosts and virtual machines, we now consume complex managed resources that are single-button deployable. Virtualization, containerization and other computing paradigms make this possible.
Serverless is a logical next step in abstraction. Deploying applications and services in VMs or containers in the cloud can be much more cost effective than buying and maintaining racks of hardware on premises. However, VMs need to be created and monitored, their software patched and maintained. Containers help (as a packaging solution), but any meaningful application deployed in containers should live within the context of a more complex orchestration solution. As things become more streamlined, new complexities arise.
When leveraging a serverless architecture, these complexities are handled by the underlying platform. The platform could be a FaaS that you deploy, or a managed Functions product offered by your cloud provider, or something else. Regardless, the platform handles the complexity. Developers implement functions which are triggered by events and coexist as applications. The functions are deployed and executed as needed, scale when needed and otherwise run merrily along without the developer thinking too much past the end of their code.
For some introductory material, check out the Fn Project. Fn is a leading open source, cloud-agnostic functions platform which provides an excellent environment both for exploring how to create applications and how to deploy a platform which provides such a solution. While you're considering the overall space, it's worth it to take a look at this project. Walking through some tutorials gives some quick exposure to the concepts at play.
We will be adding more content here related to various aspects of Serverless, from how to build serverless applications to deploying your own functions solution with open source software, and we'll often be digging into specific topics. Many have been practicing in this space for a while now, yet in some ways, we're still at the start of something big that will be with us for a long while to come.