Is Function a new container? Why so buzz around Serverless computing like OpenFaas?
One of the biggest tech trend of 2018 has been a year of “shipping code and moving quickly”. It has been a year for developers not to think about infrastructure, not to worry about scaling, load-balancing etc. What developer need is shipping code, running code and moving quickly without any infrastructure concern.
If you look back 20 years from now, the first wave of computing started with x86 servers inside the datacenter. It was the very first wave of computing which democratized computing by becoming more and more accessible and that’s when the actual IT revolution started. The small and medium businesses(SMBs) could afford to run servers in their environments. The second wave of computing started with virtualization and the credit goes to VMware who actually introduced a very affordable hypervisor and then made it possible to run virtualization or VMs so one server with multiple VMs. The third & more recent wave is containerization which brought a new mechanism of packaging and deploying applications, paving way to some of the emerging patterns like microservices.
The most recent & very exciting wave of computing is serverless computing and this is a paradigm where we never have to deal with infrastructure. Though the name “serverless” is used but still it sounds “misnomer” as serverless computing still requires server. The name “serverless computing” is used because the server management and capacity planning decisions are completely hidden from the developer or operator. Serverless code can be used in conjunction with code deployed in traditional styles, such as microservices. Alternatively, applications can be written to be purely serverless and use no provisioned servers at all.
The whole idea about Serverless is even though we still have servers but we don’t really need to care about them anymore. Serverless means that you are not going to have server that you are going to manage. You don’t have anything running. Your code is lying around there. When somebody calls that particular code, then it gets triggered and will be executed. You write functions and that functions can be executed at that instance when there is something coming and triggering.
What is Function-as-a-Service(FaaS)?
Under serverless, we don’t really take an entire application and deploy like the way you do under PaaS but we take tiny code snippets that are written as stateless functions and deploy them in this environment So instead of packaging a large application, you actually break the application into one function at a time. You basically deploy them in this environment which is serverless computing, hence expected to write one function, test it and then upload it to serverless computing platform. After you have uploaded multiple number of independent isolated autonomous functions, you decide how you are going to connect all of them to compose an application. Sounds like microservices? Well that’s what it is. It can be rightly called as “Nano-Services” which is much more stateless than microservices
Is Function a Special Type of Container?
Function is small bits of code that can do one thing well & are easy to understand and maintain. These functions are then deployed as a single unit as code into your FaaS platform and these platform then deal with provisioning your infra, scaling, reliability, billing and security in any language which you care.
Function can be rightly called as a special type of container which has property like –
- Short Running
- Single Purpose
OpenFaas is a framework for building Serverless applications. Functions in OpenFaas are actually Docker containers. It is a matured open source serverless software program that can run any CLI-driven binary program embedded in a Docker container. As an Open Source project, it has gained large-scale adoption within the community. OpenFaaS is a framework for building serverless functions with Docker and Kubernetes which has first class support for metrics.
OpenFaaS is an independent project created by Alex Ellis which is now being built and shaped by a growing community of contributors.
What does OpenFaas provides?
OpenFaaS provides :
- Easy install (1minute!) – Ease of use through UI portal and one-click install
- Multiple language support – Write functions in any language for Linux or Windows and package in Docker/OCI image format
- Transparent autoscaling – Auto-scales as demand increases
- No infrastructure headaches/concerns
- Support for Docker Swarm and Kubernetes
- Console UI to enable deployment, invocation of functions
- Integrated with Prometheus Alerts
- Async support
- Market Serverless function repository
- RESTful API
- CLI tools to build & deploy functions to the cluster.
OpenFaaS is a framework for building serverless functions with Docker and Kubernetes which has first class support for metrics.
OpenFaas Stack – Under the Hood
OpenFaas stack consists of the following components:-
- API Gateway
- Function Watchdog
The API gateway and Prometheus instance runs as services while the function watchdog runs as function container. The below picture depicts API gateway and Prometheus stack under docker-compose.yml leveraged by OpenFaas deploy-stack.yml file.
Let us dive into each components separately:
API Gateway is where you define all of your functions. The API Gateway is a RESTful microservice. It provides an external route into your functions and collects Cloud Native metrics through Prometheus. Your API Gateway will scale functions according to demand by altering the service replica count in the Docker Swarm or Kubernetes API. A UI is baked in allowing you to invoke functions in your browser and create new ones as needed.
Figure:2 – Swarm overlay connecting API gateway to Function containers
Function Watchdog ( fwatchdog)
Function Watchdog is embedded in every container and that allow every container to become serverless. It is a tiny Golang webserver. It provides an unmanaged and generic interface between the outside world and your function. Its job is to organize a HTTP request accepted on the API Gateway and to invoke your chosen application.
Each function container consists of a function watchdog, fwatchdog & certain function program written in any language(Python/Go/CSharp). The Function Watchdog is the entrypoint allowing HTTP requests to be forwarded to the target process via STDIN. The response is sent back to the caller by writing to STDOUT from your application.
It is important to note that the Dockerfile describing a function container must have the fprocess environment variable pointing to the function program name and arguments.
Prometheus is an open-source systems monitoring and alerting toolkit used in this stack . It underpins the complete stack and collects statistic to build dashboard, certain metrics interpolation. Whenever certain functions have high traffic, it will scale out for you automatically using Docker Swarm and K8s API.
Orchestration Engines(Swarm & Kubernetes) & Docker Platform
All the above components run on top of Docker Swarm or Kubernetes orchestration engine The container runtime can be any modern version of Docker or containerd.
Want to test Drive OpenFaas?
Under this demo, I will showcase how to get started with OpenFaas in detailed manner. We will leverage PWD platform which is the fastest method to setup 5-node Swarm cluster. In case you are completely new to play-with-docker, you can follow the below step by step instructions –
- Open https://labs.play-with-docker.com/
- Click on login & start
- Click on tool near the setting on the left side of PWD interface
- Choose 3 Managers and 2 workers and allow it to bring up 5 node cluster
Setting up Visualizer tool:
cd openfaas/visualizer/ docker-compose up -d
$ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES05b89b6b8aa9 dockersamples/visualizer:stable "npm start" 56 seconds ago U p 55 seconds 0.0.0.0:8085->8080/tcp visualizer_visualizer_1
This will setup a visualizer tool under port 8085:
As of now, there is no service up and running on Swarm and hence it will show up blank.
Cloning the OpenFaas Repository
git clone https://github.com/openfaas/faas cd faas ./deploy_stack.sh
This single script will bring up complete OpenFaas stack.
Click on port 8080 which appear on the top of PWD screen and it will redirect you to OpenFaas UI page as shown below:
As directed by UI page, let us head over to CLI and get faas-cli installed:
curl -sSL https://cli.openfaas.com | sudo sh
$ faas-cli list Function Invocations Replicas func_wordcount 0 1func_hubstats 0 1 func_base64 0 1func_echoit 0 1 func_markdown 0 1tcpdump 8 1 func_nodeinfo 1 1[manager1] (local) firstname.lastname@example.org ~ $
OpenFaas by default comes with few already baked in functions like as shown below:
All you need is to choose a function(like func_nodeinfo) and click on “Invoke” on the right hand side, it will display node information.
Building Your First Serverless Function
Before we end up this blog, let us try building a Serverless function called “retweet-bot”. This function retweets all Tweets containing your search term. Let’s make this happen –
Clone this repository
In case you missed it out, clone the repository –
Writing Retweet Function
mkdir -p ~/retweet && \ cd ~/retweet
The “Retweet” Python function using the CLI:
root@ubuntu18:~/retweet# faas-cli new --lang python retweet Folder: retweet created. ___ _____ ____ / _ \ _ __ ___ _ __ | ___|_ _ __ _/ ___| | | | | '_ \ / _ \ '_ \| |_ / _` |/ _` \___ \ | |_| | |_) | __/ | | | _| (_| | (_| |___) | \___/| .__/ \___|_| |_|_| \__,_|\__,_|____/ |_| Function created in folder: retweet Stack file written: retweet.yml root@ubuntu18:~/retweet#
This creates 3 major files, 2 of them under retweet directory and 1 YAML file as shown:
retweet# tree . ├── retweet.py └── requirements.txt 0 directories, 2 files
Replace handler.py & requirements.txt with these above files and add config from the repository under the same location.
Displaying contents of retweet.yml
root@ubuntu18:~/retweet# cat retweet.yml provider: name: faas gateway: http://127.0.0.1:8080 functions: retweet: lang: python handler: ./retweet image: retweet
Building the Function
cd .. faas-cli build -f ./retweet.yml
Verifying the Image
docker images | grep retweet ajeetraina/retweet latest 027557a5185d About a minute ago 83MB
Deploying the Retweet Function
faas-cli deploy -f ./retweet.yml Deploying: retweet. Deployed. 200 OK. URL: http://127.0.0.1:8080/function/retweet
Now open up localhost:8080/ui and watch out for brand new retweet function. Clik on Invoke and there you find retweet bot active to display you hashtags.
Hurray ! We have built our first retweet Serverless function in just 2 minutes.
If you want to learn more about OpenFaas, head over to https://docs.openfaas.com.