When deploying cloud native applications across edge footprint, the application providers must deal with multiple edge locations that have constrained compute and storage capacity.

At the edge, there’s no infinite scale-out elasticity of public clouds, so the application providers often have to design for individual edge outages as well as demand peaks (“hotspots”) that may overrun edge node capacity. Edge Mesh offers a service overlay that allows edge applications to route end points to various nodes in the mesh based on the service requested, proximity, node load, and service policy. As a result, the edge applications are deployed across a scaled-out cloud network that is resilient to outages and demand hotspots. Being neutral to network protocol used by end points to access the service (HTTP, SSL, MQTT, etc.) it’s designed to introduce minimal latency.

How it Work

The Service Overlay provides for mobility across same deployment tier (e.g. across multiple edge nodes in the same edge location, different telco central offices in the same region), or up and down from edge tier to telco/public clouds.

The mobility functions are available for “serverless” functions, Kubernetes pods or individual Linux containers From deployment and operation point of view, it provides the same “serverless” deployment experience as public clouds.

Application providers publish a microservice workload into the central API point together with deployment policy and metadata, and from that point on the microservice becomes accessible in the mesh to the end points. The deployment policy spells out deployment constraints of the workload within the mesh, including geo-restrictions, failover policies, and latency/proximity requirements.

As the end points access the microservice, the mesh makes centralized transactional reporting available to the microservice provider. The provisioning and reporting are all accessible via OpenAPI from the cloud-based API gateway.

Committed to Open Source Edge Mesh services are OpenAPI-based and are easily integrated into 3rd party CI/CD toolchains, high-level orchestration platforms, and automation workflows.

The platform architecture relies heavily on open source, giving users access to a vibrant cloud native ecosystem in addition to an installed set of cloud native services pre-integrated with the platform to keep complexity and time-to-deployment low.

With the service, edge cloud operators are able to easily deploy a variety of open source solutions using or 3rd party software repositories. It also allows deployment of cloud native SaaS across private on-premise compute footprints.