Cloud native computing represents the next paradigm shift in IT. The industry in the midst of a shift from on-premise compute infrastructure to a larger, global network of vendor-owned cloud data centers. The critical set of consideration for cloud migration includes cost, time to market as well as security and compliance. With cloud providers doing a great job providing a secure and compliant platforms, development organizations can focus on creating innovative, scalable solutions and delivering them ever faster.
When it comes to deciding on the compute portion of their native cloud apps, developers and enterprises are facing choices ranging from infrastructure as a service (IaaS), container orchestrators such as Kubernetes and serverless functions. Two of the most critical decision criteria are increasing developer productivity and reducing overall costs.
Container orchestrators represent improvements in both areas. Creating, building and testing Docker containers is quick and easy. Developers can start from existing templates and deploy containers into many available hosting platforms. As far as cost, container orchestrators offer better CPU utilization for cloud hosted VMs by running different application roles in different containers in the same VM.
Serverless functions improve on this model in a few important ways. They increase developer productivity by breaking code into smaller pieces and by managing the dependency layers that are still needed in containers. They also reduce cost through a pay-per-execution pricing model, where utilization is essentially at 100%. This is way better than containers and dramatically better than IaaS.
Existing Serverless function solutions offer a great alternative to Kubernetes for asynchronous operations such as scheduled jobs or data transformations. For interactive and continuous workloads, existing solutions such as AWS Lambda suffer from high (warm and cold) latencies and become very expensive at scale.
At Binaris, we believe that functions are the best programming model for horizontally scalable cloud applications. We also realized that some existing cloud-based solutions are still not adequate in terms of performance, scale and time to market.
Binaris functions are the optimal way to build cloud-native applications as they:
- Infinitely extensible. The low internal (function to function) invocation latency of Binaris functions allows developers to extend the platform based on their organization and application needs. Unlike Kubernetes where augmenting the platform requires a special set of components, with Binaris developers use functions to build extensions like custom code deployment workflows, testing and more.
- Scale with the workload. Binaris functions offer low invocation latencies and fast scale, thus providing the ability to implement synchronous, low-latency workloads.
- Provide 100% utilization at a competitive price point. Designed for the single purpose of running Serverless functions at scale, Binaris functions cost significantly less than other cloud-based compute solutions.
The Binaris platform is built from the ground up to provide low latency, low-cost serverless functions. Binaris functions provide an invocation SLA of less than 5ms and never encounter “cold starts”. Binaris functions can be paid for per execution at a much lower price-point than AWS Lambda.
Currently running on AWS but with a roadmap for multi-cloud deployments, Binaris is a viable replacement for Kubernetes based cloud-native applications. To learn more about Binaris, visit us in our booth #S71 at KubeCon Seattle, Dec. 11-13.