Chief Technical Writer
Feb 8, 2024 | 3 mins read
APIs or Application Programming Interfaces are the gateways or entry points to the logic—or functionality contained within distributed computing modules – or microservices in a cloud native microservices-based architecture. Not only are APIs entry points to this logic, but they also act as centurions or guardians of the code contained within these services.
Imagine a Roman centurion standing guard at a city's gates, scrutinizing each person seeking entry to the city, determining if they have legitimate business inside. No one can pass through the city gates without having a valid reason, thereby keeping the residents of the city safe from marauders and criminals.
Fast-forward to 2024, the digital era where distributed cloud computing plays a significant role in the global economy, where the analogy of the Roman centurion standing guard becomes even more relevant in the context of APIs in a cloud native microservices architecture.
This article discusses the role APIs play in distributed computing—or microservices-based architectures—and how they help solve the challenges that enterprises face today, including the need for hybrid, geographically distributed, highly available, massively scalable applications with the ability to return data analytics in near real-time, returning 99.99% uptime metrics, and meet regulatory compliance requirements across all geographic jurisdictions.
Note: These requirements are espoused in an enterprise application type known as a Global Hybrid Multicloud Application—or GHMA.
But first, let’s explore the meaning of smart endpoints in relation to an API’s primary role in distributed computing.
Historically, APIs have only traditionally served as gateways to software applications, enabling different software components to communicate with each other and facilitating operations like exchanging data between disparate systems.
For instance, imagine you own a FinTech startup developing a PSD2-compliant payment platform for online marketplaces like Etsy and Amazon. You have successfully raised $1M in a seed funding round. Now, it’s time to start attracting and signing up customers.
A significant part of this process is to analyze your data to refine your marketing plans. You have data in a CRM system and your payment platform, which needs to be amalgamated in a central location before running data analysis algorithms on this data. The data is extracted from its origin systems and uploaded into a centralized data warehouse using a DAG-driven data pipeline.
Each step in the DAG—Direct Acyclic Graph—is a traditional API call to the origin database, extracting and uploading the data into the data warehouse, where the data is analyzed as part of a data analytics application and not the pipeline. The API endpoints act as an interface or bridge between these systems. There is no requirement for each system—database—to understand or even know about each other. The API is merely the conduit/gateway in a traditional context.
Let’s look at the same scenario but in the context of distributed computing and smart endpoints.
Imagine that your payment platform comprises multiple containerized microservices, each providing a single feature of the overarching platform. A typical workflow is the new customer onboarding workflow, which is divided into the following microservices:
When a prospective customer clicks the signup button, the UI sends an API call to the customer bio details microservice’s load balancer, requesting access to this microservice’s UI through the container’s sidecar proxy. The load balancer decides whether the already spun-up service can handle the additional load or whether Kubernetes—the orchestration platform—must spin up a second copy of this microservice.
The same principle applies at every point in this workflow. However, for this discussion, let’s focus on the API’s role in this first step in the workflow—from where the user clicks the signup button to where the customer bio details screen loads—and why its endpoints are considered smart.
In summary, the role of APIs has had to evolve in the context of distributed computing and microservices architectures to the point where they are no longer simple gateways—and conduits—but where business logic, security, and data processing capabilities are baked directly into the API layer.
These smart endpoints include the following characteristics:
Smart endpoints must be equipped with sophisticated security, such as authentication, authorization, and encryption, to manage access to microservices, ensuring that only valid, authorized requests are processed. This is imperative to protect sensitive data and functionalities in distributed systems.
APIs with smart endpoints have significant amounts of the application’s business logic built into them. These APIs pass data between microservices and process, interpret, and act upon this data based on complex business rules before forwarding it to another service.
For instance, let’s assume your FinTech platform has had several new customer signups from different geographical regions with location-specific data protection and security laws, one location being the European Union.
Personal Identifiable Information—PII—from European Union customers must be stored in the EU. Therefore, when EU customers sign up, their customer data must be stored on servers in EU data centers. This information is stored within the application interface and used to spin up a microservice with a database in the EU.
As described above, smart endpoints can include the functionality to dynamically route requests to services based on load, priority, or other criteria, optimizing resources and increasing system responsiveness and resilience.
Moreover, this characteristic, among others, ensures that the application is scalable, globally available, and executable on a hybrid architecture—on-premises servers and in the cloud- as well as dynamic and highly available.
APIs with smart endpoints improve service discovery in dynamic, distributed computing environments, where service availability frequently changes because of the application’s ability to scale microservices in and out as required.
For example, let’s assume you offer free financial transactions during November for the first 25 signups as part of a Black Friday/Cyber Monday weekend sale. If marketed correctly, you should get over 25 signups this weekend, resulting in Kubernetes needing to scale up the customer signup microservices. And when the weekend is over, it stands to reason that the load or demand will reduce substantially.
These smart endpoints help maintain a directory of services and their endpoints, making it easier for the higher network traffic to be routed to the correct microservices.
Smart endpoints can often ELT/ETL (Extract, Load, Transform/Extract, Transform, Load) data as part of the data analysis process. For example, instead of just acting as the conduit—or gateway—in the traditional context, these APIs have the logic to process—or pre-process—the data before loading it into a data warehouse, as described above.
Moreover, these smart endpoints can pre-process the data when sending it between microservices in a workflow, as elucidated in the customer signup workflow. This baked-in functionality can enforce data integrity and consistency across microservices, reducing the requirement to duplicate the logic in each service.
Integrating all the microservices belonging to an enterprise application seamlessly across different geo-locations is nothing short of challenging. The traditional solution to this challenge is to generate messages and add them to a queue for the microservices to pick up when they are ready to process the next transaction. However, this application is limited, especially when there are heavy workloads.
APIs with smart endpoints solve this challenge, especially under heavy workloads, by enabling event-driven business processing and a more dynamic and scalable approach, improving the application’s operational efficiencies.
In practice, these endpoints react to events rather than relying on message queues, increasing support for horizontal scaling, processing data in near real-time, decoupling services as they act on events and not direct calls to specific services, and improving fault tolerance. Because of the application’s distributed nature, Kubernetes spins up another microservice to take over the functionality when a single microservice returns an error. Therefore, the application stays up and available irrespective of individual errors.
The transformation of APIs from traditional gateways/conduits—channeling data from source to destination—marks a significant pivot in how distributed, microservice-based applications are developed.
To circle back to the beginning of this article:
By embedding intelligence and capabilities directly into the API layer, developers can create applications that meet the requirements of a GHMA—or an application that is a:
“Hybrid, geographically distributed, highly available, massively scalable application with the ability to return data analytics in near real-time, returning 99.99% uptime metrics, and meet regulatory compliance requirements across all geographic jurisdictions.”