An Overview of Apigee

What is Apigee?

Apigee is a platform for developing and managing APIs. By fronting services with a proxy layer, Apigee provides an abstraction or facade for your backend service APIs and provides security, rate limiting, quotas, analytics, and more. Refer to this link to know more about Apigee.

Flavors of Apigee

Apigee comes in the following flavors:

  • Apigee: A hosted SaaS version in which Apigee maintains the environment, allowing you to concentrate on building your services and defining the APIs to those services.
  • Apigee hybrid: A hybrid version consisting of a runtime plane installed on-premises or in a cloud provider of your choice, and a management plane running in Apigee’s cloud. In this model, API traffic and data are confined within your own enterprise-approved boundaries.

Understanding APIs and API proxies , Apigee Edge, and Apigee Docs

Build your first API proxy overview

Apigee is a platform for developing and managing API proxies.
An API proxy is your interface to developers that want to use your backend services. Rather than having them consume those services directly, they access an Apigee API proxy that you create. With a proxy, you can provide value-added features such as :

  • Security
  • Rate limiting
  • Quotas
  • Caching & persistence
  • Analytics
  • Transformations
  • CORS
  • Fault handling
  • And so much more…

Let's look into some of these value-added features in detail :

Security

API security involves controlling access to your APIs, guarding against malicious message content, accessing and masking sensitive encrypted data at runtime, protecting your backend services against direct access, and other important safeguards.

Following are ways to secure a proxy :

  1. OAuth home : The OAuth 2.0 authorization framework enables a third-party application to obtain limited access to an HTTP service, either on behalf of a resource owner by orchestrating an approval interaction between the resource owner and the HTTP service, or by allowing the third-party application to obtain access on its behalf.
    You can protect any API proxied through Apigee with OAuth 2.0. Apigee includes an authorization server implementation, and as such, can generate and validate access tokens. Developers begin by registering their apps with Apigee. Registered apps can request access tokens through any of the four grant-type interactions.
    Apigee provides a multi-faceted OAuthV2 policy that implements the details of each grant type, making it relatively easy to set up OAuth on Apigee. For example, you can configure a policy that receives a request for an access token, evaluates all required credentials, and returns an access token if the credentials are valid.  View the OAuth 2.0 to know more.
  1. Using SAML policies : The Security Assertion Markup Language (SAML) specification defines formats and protocols that enable applications to exchange XML-formatted information for authentication and authorization.
    Apigee API Services enables you to authenticate and authorize apps that are capable of presenting SAML tokens. A SAML token is a digitally signed fragment of XML that presents a set of “assertions”. These assertions can be used to enforce authentication and authorization. Read more about SAML policies
  1. Data-masking and hiding : When you debug the APIs call in Apigee, the content can sometimes contain sensitive data, such as credit cards or personally identifiable health information (PHI) that must be masked. Apigee provides different ways of masking or hiding sensitive data from Trace and debug sessions. Apigee enables you to define mask configurations to mask specific data in trace and debug sessions. When data is masked, it is replaced with asterisks in the trace output. See data-masking and hiding for more information.
  2. Last mile security : Last-mile security protects the backend services that are proxied by API Services. The primary goal of last-mile security is to prevent the so-called “end-run” attacks, where an app developer discovers the URL for a backend service and bypasses any API proxies to directly hit the backend URL.
    Following are the primary options for setting up last-mile security:

    • Client TLS/SSL
    • Outbound authentication

    Refer to the Last-mile security to know more.

  1. API Keys : An API key (known in Apigee as a consumer key) is a string value passed by a client app to your API proxies. The key uniquely identifies the client app. API key validation is the simplest form of app-based security that you can configure for an API. A client app simply presents an API key with its request, then Apigee checks to see that the API key is in an approved state for the resource being requested. Internally, your proxies use policies to verify API key authenticity. Explore API Keys to know more.
  2. Content-based security : Message content is a significant attack vector used by malicious API consumers. API Services provides a set of Policy types to mitigate the potential for your backend services to be compromised by attackers or by malformed request payloads. Read more about content-based security
  1. Key-Value Maps and property sets : Store data that shouldn’t be hard-coded in your API proxy logic for retrieval at runtime, such as credentials, private keys, or tokens. See more on Key-Value Maps and property set.

Rate Limiting

To maintain performance and availability across a diverse base of client apps, it’s critical to maintain app traffic within the limits of the capacity of your APIs and backend services. It’s also important to ensure that apps don’t consume more resources than permitted.
Apigee provides two policies that enable you to optimize traffic management to minimize latency for apps while maintaining the health of backend services. Each policy type addresses a distinct aspect of traffic management. In some cases, you might use both policy types in a single API proxy.

  1. SpikeArrest policy: The SpikeArrest policy protects against traffic surges. This policy limits the number of requests processed by an API proxy and sent to the backend, protecting against performance lags and downtime.This policy should be used to prevent sudden traffic bursts caused by malicious attackers attempting to disrupt a service using a denial-of-service (DOS) attack or by buggy client applications.
    See SpikeArrest policy to know more.
  2. Quota Policy: This policy enforces consumption limits on client apps by maintaining a distributed ‘counter’ that tallies incoming requests. The counter can tally the API calls for any identifiable entity, including apps, developers, API keys, access tokens, and so on. Usually, API keys are used to identify client apps. This policy is computationally expensive so, for high-traffic APIs, it should be configured for longer time intervals, such as a day or month.

Refer to the Quota policy to view more information. Read more about comparing quota and spike arrest policies.

Caching and Persistence

Apigee persistence features include caches, key-value maps, and property sets. If you are utilizing Apigee hybrid, you can apply the Kubernetes secrets to persist sensitive data. See Caching and persistence

Caching

Using policies for general purpose caching, you can persist any objects your proxy requires across multiple request/response sessions. You can also cache backend response data.
You might want to use a cache to:

  • Reduce latency and traffic: Requests are satisfied in a shorter time and with reused representations
  • Persist data across transactions: You can store session data for reuse across HTTP transactions
  • Support security: Scope access to cache entries so they can be accessed only in a particular environment or by a specific API proxy

General-purpose caching

You can use policies to store data in a general-purpose cache for faster retrieval. Using the following policies, your proxy can store and retrieve cached data at runtime. View general-purpose caching to know more.

Response caching

Caches data from a backend resource, reducing the number of requests to the resource. As the apps make requests to the same URL, you can use this policy to return cached responses instead of forwarding those requests to the backend server. The ResponseCache policy can improve your API’s performance through reduced latency and network traffic. For information about response caching, see Response Cache policy

Using key-value maps

There are times when you want to store data for retrieval at runtime—non-expiring data that shouldn’t be hard-coded in your API proxy logic. Key-value maps (KVMs) are ideal for this. A KVM is a custom collection of encrypted key/value String pairs. Read more about KVMs.

Using property sets

A property set is a custom collection of key/value pairs that store data. API proxies can retrieve this data when they execute. Typically, you use property sets to store non-expiring data that shouldn’t be hard-coded in your API proxy logic. You can access the property set data in a proxy while accessing the flow variables.
A common use case for property sets is to provide values that are associated with one environment or another. For example, you can create an environment-scoped property set with configuration values that are specific to proxies running in your test environment, and another set for your production environment. For more information read property sets.

Kubernetes secrets

If you are already using Kubernetes for secret management in a custom vault for sensitive data, you might want to consider using Kubernetes Secrets. Just like with KVM data, you can access the Kubernetes secret data in API proxy flow variables. For more information, see Storing data in a Kubernetes secret.

Support for HTTP response headers

This topic describes how Apigee handles HTTP/1.1 caching headers when you’re using the ResponseCache policy. Apigee currently supports a subset of the HTTP/1.1 caching headers and directives (unsupported features are listed in this topic) received from backend target (origin) servers. In addition to the certain headers, Apigee takes action based on their directives. In some cases, these HTTP/1.1 cache headers override whatever behavior is specified in the ResponseCache policy. View HTTP support to know more.

Cache internals

When you deploy an API proxy that includes a caching policy, short-lived, L1 cache is automatically created. This short-lived data is then persisted in a database where it is available to all message processors deployed in an environment. For information about how policies use the cache, see caching policies.

Analytics Dashboard

The Apigee UI provides a set of predefined dashboards that you can use to view analytics data. This dashboard includes charts for:

  • Total Traffic: The total number of API requests received by Apigee for an API environment in an organization
  • Traffic Success: The total number of requests that resulted in a successful response. Error responses do not count
  • Traffic Errors: The total number of API requests that are unsuccessful i.e, the request does not deliver a response. The count includes both Proxy errors (Apigee side) and Target errors (the backend services)
  • Average TPS: The average number of API requests and resulting responses per second.

For more information on these predefined dashboards, see using the analytics dashboards.

Creating and managing custom reports

Custom reports enable you to drill down into specific API metrics and view the exact data that you want to see. You can create a custom report by using any of the metrics and dimensions built into Apigee. For more information, see managing custom reports

Using the asynchronous custom reports API

Apigee Analytics provides a rich set of interactive dashboards, custom report generators, and related capabilities. However, these features are intended to be interactive. In this case, you submit either an API or UI request, and the request is blocked until the analytics server provides a response. For more information on how to make an asynchronous analytics query, view asynchronous custom reports API

Transformations (shaping, accessing, and converting messages)

You can use policies included with Apigee to manipulate the messages flowing through your API proxies. With policies, you can:

  • Convert messages between formats, such as from XML to JSON
  • Set variable values from message content, and create messages from variable values
  • Use procedural code, such as JavaScript, Java, and Python, to handle messages and data in more complex ways

In general, when using these policies, you must specify input and output as flow variables. At run time, Apigee retrieves the input value from a source variable and writes the output value to an output variable.

Simple handling for XML and JSON

Apigee includes policies that make it easier to convert between XML and JSON and to transform XML with XSL. For more information, refer to handling for XML and JSON.

Handling variable data

Data handling within a proxy often involves simply working with state data as flow variable values. You can often do this by using a policy that gets or sets variable values. Be sure to see references for the following two policies:

Adding CORS support to an API proxy

CORS (Cross-origin resource sharing) is a standard mechanism that allows JavaScript XMLHttpRequest (XHR) calls executed in a web page to interact with resources from non-origin domains. CORS is a commonly implemented solution to the same-origin policy that is enforced by all browsers. For example, if you make an XHR call to the Twitter API from JavaScript code executing in your browser, the call will fail. This is because the domain serving the page to your browser is not the same as the domain serving the Twitter API. CORS provides a solution to this problem by allowing servers to opt-in if they wish to provide cross-origin resource sharing.

Thought Leadership