RSS

API Agreggation News

These are the news items I've curated in my monitoring of the API space that have some relevance to the API aggregation conversation and I wanted to include in my research. I'm using all of these links to better understand how the space is testing their APIs, going beyond just monitoring and understand the details of each request and response.

API Icon Vocabulary

I am working on profiling 75 of the Google APIs, and one thing I struggle with at this scale is standardizing the images I use, or more specifically, icons that represent each service as well as the value they deliver under the hood--something Google seriously needs to get more organized in by the way. I have written before about having a set of icons for the API sector, for SDK related icons, and also about how Amazon is getting more organized when it comes to icons for the AWS platform, as I beat this drum about the need for common imagery.

While I am glad that Amazon is started to think about iconography when it comes to working with APIs at scale, a lead that Google and Microsoft should follow, I'm hoping that API icons are something that someone will tackle at the same level as say a Schema.org. I would like to see API provider (company) level icons, building on the work of Devicon, but I'd also like to see individual icons developed for common resources that are made available via APIs--like compute, storage, images, video, etc.

What Amazon is doing provides the best model we have so far, but I want to make sure icons are not vendor specific. I would like to see a universal icon to represent a compute API for instance, whether it was Amazon, Google, or Microsoft. Think about IFTTT or Zapier, but something that would universally represent the valuable bits and bytes we are putting to use via individual platforms, but are increasingly also moving around between platforms--I want a common visual iconography we can use to communicate about what is being done with APIs.

There is a ton of work involved with establishing a project of this scale. Ideally, it is something that would also involve API providers, and not just be an external thing. I'd also like to see each icon be more than just a visual icon, I'd like there to be semantics and vocabulary attached to each image. Imagine if we had a common set of icons describe machine learning APIs, that helped us quickly understand what they do, but would also help us more consistently articulate the reality of what they do, and do not do (ie. Facial Recognition, Sentiment Anlysis, etc.).

I am going to keep talking about this until someone either does it or gives me the money to pay someone to do it. Sadly, I feel like it will end up being like the rest of common API definitions and tooling, we'll see each provider do it on their own, where all meaning and vocabulary becomes platform-driven, and not about actually finding a common language to communicate about what is possible with APIs.


API Definition: Human Services API Specification

This is an article from the current edition of the API Evangelist industry guide to API definitions. The guide is designed to be a summary of the world of API definitions, providing the reader with a recent summary of the variety of specifications that are defining the technology behind almost every part of our digital world.

A lot of attention is given to APIs and the world of startups, but in 2017 this landscape is quickly shifting beyond just the heart of the tech space, with companies, organizations, institutions, and government agencies of all shapes and sizes are putting APIs to work. API definitions are being applied to the fundamental building blocks of the tech sector, quantifying the computational, storage, images, videos, and other essential resources powering web, mobile, and device based applications. This success is now spreading to other sectors, defining other vital resources that are making a real impact in our communities.

One API making an impact in communities is the Human Services Data Specification (HSDS), also known as the Open Referral Ohana API. The project began as a Code for America project, providing an API, website, and administrative system for managing the organizations, locations, and the human services that communities depend on. Open Referral, the governing organization for HSDS, and the Ohana API is working with API Evangelist and other partners to define the next generation of the human services data specification, API definition, as well as the next generation of API, website, admin, and developer portal implementations.

The HSDS Specification API isn’t about any single API, it is a suite of API-first definitions, schema, and open tooling that cities, counties, states and federal government agencies can download or fork, and employ to help manage vital human services for their communities. Providing not just a website for finding vital services, but a complete API ecosystem that can be deployed incentivizing developers to build important web, mobile, and other applications on top of a central human services system. Better delivering on the mission of human services organizations, and meeting the demands of their constituents.

This approach to delivering APIs centers around a common data schema, extending it as an OpenAPI Spec definition, describing how that data is accessed and put to use across a variety of applications, including a central website and administrative system. While server-side HSDS API implementations, website, mobile, administrative, developer portal, and other implementations are important, the key to the success of this model is a central OpenAPI definition of the HSDS API. This definition connects all the implementations within an API’s ecosystem, but it also provides the groundwork for a future where all human services implementations are open and interoperable with other implementations--establishing a federated network of services meeting the needs of the communities they serve.

Right now each city is managing one or multiple human service implementations. Even though some of these implementations operate in overlapping communities, few of them are providing 3rd party access, let alone providing integration between overlapping geographic regions. The HSDS API approach employs an API-first approach, focusing on the availability and access of the HSDS schema, then adding on a website, administrative and API developer portals to support. This model opens up human services to humans via the website, which is integrated with the central API, but then also opens up the human services for inclusion into other websites, mobile and device applications, as well as integration with other systems.

The HSDS OpenAPI spec and schema provide a reusable blueprint that can be used to standardize how we provide human services. The open source approach to delivering definitions, schema, and code reduces the cost of deployment and operation for cash-strapped public organizations and agencies. The API-first approach to delivering human services also opens up resources for inclusion in our applications and system, potentially outsourcing the heavy lifting to trusted partners, and 3rd party developers interested in helping augment and extend the mission of human service organizations and groups.

If you’d like to learn more about the HSDS API you can visit Open Referral. From there you can get involved in the discussion, and find existing open source definitions, schema, and code for putting HSDS to work. If you’d like to contribute to the project, there are numerous opportunities to join the discussion about next generation of the schema and OpenAPI Spec, as well as develop server-side and client-side implementations.


 If you have a product, service, or story you think should be in the API Evangelist industry guide to API design you can email me , or you can submit a Github issue for my API definition research, and I will consider adding your suggestion to the road map.


Considering Standards In Our API Design Over Being A Special Snowflake

Most of the APIs I look at are special snowflakes. The definition and designs employed are usually custom-crafted without thinking other existing APIs, or standards that already in place. There are several contributing factors to why this is, ranging from the types of developers who are designing APIs, to incentive models put in place because of investment and intellectual property constraints. So, whenever I find an API that is employing an existing standard, I feel compelled to showcase and help plant the seeds in others minds that we should be speaking a common language instead of always being a special snowflake.

One of these APIs that I came across recently was the Google Spectrum Database API which has employed a standard defined by the IETF Protocol to Access White Space (PAWS).  I wouldn't say the API is the best-designed API, but it does follow a known standard, that is already in use by an industry, which in my experience can go further than having the best-designed API. The best product doesn't always win in this game, sometimes it is just about getting adoption with the widest possible audience. I am guessing that the Google Spectrum Database API is targeting a different type of engineering audience than their more modern, machine learning and other APIs are, so following standards is probably more of a consideration.

I wish more APIs would share a little bit about the thoughts that went into the definition and design of their APIs, sharing their due diligence of existing APIs and standards, and other considerations that were included in the process of crafting an API. I'd like to see some leadership in this area, as well as some folks admitting that they didn't have the time, budget, expertise, or whatever the other reasons why you are a special snowflake. It is a conversation we should be having, otherwise, we may never fully understand why we aren't seeing the adoption we'd like to see with our APIs.


A Community Strategy For My API Definition Guide

I have tpublished the latest edition of my API definition guide. I've rebooted my industry guides to be a more polished, summary version of my research instead of the rougher, more comprehensive version I've bee publishing for the last couple of years. I'm looking for my guides to better speak to the waves of new people entering the API space, and help them as they continue on their API journey.

In addition to being a little more polished, and having more curated content, my API guides are now going to also be more of a community thing. In the past I've kept pretty tight control over the content I publish to API Evangelist, only opening up the four logos to my partners. Using my API industry guides I want to invite folks from the community to help edit the content, and provide editorial feedback--even suggesting what should be in future editions. I'm also opening up the guides to include paid content that will help pay for the ongoing publication of the guides with the following opportunities available in the next edition:

  • One Page Articles - Sponsored suggested topics, where I will craft the story and publish in the next edition of the guide--also published on API Evangelist blog after the guide is published.
  • Two Page Articles - Sponsored suggested topics, where I will craft the story and publish in the next edition of the guide--also published on API Evangelist blog after the guide is published.Sponsor Slot - 
  • Sponsor Slot - On the service and tooling pages there are featured slots, some of which I will be giving to sponsors, who have related produces and services.
  • Private Distribution - Allow for private distribution of the industry guide, to partners, and behind lead generation forms, allowing you to use API Evangelist research to connect with customers.

Even though I will be accepting paid content within these industry guides, and posts via the blog now, they will all be labeled as sponsored posts, and I will also still be adding my voice to each and every piece--if you know me, or read API Evangelist blog you know what this means. I'm looking to keep the lights on, while also opening up the doors for companies in the space to join in the conversation, as well as the average reader--allowing anyone to provide feedback and suggestions via the Github issues for each area of research.

My API definition research is just the first to come off the assembly line. I will be applying this same model to my design, deployment, and management research in coming weeks, and eventually the rest of my research as it makes sense. If there is a specific research area you'd like to see get attention or would be willing to sponsor in one of the ways listed above, please let me know. Once I get the core set of my API industry research guides published in this way, I will be working on increasing the distribution beyond just my network of sites, and the API Evangelist digital presence--publishing them to Amazon, and other prominent ecosystems.

I also wanted to take a moment and thank everyone in the community who helped m last year and for everyone who is helping make my research, and the publishing of these industry guides a reality. Your support is important to me, and it is also important to me that my research continues, and is as widely available as it possibly can. 


What Will It Take To Evolve OpenAPI Tooling to Version 3.0

I am spending some time adding more tools to my OpenAPI Toolbox, and I'm looking to start evaluating what it will take for tooling providers to evolve their solution from version 2.0 of the OpenAPI Spec to version 3.0. I want to better understand what it will take to evolve the documentation, generators, servers, clients, editors, and other tools that I'm tracking on as part of my toolbox research.

I'm going to spend another couple of weeks populating the toolbox with OpenAPI solutions. Getting them entered with all the relevant metadata. Once I feel the list is good enough, I will begin reaching out to each tool owner, asking what their OpenAPI 3.0 plans are. It will give me a good reason to reach out and see if anyone is even home. I'm assuming that a number of the projects are abandoned, and even that their owners do not have the resources necessary to go from 2.0 to 3.0. Regardless, this is something I want to track on as part of this OpenAPI toolbox research.

The overall architecture of OpenAPI shifted pretty significantly from 2.0 to 3.0. Things are way more modular, and reusable in there, something that will take some work to bring out in most of the tooling areas. Personally, I'm pretty excited for the opportunities when it comes to API documentation and API design editors with OpenAPI 3.0 as the core. I am also hoping that developers step up to make sure that the generators, as well as the server and client code generators become available in a variety of programming languages--we will need this to make sure we keep the momentum that we've established with the specification so far.

If you are looking at developing any tooling using OpenAPI 3.0 I'd love to hear from you. I'd like to hear more about what it will take to either migrate your tool from version 2.0 to 3.0 or even hear what it will take to get up and running on 3.0 from scratch. I'm going to get to work on crafting my first OpenAPI definition using version 3.0, then I'm going to begin playing around with some new approaches to API documentation and possibly an API editor or notebook that takes advantage of the changes in the OpenAPI Specification.


API Environment Variable Autocomplete And Tooltips In Postman

The Postman team has been hard at work lately, releasing their API data editor, as well as introducing variable highlighting and tooltips. The new autocomplete menu contains a list of all the variables in the current environment, followed by global variables, making your API environment setups more accessible from the Postman interface. Introducing a pretty significant time saver, once you have your environments setup properly.

This is a pretty interesting feature, but what makes me most optimistic, is when this approach becomes available for parameters, headers, and some of the data management features we are seeing emerge with the new Portman data editor. It all feels like the UI equivalent of what we've seen emerge in the latest OpenAPI 3.0 release, helping us better manage and reuse the schema, data, and other bits we put to use across all of our APIs. 

Imagine when you can design and mock your API in Postman, crafting our API using a common vocabulary. Reusing environment variables, API path resources, parameters, headers, and other common elements already in use across operations. Imagine when I get tooltip suggesting that I use Schema.org vocabulary, or possibly even RFCs for a date, currency, and other common definitions. Anyways, I'm liking the features coming out of postman, and I'm also liking that they are regularly blogging about this stuff, so I can keep up to speed on what is going on, and eventually cover here on the blog, and include in my research.


Tracking On Licensing For The Solutions In My OpenAPI Toolbox

I wanted to provide an easy way to publish and share some of the tools that I'm tracking on in the OpenAPI ecosystem, so I launched my API toolbox. In addition to tracking on the name, description, logo, and URL for OpenAPI tooling, I also wanted to categorize them, helping me better understand the different types of tools that are emerging. As I do with all my research, I published the OpenAPI Toolbox as a Github repository, leveraging its YAML data core to store all the tools

It will be a never ending project for me to add, update, and archive abandoned projects, but before I got too far down the road I wanted to also begin tracking on the license for each of the tools. I'm still deciding whether or not I want the toolbox to exclusively contain openly licensed tools, or look to provide a more comprehensive directory of tooling that includes unknown and proprietary solutions. I think for now I will just flag any tool I cannot find a license for, and follow up with the owner--it gives me a good excuse to reach out and see if there is anyone home.

Eventually, I want to also provide a search for the toolbox that allows users to search for tools and filter by license. Most of the tools have been Apache 2.0 or MIT license, details that I will continue to keep tracking and reporting on. If you know of any tooling that employs the OpenAPI Specification that should be included feel free to submit a Github issue for the project, or submit a pull request on the repository and add it to the YAML data file that drives that OpenAPI Toolbox.


Thinking About Schema.org's Relationship To API Discovery

I was following the discussion around adding a WebAPI class to Schema.org's core vocabulary, and it got me to think more about the role Schema.org has to play with not just our API definitions, but also significantly influencing API discovery. Meaning that we should be using Schema.org as part of our OpenAPI definitions, providing us with a common vocabulary for communicating around our APIs, but also empowering the discovery of APIs. 

When I describe the relationship between Schema.org to API discovery, I'm talking about using the pending WebAPI class, but I'm also talking about using common Schema.org org within API definitions--something that will open the definitions to discovery because it employs a common schema. I am also talking about how do we leverage this vocabulary in our HTML pages, helping search engines like Google understand there is an API service available:

I will also be exploring how I can better leverage Schema.org in my APIs.json format, better leveraging a common vocabulary describing API operations, not just an individual API. I'm looking to expand the opportunities for discovering, not limit them. I would love all APIs to take a page from the hypermedia playbook, and have a machine readable index for each API, with a set of links present with each response, but I also want folks to learn about APIs through Google, ensuring they are indexed in a way that search engines can comprehend.

When it comes to API discovery I am primarily invested in APIs.json (because it's my baby) describing API operations, and OpenAPI to describe the surface area of an API, but I also want this to map to the very SEO driven world we operate in right now. I will keep investing time in helping folks use Schema.org in their API definitions (APIs.json & OpenAPI), but I will also start investing in folks employing JSON+LD and Schema.org as part of their search engine strategies (like above), making our APIs more discoverable to humans as well as other systems.


Getting Our Schema In Order With Postman's New Data Editor

In 2017 I think that getting our act together when it comes to our data schema will prove to be just as important as getting it together when it comes to our API definitions and design. This is one reason I'm such a big fan of using OpenAPI to define our APIs because it allows us to better organize the schema of the data included as part of the API request and response structure. So I am happy to see Postman announce their new data editor, something I'm hoping will help us make sense of the schema we are using throughout our API operations.

The Postman data editor provides us with some pretty slick data management UI features including drag and drop, a wealth of useful keyboard shortcuts, bulk actions, and other timesaving features. Postman has gone a long way to inject awareness into how we are using APIs over the last couple of years, and the data editor will only continue developing this awareness when it comes to the data we are passing back and forth. Lord knows we need all the help we can get when it comes to getting our data backends in order.

The Postman data editor makes me happy, but I'm most optimistic about what it will enable, and what Postman has planned as part of their roadmap. They end their announcement with "we have a LOT of new feature releases planned to build on top of this editor, capabilities inspired by things you already do using spreadsheets". For me, this points to some features that would directly map to the most ubiquitous data tools out there--the spreadsheet. With a significant portion of business in the world is done via spreadsheets, it makes the concept of integration into the API toolchain a pretty compelling thing.


A Tighter API Contract With gRPC

I was learning more about gRPC from the Google team last week, while at the Google Community Summit, as well as the API Craft SF Meetup. I'm still learning about gRPC, and how it contributes to the API conversation, so I am trying to share what I learn as I go, keeping a record for others to learn from along the way. One thing I wanted to better understand was something I kept hearing regarding gRPC delivering more of a tighter API contract between API provider and consumer.

In contrast to more RESTful APIs, a gRPC client has to be generated by the provider. First, you define a service in a .proto file (aka Protocol Buffer), then you generate client code using the protocol buffer compiler. Where client SDKs are up for debate in the world of RESTful APIs, and client generation might even be frowned upon in some circles, when it comes to gRPC APIs, client generation is a requirement--dictating a much tighter coupling and contract, between API provider and consumer. 

I do not have the first-hand experience with this process yet, I am just learning from my discussions last week, and trying to understand how gRPC is different from the way we've been doing APIs using a RESTful approach. So far it seems like you might want to consider gRPC if you are looking for significant levels of performance from your APIs, in situations where you have a tighter relationship with your consumers, such as internal, or partner scenarios. gRPC requires a tighter API contract between provider and consumer, something that might not always be possible, depending on the situation. 

While I'm still getting up to speed, it seems to me that the .proto file, or the protocol buffer definition acts as the language for this API contract. Similar to how OpenAPI is quickly becoming a contract for more RESTful APIs, although it is often times much looser contract. I'll keep investing time into learning about gRP, but. I wanted to make sure and process what I've learned leading up to, and while at Google this last week. I'm not convinced yet that gRPC is the future of APIs, but I am getting more convinced that it is another important tool in our API toolbox.


Thinking About Schema.org's Relationship To API Discovery

I was following the discussion around adding a WebAPI class to Schema.org's core vocabulary, and it got me to think more about the role Schema.org has to play with not just our API definitions, but also significantly influencing API discovery. Meaning that we should be using Schema.org as part of our OpenAPI definitions, providing us with a common vocabulary for communicating around our APIs, but also empowering the discovery of APIs. 

When I describe the relationship between Schema.org to API discovery, I'm talking about using the pending WebAPI class, but I'm also talking about using common Schema.org org within API definitions--something that will open the definitions to discovery because it employs a common schema. I am also talking about how do we leverage this vocabulary in our HTML pages, helping search engines like Google understand there is an API service available:

I will also be exploring how I can better leverage Schema.org in my APIs.json format, better leveraging a common vocabulary describing API operations, not just an individual API. I'm looking to expand the opportunities for discovering, not limit them. I would love all APIs to take a page from the hypermedia playbook, and have a machine readable index for each API, with a set of links present with each response, but I also want folks to learn about APIs through Google, ensuring they are indexed in a way that search engines can comprehend.

When it comes to API discovery I am primarily invested in APIs.json (because it's my baby) describing API operations, and OpenAPI to describe the surface area of an API, but I also want this to map to the very SEO driven world we operate in right now. I will keep investing time in helping folks use Schema.org in their API definitions (APIs.json & OpenAPI), but I will also start investing in folks employing JSON+LD and Schema.org as part of their search engine strategies (like above), making our APIs more discoverable to humans as well as other systems.


Getting Our Schema In Order With Postman's New Data Editor

In 2017 I think that getting our act together when it comes to our data schema will prove to be just as important as getting it together when it comes to our API definitions and design. This is one reason I'm such a big fan of using OpenAPI to define our APIs because it allows us to better organize the schema of the data included as part of the API request and response structure. So I am happy to see Postman announce their new data editor, something I'm hoping will help us make sense of the schema we are using throughout our API operations.

The Postman data editor provides us with some pretty slick data management UI features including drag and drop, a wealth of useful keyboard shortcuts, bulk actions, and other timesaving features. Postman has gone a long way to inject awareness into how we are using APIs over the last couple of years, and the data editor will only continue developing this awareness when it comes to the data we are passing back and forth. Lord knows we need all the help we can get when it comes to getting our data backends in order.

The Postman data editor makes me happy, but I'm most optimistic about what it will enable, and what Postman has planned as part of their roadmap. They end their announcement with "we have a LOT of new feature releases planned to build on top of this editor, capabilities inspired by things you already do using spreadsheets". For me, this points to some features that would directly map to the most ubiquitous data tools out there--the spreadsheet. With a significant portion of business in the world is done via spreadsheets, it makes the concept of integration into the API toolchain a pretty compelling thing.


A Tighter API Contract With gRPC

I was learning more about gRPC from the Google team last week, while at the Google Community Summit, as well as the API Craft SF Meetup. I'm still learning about gRPC, and how it contributes to the API conversation, so I am trying to share what I learn as I go, keeping a record for others to learn from along the way. One thing I wanted to better understand was something I kept hearing regarding gRPC delivering more of a tighter API contract between API provider and consumer.

In contrast to more RESTful APIs, a gRPC client has to be generated by the provider. First, you define a service in a .proto file (aka Protocol Buffer), then you generate client code using the protocol buffer compiler. Where client SDKs are up for debate in the world of RESTful APIs, and client generation might even be frowned upon in some circles, when it comes to gRPC APIs, client generation is a requirement--dictating a much tighter coupling and contract, between API provider and consumer. 

I do not have the first-hand experience with this process yet, I am just learning from my discussions last week, and trying to understand how gRPC is different from the way we've been doing APIs using a RESTful approach. So far it seems like you might want to consider gRPC if you are looking for significant levels of performance from your APIs, in situations where you have a tighter relationship with your consumers, such as internal, or partner scenarios. gRPC requires a tighter API contract between provider and consumer, something that might not always be possible, depending on the situation. 

While I'm still getting up to speed, it seems to me that the .proto file, or the protocol buffer definition acts as the language for this API contract. Similar to how OpenAPI is quickly becoming a contract for more RESTful APIs, although it is often times much looser contract. I'll keep investing time into learning about gRP, but. I wanted to make sure and process what I've learned leading up to, and while at Google this last week. I'm not convinced yet that gRPC is the future of APIs, but I am getting more convinced that it is another important tool in our API toolbox.


API Definitions Covering Both REST and gRPC APIs

I have been learning more about the way Google designs and defines their APIs after their release of their API design guide. When I research a company's APIs I always spend time looking through their Github repositories for anything interesting, and while poking around in Google's I found a repository of "interface definitions for a small (but growing) set of Google APIs". I keep track of any Github repo I find containing API definitions, but Google's repo stood out because it contained a set of API definitions that covered both APIs that support both REST and gRPC.

Straight from the Github repo, they support two ways of access APIs: "Google APIs use Protocol Buffers version 3 (proto3) as their Interface Definition Language (IDL) to define the API interface and the structure of the payload messages. The same interface definition is used for both REST and RPC versions of the API, which can be accessed over different wire protocols."

  1. JSON over HTTP: You can access Google APIs directly using JSON over HTTP, using Google API client libraries or third-party API client libraries.

  2. Protocol Buffers over gRPC: You can access Google APIs published in this repository through GRPC, which is a high-performance binary RPC protocol over HTTP/2. It offers many useful features, including request/response multiplex and full-duplex streaming.

This is the first example of this I've seen in the wild, and it feels like we are shifting from an HTTP to an HTTP/2 API world. I don't think regular old REST or web APIs are going anywhere, I think they'll continue to be a staple, but it looks like Google is laying the groundwork for two-speed APIs, that are defined using a common API definition--you pick the speed you need. I've been hearing tales of gRPC usage for a while and seeing more APIs defined using protocol buffers, but Google's approach signals a wider more significant shift for me.

I'm still learning about gRPC, so I can't quite visualize the overlap between gRPC and REST quite yet. I'm going through their API definitions because they provide an interesting snapshot of the surface area of these hybrid APIs. As I spend my week in San Francisco for Google Next, I'm eager to learn more about their evolving approach to designing and defining APIs--something that I think will be setting the tone for API design at scale in the near future.


People Doing Interesting Things With APIs

I just wanted to take a moment and highlight some folks who are doing interesting things with APIs. I spend a lot of time focusing on the companies, products, and services from the sector, but I don't talk a lot about individual people. So I wanted to pause for a moment and just highlight a couple of people doing really interesting things with APIs right now.

If have been paying attention to API definitions in the last year, then you probably have come across APIs.guru, the Wikipedia for APIs. They have 244 OpenAPI definitions available in their catalog, which is the most comprehensive directory of machine readable API definitions out there. If you have an OpenAPI for your API you should be publishing it to APIs.guru. if you don't, you should be creating one, and then publishing it to APIs.guru.

Here are the hardworking, API-savvy folks behind APIs.guru:

Ivan Goncharov

Roman Hotsiy

 

I am in the middle of a project where I am building on the work these two have invested in with APIs.guru. I'm hoping that with the next wave of this work, I'll have some complete API definitions I can contribute to APIs.guru. I encourage you to make sure your API definitions are published to the directory so that other people like me can include your APIs in our work.

Beyond the APIs.guru directory, I recommend checking out their Github repository, as they are doing some interesting things with GraphQL. If you need help crafting an API definition for your API platform, or in need of other API advice, I recommend messaging the APIs.guru team. I'm sure they'd be happy to talk to you, and see where they can help you with your API efforts.


Getting Back To Work On My OpenAPI Toolbox

I used to have a Github repository dedicated to Swagger tooling and implementations, but I took it down after Swagger was donated to the Linux Foundation. I've rebooted it as my OpenAPI Toolbox, providing a single Github repository for managing an active list of open source tooling built on top of the OpenAPI specification.

Here is a snapshot of my toolbox of OpenAPI-driven solutions, as it stands today. This site is a Jekyll-driven website running on Github, using Github Pages. The tools in this toolbox are driven by a YAML file in the _data folder for this repository, with the HTML pages driven using Liquid.

Here are the tools organized by type of implementation (something that is evolving quickly):

Documentation

Generators

Servers

 

Clients

Editors

 

Here they are organized by programming language, providing another dimension to look at the tooling being developed on top of OpenAPI.

This project is forkable using the Github repository, and accessible as JSON. If you have a tool you think should be added, or there is something that needs fixing, you can submit an issue on the Github repository, or submit a pull request. It is meant to be a community project, designed to be forkable, shareable, and machine-readable.

I've just started adding the tools I have in my database. I only have 37 so far, but will be adding more as I have time. Once I have it up to date, I will start thinking about other ways to slice and dice the tools, to better understand what is being built on the OpenAPI specification, what tools are being built on the upcoming 3.0 version, as well as working to identify where the gaps and opportunities are for developing tooling.


New York Times Manages Their OpenAPI Using Github

I come across more companies managing their OpenAPI definition as a single Github repository. One example of this is from the New York Times, who as the API definitions for their platform available as its own Github repository. It demonstrates the importance of maintaining your API definitions separately from any particular implementation, such as just your documentation.

You can find Individual OpenAPIs for their archive_api, updated description, article_search,books_api, community, geo_api, most_popular_api, movie_reviews, semantic_api, times_tags, timeswire, top_stories broken down into separate folders within the Github repository. The NYT also provides markdown documentation, alongside the machine-readable OpenAPI definition in each folder, helping make sure things are human-readable.

It just makes sense to manage your API definitions this way. It's more than just documentation. When you do this, you are taking advantage of the repository and version control features of Github, but you also open things up for participation through forking and pull requests. The resulting definition and machine readable contract can then be injected anywhere into the integration and API lifecycle, internally or externally.

I personally like it when companies manage their API definitions in this way. It gives me a central truth to work with when profiling their operations, something that will be used across my research and storytelling. The more you describe your APIs in this way, the more chance I will be writing about them and including them across my work.


A Machine Readable Definition For Your AWS API Plan

I was learning about the AWS Serverless Developer Portal, and found their API plan layer to be an interesting evolution in how we define the access tiers of our APIs. There were a couple different layers of AWS's approach to deploying APIs that I found interesting, including the AWS marketplace integration, but I wanted to stop for a moment and focus in on their API plan approach.

Using the AWS API Gateway you can establish a variety of API plans, with the underlying mechanics of that plan configurable via the AWS API Gateway user interface or the AWS API Gateway API. In the documentation for the AWS Serverless Developer Portal, they include a JSON snippet of the configuration of the plan for each API being deployed.

This reminds me that I needed to take another look at my API plan research, and take the plan configuration, rate limit, and other service composition API definitions I have, and aggregate their schema into a single snapshot. It has been a while since I worked on my machine-readable API plan definition, and there are now enough API management solutions with an API layer out there, I should be able to pull a wider sampling of the schema in play. I'm not in the business of defining what the definition should be, I am only looking to aggregate what others are doing.

I am happy to see more folks sharing machine-readable OpenAPI definitions describing the surface area of their APIs. As this work continues to grow we are going to have to also start sharing machine-readable definitions of the monetization, plan, and access layers of our API operations. After I identify the schema in play for some of the major API management providers I track on, I'm going to invest more work into my standard API plan definition to make the access levels of APIs more discoverable using APIs.json.


The Potential Of The OpenAPI Spec Parameters Object

I enjoy learning from the OpenAPI Specs of the API providers I track on. Just having an OpenAPI Spec present tells a lot about an API provider in my book, but the level of detail some providers put into their API definitions adds another level to this for me. While reviewing the OpenAPI Spec for the Oxford Dictionaries API, I noticed their robust usage of the OpenAPI Spec parameters definitions collection, which provides an interesting overview of the surface area of the API, augmenting the benefits brought to the table by the definitions collection of an APIs underlying data schema.

When you are defining each path for an API you can either define the parameters using each paths parameters, or you can add them to the overall parameters definition object, allowing them to be reused across all paths. This object provided me with a centralized place to learn about the parameters used when making calls to the Oxford Dictionary API, and I'm assuming it helped them be more organized in how they defined the surface area of their APIs.

I can see how the processing of defining each path's parameters, and centrally organizing them for reuse can be a healthy thing. The more you lift yourself out of the individual definition of each path and consider the parameter patterns that have been used for other paths, the chances you will have a better view of the landscape will increase. I am optimistic about this OpenAPI Spec object, and curious about how it can be evolved as part of other conversation around GraphQL--something I'll work to understand better in the future.


Reducing Friction For API Developers With Enums In API Definitions

I am going through the Oxford Dictionaries API, learning about this valuable resource. Their onboarding process for registration, and learning about what the API does using interactive documentation, is very smooth. One of the things that really cuts the rough edges off learning about each API are the enums that are available for each path.

The parameters required for making calls to many of the paths, like language and country, have their enum values populated as part of their API definition. I look at numerous OpenAPI Specs in the course of my work and they rarely have values present for enum, providing critical default values for developers to use--eliminating some often serious frustration.

Not having the right values available when making even the simplest of API calls can be a significant point of friction when trying to get up and running using an API. While it may seem like a small thing, the work the Oxford Dictionaries API team has put into this level of detail for their API definitions will go a long way towards making their API resources more accessible and usable.


OpenAPI Spec Google Spreadsheet to Github Jekyll Hosted YAML

I have been playing around with different ways of using Google Spreadsheet to drive YAML and JSON data to Jekyll data projects hosted as Github repositories. It is an approach I started playing around with in Washington DC, while I was helping data stewards publish government services as JSON-LD. It is something I've been playing around with lately using to drive D3.js visualizations and even a comic book.

There are couple of things going on here. First, you are managing machine-readable data using Google Spreadsheets, and publishing this data as two separate machine readable formats: JSON and YAML. When these formats are combined with the data capabilities of a Jekyll website hosted on Github Pages, it opens up some pretty interesting possibilities for using data to fuel some pretty fun things. Plus...no backend needed.

To push this approach forward I wanted to apply to managing OpenAPI Specs that can be used across the API life cycle. I pulled together a spreadsheet template for managing the details I need for an OpenAPI Spec. Then I created a Github repository, forked my previous spreadsheet to YAML project, and modified it to pull data from a couple of worksheets in the Google Doc and publish as both JSON and YAML OpenAPI Specs. 

My OpenAPI Spec Google Sheet to YAML for use in a Jekyll project hosted on Github is just a prototype. The results don't always validate, and I'm playing with different ways to represent and manage the data in the Google Sheet. It is a fun start though! I am going to keep working on it, and probably start a similar project for managing an APIs.json index using Google Sheets. When done right it might provide another way that non-developers can participate in the API design process, and apply OpenAPI Specs to other stops along the API life cycle like with API documentation, SDK generation, or testing and monitoring.


Please Share Your OpenAPI Specs So I Can Use Across The API Life Cycle

I was profiling the New Relic API, and while I was pleased to find OpenAPI Specs behind their explorer, I was less than pleased to have to reverse engineer their docs to get at their API definitions. It is pretty easy to open up my Google Chrome Developer Tools and grab the URLs for each OpenAPI Spec, but you know what would be easier? If you just provided me a link to them in your documentation!

Your API definitions aren't just driving the API documentation on your website. They are being used across the API life cycle. I am using them fire up and playing with your API in Postman, generating SDKs using APIMATIC, or creating a development sandbox so I do not have to develop against your live environment. Please do not hide your API definitions, bring them out of the shadow of your API documentation and give me a link I can click on--one click access to a machine-readable definition of the value your API delivers.

I'm sure my regular readers are getting sick of hearing about this, but the reality of my readers is that they are a diverse, and busy group of folks and will most likely not read every post on this important subject. If you have read a previous post on this subject from me, and are reading this latest one, and still do not have API definitions or prominent links--then shame on you for not making your API more accessible and usable...because isn't that what this is all about?


The Different Reasons Behind Why We Craft API Definitions

I wrote a post about the emails I get from folks telling me the API definitions contained within my API stack research, something that has helped me better see why it is I do API definitions. I go through APIs and craft OpenAPI Specs for them because it helps me understand the value each company offers, while also helping me discover interesting APIs and the healthy practices behind them.

The reason I create API definitions and organize them into collections is all about discovery. While some of the APIs I will be putting to use, most of them just help me better understand the world of APIs and the value and the intent behind the companies who are doing the most interesting things in the space.

I would love it if all my API definitions were 100% certified, and included complete information about the request, response, and security models, but just having the surface area defined makes me happy. My intention is to try and provide as complete of a definition as possible, but the primary stop along the API lifecycle I'm looking to serve is discovery, with other ones like design, mocking, deployment, testing, SDKs, and others following after that.

Maybe if we can all better understand the different reasons behind why we all craft and maintain API definitions we can better leverage Github to help make more of them complete. For now, I'll keep working on my definitions, and if you want to contribute head over to the Github repo for my work, and share any of your own definitions, or submit an issue about which APIs you'd like to see included.


APIs Can Give An Honest View Of What A Company Does

One of the reasons I enjoy profiling APIs is that they give an honest view of what a company does, absent of all the marketing fluff, and the promises that I see from each wave of startups. If designed right, APIs can provide a very functional, distilled down representation of data, content, and algorithmic resources of any company. Some APIs can be very fluffy and verbose, but the good ones are simple, concise, and straight to the point.

As I'm profiling the APIs for the companies included in my API monitoring research, what API Science, Apica, API Metrics, BMC Software, DataDog, New Relic, and Runscope offer quickly become pretty clear. A simple list of valuable resources you can put to use when monitoring your APIs. Crafting an OpenAPI Spec allows me to define each of these companies APIs, and easily articulate what it is that they do--minus all the bullshit that often comes with the businesses side of all of this. 

I feel like the detail I include for each company in an APIs.json file provides a nice view of the intent behind an API, while the details I put into the OpenAPI Spec provide insight into whether or not a company actually has any value behind this intent. It can be frustrating to wade through the amount of information some providers feel they need to publish as API documentation, but it all becomes worth it once I have the distilled down OpenAPI Spec, giving an honest view of what each company does.


SchemaHub's Usage Of Github To Launch Their API Service Is A Nice Approach

I'm looking through a new API definition focused service provider called SchemaHub today, and I found their approach to using Github as a base of operations was interesting and provided a nice blueprint for other API server providers to follow. I'm continually amazed at the myriad of ways that Github can be put to use in the world of APIs, which is one of the things I love about it.

As a base for SchemaHub, they created a Github Org, and made their first repository the website for the service, hosted on Github Pages. In my opinion, this is how all API services should begin, as a repo, under an organization on Github--leveraging the social coding platform as a base for their operations.

SchemaHub is taking advantage of Github for hosting their API definition focused project--free, version controlled, static website hosting for schemahub.io. 

As I was looking through their site, learning about what they are doing I noticed a subscription button at the bottom of the page, asking me to subscribe, and they'll notify me when things are ready.

Once I clicked on the button, I was taken for a Github OAuth dance, which now makes SchemaHub not just a Github repo for the site, it is an actual Github Application that I've authenticated with using my Github account. They only have access to my profile and email, but is the types of provider to developer connection I like to see in the API world.

Once I authorize and connect I am taken to a thank you page back on their website, letting me know I will be contacted shortly with any updates about the service. Oh, and I'm offered a Twitter account as well, allowing me to stay in tune with what they are up to--providing a pretty complete picture for how new API services can operate. 

SchemaHub's approach reflects what I'm talking about when I say that Github should offer an Oauth service, something that would enable applications running on Github to establish a Github app as part of their organization and website. I like this model because it enables connections like Schema has established to occur, maximizing the social powers of the Github platform.

SchemaHub wins for making a great first impression on me with their API service. Github Org, simple static Github Pages hosted website, connectivity with my Github profile, and a Twitter account to follow. Now I know who they are, I'm connected, and when they are ready with their API service, they have multiple channels to update me on. My only critique is that I would also like to have a blog with Atom feed, so I can hear stories about what they are trying to accomplish, but that is something that can come later. For now, they are off to a pretty good start.


If you think there is a link I should have listed here feel free to tweet it at me, or submit as a Github issue. Even though I do this full time, I'm still a one person show, and I miss quite a bit, and depend on my network to help me know what is going on.