SF’s Top 7 Best Practices for REST API’s

Posted by: Manpreet Singh

Early on in my career, I understood that, a coder needs to evolve quickly and continuously or become obsolete; being in one of the most dynamic fields, I also realized that to keep up with the fast-paced developments – it was good to evangelize best practices.

And at SourceFuse, it has been a forever endeavor for my team and I, to set processes as well define best practices for a lot that we do. Being partner focused, our focus is always building market ready products and delivering products that our clients envisioned. We need to move fast, try new tech…and still come out…all the better for it! We just turned 11 this year, and with over 1000+ web and mobile products under our belt – let me say it’s been one fantastic learning curve – exponential even.

With the need to have many market-ready and cloud first products, we were early adopters of API first development model. Having an API first model helped us keep any architecture enhancements in future releases and it has been a phenomenal journey experimenting and implementing the best practices for REST APIs!

Here are “SF’s Top 7 Best Practices for REST API’s” which we feel are worth a follow through. (Please note we are extensive users of NodeJS and its frameworks for exposing REST API, therefore many of the best practices will have reference to NodeJS specific implementation.)

  • Document API First 

A few years back – there was an upsurge in ‘Design Methodology’, wherein proponents of this method propagated an API first design, recommended writing REST API for all business logic – which could be consumed by anyone within or outside of an organization. The idea was that even when being used by a webapp, the business logic should be exposed as REST API, and any future integrations with mobile or other third party too will consume the same REST API’s. However, API’s cannot be consumed sans documentations – thus came into existence “Document API First Model

The ‘Document API First Model’ demands that all document be completed ready for consumption by a human or a machine before a single line of code is written – this means the document is structured and not a free flowing one. In our experience the documents help lessen the ambiguity, as we have the option to get further validation from the clients.

While there are many ways to document the REST API’s we have found “SWAGGER” to be useful (Swagger has been renamed ‘OpenAPI’, though still referred to as Swagger in the open source world!). Swagger file can be used to define API endpoints, data input format, output data model, server environments, authentication schemes etc., detailed specifications can be found at here.

Now, the choice comes down to whether, to write this documentation or generate this documentation from code. But, if code has not been setup, it is advisable to write a swagger file, and if code framework is in place, we can declare API functions (note – just declare – not code), and put function headers on it as per the API documentation format. The function header can be consumed by tools to generate documentation, e.g. swagger-jsdoc

A ‘Swagger’ file is a structured file and can be viewed by anyone aware of its structure by just opening it in any text editor. There could be some consumers of the API who don’t want to spend time in learning the structure. There are quite a few tools available which can help in viewing swagger files without knowing file structure, e.g. swagger-editor. There are quite a few tools available to view swagger file and even try calls to the API without using Postman or the likes of it.

  • Use API gateway

An ‘API gateway’ is a layer between the business logic and the API documentation, and its use is imperative; the gateway ensures:

  1. Only valid data is passed to the API as per the API documentation. It also reduces the vulnerabilities in the application since data gets validated before reaching the code.
  2. Only the filtered data as per API documentation goes out from server. This ensures that any kind of secret data doesn’t go out from server.
  3. Documentation never goes out of sync. Since, API gateway will ensure that only the documented APIs will be exposed to the outer world, there will be no backdoor entry into the backend.
  4. That clients of API can consume APIs while backend developer is still developing business logic for the API – We can mock the response of API without even single line of code written by backend developer. This certainly requires an integration phase when both backend developer and client are done with the code and there will be bugs to be fixed in this phase so better is the documentation, less would be the bugs in the integration phase.

We have used ‘AWS API gateway’ in the past and there are quite a few others available, for instance there is ‘Swagger npm’ – it is quite interesting, and can be used for the projects which have nodejs as backend – we did do a POC and liked it a lot! Below is an image defining flow in this npm: Post-image-for-blog

  • Use REST specs to our advantage

REST has specs as defined by W3C and we can use it to our advantage to build REST APIs. Few of the specifications or conventions which can help us in managing client expectations as well as behavior:

  1. API route should use noun instead of verbs for identifying the resource category. Below are a few examples of good API routes as per REST conventions:
    1. GET /article – to get all articles
    2. POST /article – to create an article
    3. GET /article/:articleID – to get specific article
    4. PATCH /article/:articleID – to update specified article
    5. DELETE /article/:articleID – to delete specified article
  2. Return HTTP code as per the specifications. This wiki page which lists down all the specifications concisely . Before returning any HTTP code, it is best to check the status code list if there is a better HTTP code.
  3. For any ‘GET all’ request, it is best to implement pagination logic. Pagination parameters like ‘page’ can be expected in route as query parameter. It will be used to return data for the specific page, and will not degrade performance of client as well as server side by not sending huge amount of data.
  4. As the name suggests for any GET request, it is important to not update DB state. If there is a situation where DB state needs to be updated for GET request, it needs to be in a controlled fashion and only in special situations.
  • Modularize auth, ACL & data validation policies

There are couple of ways by which authentication and authorization can be handled in an API.

  1. Least effective and most bug-filled being the one in which auth code is written separately for each API. This approach will not work because developers will miss out on checks at one place or other, therefore introducing vulnerabilities in the application.
  2. Another way is to write auth code is in ‘Aspect’ also called middleware in nodeJS applications.
  3. And for the more effective optimizations, having a fixed set of authentication and authorization policies is any day better than having code directly written in the aspect/middleware, as complicating the auth code is the last thing which any architect or developer would like to do! Writing auth code directly in aspect/middleware in a complex code will complicate it further, adding more vulnerabilities to the auth code. To take care of this, we recommend auth policy factory which generates auth policy based on the request passed. This auth policy needs to be triggered in aspect/middleware and that auth policy is asked to validate whether api user is allowed to do what it is trying to do.

Same logic can be applied for ACL, instead of hardcoding the logic for ACL inside each API where won’t it be great if it becomes cross cutting concern – more like auth policies mentioned above. Policy where database access/update ‘where’ conditions are defined based on some common logic and then it is just a matter of picking the right policy based on the situation.

Similarly for validation policies, the input data validation policies should be applied to avoid XSS and SQL injection attacks. Most of the frameworks take care of this by default, but it is important to check this before using any framework. For avoiding SQL injection attacks, it is important to use ‘ORM’ or ‘parameterized queries’.

  • Log API access

Logging each API access is important for analysis of quite a few factors.

Examples are:

  1. Country/City from which APIs are being accessed, this analysis can help in validating if application is being built or deployed for the audience of that country.
  2. Frequency of the access based on time, country, IP, etc. This can be used for capacity planning of infrastructure.
  3. Percentage of success/failures. This can be used to identify if application is having a bug which is leading to high percentage of failures.
  4. ….many others…

We used the analytic tools like ELK (Elasticsearch Logstash Kibana) stack for this kind of analysis. There are quite a few other open source tools with similar analytic capabilities. While using NodeJS, we had to put a server (like Apache) in front on it, that helped in fetching the access and error logs and moving to Elasticsearch using Logstash. A relatively cheaper alternative if you don’t require extensive analysis is AWS cloudwatch. Similar thing should also be done for database log to capture slow queries. Based on slow queries, we can explain the plan to identify the reason for slowness.

  • Monitor the APIs and infrastructure

We can certainly analyse logs in retrospect for analysis. Nowadays almost-real time log analysis is also available but what if API server is having memory leaks or draining CPU or having an exception. And developer would like to monitor the API server. There are quite a few tools available in market for this kind of monitoring. We have extensively used keymetrics. It is from creator of PM2 (module used extensively in NodeJS).

Infrastructure monitoring is very much dependent in the infra provider being used. For AWS, we monitor using AWS cloudwatch to monitor infrastructure as well as raise alarms in case of thresholds being breached.

  • Implement throttling on the API

This would help in preventing someone using infrastructure more than as defined in pricing plan or fair usage policy. While using AWS API gateway we got this feature for free. AWS API gateway also prevents Denial of Service attacks. In general, for each programming language or framework there are libraries or packages available to achieve the throttling at application server level.

We will conclude this blog and capture some bonus best practices in a follow up blog.

With over 18 years of experience, Manpreet is the Chief Technology Officer at SourceFuse, leading the technology frat pack. Manpreet loves mentoring and running POC’s! He passed out from Thapar University (Patiala) and BITS (Pilani). Reach out to us for any questions at support@sourcefuse.com!