Caching HTTP Requests In Angular

Requesting “lookup” data from an API in Angular can quickly weigh your application down if you’re having to grab the same data over and over. The type of lookup data I’m talking about is say fixed country lists, or a set of user roles, or any other set of data that is rarely going to change on a daily, weekly, or even monthly basis.

I recently tried to solve this by implementing a level of caching in the Angular app that was purely in memory. I figured that if I cache everything in memory, then a simple refresh of the page “clears” the cache, but requests to grab any lookup data on multiple pages won’t result in multiple calls to my API. I also figured it was going to be the easiest way to approach things.

Avoiding Interceptors

There’s a tendency in Angular that when you are dealing with anything HTTP request related, you create an interceptor for it. I’ve been guilty of this in the past for sure! But the problem with Interceptors is that it’s a bit of an all or nothing approach. Opting out of interceptors is not that easy, and “configuring” when an interceptor runs and when it doesn’t is also a bit of a black box. For example, someone looking at a piece of that looks like so :

this.http.get(`myAPI.com`);

Does this cache your request? Does it run any interceptors at all? It’s hard to say without really digging into the code.

And don’t get me wrong, I do use interceptors a lot. But generally speaking, I try and make it so that in the overwhelming majority of cases, I want the interceptor to run. In our case, we only want to run the caching interceptor on a few select endpoints.

CachedHttpClient Service

Instead what I did is I created a CachedHttpClient that functions more or less like your regular HTTP service you already have in Angular.

The code :

import { Injectable } from '@angular/core';
import { Observable, of } from 'rxjs';
import { HttpClient } from '@angular/common/http';
import { map } from 'rxjs/operators';

@Injectable({
    providedIn: 'root'
})
export class CachedHttpClient
{
    cachedItems : any[] = [];

    constructor(private http : HttpClient) { 

    }

    getCached<T>(url : string) : Observable<T> {
        if(this.cachedItems[url])
        {
            return of(this.cachedItems[url] as T);
        }

        return this.http.get<T>(url).pipe(map((item : T) => {
            this.cachedItems[url] = item;
            return item;
        }));
    }
}

All this does is before running your HTTP call, we check if the exact URL is in the cache (This includes query strings also), and if not, run the http call, add the item to the cache, and then return the result. On the next request, it’s going to hit the cache and not have to make the http call!

We can inject it into a service like so :

export class MyService {
  constructor(private http : HttpClient, private cachedHttp : CachedHttpClient) { 
  }

  getLookupData() {
    return this.cachedHttp.getCached<LookupData>(`myAPI/LookupData`);
  }
}

Notice that it’s extremely easy to see what’s cached and what’s not. Not only are we using a class called CachedHttpClient, but the actual method call is “getCached”. There’s no hidden magic going on for a developer to shoot themselves in the foot.

Limitations

It is worth pointing out a few limitations.

If a user refreshes their browser (Or closes their browser), then the cache is lost. In my case, this is acceptable because I only really want to cache values as users move through the site. I don’t mind that when they do a hard refresh the first few pages have to fetch those values again and re-cache them. Infact, I somewhat like the fact that a user doesn’t have to know any “tricks” on clearing their “browser cache”, they just have to refresh.

And finally, the above code does not really handle race conditions all too well (e.g. If on a page, two components request the same lookup data). To me again, this was acceptable. Introducing some sort of lock mechanism was really overkill and in 99% of cases, you aren’t going to run into an issue where on one page, you request the same data at exactly the same time (Or you shouldn’t anyway!).

Leave a Reply

Your email address will not be published. Required fields are marked *