Caching API Requests in Angular : Better , Faster and Stronger

Koye Mohan Reddy
10 min readSep 3, 2024

--

Fast and Performant

In modern web applications, frequent API requests can lead to increased server load and slower user experiences. By implementing efficient caching mechanisms in Angular, we can significantly reduce server requests, improve loading times, and create a more responsive application

We have two main objectives :

  1. Cache HTTP requests to efficiently handle server requests.
  2. Architecture it in a scalable way to make it more flexible.

This will improve user experience and reduce server load for frequently requested data.

Table of Contents:

🔍Why and How?
🌟Basic Implementation
⚙️ Handling Different Payloads
🔄 Cache Invalidation
🎯 Selective Caching
🚫 Cache Busting
🧠 Cache Memory
📈 End Result
📝 Conclusion
🚀What’s next ?

Why?

We have some requests which gets frequently triggered, however the output remains the same as the input payload is similar. This includes:

  • Same requests made by multiple components.
  • Requests made in shorter time frames like requesting in 10 seconds frequently where the data refreshes in 1 min intervals.
  • and so on.

These cases if handled efficiently would provide better user experience , faster loading times and reduced server loads upto 100% ( because we won’t be making any requests! ).

How?

Now that we have a goal in mind. Let’s take a look at implementation , for this we have two approaches:

  • Caching at Interceptor Level.
  • Using Service Workers.

We can have a hybrid of two as well but for keeping this short we will mainly focus on Interceptor Level which gives us lot of flexibility and control and easier to test.

Let’s start with an base application , we have an application which fetches paginated data for backend. We will use a dummy backend for out testing purpose , i.e. Dummy API which return a dummy json data.

Fetching Data without Caching
Network Snapshot

Fetch Data button will trigger this request for the above data. You can relate this with your use case scenario. Let’s say you have multiple disjoint forms which require same API data and cannot be shared or you have static data that does not change as frequently and can be cached.

Basic Implementation:

Interceptor-Based Caching:

  • Interceptor Functionality: The HTTP interceptor in Angular allows you to intercept all outgoing HTTP requests. By leveraging this, you can inject caching logic that checks for existing cache entries before making a network call. If a cached response is found, it is returned immediately, bypassing the need for an HTTP request.
  • Flexible and Testable: Interceptor-based caching is highly flexible as it centralizes the caching logic, making it easier to manage and test. Since the caching logic is separated from the core application logic, it can be adjusted or extended without significant refactoring.

import { Injectable } from "@angular/core";
import { HttpInterceptor, HttpRequest, HttpHandler, HttpResponse } from "@angular/common/http";
import { Observable, of } from "rxjs";
import { tap } from "rxjs/operators";
import { CacheService } from "./cache.service";
import { canCacheRequest } from "app/utils/cache.requests.utils";
import { APP_SERVER } from "app/config";
@Injectable()
export class CachingInterceptor implements HttpInterceptor {
constructor(private cacheService: CacheService) {}

intercept(req: HttpRequest<any>, next: HttpHandler): Observable<any> {

const cacheKey = this.createCacheKey(req.urlWithParams, req.body);
const cachedResponse = this.cacheService.getCache(cacheKey);
if (cachedResponse) {
return of(cachedResponse);// Return cached response if available
}

return next.handle(req).pipe(
tap((event) => {
if (event instanceof HttpResponse) {
if (canCacheRequest(req)) this.cacheService.setCache(cacheKey, {data : event , maxAge: 90000});
}
})
);
}
private createCacheKey(url: string, body: any): string {
const bodyHash = this.simpleHash(JSON.stringify(body)).toString(); // with hash we can do it with only small key

return `${url}_${bodyHash}`;
}

/** Generates a Hash to be appended with key */
private simpleHash(str: string): string {
let hash = 0;
if (str.length === 0) return hash.toString();
for (let i = 0; i < str.length; i++) {
const char = str.charCodeAt(i);
hash = (hash << 5) - hash + char;
hash = hash & hash; // Convert to 32bit integer
}
return hash.toString();
}
}

For functional based interceptors in latest angular versions. core logic remains same.

Let’s dive deep into what’s happening.

The base gist is simple , we intercept requests , we keep data in a map for easier and faster access and update it’s value with payload.

For that we can use a separate service to keep out interceptor less cluttered.

import { Injectable } from '@angular/core';

@Injectable({
providedIn: 'root',
})
export class CacheService {
private cache = new Map<string, any>();

constructor() {}

setCache(key: string, data: any) {
this.cache.set(key, data);
}

getCache(key: string): any {
const cacheEntry = this.cache.get(key);
if (cacheEntry)
return data;
return null;
}
deleteCache(key: string) {
this.cache.delete(key);
}
}

Handling Different Payloads :

Unique Key Generation: When caching requests with different payloads but the same endpoint, generating a unique cache key is crucial. This can be achieved by hashing the request body and appending it to the URL. This ensures that each unique request is stored separately, preventing data conflicts.

Now for the entry of the map we can use the URL with params as key. However with this we have scenario of having requests with same api path but different payload , to counter this we can generate a simple hash and append this to our key , this will make it unique for different payload.

  private simpleHash(str: string): string {
let hash = 0;
if (str.length === 0) return hash.toString();
for (let i = 0; i < str.length; i++) {
const char = str.charCodeAt(i); // ASCII
hash = (hash << 5) - hash + char; // Prevents the hash from getting too large
hash = hash & hash; // Convert to 32bit integer
}
return hash.toString();
}

Note : Hash generated will always be same for given set of input , which will be later useful for deleting cache if required.

Cache Invalidation :

Data get’s updated on regular occasions in those cases we need to invalidate old cache and get latest data we will implement timestamp on each entry and validate the data the next time it is fetched.

import { Injectable } from '@angular/core';

@Injectable({
providedIn: 'root',
})
export class CacheService {
private cache = new Map<string, any>();

constructor() {}
/** Set Cached Data based on key */
setCache(key: string, data: any) {
// adding timestamp in each data inside service as it would be common in all.
this.cache.set(key, { ...data, timestamp: new Date().getTime() });
}

/** Fetch Cached Data based on key */
getCache(key: string): any {
const cacheEntry = this.cache.get(key);
if (cacheEntry) {
const { data, timestamp , maxAge } = cacheEntry;
const currentTime = new Date().getTime();
if (currentTime - timestamp < maxAge) {
return data;
} else {
this.cache.delete(key); // Invalidate expired cache
}
}
return null;
}
/** Delete Cached Data */
deleteCache(key: string) {
this.cache.delete(key);
}
}

This will cache all the requests in our application but usually some requests needs to be latest data. For that we will cache only those requests that is required such as common services.

Selective Caching :

Why Selective Caching?

  • Use Case: Not all API requests should be cached. For example, requests that fetch real-time data, such as stock prices or user authentication tokens, should always retrieve the latest information from the server. On the other hand, static or infrequently changing data, like product catalogs or configuration settings, are ideal candidates for caching.
  • Efficiency: By selectively caching only the necessary API calls, you can significantly optimize your application’s performance without compromising data accuracy. This approach reduces server load for non-critical requests while ensuring that critical data remains fresh and up-to-date.

How to Implement Selective Caching?

  • Pattern Matching: Use regular expressions to define patterns for URLs that should or should not be cached. This allows for fine-grained control over which requests are cached.
  • Whitelisting and Blacklisting: Maintain lists of API endpoints that should always be cached (whitelisted) and those that should never be cached (blacklisted). This approach ensures that only appropriate data is stored in the cache.

For that we will write a separate function to match the API path from the list of whitelisted API’s.

Note : We’ll use regex as it provides more flexibility for handling similar and common paths.

We can also keep a log of API’s that should never be cached incase we are caching for all. For both the scenarios.

import { HttpRequest } from '@angular/common/http';
type CachePattern = {
urlPattern: RegExp;
};
// Object of cache patterns
const CACHE_PATTERNS: { [method: string]: CachePattern[] } = {
GET: [
{ urlPattern: /^https?:\/\/[^\/]+\/todos$/ },
// Add more GET patterns here
],
PUT: [
// { urlPattern: /^\/api\/account\/todos$/ },
// { urlPattern: /^\/api\/account\/todos\/\d+\/\d+$/ }, Paginated Services
// Add more PUT patterns here
],
// Add more HTTP methods and their patterns as needed
};
export const DELETE_PATTERNS = {
'/api/add/todos': true,
};
// Array of URLs that should never be cached
const NEVER_CACHE_PATTERNS: RegExp[] = [
/^\/api\/account\/generateOTP$/,
// Add more patterns as needed
];
export function canCacheRequest(req: HttpRequest<any>): boolean {
const urlWithoutHost = req.url.split(window.origin)[1];

// Check if the request should never be cached
if (NEVER_CACHE_PATTERNS.some((pattern) => pattern.test(urlWithoutHost))) {
return false;
}

// Check if the request matches any of our cache patterns
const methodPatterns = CACHE_PATTERNS[req.method];
if (methodPatterns) {
const matchedPattern = methodPatterns.find((pattern) =>
pattern.urlPattern.test(urlWithoutHost)
);
if (matchedPattern) {
console.log(`${matchedPattern.urlPattern.source} matched URL`);
return true;
}
}
return true;
}

Cache Busting :

Now let’s say we perform certain CRUD Operations and post that old data in invalid. For that we need to delete the old cached data and since we are using hashing which will be same for unique requests we can obtain the key using which we can delete the cached data.

Why is Cache Busting Necessary?

  • Data Accuracy: Cached data can become outdated if the underlying data changes. For example, if a user updates their profile information, the cached version of that data should be invalidated to ensure the application displays the most current information.
  • Maintaining Integrity: Cache busting ensures that your application doesn’t serve stale or incorrect data to users, especially after performing CRUD (Create, Read, Update, Delete) operations.

How to Implement Cache Busting?

  • Key-based Invalidation: Use a consistent hashing method to generate keys for caching. This allows you to easily identify and delete specific cached entries when the data they represent is updated.
  • Automatic Invalidation: Implement mechanisms to automatically invalidate cache entries after a certain period or when specific conditions are met, such as a successful POST, PUT, or DELETE request.
export function cacheBust(req: HttpRequest<any>) {
if (req.method == 'DELETE' || req.method == 'POST') {
if (req.url.includes('API_ENDPOINT'))// EndPoint without Host
this.cacheService.deleteCache(this.generateDeleteURL("API_ENDPOINT")); //Generated Hash will always be same.
}
return next.handle(req);
}
}

// URL Generator
private generateDeleteURL(key) {
const url = `${HOST}/${key}`;
const genKey = this.createCacheKey(url, null);
return genKey;
}

Cache Memory :

Preventing Cache Overflow: To prevent the cache from consuming too much memory, implement a cache size limit. When this limit is reached, older or less frequently used cache entries are purged to make room for new data. This approach ensures that the cache remains performant and does not degrade the application’s memory usage.

We will set a max cache size of 50MB you can set as per your requirement.

On each new cache update we check if the current cache size has exceeded the cache size. If so we delete the last unused cache and push the new incoming data else continue as usual.

import { Injectable } from '@angular/core';

@Injectable({
providedIn: 'root',
})
export class CacheService {
private cache = new Map<string, any>();
private maxCacheSize = 50 * 1024 * 1024; // 50 MB in bytes

constructor() {}

setCache(key: string, data: any) {
const size = this.getObjectSize(data);
this.manageCacheSize(size);// Check current cache size
this.cache.set(key, { data, size, timestamp: new Date().getTime() });
}

private manageCacheSize(newEntrySize: number) {
let currentSize = this.getCacheSize();
// Clear the amount of new cache size required.
while (currentSize + newEntrySize > this.maxCacheSize && this.cache.size > 0) {
const oldestKey = this.getOldestCacheKey();
if (oldestKey) {
const entrySize = this.cache.get(oldestKey)?.size || 0;
this.cache.delete(oldestKey);
currentSize -= entrySize;
}
}
}

private getCacheSize(): number {
let totalSize = 0;
this.cache.forEach(entry => {
totalSize += entry.size;
});
return totalSize;
}

private getOldestCacheKey(): string | undefined {
let oldestKey: string | undefined;
let oldestTimestamp = Infinity;

this.cache.forEach((entry, key) => {
if (entry.timestamp < oldestTimestamp) {
oldestTimestamp = entry.timestamp;
oldestKey = key;
}
});

return oldestKey;
}
// Get current cache size
private getObjectSize(obj: any): number {
const jsonString = JSON.stringify(obj);
return new Blob([jsonString]).size;
}
}

At the end we have an application architecture that looks something like this.

End Result :

Before :

With no Caching

After :

With Caching

Since requests are fulfilled very fast we don’t even see a loading indicator from earlier 🎉. Hopefully that’s not problematic , you can force the state for small amount of time if required for better UX.

You can find the complete source code here.

Conclusion :

We implemented caching and handled different scenarios which we might encounter and established a scalable architecture which is easily flexible enough and can be rolled out to end users gradually.

By incorporating these caching strategies, our Angular application will be more efficient, with reduced server load and faster response times. Selective caching and cache busting ensure that only relevant data is cached and that stale data is automatically removed. These improvements lead to a more responsive and reliable user experience.

What’s next ?

We can use web workers so that cache data persists even after user reloads the page and that particular instance gets destroyed.

Why Consider Web Workers?

  • Offline Support: Web workers can enable offline capabilities by caching assets and API responses, allowing your application to function even when the user is disconnected from the internet. This is particularly useful for Progressive Web Apps (PWAs).
  • Persistent Data: Unlike in-memory caching, which is lost when the application is closed or the page is refreshed, service workers provide persistent caching that remains intact across sessions. This ensures that frequently accessed data is available even after a page reload, improving user experience.

Thank you for your time.

--

--

Koye Mohan Reddy

I'm a Frontend Developer at Faclon Labs. Angular | Angular Material | Typescript | JavaScript | Linkedin : https://www.linkedin.com/in/kmohan-reddy/