NIXX/DEVv1.14.0
ArticlesFavorites
Sign In
Sign In
Articles

Welcome to our blog

A curated collection of insightful articles, practical guides, and expert tips designed to simplify your workflow

Cover image for: Optimizing API Calls in React: Debouncing, Throttling, and Caching Explained
August 21, 20256 MIN READ min readBy ℵi✗✗

Optimizing API Calls in React: Debouncing, Throttling, and Caching Explained

Learn how to reduce unnecessary API calls and improve app performance in React by using debouncing, throttling, and caching. Includes practical examples with `useEffect`, `useCallback`, and popular utility libraries.

webdevreactjavascript
ℵi✗✗

ℵi✗✗

Full-Stack Developer

Passionate about building tools and sharing knowledge with the developer community.

Was this helpful?

Popular Posts

  • NixOS vs. Arch Linux: Which One Belongs in Your Dev Setup?

    NixOS vs. Arch Linux: Which One Belongs in Your Dev Setup?

    5 MIN READ min read

  • How to Enable HTTPS on Localhost in Under 2 Minutes

    How to Enable HTTPS on Localhost in Under 2 Minutes

    3 MIN READ min read

  • Migrating from Create React App (CRA) to Vite: A Step-by-Step Guide

    Migrating from Create React App (CRA) to Vite: A Step-by-Step Guide

    4 MIN READ min read

  • Array Destructuring in PHP: A Practical Guide for Modern Developers

    Array Destructuring in PHP: A Practical Guide for Modern Developers

    5 MIN READ min read

Recommended Products

  • Apple MacBook Air M2

    Apple MacBook Air M2

    4.4
  • Samsung Galaxy S23

    Samsung Galaxy S23

    4.2
  • Apple iPad (7th Gen)

    Apple iPad (7th Gen)

    4.3
  • Fitbit Versa 4

    Fitbit Versa 4

    4.3

May contain affiliate links

Topics

webdev33productivity16cybersecurity12javascript11automation9guide8react7typescript7php6tutorial6freelancing5github actions5privacy5how to4Node.js4
+111 more topics →
🇺🇸USD ACCOUNTOpen a free US-based USD accountReceive & save in USD — powered by ClevaSponsoredInterserver Hosting#1 VALUEAffordable, reliable hosting from $2.50/mo99.9% uptimeSponsored

A search input that fires a fetch request on every keystroke is one of the most common performance problems in React applications. A five-character query produces five network requests in under a second. Across thousands of users, the backend load is significant. For the individual user, the result is often a flickering UI as responses arrive out of order and overwrite each other.

Three techniques address this problem at different points: debouncing delays the request until the user pauses typing, throttling caps how frequently requests can fire, and caching stores previous results so repeated queries skip the network entirely. This guide covers each one with working implementations and explains when combining them makes sense.

What this covers:

  • Debouncing with useCallback and lodash.debounce

  • Throttling scroll and event handlers with lodash.throttle

  • Simple in-memory caching with useRef

  • Race condition handling with AbortController

  • When to use each technique and when to combine them

  • When a library like TanStack Query handles all three automatically


The Problem: Uncontrolled API Calls

This is the pattern that creates the problem:

useEffect(() => {
    fetch(`/api/search?q=${query}`)
        .then(res => res.json())
        .then(data => setResults(data));
}, [query]);

Every change to query triggers a fetch. There is no delay, no rate limit, and no check for whether the same query has already been fetched. There is also a race condition: if two requests are in flight and the earlier one resolves after the later one, the UI displays stale results.

The sections below address each of these issues.


Debouncing API Calls

Debouncing delays the execution of a function until a specified amount of time has passed since it was last called. A user typing "react" triggers the function once, 500ms after the last keystroke, rather than five times in rapid succession.

Install lodash.debounce and its types:

npm install lodash.debounce
npm install -D @types/lodash.debounce
import { useState, useCallback, useEffect, useRef } from "react";
import debounce from "lodash.debounce";

function SearchBox() {
    const [query, setQuery] = useState("");
    const [results, setResults] = useState<{ name: string }[]>([]);
    const abortRef = useRef<AbortController | null>(null);

    const fetchResults = useCallback(
        debounce(async (value: string) => {
            if (abortRef.current) {
                abortRef.current.abort();
            }
            abortRef.current = new AbortController();

            try {
                const res = await fetch(`/api/search?q=${value}`, {
                    signal: abortRef.current.signal,
                });
                const data = await res.json();
                setResults(data);
            } catch (err) {
                if ((err as Error).name !== "AbortError") {
                    console.error(err);
                }
            }
        }, 500),
        []
    );

    useEffect(() => {
        return () => {
            fetchResults.cancel();
        };
    }, [fetchResults]);

    return (
        <div>
            <input
                type="text"
                value={query}
                onChange={e => {
                    setQuery(e.target.value);
                    fetchResults(e.target.value);
                }}
                placeholder="Search..."
            />
            <ul>
                {results.map((r, idx) => (
                    <li key={idx}>{r.name}</li>
                ))}
            </ul>
        </div>
    );
}

Two additions compared to the basic debounce example most guides show:

The AbortController cancels the previous in-flight request when a new one starts. Without this, debouncing reduces the number of requests but does not prevent the race condition where an earlier response arrives after a later one.

The useEffect cleanup calls fetchResults.cancel() to cancel any pending debounced call when the component unmounts. Without this, a debounced call could fire after the component is gone and attempt to update state on an unmounted component.

When to use debouncing: search inputs, form validation on typing, autocomplete fields, any input that should trigger an action only after the user has finished a burst of changes.


Throttling Event Handlers

Throttling ensures a function runs at most once per specified interval, regardless of how many times it is triggered. Unlike debouncing, which waits for a pause, throttling produces regular updates at a controlled rate.

npm install lodash.throttle
npm install -D @types/lodash.throttle
import { useEffect } from "react";
import throttle from "lodash.throttle";

function ScrollTracker() {
    useEffect(() => {
        const handleScroll = throttle(() => {
            console.log("Scroll position:", window.scrollY);
            // could trigger pagination, analytics, or visibility checks
        }, 200);

        window.addEventListener("scroll", handleScroll);

        return () => {
            handleScroll.cancel();
            window.removeEventListener("scroll", handleScroll);
        };
    }, []);

    return <div style={{ height: "200vh" }}>Scroll down to trigger the handler</div>;
}

The cleanup calls handleScroll.cancel() before removing the listener. This prevents a throttled call that was scheduled to run at the end of the interval from firing after the component unmounts.

When to use throttling: scroll position tracking, infinite scroll triggers, window resize handlers, button spam prevention, real-time dashboard updates where a rate of one update per second is sufficient.

Debouncing vs. throttling: debouncing waits for activity to stop before firing once. Throttling fires at regular intervals while activity continues. For a search input, debouncing is correct because the goal is to fire after the user stops typing. For scroll position tracking, throttling is correct because updates should happen continuously while scrolling, just not on every pixel.


Caching Previous Results

Caching stores responses in memory so a repeated query returns immediately without a network round-trip. A useRef holding a Map is a straightforward implementation for session-level caching:

import { useState, useRef, useCallback } from "react";

interface SearchResult {
    name: string;
    id: number;
}

function CachedSearch() {
    const [query, setQuery] = useState("");
    const [results, setResults] = useState<SearchResult[]>([]);
    const cacheRef = useRef<Map<string, SearchResult[]>>(new Map());
    const abortRef = useRef<AbortController | null>(null);

    const fetchResults = useCallback(async (value: string) => {
        if (!value.trim()) {
            setResults([]);
            return;
        }

        if (cacheRef.current.has(value)) {
            setResults(cacheRef.current.get(value)!);
            return;
        }

        if (abortRef.current) {
            abortRef.current.abort();
        }
        abortRef.current = new AbortController();

        try {
            const res = await fetch(`/api/search?q=${value}`, {
                signal: abortRef.current.signal,
            });
            const data: SearchResult[] = await res.json();
            cacheRef.current.set(value, data);
            setResults(data);
        } catch (err) {
            if ((err as Error).name !== "AbortError") {
                console.error(err);
            }
        }
    }, []);

    return (
        <div>
            <input
                value={query}
                onChange={e => {
                    setQuery(e.target.value);
                    fetchResults(e.target.value);
                }}
                placeholder="Search..."
            />
            <ul>
                {results.map(r => (
                    <li key={r.id}>{r.name}</li>
                ))}
            </ul>
        </div>
    );
}

This implementation uses the actual id field as the list key rather than the array index, which is the correct approach when items have stable identifiers.

The cache is session-scoped: it lives in memory and is cleared when the component unmounts. For persistent caching across sessions, localStorage or sessionStorage can be used, though those require serialization and a staleness strategy.

Limitations of the Map cache: it has no size limit and no expiry. For a real application, consider limiting the cache size (evicting the oldest entry when the limit is reached) or setting a TTL on cached values. Libraries like TanStack Query handle these concerns automatically.

When to use caching: data that does not change frequently, repeated queries within the same session, expensive computations or requests where the same input reliably produces the same output.


Combining the Techniques

For a production search component, debouncing and caching complement each other well: debouncing reduces the number of requests that reach the network, and caching eliminates round-trips for queries that have already been fetched.

The general pattern:

const debouncedFetch = useCallback(
    debounce(async (value: string) => {
        if (cache.has(value)) {
            setResults(cache.get(value)!);
            return;
        }
        const data = await fetchFromApi(value);
        cache.set(value, data);
        setResults(data);
    }, 300),
    []
);

For dashboards or feeds that update continuously, throttling and caching work well together: throttling caps the request rate, and caching serves repeated intervals from memory when the underlying data has not changed.


When to Use a Library Instead

For applications where data fetching is a significant concern, libraries like TanStack Query (formerly React Query) and SWR handle debouncing, caching, deduplication, background revalidation, and error states in a unified API. The manual implementations above are useful for understanding the underlying mechanisms or for lightweight cases where adding a dependency is not warranted.

For any application with multiple data-fetching components, a server state library is worth evaluating before implementing these patterns manually.


Key Takeaways

  • Debouncing delays a function call until a pause in activity. Use it for search inputs and form validation to prevent firing on every keystroke.

  • Throttling limits a function to at most one call per interval. Use it for scroll handlers and continuous event streams that need regular but rate-limited updates.

  • Caching stores previous responses to skip network requests for repeated queries. A Map in a useRef is a simple session-scoped cache.

  • Always cancel in-flight requests with AbortController when a new request starts or the component unmounts. This prevents race conditions and state updates on unmounted components.

  • Cancel pending debounced and throttled calls in useEffect cleanup to prevent them from firing after unmount.

  • For complex data-fetching requirements, TanStack Query and SWR handle these concerns more robustly than manual implementations.


Conclusion

Debouncing, throttling, and caching address the same underlying problem from different angles: preventing the application from doing more work than necessary. Debouncing reduces work by waiting for activity to settle. Throttling reduces work by capping the rate. Caching eliminates work by reusing previous results.

The correct technique depends on the interaction pattern. Search inputs benefit from debouncing. Scroll handlers benefit from throttling. Both benefit from caching. Understanding which problem each technique solves makes it straightforward to choose the right one or to combine them when the use case requires it.


Implemented one of these patterns in a specific context and found an edge case worth sharing? Leave it in the comments.

Topics
webdevreactjavascript
Interserver Hosting#1 VALUEAffordable, reliable hosting from $2.50/mo99.9% uptimeSponsored

Discussion

Join the discussion

Sign in to share your thoughts and engage with the community.

Sign In
Loading comments…

Continue Reading

More Articles

View all
Cover image for: Embedding Cybersecurity in Development: Best Practices for 2025
Jul 1, 20257 MIN READ min read

Embedding Cybersecurity in Development: Best Practices for 2025

A developer-focused guide to integrating security into your workflow—covering tools, practices, and mindset shifts for 2025.

Cover image for: The 3-Device Rule: How to Simplify Your Digital Life and Stop Overbuying Tech
Aug 5, 20255 MIN READ min read

The 3-Device Rule: How to Simplify Your Digital Life and Stop Overbuying Tech

Tired of juggling too many devices? Learn the 3-device rule that helps you streamline your digital life, reduce clutter, and focus on what really matters.

Cover image for: Array Destructuring in PHP: A Practical Guide for Modern Developers
Mar 12, 20255 MIN READ min read

Array Destructuring in PHP: A Practical Guide for Modern Developers

From PHP 7.1 to 8.1—learn how array destructuring simplifies variable assignment, reduces boilerplate, and improves readability in modern PHP development.

Cover image for: How Much Does Business Email Really Cost? (And How to Save Money)
May 25, 20254 MIN READ min read

How Much Does Business Email Really Cost? (And How to Save Money)

If you're paying for business email through Google Workspace or Microsoft 365, you might be overpaying. Here's how to rethink your setup and save hundreds per year.

|Made with · © 2026|TermsPrivacy
AboutBlogContact

Free, open-source tools for developers and creators · Community driven