Achieving a 3x reduction in React Bundle size: A case study

January 03, 2024

5 min read999 views

In our recent efforts to improve the performance of our frontend applications, we made significant strides by optimizing our use of our component library. Here's how we achieved a more than 3x reduction in our bundle size, speeding up our build process and enhancing our app's runtime efficiency.

The Power of Tree Shaking

Tree shaking is a term commonly used in the JavaScript context for dead-code elimination. It relies on the static structure of ES modules, which allows unused exports to be removed during the bundling process. Our project uses Material UI extensively, a popular React UI framework known for its comprehensive component library. Initially, our bundle size was bloated due to importing entire libraries, even when only using specific components.

To address this, we implemented stricter import rules using ESLint to ensure that every component is imported directly from its path:

// ESLint rules for Tree Shaking
"no-restricted-imports": [
    "error",
    {
        paths: [
            {
                name: "@material-ui/icons",
                message: 'Use direct imports, e.g., Import ExampleIcon from "@material-ui/icons/Example"'
            },
            {
                name: "@material-ui/core",
                message: 'Use direct imports, e.g., Import Example from "@material-ui/core/Example"'
            },
            {
                name: "mdi-material-ui",
                message: 'Direct imports preferred, e.g., Import ExampleIcon from "mdi-material-ui/Example"'
            }
        ],
        patterns: [
            {
                message: "Use path imports instead. See https://v4.mui.com/guides/minimizing-bundle-size/#option-1",
                group: [
                    "@material-ui/*/*/*",
                    "!@material-ui/core/test-utils/*",
                    "!@material-ui/core/styles/*",
                    "!@material-ui/core/colors/*",
                    "!@material-ui/pickers/typings/*",
                ]
            }
        ]
    }
]

Thus, if we attempt to use an import like this:

import { useMediaQuery } from '@material-ui/core';

ESLint will flag an error with the message: '@material-ui/core' import is restricted from being used. Import should be formatted as follows: Import Example from "@material-ui/core/Example". It will prompt us to correct the import as shown below:

import useMediaQuery from '@material-ui/core/useMediaQuery';

This change enforced that developers import only what they need, significantly reducing the initial load time by cutting down unnecessary code from the bundle.

Embracing Lazy Loading with @loadable/components

Lazy loading is a design pattern commonly used in web development to defer initialization of objects until the point at which they are needed. It can significantly reduce the initial load time and positively impact performance. We incorporated lazy loading using the @loadable/component library, which integrates seamlessly with our server-side rendering setup. Here's an example of how we used it:

import React, { Suspense } from "react";
import TextInput from "@material-ui/core/TextField";
import loadable from "@loadable/component";
const SearchSuggestion = loadable(() => import("./SearchSuggestion"));

export const SearchInput = React.forwardRef((props, ref) => {
    const { suggestions, value, onChange} = props;
    return (
        <>
            <TextInput
                ref={ref}
                value={value}
                onChange={onChange}
            />
            {suggestions.length > 0 ? (
                <Suspense>
                    <SearchSuggestion
                        items={suggestions}
                        maxSuggestions={3}
                    />
                </Suspense>
            ) : null}
        <>
    )
})

In this example we are lazy loading the suggestion items only when suggestions is greater than 1.

Using @loadable/component allowed us to split our codebase into smaller chunks that are only loaded when required. This approach not only reduced the load time but also decreased the time-to-interactive, enhancing the user experience by not loading heavy components upfront.

This library was causing some conflicts when testing components with Jest. To address this, we created the following mock setup:

mockLoadable.tsx
import React from "react";

export const mockLoadable = (load) => {
  let Component;
  // Capture the component from the module load function
  const loadPromise = load().then(
    (val) => (Component = val.default)
  );
  // Create a react component which renders the loaded component
  const Loadable = (props) => {
    if (!Component) {
      throw new Error(
        "Bundle split module not loaded yet, ensure you beforeAll(() => MyLazyComponent.load()) in your test, import statement: " +
          load.toString()
      );
    }
    return <Component {...props} />;
  };
  Loadable.load = () => loadPromise;
  return Loadable;
};

Then, in the component that leverages Loadable, when we needed to test it, we included at the beginning of our test files:

jest.mock('@loadable/component', () =>
  jest.requireActual('../testing/testUtilities/mockLoadable')
);

This setup ensures that the asynchronous components are properly handled in our Jest tests, providing a stable environment for unit testing.

Advanced Code Splitting Using Vite

To further optimize our application, we embraced advanced code splitting strategies using Vite, which allowed us to push only the necessary code to production. This approach was critical in reducing our bundle size significantly.

Here's the Vite configuration we used to achieve this:

vite.config.ts
import { defineConfig, splitVendorChunkPlugin } from "vite";
import react from "@vitejs/plugin-react-swc";
import tsPaths from "vite-tsconfig-paths";
import { visualizer } from "rollup-plugin-visualizer";

export default defineConfig({
    server: {
        https: false,
        port: 5173,
    },
    base: "./",
    plugins: [react(), tsPaths(), visualizer(), splitVendorChunkPlugin()],
    optimizeDeps: {
        force: true,
    },
    define: { "process.env.NODE_ENV": '"production"' },
    build: {
        outDir: "./build",
        rollupOptions: {
            output: {
                manualChunks: {
                    "react-vendor": ["react", "react-dom", "react-router-dom"],
                    "react-spring-carousel": ["react-spring-carousel"],
                    "material-ui": ["@material-ui/core", "@material-ui/icons", "@material-ui/lab", "@material-ui/styles"],
                    "react-select": ["react-select"],
                },
            },
        },
    },
});

Vite's flexible configuration supports manual chunking, allowing you to experiment and understand how much space various library groups consume.

We also used rollup-plugin-visualizer to provide a visual representation of how much weight each library and dependency adds to the bundle. This tool has been invaluable in helping us understand and optimize our dependency graph.

Results and Conclusion

By implementing these optimizations, we observed a more than 3x reduction in our overall bundle size. Our build times improved, and our application's performance increased dramatically, especially on mobile devices where bandwidth and processing power are more limited.

Adopting tree shaking, lazy loading, and manual code splitting with Vite requires an upfront investment in configuring and enforcing best practices, but the payoff in terms of performance can be substantial. If you're using a component library like Material UI, consider these strategies to optimize your application's bundle size and runtime efficiency.

I encourage all developers to review their import statements, consider lazy loading, and explore advanced code splitting to improve performance. Being vigilant and proactive about performance will consistently yield positive results.