From Jest to Vitest: improving Vitest performance

posted on 09/01/2026

After weeks of configuration and tweakings, my team and I finally made it, we've migrated from Jest to Vitest!

Our project is a big old React SPA, made of ~307k LOC, with ~500 test files for a net total of ~2662 tests (unit and component / integration tests).

We had been wanting to migrate out of Jest for some time for several fairly common reasons: its poor ESM support, the complexity of its configuration, and most of all, the belief that we could get much better test performance with a more modern tool.

Naturally, after trying out one other candidate on a parallel migration branch (Rstest, that wasn't mature enough for our needs), our choice fell on Vitest for also very common reasons: its great ESM support, its simple and straightforward configuration, its modern DX, its ecosystem, and obviously, its performances... that turned out to be not so great.

"What? But Vitest is one of the most performant JS test runners! It's much faster than Jest on my project!", you'll tell me. That's what we also expected after reading comparisons and several testimonials online, but it turned out that, in our case, Vitest was actually ~1.5x slower than Jest locally, and ~2x slower on CI... Yikes.

After more research, we found out that we were not the only team to notice such a slowdown compared to Jest (see for example this issue, this first, this second, or this third blog post, or even this Reddit thread), but apart from performances, all of our other expectations were met! So we were not ready to give up with Vitest!

It's now been a month since the migration, and I'm happy to say that we've finally successfully made our Vitest setup faster than our old Jest one! That's how we made it.

What slows down test runners

To fully understand the improvements we're about to discuss, we first need to understand how test runners run a test suite. Jest and Vitest will typically go through the following pipeline:

As stated in 3., test runners generally keep a transform cache to avoid retransforming already compiled modules for following tests. But they also keep a module cache that stores the state of evaluated modules in memory. By default, test runners also clear that module cache between test files to avoid side effects: it means that if two test files running in the same worker import the same module, they each get a separate clean instance of that module: this is called test isolation.

But test isolation obviously comes with a cost! Clearing the module cache between two test files means that subsequent imports of a given module will retrigger its whole evaluation: top-level code execution, binding of its exports, and resolutions of its imports. This is one of the most common reasons for a test suite slowing down: tests importing large dependencies graphs, meaning longer evaluation times due to test isolation.

But that doesn't explain why Vitest was slower than Jest in our case, right? Both Jest and Vitest has the constraint of test isolation, so what is the reason for Vitest being slower?

From my understanding, Vitest's architecture is built on Vite, which was designed for dev servers and bundling, and not really for the pattern test runners need, which is repeatedly importing thousands of modules across many isolated workers. Jest has spent years optimizing its custom module system for exactly that workload, which moreover relies on native Node.js's require, that is really fast. On the other hand, while Vitest is much faster than Jest on transform speed (esbuild/Rolldown being much faster than babel/ts-jest), its module evaluation system relying on Vite's transform pipeline makes it less performant for large test suites with deep dependency trees, where module evaluation and resolution (rather than transformation) are the real bottleneck.

Areas of improvement

Some tips and config changes are already well documented and covered in several articles, like this one for example. Here I'll try to cover two approaches that are a bit more demanding, but had a net positive impact on performance for us.

Splitting tests by 'projects'

Identifying and optimizing heavy imports