Fix: FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
Quick Answer
How to fix the JavaScript heap out of memory error by increasing Node.js memory limits, fixing memory leaks, and optimizing builds in webpack, Vite, and Docker.
The Error
You run a Node.js script, start a build, or launch your development server, and the process crashes with this:
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
1: 0x100a7a1c4 node::OOMErrorHandler(char const*, v8::OOMDetails const&)
2: 0x100c1e5d0 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, v8::OOMDetails const&)
3: 0x100c1e56c v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, v8::OOMDetails const&)
4: 0x100dfba60 v8::internal::Heap::GarbageCollectionReasonToString(v8::internal::GarbageCollectionReason)Sometimes it shows up during npm run build, webpack compilation, or when processing large datasets. The error might also appear as:
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memoryOr:
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memoryRegardless of the exact variant, the root cause is the same: Node.js ran out of memory.
Why This Happens
Node.js uses V8 as its JavaScript engine, and V8 has a default memory limit. On 64-bit systems, this limit is approximately 1.5 GB to 2 GB depending on the Node.js version. When your process tries to allocate memory beyond this ceiling, V8’s garbage collector cannot free enough space, and the process crashes.
Several common scenarios trigger this:
- Large builds: Webpack, Vite, Next.js, or Angular builds that process thousands of modules consume significant memory during bundling, tree-shaking, and source map generation.
- Processing large files: Reading an entire large JSON file or CSV into memory at once (for example with
JSON.parse(fs.readFileSync('huge-file.json'))) can exceed the limit quickly. - Memory leaks: Event listeners that are never removed, closures that hold references to large objects, global caches that grow without bounds, or accumulating data in arrays during long-running processes.
- Docker or CI/CD environments: Containers and CI runners often have tight memory constraints. Even if you increase the Node.js heap limit, the container itself may kill the process when it exceeds the container’s memory allocation. This is the same mechanism behind Docker’s OOMKilled (exit code 137).
- Dependency bloat: Installing heavy packages or importing large libraries you only partially use adds to the baseline memory footprint.
Understanding which of these applies to your situation determines which fix you need.
Fix 1: Increase the Memory Limit with --max-old-space-size
The fastest fix is to give Node.js more memory. Pass the --max-old-space-size flag with a value in megabytes:
node --max-old-space-size=4096 your-script.jsThis sets the V8 heap limit to 4 GB. Common values:
| Value | Memory |
|---|---|
2048 | 2 GB |
4096 | 4 GB |
8192 | 8 GB |
16384 | 16 GB |
For build tools, you typically need to pass this through the tool’s CLI. For example, with webpack:
node --max-old-space-size=4096 ./node_modules/.bin/webpack --config webpack.prod.jsOr with a Next.js build:
node --max-old-space-size=4096 ./node_modules/.bin/next buildPro Tip: Don’t blindly set this to the maximum your machine allows. Set it to roughly 75% of available RAM to leave room for the operating system and other processes. On a machine with 8 GB RAM,
--max-old-space-size=6144is a reasonable ceiling. Setting it too high can cause your system to swap to disk, making everything slower.
Fix 2: Set NODE_OPTIONS Environment Variable
If you don’t want to modify every command, set the memory limit globally through the NODE_OPTIONS environment variable.
On Linux and macOS:
export NODE_OPTIONS="--max-old-space-size=4096"On Windows (Command Prompt):
set NODE_OPTIONS=--max-old-space-size=4096On Windows (PowerShell):
$env:NODE_OPTIONS="--max-old-space-size=4096"To make this permanent in your project, add it to your package.json scripts:
{
"scripts": {
"build": "NODE_OPTIONS='--max-old-space-size=4096' webpack --config webpack.prod.js",
"build:win": "set NODE_OPTIONS=--max-old-space-size=4096 && webpack --config webpack.prod.js"
}
}For cross-platform compatibility, use the cross-env package:
npm install --save-dev cross-env{
"scripts": {
"build": "cross-env NODE_OPTIONS='--max-old-space-size=4096' webpack --config webpack.prod.js"
}
}Note: If NODE_OPTIONS is already set elsewhere (for example, in your CI pipeline or Docker image), your new value will override it entirely. Concatenate values if you need multiple flags: NODE_OPTIONS="--max-old-space-size=4096 --openssl-legacy-provider".
Fix 3: Fix Memory Leaks in Your Code
Increasing the memory limit is a band-aid. If your application has a memory leak, it will eventually crash regardless of how much memory you allocate. Here are the most common leak patterns in Node.js and how to fix them.
Unbounded Caches or Arrays
// BAD: This array grows forever in a long-running process
const cache = [];
app.get('/data', (req, res) => {
const result = expensiveQuery(req.params.id);
cache.push(result); // Never cleaned up
res.json(result);
});Fix it by using a bounded cache with an LRU (Least Recently Used) strategy:
import { LRUCache } from 'lru-cache';
const cache = new LRUCache({ max: 500 }); // Maximum 500 entries
app.get('/data', (req, res) => {
const cached = cache.get(req.params.id);
if (cached) return res.json(cached);
const result = expensiveQuery(req.params.id);
cache.set(req.params.id, result);
res.json(result);
});Event Listener Leaks
// BAD: Adding a listener on every request without removing it
app.get('/stream', (req, res) => {
process.on('SIGTERM', () => {
res.end('Server shutting down');
});
});Node.js warns you about this with MaxListenersExceededWarning. Fix it by removing listeners when they’re no longer needed:
app.get('/stream', (req, res) => {
const handler = () => {
res.end('Server shutting down');
};
process.on('SIGTERM', handler);
req.on('close', () => {
process.removeListener('SIGTERM', handler);
});
});Closures Holding Large References
// BAD: The closure keeps `hugeData` in memory as long as `getItem` exists
function createLookup() {
const hugeData = loadEntireDatabase(); // 500 MB object
return function getItem(id) {
return hugeData[id];
};
}
const lookup = createLookup(); // hugeData is never garbage collectedFix it by restructuring so you don’t hold the entire dataset in memory, or use a database query instead.
Fix 4: Optimize Large JSON and File Processing
Parsing large JSON files with JSON.parse() requires roughly twice the file size in memory — once for the raw string, once for the parsed object. A 500 MB JSON file needs at least 1 GB of free heap.
Stream Large JSON Files
Instead of loading the entire file at once, use a streaming JSON parser:
import { createReadStream } from 'fs';
import { parser } from 'stream-json';
import { streamArray } from 'stream-json/streamers/StreamArray.js';
const pipeline = createReadStream('huge-file.json')
.pipe(parser())
.pipe(streamArray());
pipeline.on('data', ({ value }) => {
processItem(value); // Process one item at a time
});
pipeline.on('end', () => {
console.log('Done processing');
});Stream Large CSV Files
The same principle applies to CSV files. Use a streaming parser like csv-parser:
import { createReadStream } from 'fs';
import csv from 'csv-parser';
createReadStream('large-dataset.csv')
.pipe(csv())
.on('data', (row) => {
processRow(row);
})
.on('end', () => {
console.log('CSV processing complete');
});Avoid fs.readFileSync for Large Files
Replace synchronous reads with streams for any file larger than a few dozen megabytes:
// BAD
const data = JSON.parse(fs.readFileSync('large.json', 'utf-8'));
// GOOD
import { createReadStream } from 'fs';
import { pipeline } from 'stream/promises';
// Use a streaming approach as shown aboveThis is a different issue from the module resolution errors you might see — the file exists and can be found, but it’s simply too large to fit in memory all at once.
Fix 5: Optimize Webpack and Vite Builds
Build tools are among the most common triggers for heap out of memory errors. Here are targeted fixes for each.
Webpack
Generate source maps more efficiently. The default source-map devtool is the most memory-intensive option. Switch to a lighter alternative for development:
// webpack.config.js
module.exports = {
devtool: process.env.NODE_ENV === 'production'
? 'source-map'
: 'eval-cheap-module-source-map',
};Use thread-loader to offload heavy loaders (like babel-loader or ts-loader) to worker threads:
module.exports = {
module: {
rules: [
{
test: /\.tsx?$/,
use: [
'thread-loader',
'ts-loader',
],
},
],
},
};Split your build into chunks to reduce peak memory usage. If you’re getting OOM during the optimization phase, limit parallel operations:
module.exports = {
optimization: {
minimizer: [
new TerserPlugin({
parallel: 2, // Limit parallel minification (default uses all CPUs)
}),
],
},
};If your webpack build consistently fails, you may also encounter module parse failures alongside the OOM error. Fix the parse errors first, as failed parses can cause webpack to retry and consume more memory.
Vite
Vite typically uses less memory than webpack, but large projects can still hit the limit during production builds (which use Rollup under the hood).
Increase memory for Vite builds:
node --max-old-space-size=4096 ./node_modules/.bin/vite buildIf the issue persists, disable source maps for the build or split the build with build.rollupOptions.output.manualChunks:
// vite.config.js
export default defineConfig({
build: {
sourcemap: false,
rollupOptions: {
output: {
manualChunks(id) {
if (id.includes('node_modules')) {
return 'vendor';
}
},
},
},
},
});Angular, Next.js, and Other Frameworks
Most frameworks provide a way to pass Node.js flags. Check your framework’s documentation, but the pattern is usually:
# Angular
node --max-old-space-size=4096 ./node_modules/.bin/ng build --configuration production
# Next.js
NODE_OPTIONS='--max-old-space-size=4096' next build
# Gatsby
NODE_OPTIONS='--max-old-space-size=4096' gatsby buildFix 6: Handle Docker Container Memory Limits
Running Node.js inside a Docker container adds another layer. Even if you set --max-old-space-size=8192, the container might only have 512 MB allocated. In that case the kernel’s OOM killer terminates the process before V8’s own limit kicks in.
Set proper memory limits in your docker run command:
docker run --memory=4g --memory-swap=4g your-appOr in docker-compose.yml:
services:
app:
build: .
deploy:
resources:
limits:
memory: 4G
environment:
- NODE_OPTIONS=--max-old-space-size=3072Note: Set --max-old-space-size to roughly 75% of the container’s memory limit. The remaining 25% is needed for V8 internals, native allocations, buffers, and the operating system overhead inside the container. If you set both values to the same number, the OOM killer will terminate the process before V8 can gracefully handle the out-of-memory condition. This is identical to the exit code 137 OOMKilled issue.
In your Dockerfile, you can set the environment variable directly:
FROM node:20-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY . .
ENV NODE_OPTIONS="--max-old-space-size=3072"
CMD ["node", "server.js"]Fix 7: Fix CI/CD Pipeline OOM Crashes
CI/CD environments like GitHub Actions, GitLab CI, and Jenkins often have limited memory (GitHub Actions runners get about 7 GB). Large builds frequently OOM in CI even when they work locally.
GitHub Actions
Add NODE_OPTIONS to your workflow file:
jobs:
build:
runs-on: ubuntu-latest
env:
NODE_OPTIONS: --max-old-space-size=4096
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 20
- run: npm ci
- run: npm run buildGitLab CI
build:
stage: build
variables:
NODE_OPTIONS: "--max-old-space-size=4096"
script:
- npm ci
- npm run buildReduce Memory in CI Builds
Beyond increasing limits, reduce memory usage in CI:
- Use
npm ciinstead ofnpm install— it’s faster and uses less memory. - Avoid running tests and builds in parallel unless your runner has enough RAM.
- Disable source maps in CI if you don’t need them for deployments.
- Cache
node_modulesto avoid reinstalling on every run.
If your CI build fails with an exit code 1 and the logs show the heap error, the fix is the same: increase NODE_OPTIONS or reduce memory consumption.
Fix 8: Profile Memory Usage to Find the Root Cause
When increasing memory limits doesn’t solve the problem — or you want to understand what’s consuming memory — use Node.js’s built-in profiling tools.
Use --inspect with Chrome DevTools
Start your application with the --inspect flag:
node --inspect --max-old-space-size=4096 your-script.jsThen open Chrome and navigate to chrome://inspect. Click “Open dedicated DevTools for Node”. Go to the Memory tab and take a Heap Snapshot.
The heap snapshot shows you:
- Which objects consume the most memory
- How many instances of each constructor exist
- The retainer tree — what’s keeping objects from being garbage collected
To find leaks, take two snapshots at different times and use the Comparison view. Objects that grow between snapshots are likely leaking.
Use --inspect-brk for Build Scripts
For build scripts that crash too quickly to attach a debugger, use --inspect-brk to pause execution at the first line:
node --inspect-brk --max-old-space-size=4096 ./node_modules/.bin/webpackThis gives you time to open DevTools and start recording before the memory spike.
Use process.memoryUsage() for Monitoring
Add memory logging to long-running processes to identify when memory starts growing:
setInterval(() => {
const usage = process.memoryUsage();
console.log({
rss: `${Math.round(usage.rss / 1024 / 1024)} MB`,
heapUsed: `${Math.round(usage.heapUsed / 1024 / 1024)} MB`,
heapTotal: `${Math.round(usage.heapTotal / 1024 / 1024)} MB`,
external: `${Math.round(usage.external / 1024 / 1024)} MB`,
});
}, 10000); // Log every 10 secondsKey metrics to watch:
heapUsed: Actual memory used by JavaScript objects. If this grows continuously, you have a leak.rss(Resident Set Size): Total memory allocated to the process, including native code and buffers.external: Memory used by C++ objects bound to JavaScript objects (like Buffers).
Use the --heap-prof Flag
Node.js 12+ supports heap profiling natively:
node --heap-prof your-script.jsThis generates a .heapprofile file that you can load in Chrome DevTools (Memory tab → Load) to see allocation timelines.
Common Mistake: Developers often focus only on
heapUsedwhen diagnosing memory issues. But native memory (Buffers, streams, C++ add-ons) is counted underrssandexternal, notheapUsed. Ifrssgrows butheapUsedstays flat, the leak is in native code or Buffers, and increasing--max-old-space-sizewon’t help — you need to find and fix the native memory leak instead.
Fix 9: Upgrade Node.js
Newer versions of Node.js include V8 improvements that handle memory more efficiently. Specifically:
- Node.js 12+: V8 introduced concurrent marking for garbage collection, reducing GC pauses and improving memory management.
- Node.js 14+: V8’s pointer compression on 64-bit systems can reduce heap size by up to 40% for pointer-heavy workloads.
- Node.js 20+: Further GC improvements and better defaults for large heaps.
Check your current version:
node --versionUpgrade to the latest LTS version:
# Using nvm
nvm install --lts
nvm use --lts
# Using fnm
fnm install --lts
fnm use lts-latestAfter upgrading, your build might work without any memory flag changes. If you’re stuck on an older Node.js version and also hitting file watcher limits, upgrading resolves both issues.
Still Not Working?
If you’ve tried everything above and the error persists:
- Check for
node_modulesbloat. Runnpx depcheckto find unused dependencies. Remove packages you don’t use — each one adds to the build graph and memory usage. - Split large monorepo builds. If you’re building a monorepo with many packages, build each package separately instead of all at once. Tools like Turborepo and Nx can orchestrate incremental builds.
- Move heavy processing out of Node.js. If you’re processing gigabytes of data, consider using a language better suited for it (Python with generators, Go, Rust) or offload to a database query.
- Check for circular dependencies. Circular imports can cause webpack and other bundlers to re-process modules repeatedly, inflating memory usage. Use
madge --circularto detect them. - Try
node --expose-gcwith manual garbage collection. In extreme cases, you can trigger garbage collection manually between heavy operations:
// Start with: node --expose-gc your-script.js
async function processBatches(items) {
const batchSize = 1000;
for (let i = 0; i < items.length; i += batchSize) {
const batch = items.slice(i, i + batchSize);
await processBatch(batch);
if (global.gc) {
global.gc(); // Force garbage collection between batches
}
}
}- Use
worker_threadsfor parallel processing. Each worker gets its own V8 heap, effectively multiplying your available memory:
import { Worker, isMainThread, workerData } from 'worker_threads';
if (isMainThread) {
const worker = new Worker(new URL(import.meta.url), {
workerData: { file: 'chunk-1.json' },
resourceLimits: {
maxOldGenerationSizeMb: 2048,
},
});
worker.on('message', (result) => console.log(result));
worker.on('error', (err) => console.error(err));
} else {
const data = processFile(workerData.file);
parentPort.postMessage(data);
}- Set
resourceLimitson worker threads. TheresourceLimitsoption lets you control memory per worker independently, giving you finer control than--max-old-space-sizewhich applies to the main thread.
Solo developer based in Japan. Every solution is cross-referenced with official documentation and tested before publishing.
Was this article helpful?
Related Articles
Fix: SyntaxError: Cannot use import statement outside a module
How to fix 'SyntaxError: Cannot use import statement outside a module' in Node.js, TypeScript, Jest, and browsers by configuring ESM, package.json type, and transpiler settings.
Fix: npm ERR! enoent ENOENT: no such file or directory
How to fix the npm ENOENT no such file or directory error caused by missing package.json, wrong directory, corrupted node_modules, broken symlinks, and npm cache issues.
Fix: Objects are not valid as a React child (found: [object Object])
How to fix the React error 'Objects are not valid as a React child' caused by rendering plain objects, Date objects, Promises, or API responses directly in JSX.
Fix: React Hook useEffect has a missing dependency warning
How to fix the React Hook useEffect has a missing dependency warning — covers the exhaustive-deps rule, useCallback, useMemo, refs, proper fetch patterns, and when to safely suppress the lint warning.