Skip to content

Fix: PHP Fatal error: Allowed memory size of X bytes exhausted

FixDevs ·

Quick Answer

How to fix PHP Fatal error Allowed memory size exhausted caused by memory limits, large datasets, memory leaks, recursive functions, and inefficient queries.

The Error

Your PHP application crashes with:

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 65536 bytes)
in /var/www/html/app/process.php on line 42

Or variations:

PHP Fatal error: Out of memory (allocated 268435456) (tried to allocate 4096 bytes)
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 20480 bytes)
in /vendor/laravel/framework/src/Illuminate/Database/Eloquent/Builder.php on line 588
Composer: proc_open(): fork failed - Cannot allocate memory

PHP tried to use more memory than the configured limit allows. The default limit is typically 128MB (134217728 bytes). Your script needs more memory than is available.

Why This Happens

PHP has a per-script memory limit (memory_limit in php.ini) that prevents a single script from consuming all server memory. When a script exceeds this limit, PHP kills it with a fatal error.

Common causes:

  • Loading too much data at once. Fetching millions of database rows into memory.
  • Large file processing. Reading an entire large file into a string.
  • Memory leaks. Objects accumulate in loops and are never freed.
  • Recursive functions. Deep recursion with large data at each level.
  • Image processing. GD or Imagick operations on large images.
  • Composer installation. Composer itself needs memory for dependency resolution.
  • Low memory limit. The default 128MB is too low for some operations.

Fix 1: Increase the Memory Limit

The quickest fix. Increase memory_limit in PHP configuration:

In php.ini:

; Find your php.ini:
; php -i | grep php.ini
; Common locations: /etc/php/8.3/fpm/php.ini, /etc/php/8.3/cli/php.ini

memory_limit = 256M
; or
memory_limit = 512M
; or for unlimited (not recommended in production):
memory_limit = -1

Restart PHP-FPM after changing:

sudo systemctl restart php8.3-fpm
# or
sudo systemctl restart php-fpm

In a specific script (runtime override):

ini_set('memory_limit', '512M');

In .htaccess (Apache):

php_value memory_limit 256M

In docker-compose.yml:

services:
  php:
    image: php:8.3-fpm
    environment:
      PHP_MEMORY_LIMIT: 256M

For Composer specifically:

COMPOSER_MEMORY_LIMIT=-1 composer install
# or
php -d memory_limit=-1 /usr/local/bin/composer install

Pro Tip: Increase the limit enough to solve the immediate problem, but not to -1 (unlimited) in production. Unlimited memory allows a single buggy script to crash the entire server. Set it to the maximum your scripts actually need (check with memory_get_peak_usage()).

Fix 2: Process Data in Chunks

Loading all data at once is the most common cause. Process in batches:

Broken — loading all rows:

// Loads ALL users into memory at once
$users = User::all();  // 1 million rows = crash!

foreach ($users as $user) {
    processUser($user);
}

Fixed — use chunking:

// Laravel: Process 1000 rows at a time
User::chunk(1000, function ($users) {
    foreach ($users as $user) {
        processUser($user);
    }
});

// Laravel: Lazy collection (even more memory-efficient)
User::lazy()->each(function ($user) {
    processUser($user);
});

Raw PDO with cursor:

$stmt = $pdo->prepare("SELECT * FROM users");
$stmt->execute();

while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) {
    processUser($row);
    // Only one row in memory at a time
}

MySQLi unbuffered query:

$mysqli->real_query("SELECT * FROM large_table");
$result = $mysqli->use_result();  // Unbuffered — rows fetched one at a time

while ($row = $result->fetch_assoc()) {
    processRow($row);
}
$result->free();

Common Mistake: Using ->get() or ::all() on large tables. These methods load the entire result set into memory. Use chunk(), lazy(), or cursor() for large datasets.

Fix 3: Fix File Processing

Reading large files entirely into memory:

Broken:

// Loads the entire file into memory
$content = file_get_contents('/path/to/huge-file.csv');  // 2GB file = crash!
$lines = explode("\n", $content);

Fixed — read line by line:

$handle = fopen('/path/to/huge-file.csv', 'r');
if ($handle) {
    while (($line = fgets($handle)) !== false) {
        processLine($line);
    }
    fclose($handle);
}

Fixed — use SplFileObject:

$file = new SplFileObject('/path/to/huge-file.csv');
$file->setFlags(SplFileObject::READ_CSV);

foreach ($file as $row) {
    if ($row[0] !== null) {  // Skip empty lines
        processRow($row);
    }
}

Fixed — use generators:

function readCsv(string $path): Generator {
    $handle = fopen($path, 'r');
    while (($row = fgetcsv($handle)) !== false) {
        yield $row;
    }
    fclose($handle);
}

foreach (readCsv('/path/to/huge-file.csv') as $row) {
    processRow($row);
    // Only one row in memory at a time
}

For JSON files:

// Wrong — loads entire JSON into memory
$data = json_decode(file_get_contents('huge.json'), true);

// Fixed — use a streaming JSON parser
// composer require halaxa/json-machine
use JsonMachine\Items;

$items = Items::fromFile('huge.json');
foreach ($items as $item) {
    processItem($item);
}

Fix 4: Fix Memory Leaks in Loops

Variables accumulating in loops:

Broken:

$results = [];
foreach ($largeDataSet as $item) {
    $processed = heavyProcessing($item);
    $results[] = $processed;  // Array grows until memory runs out
    log($processed);  // If you only need to log, don't store
}

Fixed — process and discard:

foreach ($largeDataSet as $item) {
    $processed = heavyProcessing($item);
    log($processed);
    // Don't accumulate results if you don't need them all
}

Fixed — write to file or database instead of storing in memory:

$outputFile = fopen('results.csv', 'w');
foreach ($largeDataSet as $item) {
    $processed = heavyProcessing($item);
    fputcsv($outputFile, $processed);
}
fclose($outputFile);

Unset large variables when done:

$bigData = loadData();
processData($bigData);
unset($bigData);  // Free the memory immediately
gc_collect_cycles();  // Force garbage collection

Laravel Eloquent in loops — disable event listeners and relations:

// Prevent Eloquent from accumulating query log
DB::disableQueryLog();

// Process in chunks
User::chunk(500, function ($users) {
    foreach ($users as $user) {
        $user->process();
    }
});

Fix 5: Fix Image Processing

Image operations can use massive amounts of memory:

// A 5000x5000 pixel image at 32-bit color uses ~100MB of raw memory
$image = imagecreatefromjpeg('large-photo.jpg');

Estimate memory needed:

function estimateImageMemory(string $path): int {
    [$width, $height] = getimagesize($path);
    // 4 bytes per pixel (RGBA) + overhead
    return $width * $height * 4 * 1.5;  // 1.5x safety factor
}

$needed = estimateImageMemory('photo.jpg');
$available = intval(ini_get('memory_limit')) * 1024 * 1024;

if ($needed > $available * 0.8) {
    ini_set('memory_limit', ceil($needed / 1024 / 1024 * 2) . 'M');
}

Use ImageMagick CLI instead of GD for large images:

// Process with ImageMagick command line — doesn't use PHP memory
exec('convert large-photo.jpg -resize 800x600 thumbnail.jpg');

Resize before processing:

// Process in tiles or reduce resolution first
$image = imagecreatefromjpeg('large-photo.jpg');
$thumb = imagescale($image, 800, 600);
imagedestroy($image);  // Free the original immediately
// Work with the smaller $thumb

Fix 6: Monitor Memory Usage

Track memory usage to find the problem:

echo "Memory: " . memory_get_usage(true) / 1024 / 1024 . " MB\n";
echo "Peak: " . memory_get_peak_usage(true) / 1024 / 1024 . " MB\n";

Profile memory usage in sections:

function memoryCheckpoint(string $label): void {
    static $last = 0;
    $current = memory_get_usage(true);
    $diff = $current - $last;
    echo sprintf("[%s] Memory: %.2f MB (Δ %.2f MB)\n",
        $label,
        $current / 1024 / 1024,
        $diff / 1024 / 1024
    );
    $last = $current;
}

memoryCheckpoint("Start");
$data = loadData();
memoryCheckpoint("After loadData");
processData($data);
memoryCheckpoint("After processData");

Use Xdebug profiler for detailed analysis:

; php.ini
xdebug.mode=profile
xdebug.output_dir=/tmp/xdebug

Fix 7: Fix Recursive Functions

Deep recursion with large data structures:

Broken:

function processTree(array $node): array {
    $result = processNode($node);
    foreach ($node['children'] ?? [] as $child) {
        $result['children'][] = processTree($child);  // Recursive, accumulates memory
    }
    return $result;
}

Fixed — use iterative approach:

function processTree(array $root): array {
    $stack = [&$root];
    while (!empty($stack)) {
        $node = &$stack[count($stack) - 1];
        array_pop($stack);
        processNode($node);
        foreach ($node['children'] ?? [] as &$child) {
            $stack[] = &$child;
        }
    }
    return $root;
}

Fix 8: Optimize PHP-FPM Settings

PHP-FPM worker processes each consume memory:

; /etc/php/8.3/fpm/pool.d/www.conf

; Calculate: available_memory / memory_per_process = max_children
; Example: 4GB server, 256MB per process → max 16 children
pm = dynamic
pm.max_children = 16
pm.start_servers = 4
pm.min_spare_servers = 2
pm.max_spare_servers = 8

; Restart workers after N requests to prevent memory leaks
pm.max_requests = 500

Check per-worker memory usage:

ps aux --sort -rss | grep php-fpm | head -10

Still Not Working?

Check for OPcache settings. OPcache uses shared memory separately from memory_limit:

opcache.memory_consumption=128
opcache.interned_strings_buffer=16

Check for session storage issues. Large session data stored in memory can accumulate.

Check for third-party library leaks. Some libraries accumulate internal state. Check their documentation for cleanup methods.

Consider using a queue for heavy processing:

// Instead of processing in a web request:
dispatch(new ProcessLargeDataJob($dataId));

// The job runs in a separate process with its own memory limit
// php artisan queue:work --memory=512

For Composer memory issues specifically, see the Composer documentation. For general PHP configuration, see Fix: PHP Composer memory limit. For Python memory errors, check Fix: JavaScript heap out of memory for similar patterns in other languages.

F

FixDevs

Solo developer based in Japan. Every solution is cross-referenced with official documentation and tested before publishing.

Was this article helpful?

Related Articles