Skip to content

readFile will not read files larger than 2 GiB even if buffers can be larger #55864

@davazp

Description

@davazp

Version

v22.11.0

Platform

Darwin LAMS0127 23.6.0 Darwin Kernel Version 23.6.0: Thu Sep 12 23:36:23 PDT 2024; root:xnu-10063.141.1.701.1~1/RELEASE_ARM64_T6031 arm64 arm Darwin

Subsystem

No response

What steps will reproduce the bug?

const fs = require("fs/promises");

const FILE = "test.bin";

async function main() {
  const buffer1 = Buffer.alloc(3 * 1024 * 1024 * 1024);
  await fs.writeFile(FILE, buffer1);

  const buffer2 = await fs.readFile(FILE);
  // does not reach here
  console.log(buffer2.length);
}

main();

How often does it reproduce? Is there a required condition?

It is deterministic.

What is the expected behavior? Why is that the expected behavior?

readFile should allow for files as large as the max buffer size, as according to the documentation:

RR_FS_FILE_TOO_LARGE#
An attempt has been made to read a file whose size is larger than the maximum allowed size for a Buffer.

https://nodejs.org/api/errors.html#err_fs_file_too_large

In newer node versions, the maximum buffer has increased but the maximum file size is still capped at 2 GiB

In older versions (v18), the max buffer size on 64bit platforms was 4GB, but files cannot be that large either.

What do you see instead?

readFile will throw the error

RangeError [ERR_FS_FILE_TOO_LARGE]: File size (3221225472) is greater than 2 GiB

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    fsIssues and PRs related to the fs subsystem / file system.good first issueIssues that are suitable for first-time contributors.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions