From 5bfdc7ee741be3f9d1bd80cefb0ebf094dc2a937 Mon Sep 17 00:00:00 2001 From: Filip Skokan Date: Tue, 26 Aug 2025 12:15:10 +0200 Subject: [PATCH 001/103] doc,crypto: cleanup unlinked and self method references webcrypto.md MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/59608 Reviewed-By: Tobias Nießen Reviewed-By: Luigi Pinca Reviewed-By: Ulises Gascón Reviewed-By: James M Snell --- doc/api/webcrypto.md | 34 +++++++++++++++++----------------- 1 file changed, 17 insertions(+), 17 deletions(-) diff --git a/doc/api/webcrypto.md b/doc/api/webcrypto.md index 4e6594bfbe4db5..b567c89785dc8d 100644 --- a/doc/api/webcrypto.md +++ b/doc/api/webcrypto.md @@ -653,7 +653,7 @@ added: v15.0.0 * Type: {boolean} When `true`, the {CryptoKey} can be extracted using either -`subtleCrypto.exportKey()` or `subtleCrypto.wrapKey()`. +[`subtle.exportKey()`][] or [`subtle.wrapKey()`][]. Read-only. @@ -845,7 +845,7 @@ changes: * Returns: {Promise} Fulfills with an {ArrayBuffer} upon success. Using the method and parameters specified in `algorithm` and the keying -material provided by `key`, `subtle.decrypt()` attempts to decipher the +material provided by `key`, this method attempts to decipher the provided `data`. If successful, the returned promise will be resolved with an {ArrayBuffer} containing the plaintext result. @@ -887,7 +887,7 @@ changes: Using the method and parameters specified in `algorithm` and the keying -material provided by `baseKey`, `subtle.deriveBits()` attempts to generate +material provided by `baseKey`, this method attempts to generate `length` bits. When `length` is not provided or `null` the maximum number of bits for a given @@ -929,12 +929,12 @@ changes: Using the method and parameters specified in `algorithm`, and the keying -material provided by `baseKey`, `subtle.deriveKey()` attempts to generate +material provided by `baseKey`, this method attempts to generate a new {CryptoKey} based on the method and parameters in `derivedKeyAlgorithm`. -Calling `subtle.deriveKey()` is equivalent to calling `subtle.deriveBits()` to +Calling this method is equivalent to calling [`subtle.deriveBits()`][] to generate raw keying material, then passing the result into the -`subtle.importKey()` method using the `deriveKeyAlgorithm`, `extractable`, and +[`subtle.importKey()`][] method using the `deriveKeyAlgorithm`, `extractable`, and `keyUsages` parameters as input. The algorithms currently supported include: @@ -962,7 +962,7 @@ changes: * `data` {ArrayBuffer|TypedArray|DataView|Buffer} * Returns: {Promise} Fulfills with an {ArrayBuffer} upon success. -Using the method identified by `algorithm`, `subtle.digest()` attempts to +Using the method identified by `algorithm`, this method attempts to generate a digest of `data`. If successful, the returned promise is resolved with an {ArrayBuffer} containing the computed digest. @@ -1039,7 +1039,7 @@ changes: * Returns: {Promise} Fulfills with an {ArrayBuffer} upon success. Using the method and parameters specified by `algorithm` and the keying -material provided by `key`, `subtle.encrypt()` attempts to encipher `data`. +material provided by `key`, this method attempts to encipher `data`. If successful, the returned promise is resolved with an {ArrayBuffer} containing the encrypted result. @@ -1229,7 +1229,7 @@ changes: * `keyUsages` {string\[]} See [Key usages][]. * Returns: {Promise} Fulfills with a {CryptoKey} upon success. -The `subtle.importKey()` method attempts to interpret the provided `keyData` +The [`subtle.importKey()`][] method attempts to interpret the provided `keyData` as the given `format` to create a {CryptoKey} instance using the provided `algorithm`, `extractable`, and `keyUsages` arguments. If the import is successful, the returned promise will be resolved with the created {CryptoKey}. @@ -1290,7 +1290,7 @@ changes: Using the method and parameters given by `algorithm` and the keying material -provided by `key`, `subtle.sign()` attempts to generate a cryptographic +provided by `key`, this method attempts to generate a cryptographic signature of `data`. If successful, the returned promise is resolved with an {ArrayBuffer} containing the generated signature. @@ -1336,11 +1336,11 @@ changes: * Returns: {Promise} Fulfills with a {CryptoKey} upon success. In cryptography, "wrapping a key" refers to exporting and then encrypting the -keying material. The `subtle.unwrapKey()` method attempts to decrypt a wrapped +keying material. This method attempts to decrypt a wrapped key and create a {CryptoKey} instance. It is equivalent to calling -`subtle.decrypt()` first on the encrypted key data (using the `wrappedKey`, +[`subtle.decrypt()`][] first on the encrypted key data (using the `wrappedKey`, `unwrapAlgo`, and `unwrappingKey` arguments as input) then passing the results -in to the `subtle.importKey()` method using the `unwrappedKeyAlgo`, +to the [`subtle.importKey()`][] method using the `unwrappedKeyAlgo`, `extractable`, and `keyUsages` arguments as inputs. If successful, the returned promise is resolved with a {CryptoKey} object. @@ -1405,7 +1405,7 @@ changes: Using the method and parameters given in `algorithm` and the keying material -provided by `key`, `subtle.verify()` attempts to verify that `signature` is +provided by `key`, this method attempts to verify that `signature` is a valid cryptographic signature of `data`. The returned promise is resolved with either `true` or `false`. @@ -1446,12 +1446,12 @@ changes: In cryptography, "wrapping a key" refers to exporting and then encrypting the -keying material. The `subtle.wrapKey()` method exports the keying material into +keying material. This method exports the keying material into the format identified by `format`, then encrypts it using the method and parameters specified by `wrapAlgo` and the keying material provided by -`wrappingKey`. It is the equivalent to calling `subtle.exportKey()` using +`wrappingKey`. It is the equivalent to calling [`subtle.exportKey()`][] using `format` and `key` as the arguments, then passing the result to the -`subtle.encrypt()` method using `wrappingKey` and `wrapAlgo` as inputs. If +[`subtle.encrypt()`][] method using `wrappingKey` and `wrapAlgo` as inputs. If successful, the returned promise will be resolved with an {ArrayBuffer} containing the encrypted key data. From 430691d1afb5e2eb94fe7a639d4d78cb1de5adc4 Mon Sep 17 00:00:00 2001 From: Filip Skokan Date: Tue, 26 Aug 2025 17:41:25 +0200 Subject: [PATCH 002/103] crypto: support SLH-DSA KeyObject, sign, and verify MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/59537 Reviewed-By: Tobias Nießen --- deps/ncrypto/ncrypto.cc | 24 +++++ doc/api/crypto.md | 61 ++++++++---- lib/internal/crypto/keygen.js | 51 ++++++---- src/crypto/crypto_keys.cc | 36 +++++++ src/env_properties.h | 12 +++ test/fixtures/keys/Makefile | 96 +++++++++++++++++++ .../keys/slh_dsa_sha2_128f_private.pem | 4 + .../keys/slh_dsa_sha2_128f_public.pem | 4 + .../keys/slh_dsa_sha2_128s_private.pem | 4 + .../keys/slh_dsa_sha2_128s_public.pem | 4 + .../keys/slh_dsa_sha2_192f_private.pem | 5 + .../keys/slh_dsa_sha2_192f_public.pem | 4 + .../keys/slh_dsa_sha2_192s_private.pem | 5 + .../keys/slh_dsa_sha2_192s_public.pem | 4 + .../keys/slh_dsa_sha2_256f_private.pem | 6 ++ .../keys/slh_dsa_sha2_256f_public.pem | 4 + .../keys/slh_dsa_sha2_256s_private.pem | 6 ++ .../keys/slh_dsa_sha2_256s_public.pem | 4 + .../keys/slh_dsa_shake_128f_private.pem | 4 + .../keys/slh_dsa_shake_128f_public.pem | 4 + .../keys/slh_dsa_shake_128s_private.pem | 4 + .../keys/slh_dsa_shake_128s_public.pem | 4 + .../keys/slh_dsa_shake_192f_private.pem | 5 + .../keys/slh_dsa_shake_192f_public.pem | 4 + .../keys/slh_dsa_shake_192s_private.pem | 5 + .../keys/slh_dsa_shake_192s_public.pem | 4 + .../keys/slh_dsa_shake_256f_private.pem | 6 ++ .../keys/slh_dsa_shake_256f_public.pem | 4 + .../keys/slh_dsa_shake_256s_private.pem | 6 ++ .../keys/slh_dsa_shake_256s_public.pem | 4 + .../test-crypto-pqc-key-objects-slh-dsa.js | 74 ++++++++++++++ .../test-crypto-pqc-keygen-slh-dsa.js | 54 +++++++++++ .../test-crypto-pqc-sign-verify-slh-dsa.mjs | 63 ++++++++++++ 33 files changed, 542 insertions(+), 37 deletions(-) create mode 100644 test/fixtures/keys/slh_dsa_sha2_128f_private.pem create mode 100644 test/fixtures/keys/slh_dsa_sha2_128f_public.pem create mode 100644 test/fixtures/keys/slh_dsa_sha2_128s_private.pem create mode 100644 test/fixtures/keys/slh_dsa_sha2_128s_public.pem create mode 100644 test/fixtures/keys/slh_dsa_sha2_192f_private.pem create mode 100644 test/fixtures/keys/slh_dsa_sha2_192f_public.pem create mode 100644 test/fixtures/keys/slh_dsa_sha2_192s_private.pem create mode 100644 test/fixtures/keys/slh_dsa_sha2_192s_public.pem create mode 100644 test/fixtures/keys/slh_dsa_sha2_256f_private.pem create mode 100644 test/fixtures/keys/slh_dsa_sha2_256f_public.pem create mode 100644 test/fixtures/keys/slh_dsa_sha2_256s_private.pem create mode 100644 test/fixtures/keys/slh_dsa_sha2_256s_public.pem create mode 100644 test/fixtures/keys/slh_dsa_shake_128f_private.pem create mode 100644 test/fixtures/keys/slh_dsa_shake_128f_public.pem create mode 100644 test/fixtures/keys/slh_dsa_shake_128s_private.pem create mode 100644 test/fixtures/keys/slh_dsa_shake_128s_public.pem create mode 100644 test/fixtures/keys/slh_dsa_shake_192f_private.pem create mode 100644 test/fixtures/keys/slh_dsa_shake_192f_public.pem create mode 100644 test/fixtures/keys/slh_dsa_shake_192s_private.pem create mode 100644 test/fixtures/keys/slh_dsa_shake_192s_public.pem create mode 100644 test/fixtures/keys/slh_dsa_shake_256f_private.pem create mode 100644 test/fixtures/keys/slh_dsa_shake_256f_public.pem create mode 100644 test/fixtures/keys/slh_dsa_shake_256s_private.pem create mode 100644 test/fixtures/keys/slh_dsa_shake_256s_public.pem create mode 100644 test/parallel/test-crypto-pqc-key-objects-slh-dsa.js create mode 100644 test/parallel/test-crypto-pqc-keygen-slh-dsa.js create mode 100644 test/pummel/test-crypto-pqc-sign-verify-slh-dsa.mjs diff --git a/deps/ncrypto/ncrypto.cc b/deps/ncrypto/ncrypto.cc index 73af3b8c37c073..d7a26edd4cfe49 100644 --- a/deps/ncrypto/ncrypto.cc +++ b/deps/ncrypto/ncrypto.cc @@ -30,6 +30,18 @@ constexpr static PQCMapping pqc_mappings[] = { {"ML-KEM-512", EVP_PKEY_ML_KEM_512}, {"ML-KEM-768", EVP_PKEY_ML_KEM_768}, {"ML-KEM-1024", EVP_PKEY_ML_KEM_1024}, + {"SLH-DSA-SHA2-128f", EVP_PKEY_SLH_DSA_SHA2_128F}, + {"SLH-DSA-SHA2-128s", EVP_PKEY_SLH_DSA_SHA2_128S}, + {"SLH-DSA-SHA2-192f", EVP_PKEY_SLH_DSA_SHA2_192F}, + {"SLH-DSA-SHA2-192s", EVP_PKEY_SLH_DSA_SHA2_192S}, + {"SLH-DSA-SHA2-256f", EVP_PKEY_SLH_DSA_SHA2_256F}, + {"SLH-DSA-SHA2-256s", EVP_PKEY_SLH_DSA_SHA2_256S}, + {"SLH-DSA-SHAKE-128f", EVP_PKEY_SLH_DSA_SHAKE_128F}, + {"SLH-DSA-SHAKE-128s", EVP_PKEY_SLH_DSA_SHAKE_128S}, + {"SLH-DSA-SHAKE-192f", EVP_PKEY_SLH_DSA_SHAKE_192F}, + {"SLH-DSA-SHAKE-192s", EVP_PKEY_SLH_DSA_SHAKE_192S}, + {"SLH-DSA-SHAKE-256f", EVP_PKEY_SLH_DSA_SHAKE_256F}, + {"SLH-DSA-SHAKE-256s", EVP_PKEY_SLH_DSA_SHAKE_256S}, }; #endif @@ -2659,6 +2671,18 @@ bool EVPKeyPointer::isOneShotVariant() const { case EVP_PKEY_ML_DSA_44: case EVP_PKEY_ML_DSA_65: case EVP_PKEY_ML_DSA_87: + case EVP_PKEY_SLH_DSA_SHA2_128F: + case EVP_PKEY_SLH_DSA_SHA2_128S: + case EVP_PKEY_SLH_DSA_SHA2_192F: + case EVP_PKEY_SLH_DSA_SHA2_192S: + case EVP_PKEY_SLH_DSA_SHA2_256F: + case EVP_PKEY_SLH_DSA_SHA2_256S: + case EVP_PKEY_SLH_DSA_SHAKE_128F: + case EVP_PKEY_SLH_DSA_SHAKE_128S: + case EVP_PKEY_SLH_DSA_SHAKE_192F: + case EVP_PKEY_SLH_DSA_SHAKE_192S: + case EVP_PKEY_SLH_DSA_SHAKE_256F: + case EVP_PKEY_SLH_DSA_SHAKE_256S: #endif return true; default: diff --git a/doc/api/crypto.md b/doc/api/crypto.md index bc0d6442872b9b..557e9df270f622 100644 --- a/doc/api/crypto.md +++ b/doc/api/crypto.md @@ -77,23 +77,35 @@ try { The following table lists the asymmetric key types recognized by the [`KeyObject`][] API: -| Key Type | Description | OID | -| --------------------------- | -------------- | ----------------------- | -| `'dh'` | Diffie-Hellman | 1.2.840.113549.1.3.1 | -| `'dsa'` | DSA | 1.2.840.10040.4.1 | -| `'ec'` | Elliptic curve | 1.2.840.10045.2.1 | -| `'ed25519'` | Ed25519 | 1.3.101.112 | -| `'ed448'` | Ed448 | 1.3.101.113 | -| `'ml-dsa-44'`[^openssl35] | ML-DSA-44 | 2.16.840.1.101.3.4.3.17 | -| `'ml-dsa-65'`[^openssl35] | ML-DSA-65 | 2.16.840.1.101.3.4.3.18 | -| `'ml-dsa-87'`[^openssl35] | ML-DSA-87 | 2.16.840.1.101.3.4.3.19 | -| `'ml-kem-1024'`[^openssl35] | ML-KEM-1024 | 2.16.840.1.101.3.4.4.3 | -| `'ml-kem-512'`[^openssl35] | ML-KEM-512 | 2.16.840.1.101.3.4.4.1 | -| `'ml-kem-768'`[^openssl35] | ML-KEM-768 | 2.16.840.1.101.3.4.4.2 | -| `'rsa-pss'` | RSA PSS | 1.2.840.113549.1.1.10 | -| `'rsa'` | RSA | 1.2.840.113549.1.1.1 | -| `'x25519'` | X25519 | 1.3.101.110 | -| `'x448'` | X448 | 1.3.101.111 | +| Key Type | Description | OID | +| ---------------------------------- | ------------------ | ----------------------- | +| `'dh'` | Diffie-Hellman | 1.2.840.113549.1.3.1 | +| `'dsa'` | DSA | 1.2.840.10040.4.1 | +| `'ec'` | Elliptic curve | 1.2.840.10045.2.1 | +| `'ed25519'` | Ed25519 | 1.3.101.112 | +| `'ed448'` | Ed448 | 1.3.101.113 | +| `'ml-dsa-44'`[^openssl35] | ML-DSA-44 | 2.16.840.1.101.3.4.3.17 | +| `'ml-dsa-65'`[^openssl35] | ML-DSA-65 | 2.16.840.1.101.3.4.3.18 | +| `'ml-dsa-87'`[^openssl35] | ML-DSA-87 | 2.16.840.1.101.3.4.3.19 | +| `'ml-kem-1024'`[^openssl35] | ML-KEM-1024 | 2.16.840.1.101.3.4.4.3 | +| `'ml-kem-512'`[^openssl35] | ML-KEM-512 | 2.16.840.1.101.3.4.4.1 | +| `'ml-kem-768'`[^openssl35] | ML-KEM-768 | 2.16.840.1.101.3.4.4.2 | +| `'rsa-pss'` | RSA PSS | 1.2.840.113549.1.1.10 | +| `'rsa'` | RSA | 1.2.840.113549.1.1.1 | +| `'slh-dsa-sha2-128f'`[^openssl35] | SLH-DSA-SHA2-128f | 2.16.840.1.101.3.4.3.21 | +| `'slh-dsa-sha2-128s'`[^openssl35] | SLH-DSA-SHA2-128s | 2.16.840.1.101.3.4.3.22 | +| `'slh-dsa-sha2-192f'`[^openssl35] | SLH-DSA-SHA2-192f | 2.16.840.1.101.3.4.3.23 | +| `'slh-dsa-sha2-192s'`[^openssl35] | SLH-DSA-SHA2-192s | 2.16.840.1.101.3.4.3.24 | +| `'slh-dsa-sha2-256f'`[^openssl35] | SLH-DSA-SHA2-256f | 2.16.840.1.101.3.4.3.25 | +| `'slh-dsa-sha2-256s'`[^openssl35] | SLH-DSA-SHA2-256s | 2.16.840.1.101.3.4.3.26 | +| `'slh-dsa-shake-128f'`[^openssl35] | SLH-DSA-SHAKE-128f | 2.16.840.1.101.3.4.3.27 | +| `'slh-dsa-shake-128s'`[^openssl35] | SLH-DSA-SHAKE-128s | 2.16.840.1.101.3.4.3.28 | +| `'slh-dsa-shake-192f'`[^openssl35] | SLH-DSA-SHAKE-192f | 2.16.840.1.101.3.4.3.29 | +| `'slh-dsa-shake-192s'`[^openssl35] | SLH-DSA-SHAKE-192s | 2.16.840.1.101.3.4.3.30 | +| `'slh-dsa-shake-256f'`[^openssl35] | SLH-DSA-SHAKE-256f | 2.16.840.1.101.3.4.3.31 | +| `'slh-dsa-shake-256s'`[^openssl35] | SLH-DSA-SHAKE-256s | 2.16.840.1.101.3.4.3.32 | +| `'x25519'` | X25519 | 1.3.101.110 | +| `'x448'` | X448 | 1.3.101.111 | ## Class: `Certificate` @@ -2046,6 +2058,9 @@ Other key details might be exposed via this API using additional attributes. -* `algorithm` {EcdhKeyDeriveParams|HkdfParams|Pbkdf2Params} +* `algorithm` {EcdhKeyDeriveParams|HkdfParams|Pbkdf2Params|Argon2Params} * `baseKey` {CryptoKey} * `length` {number|null} **Default:** `null` * Returns: {Promise} Fulfills with an {ArrayBuffer} upon success. @@ -899,6 +917,9 @@ containing the generated data. The algorithms currently supported include: +* `'Argon2d'`[^modern-algos] +* `'Argon2i'`[^modern-algos] +* `'Argon2id'`[^modern-algos] * `'ECDH'` * `'HKDF'` * `'PBKDF2'` @@ -910,6 +931,9 @@ The algorithms currently supported include: -* `algorithm` {EcdhKeyDeriveParams|HkdfParams|Pbkdf2Params} +* `algorithm` {EcdhKeyDeriveParams|HkdfParams|Pbkdf2Params|Argon2Params} * `baseKey` {CryptoKey} * `derivedKeyAlgorithm` {string|Algorithm|HmacImportParams|AesDerivedKeyParams} * `extractable` {boolean} @@ -939,6 +963,9 @@ generate raw keying material, then passing the result into the The algorithms currently supported include: +* `'Argon2d'`[^modern-algos] +* `'Argon2i'`[^modern-algos] +* `'Argon2id'`[^modern-algos] * `'ECDH'` * `'HKDF'` * `'PBKDF2'` @@ -1234,7 +1261,7 @@ as the given `format` to create a {CryptoKey} instance using the provided `algorithm`, `extractable`, and `keyUsages` arguments. If the import is successful, the returned promise will be resolved with the created {CryptoKey}. -If importing a `'PBKDF2'` key, `extractable` must be `false`. +If importing KDF algorithm keys, `extractable` must be `false`. The algorithms currently supported include: @@ -1245,6 +1272,9 @@ The algorithms currently supported include: | `'AES-GCM'` | | | ✔ | ✔ | ✔ | | | | `'AES-KW'` | | | ✔ | ✔ | ✔ | | | | `'AES-OCB'`[^modern-algos] | | | ✔ | | ✔ | | | +| `'Argon2d'`[^modern-algos] | | | | | ✔ | | | +| `'Argon2i'`[^modern-algos] | | | | | ✔ | | | +| `'Argon2id'`[^modern-algos] | | | | | ✔ | | | | `'ChaCha20-Poly1305'`[^modern-algos] | | | ✔ | | ✔ | | | | `'ECDH'` | ✔ | ✔ | ✔ | ✔ | | ✔ | | | `'ECDSA'` | ✔ | ✔ | ✔ | ✔ | | ✔ | | @@ -1665,6 +1695,90 @@ added: v15.0.0 * Type: {string} Must be one of `'AES-CBC'`, `'AES-CTR'`, `'AES-GCM'`, or `'AES-KW'` +### Class: `Argon2Params` + + + +#### `argon2Params.associatedData` + + + +* Type: {ArrayBuffer|TypedArray|DataView|Buffer} + +Represents the optional associated data. + +#### `argon2Params.memory` + + + +* Type: {number} + +Represents the memory size in kibibytes. It must be at least 8 times the degree of parallelism. + +#### `argon2Params.name` + + + +* Type: {string} Must be one of `'Argon2d'`, `'Argon2i'`, or `'Argon2id'`. + +#### `argon2Params.nonce` + + + +* Type: {ArrayBuffer|TypedArray|DataView|Buffer} + +Represents the nonce, which is a salt for password hashing applications. + +#### `argon2Params.parallelism` + + + +* Type: {number} + +Represents the degree of parallelism. + +#### `argon2Params.passes` + + + +* Type: {number} + +Represents the number of passes. + +#### `argon2Params.secretValue` + + + +* Type: {ArrayBuffer|TypedArray|DataView|Buffer} + +Represents the optional secret value. + +#### `argon2Params.version` + + + +* Type: {number} + +Represents the Argon2 version number. The default and currently only defined version is `19` (`0x13`). + ### Class: `ContextParams` + +* Returns: {undefined} + +Instantiate the module with the linked requested modules. + +This resolves the imported bindings of the module, including re-exported +binding names. When there are any bindings that cannot be resolved, +an error would be thrown synchronously. + +If the requested modules include cyclic dependencies, the +[`sourceTextModule.linkRequests(modules)`][] method must be called on all +modules in the cycle before calling this method. + +### `sourceTextModule.linkRequests(modules)` + + + +* `modules` {vm.Module\[]} Array of `vm.Module` objects that this module depends on. + The order of the modules in the array is the order of + [`sourceTextModule.moduleRequests`][]. +* Returns: {undefined} + +Link module dependencies. This method must be called before evaluation, and +can only be called once per module. + +The order of the module instances in the `modules` array should correspond to the order of +[`sourceTextModule.moduleRequests`][] being resolved. If two module requests have the same +specifier and import attributes, they must be resolved with the same module instance or an +`ERR_MODULE_LINK_MISMATCH` would be thrown. For example, when linking requests for this +module: + + + +```mjs +import foo from 'foo'; +import source Foo from 'foo'; +``` + + + +The `modules` array must contain two references to the same instance, because the two +module requests are identical but in two phases. + +If the module has no dependencies, the `modules` array can be empty. + +Users can use `sourceTextModule.moduleRequests` to implement the host-defined +[HostLoadImportedModule][] abstract operation in the ECMAScript specification, +and using `sourceTextModule.linkRequests()` to invoke specification defined +[FinishLoadingImportedModule][], on the module with all dependencies in a batch. + +It's up to the creator of the `SourceTextModule` to determine if the resolution +of the dependencies is synchronous or asynchronous. + +After each module in the `modules` array is linked, call +[`sourceTextModule.instantiate()`][]. + ### `sourceTextModule.moduleRequests` * `name` {string} Name of the export to set. * `value` {any} The value to set the export to. -This method is used after the module is linked to set the values of exports. If -it is called before the module is linked, an [`ERR_VM_MODULE_STATUS`][] error -will be thrown. +This method sets the module export binding slots with the given value. ```mjs import vm from 'node:vm'; @@ -1033,7 +1121,6 @@ const m = new vm.SyntheticModule(['x'], () => { m.setExport('x', 1); }); -await m.link(() => {}); await m.evaluate(); assert.strictEqual(m.namespace.x, 1); @@ -1045,7 +1132,6 @@ const vm = require('node:vm'); const m = new vm.SyntheticModule(['x'], () => { m.setExport('x', 1); }); - await m.link(() => {}); await m.evaluate(); assert.strictEqual(m.namespace.x, 1); })(); @@ -2037,7 +2123,9 @@ const { Script, SyntheticModule } = require('node:vm'); [Cyclic Module Record]: https://tc39.es/ecma262/#sec-cyclic-module-records [ECMAScript Module Loader]: esm.md#modules-ecmascript-modules [Evaluate() concrete method]: https://tc39.es/ecma262/#sec-moduleevaluation +[FinishLoadingImportedModule]: https://tc39.es/ecma262/#sec-FinishLoadingImportedModule [GetModuleNamespace]: https://tc39.es/ecma262/#sec-getmodulenamespace +[HostLoadImportedModule]: https://tc39.es/ecma262/#sec-HostLoadImportedModule [HostResolveImportedModule]: https://tc39.es/ecma262/#sec-hostresolveimportedmodule [ImportDeclaration]: https://tc39.es/ecma262/#prod-ImportDeclaration [Link() concrete method]: https://tc39.es/ecma262/#sec-moduledeclarationlinking @@ -2049,13 +2137,14 @@ const { Script, SyntheticModule } = require('node:vm'); [WithClause]: https://tc39.es/ecma262/#prod-WithClause [`ERR_VM_DYNAMIC_IMPORT_CALLBACK_MISSING_FLAG`]: errors.md#err_vm_dynamic_import_callback_missing_flag [`ERR_VM_DYNAMIC_IMPORT_CALLBACK_MISSING`]: errors.md#err_vm_dynamic_import_callback_missing -[`ERR_VM_MODULE_STATUS`]: errors.md#err_vm_module_status [`Error`]: errors.md#class-error [`URL`]: url.md#class-url [`eval()`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/eval [`optionsExpression`]: https://tc39.es/proposal-import-attributes/#sec-evaluate-import-call [`script.runInContext()`]: #scriptrunincontextcontextifiedobject-options [`script.runInThisContext()`]: #scriptruninthiscontextoptions +[`sourceTextModule.instantiate()`]: #sourcetextmoduleinstantiate +[`sourceTextModule.linkRequests(modules)`]: #sourcetextmodulelinkrequestsmodules [`sourceTextModule.moduleRequests`]: #sourcetextmodulemodulerequests [`url.origin`]: url.md#urlorigin [`vm.compileFunction()`]: #vmcompilefunctioncode-params-options diff --git a/lib/internal/bootstrap/realm.js b/lib/internal/bootstrap/realm.js index f49f0814bbc687..9055588a31fc4b 100644 --- a/lib/internal/bootstrap/realm.js +++ b/lib/internal/bootstrap/realm.js @@ -359,6 +359,7 @@ class BuiltinModule { this.setExport('default', builtin.exports); }); // Ensure immediate sync execution to capture exports now + this.module.link([]); this.module.instantiate(); this.module.evaluate(-1, false); return this.module; diff --git a/lib/internal/errors.js b/lib/internal/errors.js index 8a6f5b26f5391c..db4d47381b06eb 100644 --- a/lib/internal/errors.js +++ b/lib/internal/errors.js @@ -1589,6 +1589,7 @@ E('ERR_MISSING_ARGS', return `${msg} must be specified`; }, TypeError); E('ERR_MISSING_OPTION', '%s is required', TypeError); +E('ERR_MODULE_LINK_MISMATCH', '%s', TypeError); E('ERR_MODULE_NOT_FOUND', function(path, base, exactUrl) { if (exactUrl) { lazyInternalUtil().setOwnProperty(this, 'url', `${exactUrl}`); diff --git a/lib/internal/vm/module.js b/lib/internal/vm/module.js index aba5c2872daa39..5cdd1850a39705 100644 --- a/lib/internal/vm/module.js +++ b/lib/internal/vm/module.js @@ -38,6 +38,7 @@ const { ERR_VM_MODULE_DIFFERENT_CONTEXT, ERR_VM_MODULE_CANNOT_CREATE_CACHED_DATA, ERR_VM_MODULE_LINK_FAILURE, + ERR_MODULE_LINK_MISMATCH, ERR_VM_MODULE_NOT_MODULE, ERR_VM_MODULE_STATUS, } = require('internal/errors').codes; @@ -50,6 +51,7 @@ const { validateUint32, validateString, validateThisInternalField, + validateArray, } = require('internal/validators'); const binding = internalBinding('module_wrap'); @@ -369,6 +371,37 @@ class SourceTextModule extends Module { } } + linkRequests(modules) { + validateThisInternalField(this, kWrap, 'SourceTextModule'); + if (this.status !== 'unlinked') { + throw new ERR_VM_MODULE_STATUS('must be unlinked'); + } + validateArray(modules, 'modules'); + if (modules.length !== this.#moduleRequests.length) { + throw new ERR_MODULE_LINK_MISMATCH( + `Expected ${this.#moduleRequests.length} modules, got ${modules.length}`, + ); + } + const moduleWraps = ArrayPrototypeMap(modules, (module) => { + if (!isModule(module)) { + throw new ERR_VM_MODULE_NOT_MODULE(); + } + if (module.context !== this.context) { + throw new ERR_VM_MODULE_DIFFERENT_CONTEXT(); + } + return module[kWrap]; + }); + this[kWrap].link(moduleWraps); + } + + instantiate() { + validateThisInternalField(this, kWrap, 'SourceTextModule'); + if (this.status !== 'unlinked') { + throw new ERR_VM_MODULE_STATUS('must be unlinked'); + } + this[kWrap].instantiate(); + } + get dependencySpecifiers() { this.#dependencySpecifiers ??= ObjectFreeze( ArrayPrototypeMap(this.#moduleRequests, (request) => request.specifier)); @@ -435,10 +468,15 @@ class SyntheticModule extends Module { context, identifier, }); + // A synthetic module does not have dependencies. + this[kWrap].link([]); + this[kWrap].instantiate(); } - [kLink]() { - /** nothing to do for synthetic modules */ + link() { + validateThisInternalField(this, kWrap, 'SyntheticModule'); + // No-op for synthetic modules + // Do not invoke super.link() as it will throw an error. } setExport(name, value) { diff --git a/src/module_wrap.cc b/src/module_wrap.cc index 72d910fa4a8144..ccd3ded24f92b3 100644 --- a/src/module_wrap.cc +++ b/src/module_wrap.cc @@ -72,6 +72,25 @@ void ModuleCacheKey::MemoryInfo(MemoryTracker* tracker) const { tracker->TrackField("import_attributes", import_attributes); } +std::string ModuleCacheKey::ToString() const { + std::string result = "ModuleCacheKey(\"" + specifier + "\""; + if (!import_attributes.empty()) { + result += ", {"; + bool first = true; + for (const auto& attr : import_attributes) { + if (first) { + first = false; + } else { + result += ", "; + } + result += attr.first + ": " + attr.second; + } + result += "}"; + } + result += ")"; + return result; +} + template ModuleCacheKey ModuleCacheKey::From(Local context, Local specifier, @@ -605,6 +624,8 @@ void ModuleWrap::GetModuleRequests(const FunctionCallbackInfo& args) { // moduleWrap.link(moduleWraps) void ModuleWrap::Link(const FunctionCallbackInfo& args) { Isolate* isolate = args.GetIsolate(); + Realm* realm = Realm::GetCurrent(args); + Local context = realm->context(); ModuleWrap* dependent; ASSIGN_OR_RETURN_UNWRAP(&dependent, args.This()); @@ -616,6 +637,30 @@ void ModuleWrap::Link(const FunctionCallbackInfo& args) { Local modules = args[0].As(); CHECK_EQ(modules->Length(), static_cast(requests->Length())); + for (int i = 0; i < requests->Length(); i++) { + ModuleCacheKey module_cache_key = ModuleCacheKey::From( + context, requests->Get(context, i).As()); + DCHECK(dependent->resolve_cache_.contains(module_cache_key)); + + Local module_i; + Local module_cache_i; + uint32_t coalesced_index = dependent->resolve_cache_[module_cache_key]; + if (!modules->Get(context, i).ToLocal(&module_i) || + !modules->Get(context, coalesced_index).ToLocal(&module_cache_i) || + !module_i->StrictEquals(module_cache_i)) { + // If the module is different from the one of the same request, throw an + // error. + THROW_ERR_MODULE_LINK_MISMATCH( + realm->env(), + "Module request '%s' at index %d must be linked " + "to the same module requested at index %d", + module_cache_key.ToString(), + i, + coalesced_index); + return; + } + } + args.This()->SetInternalField(kLinkedRequestsSlot, modules); dependent->linked_ = true; } @@ -627,6 +672,12 @@ void ModuleWrap::Instantiate(const FunctionCallbackInfo& args) { ASSIGN_OR_RETURN_UNWRAP(&obj, args.This()); Local context = obj->context(); Local module = obj->module_.Get(isolate); + + if (!obj->IsLinked()) { + THROW_ERR_VM_MODULE_LINK_FAILURE(realm->env(), "module is not linked"); + return; + } + TryCatchScope try_catch(realm->env()); USE(module->InstantiateModule( context, ResolveModuleCallback, ResolveSourceCallback)); diff --git a/src/module_wrap.h b/src/module_wrap.h index 467a9af1177b0f..2d0747dcf06dd6 100644 --- a/src/module_wrap.h +++ b/src/module_wrap.h @@ -59,6 +59,9 @@ struct ModuleCacheKey : public MemoryRetainer { SET_SELF_SIZE(ModuleCacheKey) void MemoryInfo(MemoryTracker* tracker) const override; + // Returns a string representation of the ModuleCacheKey. + std::string ToString() const; + template static ModuleCacheKey From(v8::Local context, v8::Local specifier, diff --git a/src/node_errors.h b/src/node_errors.h index 09fe51d6f79825..8919307dd3256c 100644 --- a/src/node_errors.h +++ b/src/node_errors.h @@ -108,6 +108,7 @@ void OOMErrorHandler(const char* location, const v8::OOMDetails& details); V(ERR_MISSING_PASSPHRASE, TypeError) \ V(ERR_MISSING_PLATFORM_FOR_WORKER, Error) \ V(ERR_MODULE_NOT_FOUND, Error) \ + V(ERR_MODULE_LINK_MISMATCH, TypeError) \ V(ERR_NON_CONTEXT_AWARE_DISABLED, Error) \ V(ERR_OPERATION_FAILED, TypeError) \ V(ERR_OPTIONS_BEFORE_BOOTSTRAPPING, Error) \ diff --git a/test/parallel/test-internal-module-wrap.js b/test/parallel/test-internal-module-wrap.js index bed940d1368fa7..8ba92413c6a871 100644 --- a/test/parallel/test-internal-module-wrap.js +++ b/test/parallel/test-internal-module-wrap.js @@ -1,4 +1,4 @@ -// Flags: --expose-internals +// Flags: --expose-internals --js-source-phase-imports 'use strict'; const common = require('../common'); const assert = require('assert'); @@ -6,25 +6,25 @@ const assert = require('assert'); const { internalBinding } = require('internal/test/binding'); const { ModuleWrap } = internalBinding('module_wrap'); -const unlinked = new ModuleWrap('unlinked', undefined, 'export * from "bar";', 0, 0); -assert.throws(() => { - unlinked.instantiate(); -}, { - code: 'ERR_VM_MODULE_LINK_FAILURE', -}); +async function testModuleWrap() { + const unlinked = new ModuleWrap('unlinked', undefined, 'export * from "bar";', 0, 0); + assert.throws(() => { + unlinked.instantiate(); + }, { + code: 'ERR_VM_MODULE_LINK_FAILURE', + }); -const dependsOnUnlinked = new ModuleWrap('dependsOnUnlinked', undefined, 'export * from "unlinked";', 0, 0); -dependsOnUnlinked.link([unlinked]); -assert.throws(() => { - dependsOnUnlinked.instantiate(); -}, { - code: 'ERR_VM_MODULE_LINK_FAILURE', -}); + const dependsOnUnlinked = new ModuleWrap('dependsOnUnlinked', undefined, 'export * from "unlinked";', 0, 0); + dependsOnUnlinked.link([unlinked]); + assert.throws(() => { + dependsOnUnlinked.instantiate(); + }, { + code: 'ERR_VM_MODULE_LINK_FAILURE', + }); -const foo = new ModuleWrap('foo', undefined, 'export * from "bar";', 0, 0); -const bar = new ModuleWrap('bar', undefined, 'export const five = 5', 0, 0); + const foo = new ModuleWrap('foo', undefined, 'export * from "bar";', 0, 0); + const bar = new ModuleWrap('bar', undefined, 'export const five = 5', 0, 0); -(async () => { const moduleRequests = foo.getModuleRequests(); assert.strictEqual(moduleRequests.length, 1); assert.strictEqual(moduleRequests[0].specifier, 'bar'); @@ -37,5 +37,30 @@ const bar = new ModuleWrap('bar', undefined, 'export const five = 5', 0, 0); // Check that the module requests are the same after linking, instantiate, and evaluation. assert.deepStrictEqual(moduleRequests, foo.getModuleRequests()); +} + +// Verify that linking two module with a same ModuleCacheKey throws an error. +function testLinkMismatch() { + const foo = new ModuleWrap('foo', undefined, ` + import source BarSource from 'bar'; + import bar from 'bar'; +`, 0, 0); + const bar1 = new ModuleWrap('bar', undefined, 'export const five = 5', 0, 0); + const bar2 = new ModuleWrap('bar', undefined, 'export const six = 6', 0, 0); + const moduleRequests = foo.getModuleRequests(); + assert.strictEqual(moduleRequests.length, 2); + assert.strictEqual(moduleRequests[0].specifier, moduleRequests[1].specifier); + assert.throws(() => { + foo.link([bar1, bar2]); + }, { + code: 'ERR_MODULE_LINK_MISMATCH', + // Test that ModuleCacheKey::ToString() is used in the error message. + message: `Module request 'ModuleCacheKey("bar")' at index 0 must be linked to the same module requested at index 1` + }); +} + +(async () => { + await testModuleWrap(); + testLinkMismatch(); })().then(common.mustCall()); diff --git a/test/parallel/test-vm-module-basic.js b/test/parallel/test-vm-module-basic.js index a8f5b92b7adc4b..f38bc39d70daec 100644 --- a/test/parallel/test-vm-module-basic.js +++ b/test/parallel/test-vm-module-basic.js @@ -99,7 +99,7 @@ const util = require('util'); assert.strictEqual( util.inspect(m), `SyntheticModule { - status: 'unlinked', + status: 'linked', identifier: 'vm:module(0)', context: { foo: 'bar' } }` diff --git a/test/parallel/test-vm-module-instantiate.js b/test/parallel/test-vm-module-instantiate.js new file mode 100644 index 00000000000000..e96a78a8f643d4 --- /dev/null +++ b/test/parallel/test-vm-module-instantiate.js @@ -0,0 +1,99 @@ +'use strict'; + +// Flags: --experimental-vm-modules + +require('../common'); + +const assert = require('assert'); + +const { SourceTextModule } = require('vm'); +const test = require('node:test'); + +test('simple module', () => { + const foo = new SourceTextModule(` + export const foo = 4 + export default 5; + `); + foo.linkRequests([]); + foo.instantiate(); + + assert.deepStrictEqual( + Reflect.ownKeys(foo.namespace), + ['default', 'foo', Symbol.toStringTag] + ); +}); + +test('linkRequests can not be skipped', () => { + const foo = new SourceTextModule(` + export const foo = 4 + export default 5; + `); + assert.throws(() => { + foo.instantiate(); + }, { + code: 'ERR_VM_MODULE_LINK_FAILURE', + }); +}); + +test('re-export simple name', () => { + const foo = new SourceTextModule(` + export { bar } from 'bar'; + `); + const bar = new SourceTextModule(` + export const bar = 42; + `); + foo.linkRequests([bar]); + foo.instantiate(); + + assert.deepStrictEqual( + Reflect.ownKeys(foo.namespace), + ['bar', Symbol.toStringTag] + ); +}); + +test('re-export-star', () => { + const foo = new SourceTextModule(` + export * from 'bar'; + `); + const bar = new SourceTextModule(` + export const bar = 42; + `); + foo.linkRequests([bar]); + foo.instantiate(); + + assert.deepStrictEqual( + Reflect.ownKeys(foo.namespace), + ['bar', Symbol.toStringTag] + ); +}); + +test('deep re-export-star', () => { + let stackTop = new SourceTextModule(` + export const foo = 4; + `); + stackTop.linkRequests([]); + for (let i = 0; i < 10; i++) { + const mod = new SourceTextModule(` + export * from 'stack?top'; + `); + mod.linkRequests([stackTop]); + stackTop = mod; + } + stackTop.instantiate(); + + assert.deepStrictEqual( + Reflect.ownKeys(stackTop.namespace), + ['foo', Symbol.toStringTag] + ); +}); + +test('should throw if the module is not linked', () => { + const foo = new SourceTextModule(` + import { bar } from 'bar'; + `); + assert.throws(() => { + foo.instantiate(); + }, { + code: 'ERR_VM_MODULE_LINK_FAILURE', + }); +}); diff --git a/test/parallel/test-vm-module-linkmodulerequests-circular.js b/test/parallel/test-vm-module-linkmodulerequests-circular.js new file mode 100644 index 00000000000000..f7992ef3a1eb2c --- /dev/null +++ b/test/parallel/test-vm-module-linkmodulerequests-circular.js @@ -0,0 +1,72 @@ +'use strict'; + +// Flags: --experimental-vm-modules --js-source-phase-imports + +require('../common'); + +const assert = require('assert'); + +const { SourceTextModule } = require('vm'); +const test = require('node:test'); + +test('basic circular linking', async function circular() { + const foo = new SourceTextModule(` + import getFoo from 'bar'; + export let foo = 42; + export default getFoo(); + `); + const bar = new SourceTextModule(` + import { foo } from 'foo'; + export default function getFoo() { + return foo; + } + `); + foo.linkRequests([bar]); + bar.linkRequests([foo]); + + foo.instantiate(); + assert.strictEqual(foo.status, 'linked'); + assert.strictEqual(bar.status, 'linked'); + + await foo.evaluate(); + assert.strictEqual(foo.namespace.default, 42); +}); + +test('circular linking graph', async function circular2() { + const sourceMap = { + 'root': ` + import * as a from './a.mjs'; + import * as b from './b.mjs'; + if (!('fromA' in a)) + throw new Error(); + if (!('fromB' in a)) + throw new Error(); + if (!('fromA' in b)) + throw new Error(); + if (!('fromB' in b)) + throw new Error(); + `, + './a.mjs': ` + export * from './b.mjs'; + export let fromA; + `, + './b.mjs': ` + export * from './a.mjs'; + export let fromB; + ` + }; + const moduleMap = new Map(); + for (const [specifier, source] of Object.entries(sourceMap)) { + moduleMap.set(specifier, new SourceTextModule(source, { + identifier: new URL(specifier, 'file:///').href, + })); + } + for (const mod of moduleMap.values()) { + mod.linkRequests(mod.moduleRequests.map((request) => { + return moduleMap.get(request.specifier); + })); + } + const rootModule = moduleMap.get('root'); + rootModule.instantiate(); + await rootModule.evaluate(); +}); diff --git a/test/parallel/test-vm-module-linkmodulerequests-deep.js b/test/parallel/test-vm-module-linkmodulerequests-deep.js new file mode 100644 index 00000000000000..c73b5afddfb314 --- /dev/null +++ b/test/parallel/test-vm-module-linkmodulerequests-deep.js @@ -0,0 +1,34 @@ +'use strict'; + +// Flags: --experimental-vm-modules --js-source-phase-imports + +require('../common'); + +const assert = require('assert'); + +const { SourceTextModule } = require('vm'); +const test = require('node:test'); + +test('deep linking', async function depth() { + const foo = new SourceTextModule('export default 5'); + foo.linkRequests([]); + foo.instantiate(); + + function getProxy(parentName, parentModule) { + const mod = new SourceTextModule(` + import ${parentName} from '${parentName}'; + export default ${parentName}; + `); + mod.linkRequests([parentModule]); + mod.instantiate(); + return mod; + } + + const bar = getProxy('foo', foo); + const baz = getProxy('bar', bar); + const barz = getProxy('baz', baz); + + await barz.evaluate(); + + assert.strictEqual(barz.namespace.default, 5); +}); diff --git a/test/parallel/test-vm-module-linkmodulerequests.js b/test/parallel/test-vm-module-linkmodulerequests.js new file mode 100644 index 00000000000000..6d9a4324e5aecc --- /dev/null +++ b/test/parallel/test-vm-module-linkmodulerequests.js @@ -0,0 +1,67 @@ +'use strict'; + +// Flags: --experimental-vm-modules --js-source-phase-imports + +require('../common'); + +const assert = require('assert'); + +const { SourceTextModule } = require('vm'); +const test = require('node:test'); + +test('simple graph', async function simple() { + const foo = new SourceTextModule('export default 5;'); + foo.linkRequests([]); + + globalThis.fiveResult = undefined; + const bar = new SourceTextModule('import five from "foo"; fiveResult = five'); + + bar.linkRequests([foo]); + bar.instantiate(); + + await bar.evaluate(); + assert.strictEqual(globalThis.fiveResult, 5); + delete globalThis.fiveResult; +}); + +test('invalid link values', () => { + const invalidValues = [ + undefined, + null, + {}, + SourceTextModule.prototype, + ]; + + for (const value of invalidValues) { + const module = new SourceTextModule('import "foo"'); + assert.throws(() => module.linkRequests([value]), { + code: 'ERR_VM_MODULE_NOT_MODULE', + }); + } +}); + +test('mismatch linkage', () => { + const foo = new SourceTextModule('export default 5;'); + foo.linkRequests([]); + + // Link with more modules than requested. + const bar = new SourceTextModule('import foo from "foo";'); + assert.throws(() => bar.linkRequests([foo, foo]), { + code: 'ERR_MODULE_LINK_MISMATCH', + }); + + // Link with fewer modules than requested. + const baz = new SourceTextModule('import foo from "foo"; import bar from "bar";'); + assert.throws(() => baz.linkRequests([foo]), { + code: 'ERR_MODULE_LINK_MISMATCH', + }); + + // Link a same module cache key with different instances. + const qux = new SourceTextModule(` + import foo from "foo"; + import source Foo from "foo"; + `); + assert.throws(() => qux.linkRequests([foo, bar]), { + code: 'ERR_MODULE_LINK_MISMATCH', + }); +}); diff --git a/test/parallel/test-vm-module-modulerequests.js b/test/parallel/test-vm-module-modulerequests.js index 87fb34cae3823e..aaa892cbab2307 100644 --- a/test/parallel/test-vm-module-modulerequests.js +++ b/test/parallel/test-vm-module-modulerequests.js @@ -12,15 +12,21 @@ const test = require('node:test'); test('SourceTextModule.moduleRequests should return module requests', (t) => { const m = new SourceTextModule(` import { foo } from './foo.js'; + import * as FooDuplicate from './foo.js'; import { bar } from './bar.json' with { type: 'json' }; + import * as BarDuplicate from './bar.json' with { type: 'json' }; import { quz } from './quz.js' with { attr1: 'quz' }; + import * as QuzDuplicate from './quz.js' with { attr1: 'quz' }; import { quz as quz2 } from './quz.js' with { attr2: 'quark', attr3: 'baz' }; + import * as Quz2Duplicate from './quz.js' with { attr2: 'quark', attr3: 'baz' }; import source Module from './source-module'; + import source Module2 from './source-module'; + import * as SourceModule from './source-module'; export { foo, bar, quz, quz2 }; `); const requests = m.moduleRequests; - assert.strictEqual(requests.length, 5); + assert.strictEqual(requests.length, 6); assert.deepStrictEqual(requests[0], { __proto__: null, specifier: './foo.js', @@ -65,6 +71,14 @@ test('SourceTextModule.moduleRequests should return module requests', (t) => { }, phase: 'source', }); + assert.deepStrictEqual(requests[5], { + __proto__: null, + specifier: './source-module', + attributes: { + __proto__: null, + }, + phase: 'evaluation', + }); // Check the deprecated dependencySpecifiers property. // The dependencySpecifiers items are not unique. @@ -74,6 +88,7 @@ test('SourceTextModule.moduleRequests should return module requests', (t) => { './quz.js', './quz.js', './source-module', + './source-module', ]); }); diff --git a/test/parallel/test-vm-module-synthetic.js b/test/parallel/test-vm-module-synthetic.js index 831387ddbd2a26..c32a99ede12cfa 100644 --- a/test/parallel/test-vm-module-synthetic.js +++ b/test/parallel/test-vm-module-synthetic.js @@ -58,12 +58,10 @@ const assert = require('assert'); } { - const s = new SyntheticModule([], () => {}); - assert.throws(() => { - s.setExport('name', 'value'); - }, { - code: 'ERR_VM_MODULE_STATUS', - }); + const s = new SyntheticModule(['name'], () => {}); + // Exports of SyntheticModule can be immediately set after creation. + // No link is required. + s.setExport('name', 'value'); } for (const value of [null, {}, SyntheticModule.prototype]) { From 6ab9306370bc036e308e661f955cbe5dfd801cf2 Mon Sep 17 00:00:00 2001 From: Stefan Stojanovic Date: Fri, 29 Aug 2025 14:34:55 +0200 Subject: [PATCH 022/103] doc: update install_tools.bat free disk space MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Fixes: https://github.com/nodejs/node/issues/59326 PR-URL: https://github.com/nodejs/node/pull/59579 Reviewed-By: Michaël Zasso Reviewed-By: Chengzhong Wu Reviewed-By: Luigi Pinca --- tools/msvs/install_tools/install_tools.bat | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tools/msvs/install_tools/install_tools.bat b/tools/msvs/install_tools/install_tools.bat index 4b2bdec0f31c44..c3f7a287bfd071 100644 --- a/tools/msvs/install_tools/install_tools.bat +++ b/tools/msvs/install_tools/install_tools.bat @@ -13,7 +13,7 @@ echo This script will install Python and the Visual Studio Build Tools, necessar echo to compile Node.js native modules. Note that Chocolatey and required Windows echo updates will also be installed. echo. -echo This will require about 3 GiB of free disk space, plus any space necessary to +echo This will require about 7 GiB of free disk space, plus any space necessary to echo install Windows updates. This will take a while to run. echo. echo Please close all open programs for the duration of the installation. If the From d6d05ba397cd58ad2475ac0e7b9d7ce4d547d99e Mon Sep 17 00:00:00 2001 From: theanarkh Date: Fri, 29 Aug 2025 23:42:21 +0800 Subject: [PATCH 023/103] worker: add cpu profile APIs for worker PR-URL: https://github.com/nodejs/node/pull/59428 Reviewed-By: Anna Henningsen Reviewed-By: James M Snell Reviewed-By: Matteo Collina Reviewed-By: Stephen Belanger --- doc/api/errors.md | 30 ++++ doc/api/worker_threads.md | 30 ++++ lib/internal/worker.js | 36 +++++ src/async_wrap.h | 1 + src/env.cc | 36 +++++ src/env.h | 7 + src/env_properties.h | 1 + src/node_errors.h | 3 + src/node_worker.cc | 149 +++++++++++++++++- src/node_worker.h | 2 + src/util.h | 21 +++ test/parallel/test-worker-cpu-profile.js | 73 +++++++++ test/sequential/test-async-wrap-getasyncid.js | 1 + typings/internalBinding/worker.d.ts | 1 + 14 files changed, 390 insertions(+), 1 deletion(-) create mode 100644 test/parallel/test-worker-cpu-profile.js diff --git a/doc/api/errors.md b/doc/api/errors.md index 2b4b53449ab186..3bef4a5cdfe2b4 100644 --- a/doc/api/errors.md +++ b/doc/api/errors.md @@ -826,6 +826,36 @@ when an error occurs (and is caught) during the creation of the context, for example, when the allocation fails or the maximum call stack size is reached when the context is created. + + +### `ERR_CPU_PROFILE_ALREADY_STARTED` + + + +The CPU profile with the given name is already started. + + + +### `ERR_CPU_PROFILE_NOT_STARTED` + + + +The CPU profile with the given name is not started. + + + +### `ERR_CPU_PROFILE_TOO_MANY` + + + +There are too many CPU profiles being collected. + ### `ERR_CRYPTO_ARGON2_NOT_SUPPORTED` diff --git a/doc/api/worker_threads.md b/doc/api/worker_threads.md index 80949039875db2..4820b37dc01d28 100644 --- a/doc/api/worker_threads.md +++ b/doc/api/worker_threads.md @@ -1953,6 +1953,36 @@ this matches its values. If the worker has stopped, the return value is an empty object. +### `worker.startCpuProfile(name)` + + + +* name: {string} +* Returns: {Promise} + +Starting a CPU profile with the given `name`, then return a Promise that fulfills +with an error or an object which has a `stop` method. Calling the `stop` method will +stop collecting the profile, then return a Promise that fulfills with an error or the +profile data. + +```cjs +const { Worker } = require('node:worker_threads'); + +const worker = new Worker(` + const { parentPort } = require('worker_threads'); + parentPort.on('message', () => {}); + `, { eval: true }); + +worker.on('online', async () => { + const handle = await worker.startCpuProfile('demo'); + const profile = await handle.stop(); + console.log(profile); + worker.terminate(); +}); +``` + ### `worker.stderr` -* `headers` {HTTP/2 Headers Object|Array} +* `headers` {HTTP/2 Headers Object|HTTP/2 Raw Headers} * `options` {Object} * `endStream` {boolean} `true` if the `Http2Stream` _writable_ side should @@ -1675,11 +1674,12 @@ added: v8.4.0 * `headers` {HTTP/2 Headers Object} * `flags` {number} +* `rawHeaders` {HTTP/2 Raw Headers} The `'headers'` event is emitted when an additional block of headers is received for a stream, such as when a block of `1xx` informational headers is received. -The listener callback is passed the [HTTP/2 Headers Object][] and flags -associated with the headers. +The listener callback is passed the [HTTP/2 Headers Object][], flags associated +with the headers, and the headers in raw format (see [HTTP/2 Raw Headers][]). ```js stream.on('headers', (headers, flags) => { @@ -1714,11 +1714,13 @@ added: v8.4.0 * `headers` {HTTP/2 Headers Object} * `flags` {number} +* `rawHeaders` {HTTP/2 Raw Headers} The `'response'` event is emitted when a response `HEADERS` frame has been received for this stream from the connected HTTP/2 server. The listener is -invoked with two arguments: an `Object` containing the received -[HTTP/2 Headers Object][], and flags associated with the headers. +invoked with three arguments: an `Object` containing the received +[HTTP/2 Headers Object][], flags associated with the headers, and the headers +in raw format (see [HTTP/2 Raw Headers][]). ```mjs import { connect } from 'node:http2'; @@ -1866,7 +1868,7 @@ changes: description: Allow explicitly setting date headers. --> -* `headers` {HTTP/2 Headers Object|Array} +* `headers` {HTTP/2 Headers Object|HTTP/2 Raw Headers} * `options` {Object} * `endStream` {boolean} Set to `true` to indicate that the response will not include payload data. @@ -2350,8 +2352,7 @@ added: v8.4.0 * `stream` {Http2Stream} A reference to the stream * `headers` {HTTP/2 Headers Object} An object describing the headers * `flags` {number} The associated numeric flags -* `rawHeaders` {Array} An array containing the raw header names followed by - their respective values. +* `rawHeaders` {HTTP/2 Raw Headers} An array containing the raw headers The `'stream'` event is emitted when a `'stream'` event has been emitted by an `Http2Session` associated with the server. @@ -2606,8 +2607,7 @@ added: v8.4.0 * `stream` {Http2Stream} A reference to the stream * `headers` {HTTP/2 Headers Object} An object describing the headers * `flags` {number} The associated numeric flags -* `rawHeaders` {Array} An array containing the raw header names followed by - their respective values. +* `rawHeaders` {HTTP/2 Raw Headers} An array containing the raw headers The `'stream'` event is emitted when a `'stream'` event has been emitted by an `Http2Session` associated with the server. @@ -3450,6 +3450,32 @@ server.on('stream', (stream, headers) => { }); ``` +#### Raw headers + +In some APIs, in addition to object format, headers can also be passed or +accessed as a raw flat array, preserving details of ordering and +duplicate keys to match the raw transmission format. + +In this format the keys and values are in the same list. It is _not_ a +list of tuples. So, the even-numbered offsets are key values, and the +odd-numbered offsets are the associated values. Duplicate headers are +not merged and so each key-value pair will appear separately. + +This can be useful for cases such as proxies, where existing headers +should be exactly forwarded as received, or as a performance +optimization when the headers are already available in raw format. + +```js +const rawHeaders = [ + ':status', + '404', + 'content-type', + 'text/plain', +]; + +stream.respond(rawHeaders); +``` + #### Sensitive headers HTTP2 headers can be marked as sensitive, which means that the HTTP/2 @@ -3476,6 +3502,10 @@ this flag is set automatically. This property is also set for received headers. It will contain the names of all headers marked as sensitive, including ones marked that way automatically. +For raw headers, this should still be set as a property on the array, like +`rawHeadersArray[http2.sensitiveHeaders] = ['cookie']`, not as a separate key +and value pair within the array itself. + ### Settings object -* Type: {string\[]} +* Type: {HTTP/2 Raw Headers} The raw request/response headers list exactly as they were received. -The keys and values are in the same list. It is _not_ a -list of tuples. So, the even-numbered offsets are key values, and the -odd-numbered offsets are the associated values. - -Header names are not lowercased, and duplicates are not merged. - ```js // Prints something like: // @@ -4762,7 +4786,7 @@ changes: * `statusCode` {number} * `statusMessage` {string} -* `headers` {Object|Array} +* `headers` {HTTP/2 Headers Object|HTTP/2 Raw Headers} * Returns: {http2.Http2ServerResponse} Sends a response header to the request. The status code is a 3-digit HTTP @@ -4906,6 +4930,7 @@ you need to implement any fall-back behavior yourself. [HTTP/1]: http.md [HTTP/2]: https://tools.ietf.org/html/rfc7540 [HTTP/2 Headers Object]: #headers-object +[HTTP/2 Raw Headers]: #raw-headers [HTTP/2 Settings Object]: #settings-object [HTTP/2 Unencrypted]: https://http2.github.io/faq/#does-http2-require-encryption [HTTPS]: https.md diff --git a/tools/doc/type-parser.mjs b/tools/doc/type-parser.mjs index 898903c280ff11..1f6b63c1977578 100644 --- a/tools/doc/type-parser.mjs +++ b/tools/doc/type-parser.mjs @@ -172,6 +172,7 @@ const customTypesMap = { 'ClientHttp2Session': 'http2.html#class-clienthttp2session', 'ClientHttp2Stream': 'http2.html#class-clienthttp2stream', 'HTTP/2 Headers Object': 'http2.html#headers-object', + 'HTTP/2 Raw Headers': 'http2.html#raw-headers', 'HTTP/2 Settings Object': 'http2.html#settings-object', 'http2.Http2ServerRequest': 'http2.html#class-http2http2serverrequest', 'http2.Http2ServerResponse': From 3d88aa9f2f467540f47f47ab9585fdfab8b1d3e9 Mon Sep 17 00:00:00 2001 From: theanarkh Date: Sat, 30 Aug 2025 01:20:54 +0800 Subject: [PATCH 025/103] src: remove duplicate code MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/59649 Reviewed-By: Edy Silva Reviewed-By: Gerhard Stöbich Reviewed-By: Tobias Nießen --- src/api/embed_helpers.cc | 1 - 1 file changed, 1 deletion(-) diff --git a/src/api/embed_helpers.cc b/src/api/embed_helpers.cc index d05b26c73fa87a..f6ad46dae3db9a 100644 --- a/src/api/embed_helpers.cc +++ b/src/api/embed_helpers.cc @@ -116,7 +116,6 @@ CommonEnvironmentSetup::CommonEnvironmentSetup( Isolate::CreateParams params; params.array_buffer_allocator = impl_->allocator.get(); params.external_references = external_references.data(); - params.external_references = external_references.data(); params.cpp_heap = v8::CppHeap::Create(platform, v8::CppHeapCreateParams{{}}).release(); From 56ac9a2d4652044ef4f73514904c4f6feeb958c0 Mon Sep 17 00:00:00 2001 From: Chengzhong Wu Date: Fri, 29 Aug 2025 23:41:00 +0100 Subject: [PATCH 026/103] src: migrate WriteOneByte to WriteOneByteV2 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/59634 Fixes: https://github.com/nodejs/node/issues/59555 Reviewed-By: Rafael Gonzaga Reviewed-By: Gerhard Stöbich Reviewed-By: Yagiz Nizipli Reviewed-By: Darshan Sen Reviewed-By: Matteo Collina --- src/node_buffer.cc | 20 +++---- src/node_http2.cc | 24 ++++---- src/node_http_common-inl.h | 17 +++--- src/string_bytes.cc | 12 ++-- test/cctest/test_string_bytes.cc | 100 +++++++++++++++++++++++++++++++ 5 files changed, 137 insertions(+), 36 deletions(-) create mode 100644 test/cctest/test_string_bytes.cc diff --git a/src/node_buffer.cc b/src/node_buffer.cc index 836f93697a2609..0e4d437c1ea501 100644 --- a/src/node_buffer.cc +++ b/src/node_buffer.cc @@ -1037,8 +1037,11 @@ void IndexOfString(const FunctionCallbackInfo& args) { if (needle_data == nullptr) { return args.GetReturnValue().Set(-1); } - needle->WriteOneByte( - isolate, needle_data, 0, needle_length, String::NO_NULL_TERMINATION); + StringBytes::Write(isolate, + reinterpret_cast(needle_data), + needle_length, + needle, + enc); result = nbytes::SearchString(reinterpret_cast(haystack), haystack_length, @@ -1302,11 +1305,7 @@ static void Btoa(const FunctionCallbackInfo& args) { simdutf::binary_to_base64(ext->data(), ext->length(), buffer.out()); } else if (input->IsOneByte()) { MaybeStackBuffer stack_buf(input->Length()); - input->WriteOneByte(env->isolate(), - stack_buf.out(), - 0, - input->Length(), - String::NO_NULL_TERMINATION); + input->WriteOneByteV2(env->isolate(), 0, input->Length(), stack_buf.out()); size_t expected_length = simdutf::base64_length_from_binary(input->Length()); @@ -1362,11 +1361,8 @@ static void Atob(const FunctionCallbackInfo& args) { ext->data(), ext->length(), buffer.out(), simdutf::base64_default); } else if (input->IsOneByte()) { MaybeStackBuffer stack_buf(input->Length()); - input->WriteOneByte(args.GetIsolate(), - stack_buf.out(), - 0, - input->Length(), - String::NO_NULL_TERMINATION); + input->WriteOneByteV2( + args.GetIsolate(), 0, input->Length(), stack_buf.out()); const char* data = reinterpret_cast(*stack_buf); size_t expected_length = simdutf::maximal_binary_length_from_base64(data, input->Length()); diff --git a/src/node_http2.cc b/src/node_http2.cc index 8e51129930f2cd..e53022d94f5ca1 100644 --- a/src/node_http2.cc +++ b/src/node_http2.cc @@ -483,13 +483,10 @@ Origins::Origins( CHECK_LE(origin_contents + origin_string_len, static_cast(bs_->Data()) + bs_->ByteLength()); - CHECK_EQ(origin_string->WriteOneByte( - env->isolate(), - reinterpret_cast(origin_contents), - 0, - origin_string_len, - String::NO_NULL_TERMINATION), - origin_string_len); + origin_string->WriteOneByteV2(env->isolate(), + 0, + origin_string_len, + reinterpret_cast(origin_contents)); size_t n = 0; char* p; @@ -3186,8 +3183,8 @@ void Http2Session::AltSvc(const FunctionCallbackInfo& args) { return; } - size_t origin_len = origin_str->Length(); - size_t value_len = value_str->Length(); + int origin_len = origin_str->Length(); + int value_len = value_str->Length(); CHECK_LE(origin_len + value_len, 16382); // Max permitted for ALTSVC // Verify that origin len != 0 if stream id == 0, or @@ -3196,8 +3193,13 @@ void Http2Session::AltSvc(const FunctionCallbackInfo& args) { MaybeStackBuffer origin(origin_len); MaybeStackBuffer value(value_len); - origin_str->WriteOneByte(env->isolate(), *origin); - value_str->WriteOneByte(env->isolate(), *value); + origin_str->WriteOneByteV2(env->isolate(), + 0, + origin_len, + *origin, + String::WriteFlags::kNullTerminate); + value_str->WriteOneByteV2( + env->isolate(), 0, value_len, *value, String::WriteFlags::kNullTerminate); session->AltSvc(id, *origin, origin_len, *value, value_len); } diff --git a/src/node_http_common-inl.h b/src/node_http_common-inl.h index f7f4408ecb6eaa..339e5a612312c0 100644 --- a/src/node_http_common-inl.h +++ b/src/node_http_common-inl.h @@ -2,9 +2,11 @@ #define SRC_NODE_HTTP_COMMON_INL_H_ #include "node_http_common.h" + +#include "env-inl.h" #include "node.h" #include "node_mem-inl.h" -#include "env-inl.h" +#include "string_bytes.h" #include "v8.h" #include @@ -37,13 +39,12 @@ NgHeaders::NgHeaders(Environment* env, v8::Local headers) { nv_t* const nva = reinterpret_cast(start); CHECK_LE(header_contents + header_string_len, *buf_ + buf_.length()); - CHECK_EQ(header_string.As()->WriteOneByte( - env->isolate(), - reinterpret_cast(header_contents), - 0, - header_string_len, - v8::String::NO_NULL_TERMINATION), - header_string_len); + CHECK_EQ(StringBytes::Write(env->isolate(), + header_contents, + header_string_len, + header_string.As(), + LATIN1), + static_cast(header_string_len)); size_t n = 0; char* p; diff --git a/src/string_bytes.cc b/src/string_bytes.cc index f4411c2126f859..d78bbb237325d2 100644 --- a/src/string_bytes.cc +++ b/src/string_bytes.cc @@ -254,11 +254,13 @@ size_t StringBytes::Write(Isolate* isolate, nbytes = std::min(buflen, static_cast(input_view.length())); memcpy(buf, input_view.data8(), nbytes); } else { - uint8_t* const dst = reinterpret_cast(buf); - const int flags = String::HINT_MANY_WRITES_EXPECTED | - String::NO_NULL_TERMINATION | - String::REPLACE_INVALID_UTF8; - nbytes = str->WriteOneByte(isolate, dst, 0, buflen, flags); + nbytes = std::min(buflen, static_cast(input_view.length())); + // Do not use v8::String::WriteOneByteV2 as it asserts the string to be + // a one byte string. For compatibility, convert the uint16_t to uint8_t + // even though this may loose accuracy. + for (size_t i = 0; i < nbytes; i++) { + buf[i] = static_cast(input_view.data16()[i]); + } } break; diff --git a/test/cctest/test_string_bytes.cc b/test/cctest/test_string_bytes.cc new file mode 100644 index 00000000000000..bc308918680bb1 --- /dev/null +++ b/test/cctest/test_string_bytes.cc @@ -0,0 +1,100 @@ +#include "gtest/gtest.h" +#include "node.h" +#include "node_test_fixture.h" +#include "string_bytes.h" +#include "util-inl.h" + +using node::MaybeStackBuffer; +using node::StringBytes; +using v8::HandleScope; +using v8::Local; +using v8::Maybe; +using v8::String; + +class StringBytesTest : public EnvironmentTestFixture {}; + +// Data "Hello, ÆÊÎÖÿ" +static const char latin1_data[] = "Hello, \xC6\xCA\xCE\xD6\xFF"; +static const char utf8_data[] = "Hello, ÆÊÎÖÿ"; + +TEST_F(StringBytesTest, WriteLatin1WithOneByteString) { + const HandleScope handle_scope(isolate_); + const Argv argv; + Env env_{handle_scope, argv}; + + Local one_byte_str = + String::NewFromOneByte(isolate_, + reinterpret_cast(latin1_data)) + .ToLocalChecked(); + + Maybe size_maybe = + StringBytes::StorageSize(isolate_, one_byte_str, node::LATIN1); + + ASSERT_TRUE(size_maybe.IsJust()); + size_t size = size_maybe.FromJust(); + ASSERT_EQ(size, 12u); + + MaybeStackBuffer buf; + size_t written = StringBytes::Write( + isolate_, buf.out(), buf.capacity(), one_byte_str, node::LATIN1); + ASSERT_EQ(written, 12u); + + // Null-terminate the buffer and compare the contents. + buf.SetLength(13); + buf[12] = '\0'; + ASSERT_STREQ(latin1_data, buf.out()); +} + +TEST_F(StringBytesTest, WriteLatin1WithUtf8String) { + const HandleScope handle_scope(isolate_); + const Argv argv; + Env env_{handle_scope, argv}; + + Local utf8_str = + String::NewFromUtf8(isolate_, utf8_data).ToLocalChecked(); + + Maybe size_maybe = + StringBytes::StorageSize(isolate_, utf8_str, node::LATIN1); + + ASSERT_TRUE(size_maybe.IsJust()); + size_t size = size_maybe.FromJust(); + ASSERT_EQ(size, 12u); + + MaybeStackBuffer buf; + size_t written = StringBytes::Write( + isolate_, buf.out(), buf.capacity(), utf8_str, node::LATIN1); + ASSERT_EQ(written, 12u); + + // Null-terminate the buffer and compare the contents. + buf.SetLength(13); + buf[12] = '\0'; + ASSERT_STREQ(latin1_data, buf.out()); +} + +// Verify that StringBytes::Write converts two-byte characters to one-byte +// characters, even if there is no valid one-byte representation. +TEST_F(StringBytesTest, WriteLatin1WithInvalidChar) { + const HandleScope handle_scope(isolate_); + const Argv argv; + Env env_{handle_scope, argv}; + + Local utf8_str = + String::NewFromUtf8(isolate_, "Hello, 世界").ToLocalChecked(); + + Maybe size_maybe = + StringBytes::StorageSize(isolate_, utf8_str, node::LATIN1); + + ASSERT_TRUE(size_maybe.IsJust()); + size_t size = size_maybe.FromJust(); + ASSERT_EQ(size, 9u); + + MaybeStackBuffer buf; + size_t written = StringBytes::Write( + isolate_, buf.out(), buf.capacity(), utf8_str, node::LATIN1); + ASSERT_EQ(written, 9u); + + // Null-terminate the buffer and compare the contents. + buf.SetLength(10); + buf[9] = '\0'; + ASSERT_STREQ("Hello, \x16\x4C", buf.out()); +} From df63d37ec412212d6462fc22606eb6be87e63f53 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Micha=C3=ABl=20Zasso?= Date: Sat, 30 Aug 2025 11:47:38 +0200 Subject: [PATCH 027/103] test: fix internet/test-dns The `nodejs.org` domain has now two TXT records. Do not verify the exact number of records returned (only their shape), and check that one of them is the SPF. PR-URL: https://github.com/nodejs/node/pull/59660 Reviewed-By: Antoine du Hamel Reviewed-By: Joyee Cheung Reviewed-By: Marco Ippolito Reviewed-By: Luigi Pinca Reviewed-By: Yagiz Nizipli --- test/internet/test-dns.js | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/test/internet/test-dns.js b/test/internet/test-dns.js index 84ee4f7c9ca9de..9b08f8f92b00d3 100644 --- a/test/internet/test-dns.js +++ b/test/internet/test-dns.js @@ -523,9 +523,9 @@ TEST(function test_resolveTlsa_failure(done) { TEST(async function test_resolveTxt(done) { function validateResult(result) { - assert.ok(Array.isArray(result[0])); - assert.strictEqual(result.length, 1); - assert(result[0][0].startsWith('v=spf1')); + assert.ok(result.length > 0); + assert.ok(result.every((elem) => Array.isArray(elem) && elem.length === 1)); + assert.ok(result.some((elem) => elem[0].startsWith('v=spf1'))); } validateResult(await dnsPromises.resolveTxt(addresses.TXT_HOST)); From 1a93df808cf8eb431d375a66e2434607257d1ae4 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Ren=C3=A9?= Date: Sat, 30 Aug 2025 15:28:10 +0100 Subject: [PATCH 028/103] lib: revert to using default derived class constructors MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/59650 Reviewed-By: Matteo Collina Reviewed-By: Michaël Zasso Reviewed-By: Antoine du Hamel --- lib/inspector/promises.js | 4 +-- lib/internal/buffer.js | 9 +------ lib/internal/crypto/keys.js | 5 ---- lib/internal/fs/glob.js | 1 - lib/internal/modules/esm/module_map.js | 3 --- lib/internal/per_context/primordials.js | 33 ++++++------------------- lib/internal/readline/interface.js | 4 --- lib/internal/repl/utils.js | 4 --- lib/readline/promises.js | 4 --- 9 files changed, 9 insertions(+), 58 deletions(-) diff --git a/lib/inspector/promises.js b/lib/inspector/promises.js index 462941f1d5b597..9122ac1522092a 100644 --- a/lib/inspector/promises.js +++ b/lib/inspector/promises.js @@ -3,9 +3,7 @@ const inspector = require('inspector'); const { promisify } = require('internal/util'); -class Session extends inspector.Session { - constructor() { super(); } // eslint-disable-line no-useless-constructor -} +class Session extends inspector.Session {} Session.prototype.post = promisify(inspector.Session.prototype.post); module.exports = { diff --git a/lib/internal/buffer.js b/lib/internal/buffer.js index 2d249ccdda5ae4..e0679f5306f89e 100644 --- a/lib/internal/buffer.js +++ b/lib/internal/buffer.js @@ -960,14 +960,7 @@ function writeFloatBackwards(val, offset = 0) { return offset; } -class FastBuffer extends Uint8Array { - // Using an explicit constructor here is necessary to avoid relying on - // `Array.prototype[Symbol.iterator]`, which can be mutated by users. - // eslint-disable-next-line no-useless-constructor - constructor(bufferOrLength, byteOffset, length) { - super(bufferOrLength, byteOffset, length); - } -} +class FastBuffer extends Uint8Array {} function addBufferPrototypeMethods(proto) { proto.readBigUInt64LE = readBigUInt64LE; diff --git a/lib/internal/crypto/keys.js b/lib/internal/crypto/keys.js index 69609b91efc9cc..01b2d08035ec45 100644 --- a/lib/internal/crypto/keys.js +++ b/lib/internal/crypto/keys.js @@ -253,11 +253,6 @@ const { } class AsymmetricKeyObject extends KeyObject { - // eslint-disable-next-line no-useless-constructor - constructor(type, handle) { - super(type, handle); - } - get asymmetricKeyType() { return this[kAsymmetricKeyType] ||= this[kHandle].getAsymmetricKeyType(); } diff --git a/lib/internal/fs/glob.js b/lib/internal/fs/glob.js index 05252348daee7a..3e9c83356ce1f3 100644 --- a/lib/internal/fs/glob.js +++ b/lib/internal/fs/glob.js @@ -240,7 +240,6 @@ class Pattern { class ResultSet extends SafeSet { #root = '.'; #isExcluded = () => false; - constructor(i) { super(i); } // eslint-disable-line no-useless-constructor setup(root, isExcludedFn) { this.#root = root; diff --git a/lib/internal/modules/esm/module_map.js b/lib/internal/modules/esm/module_map.js index 0e411bed7f8ba1..207515573c79c5 100644 --- a/lib/internal/modules/esm/module_map.js +++ b/lib/internal/modules/esm/module_map.js @@ -23,8 +23,6 @@ const { validateString } = require('internal/validators'); * This cache is *not* used when custom loaders are registered. */ class ResolveCache extends SafeMap { - constructor(i) { super(i); } // eslint-disable-line no-useless-constructor - /** * Generates the internal serialized cache key and returns it along the actual cache object. * @@ -93,7 +91,6 @@ class ResolveCache extends SafeMap { * Cache the results of the `load` step of the module resolution and loading process. */ class LoadCache extends SafeMap { - constructor(i) { super(i); } // eslint-disable-line no-useless-constructor get(url, type = kImplicitTypeAttribute) { validateString(url, 'url'); validateString(type, 'type'); diff --git a/lib/internal/per_context/primordials.js b/lib/internal/per_context/primordials.js index ecee4cd66eba44..aee7de99086bd1 100644 --- a/lib/internal/per_context/primordials.js +++ b/lib/internal/per_context/primordials.js @@ -401,55 +401,36 @@ primordials.makeSafe = makeSafe; // Subclass the constructors because we need to use their prototype // methods later. -// Defining the `constructor` is necessary here to avoid the default -// constructor which uses the user-mutable `%ArrayIteratorPrototype%.next`. primordials.SafeMap = makeSafe( Map, - class SafeMap extends Map { - constructor(i) { super(i); } // eslint-disable-line no-useless-constructor - }, + class SafeMap extends Map {}, ); primordials.SafeWeakMap = makeSafe( WeakMap, - class SafeWeakMap extends WeakMap { - constructor(i) { super(i); } // eslint-disable-line no-useless-constructor - }, + class SafeWeakMap extends WeakMap {}, ); primordials.SafeSet = makeSafe( Set, - class SafeSet extends Set { - constructor(i) { super(i); } // eslint-disable-line no-useless-constructor - }, + class SafeSet extends Set {}, ); primordials.SafeWeakSet = makeSafe( WeakSet, - class SafeWeakSet extends WeakSet { - constructor(i) { super(i); } // eslint-disable-line no-useless-constructor - }, + class SafeWeakSet extends WeakSet {}, ); primordials.SafeFinalizationRegistry = makeSafe( FinalizationRegistry, - class SafeFinalizationRegistry extends FinalizationRegistry { - // eslint-disable-next-line no-useless-constructor - constructor(cleanupCallback) { super(cleanupCallback); } - }, + class SafeFinalizationRegistry extends FinalizationRegistry {}, ); primordials.SafeWeakRef = makeSafe( WeakRef, - class SafeWeakRef extends WeakRef { - // eslint-disable-next-line no-useless-constructor - constructor(target) { super(target); } - }, + class SafeWeakRef extends WeakRef {}, ); const SafePromise = makeSafe( Promise, - class SafePromise extends Promise { - // eslint-disable-next-line no-useless-constructor - constructor(executor) { super(executor); } - }, + class SafePromise extends Promise {}, ); /** diff --git a/lib/internal/readline/interface.js b/lib/internal/readline/interface.js index 5ebfa44ecba068..4fd69d43c48561 100644 --- a/lib/internal/readline/interface.js +++ b/lib/internal/readline/interface.js @@ -359,10 +359,6 @@ ObjectSetPrototypeOf(InterfaceConstructor.prototype, EventEmitter.prototype); ObjectSetPrototypeOf(InterfaceConstructor, EventEmitter); class Interface extends InterfaceConstructor { - // eslint-disable-next-line no-useless-constructor - constructor(input, output, completer, terminal) { - super(input, output, completer, terminal); - } get columns() { if (this.output?.columns) return this.output.columns; return Infinity; diff --git a/lib/internal/repl/utils.js b/lib/internal/repl/utils.js index 88919653d26508..515e99f925c118 100644 --- a/lib/internal/repl/utils.js +++ b/lib/internal/repl/utils.js @@ -97,10 +97,6 @@ function isRecoverableError(e, code) { .extend( (Parser) => { return class extends Parser { - // eslint-disable-next-line no-useless-constructor - constructor(options, input, startPos) { - super(options, input, startPos); - } nextToken() { super.nextToken(); if (this.type === tt.eof) diff --git a/lib/readline/promises.js b/lib/readline/promises.js index a4ad0adabaf228..4e9fea1f8ecb49 100644 --- a/lib/readline/promises.js +++ b/lib/readline/promises.js @@ -27,10 +27,6 @@ const { let addAbortListener; class Interface extends _Interface { - // eslint-disable-next-line no-useless-constructor - constructor(input, output, completer, terminal) { - super(input, output, completer, terminal); - } question(query, options = kEmptyObject) { return new Promise((resolve, reject) => { let cb = resolve; From 5e025c7ca702f75d834b71c59eea6bf3f3938529 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=EB=B0=A9=EC=A7=84=ED=98=81?= Date: Sat, 30 Aug 2025 23:37:15 +0900 Subject: [PATCH 029/103] stream: replace manual function validation with validateFunction Replace repetitive manual function type checking with the existing validateFunction in multiple stream operator functions. PR-URL: https://github.com/nodejs/node/pull/59529 Reviewed-By: Matteo Collina Reviewed-By: Mattias Buelens Reviewed-By: Luigi Pinca Reviewed-By: James M Snell --- lib/internal/streams/operators.js | 27 ++++++--------------------- 1 file changed, 6 insertions(+), 21 deletions(-) diff --git a/lib/internal/streams/operators.js b/lib/internal/streams/operators.js index 80a0f9f731e89a..27c22f89926021 100644 --- a/lib/internal/streams/operators.js +++ b/lib/internal/streams/operators.js @@ -18,7 +18,6 @@ const { AbortController, AbortSignal } = require('internal/abort_controller'); const { AbortError, codes: { - ERR_INVALID_ARG_TYPE, ERR_INVALID_ARG_VALUE, ERR_MISSING_ARGS, ERR_OUT_OF_RANGE, @@ -28,6 +27,7 @@ const { validateAbortSignal, validateInteger, validateObject, + validateFunction, } = require('internal/validators'); const { kWeakHandler, kResistStopPropagation } = require('internal/event_target'); const { finished } = require('internal/streams/end-of-stream'); @@ -66,10 +66,7 @@ function compose(stream, options) { } function map(fn, options) { - if (typeof fn !== 'function') { - throw new ERR_INVALID_ARG_TYPE( - 'fn', ['Function', 'AsyncFunction'], fn); - } + validateFunction(fn, 'fn'); if (options != null) { validateObject(options, 'options'); } @@ -223,10 +220,7 @@ async function some(fn, options = undefined) { } async function every(fn, options = undefined) { - if (typeof fn !== 'function') { - throw new ERR_INVALID_ARG_TYPE( - 'fn', ['Function', 'AsyncFunction'], fn); - } + validateFunction(fn, 'fn'); // https://en.wikipedia.org/wiki/De_Morgan%27s_laws return !(await some.call(this, async (...args) => { return !(await fn(...args)); @@ -241,10 +235,7 @@ async function find(fn, options) { } async function forEach(fn, options) { - if (typeof fn !== 'function') { - throw new ERR_INVALID_ARG_TYPE( - 'fn', ['Function', 'AsyncFunction'], fn); - } + validateFunction(fn, 'fn'); async function forEachFn(value, options) { await fn(value, options); return kEmpty; @@ -254,10 +245,7 @@ async function forEach(fn, options) { } function filter(fn, options) { - if (typeof fn !== 'function') { - throw new ERR_INVALID_ARG_TYPE( - 'fn', ['Function', 'AsyncFunction'], fn); - } + validateFunction(fn, 'fn'); async function filterFn(value, options) { if (await fn(value, options)) { return value; @@ -277,10 +265,7 @@ class ReduceAwareErrMissingArgs extends ERR_MISSING_ARGS { } async function reduce(reducer, initialValue, options) { - if (typeof reducer !== 'function') { - throw new ERR_INVALID_ARG_TYPE( - 'reducer', ['Function', 'AsyncFunction'], reducer); - } + validateFunction(reducer, 'reducer'); if (options != null) { validateObject(options, 'options'); } From c16163511d90aafdd983f7dd047bd8a16c6cf68d Mon Sep 17 00:00:00 2001 From: Antoine du Hamel Date: Sat, 30 Aug 2025 21:06:03 +0200 Subject: [PATCH 030/103] wasi: fix `clean` target in `test/wasi/Makefile` PR-URL: https://github.com/nodejs/node/pull/59576 Refs: https://www.gnu.org/software/make/manual/make.html#Cleanup Refs: https://www.gnu.org/software/make/manual/make.html#Parallel-Disable Reviewed-By: Chengzhong Wu Reviewed-By: Luigi Pinca --- test/wasi/Makefile | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/test/wasi/Makefile b/test/wasi/Makefile index 1cd041338d9b8e..c4dd73408aa3dc 100644 --- a/test/wasi/Makefile +++ b/test/wasi/Makefile @@ -12,5 +12,7 @@ wasm/pthread.wasm : c/pthread.c wasm/%.wasm : c/%.c $(CC) $< $(CFLAGS) --target=$(TARGET) --sysroot=$(SYSROOT) -s -o $@ -.PHONY clean: +.PHONY: clean +.NOTPARALLEL: clean +clean: rm -f $(OBJ) From 44d7b92271e6cf7a9438a59750226974cb016858 Mon Sep 17 00:00:00 2001 From: Rafael Gonzaga Date: Sat, 30 Aug 2025 20:33:57 -0300 Subject: [PATCH 031/103] benchmark: calibrate config array-vs-concat According to https://github.com/nodejs/performance/issues/186 this benchmark was taking 160 secs for a single run. Based on a research in a dedicated machine, the results doesn't have variation based on the configs, so we don't need to bench all variations. Signed-off-by: RafaelGSS PR-URL: https://github.com/nodejs/node/pull/59587 Reviewed-By: Luigi Pinca Reviewed-By: James M Snell --- benchmark/dgram/array-vs-concat.js | 18 +++++++++--------- 1 file changed, 9 insertions(+), 9 deletions(-) diff --git a/benchmark/dgram/array-vs-concat.js b/benchmark/dgram/array-vs-concat.js index b5662acfc4033b..c859771e711095 100644 --- a/benchmark/dgram/array-vs-concat.js +++ b/benchmark/dgram/array-vs-concat.js @@ -5,18 +5,18 @@ const common = require('../common.js'); const dgram = require('dgram'); const PORT = common.PORT; -// `num` is the number of send requests to queue up each time. +// `n` is the number of send requests to queue up each time. // Keep it reasonably high (>10) otherwise you're benchmarking the speed of // event loop cycles more than anything else. const bench = common.createBenchmark(main, { - len: [64, 256, 512, 1024], - num: [100], - chunks: [1, 2, 4, 8], + len: [64, 512, 1024], + n: [100], + chunks: [1, 4], type: ['concat', 'multi'], dur: [5], }); -function main({ dur, len, num, type, chunks }) { +function main({ dur, len, n, type, chunks }) { const chunk = []; for (let i = 0; i < chunks; i++) { chunk.push(Buffer.allocUnsafe(Math.round(len / chunks))); @@ -28,11 +28,11 @@ function main({ dur, len, num, type, chunks }) { const onsend = type === 'concat' ? onsendConcat : onsendMulti; function onsendConcat() { - if (sent++ % num === 0) { + if (sent++ % n === 0) { // The setImmediate() is necessary to have event loop progress on OSes // that only perform synchronous I/O on nonblocking UDP sockets. setImmediate(() => { - for (let i = 0; i < num; i++) { + for (let i = 0; i < n; i++) { socket.send(Buffer.concat(chunk), PORT, '127.0.0.1', onsend); } }); @@ -40,11 +40,11 @@ function main({ dur, len, num, type, chunks }) { } function onsendMulti() { - if (sent++ % num === 0) { + if (sent++ % n === 0) { // The setImmediate() is necessary to have event loop progress on OSes // that only perform synchronous I/O on nonblocking UDP sockets. setImmediate(() => { - for (let i = 0; i < num; i++) { + for (let i = 0; i < n; i++) { socket.send(chunk, PORT, '127.0.0.1', onsend); } }); From 87f829bd0cc8fdbf2ac2d28422ceeb0a83d67578 Mon Sep 17 00:00:00 2001 From: Aviv Keller Date: Sun, 31 Aug 2025 03:48:18 -0400 Subject: [PATCH 032/103] doc: mark `path.matchesGlob` as stable MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/59572 Reviewed-By: Luigi Pinca Reviewed-By: Ulises Gascón Reviewed-By: James M Snell Reviewed-By: Moshe Atlow --- doc/api/path.md | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/doc/api/path.md b/doc/api/path.md index f08bd86792355b..ffc90ad25c8815 100644 --- a/doc/api/path.md +++ b/doc/api/path.md @@ -289,10 +289,12 @@ path.format({ added: - v22.5.0 - v20.17.0 +changes: + - version: REPLACEME + pr-url: https://github.com/nodejs/node/pull/59572 + description: Marking the API stable. --> -> Stability: 1 - Experimental - * `path` {string} The path to glob-match against. * `pattern` {string} The glob to check the path against. * Returns: {boolean} Whether or not the `path` matched the `pattern`. From bda32af587cfd3b67d9d4155ad762899df0bb2ee Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Micha=C3=ABl=20Zasso?= Date: Sun, 31 Aug 2025 16:19:11 +0200 Subject: [PATCH 033/103] build: use `windows-2025` runner GitHub is transitioning `windows-latest` to it. PR-URL: https://github.com/nodejs/node/pull/59673 Reviewed-By: Luigi Pinca Reviewed-By: Antoine du Hamel Reviewed-By: Moshe Atlow --- .github/workflows/coverage-windows.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/coverage-windows.yml b/.github/workflows/coverage-windows.yml index 52facb5778fba1..6bffbbbccc7dcb 100644 --- a/.github/workflows/coverage-windows.yml +++ b/.github/workflows/coverage-windows.yml @@ -43,7 +43,7 @@ permissions: jobs: coverage-windows: if: github.event.pull_request.draft == false - runs-on: windows-2022 + runs-on: windows-2025 steps: - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 with: From 0748160d2e35e5858ef3fe4eb84159091e5e9c52 Mon Sep 17 00:00:00 2001 From: Chengzhong Wu Date: Sun, 31 Aug 2025 18:20:02 +0100 Subject: [PATCH 034/103] lib: fix DOMException subclass support PR-URL: https://github.com/nodejs/node/pull/59680 Reviewed-By: Matthew Aitken Reviewed-By: James M Snell Reviewed-By: Jordan Harband --- lib/internal/per_context/domexception.js | 6 +-- test/parallel/test-domexception-subclass.js | 55 +++++++++++++++++++++ 2 files changed, 58 insertions(+), 3 deletions(-) create mode 100644 test/parallel/test-domexception-subclass.js diff --git a/lib/internal/per_context/domexception.js b/lib/internal/per_context/domexception.js index 1bc46616556612..d8a2e7df88a697 100644 --- a/lib/internal/per_context/domexception.js +++ b/lib/internal/per_context/domexception.js @@ -60,7 +60,6 @@ const disusedNamesSet = new SafeSet() .add('NoDataAllowedError') .add('ValidationError'); -let DOMExceptionPrototype; // The DOMException WebIDL interface defines that: // - ObjectGetPrototypeOf(DOMException) === Function. // - ObjectGetPrototypeOf(DOMException.prototype) === Error.prototype. @@ -75,7 +74,8 @@ class DOMException { // internal slot. // eslint-disable-next-line no-restricted-syntax const self = new Error(); - ObjectSetPrototypeOf(self, DOMExceptionPrototype); + // Use `new.target.prototype` to support DOMException subclasses. + ObjectSetPrototypeOf(self, new.target.prototype); self[transfer_mode_private_symbol] = kCloneable; if (options && typeof options === 'object') { @@ -158,7 +158,7 @@ class DOMException { } } -DOMExceptionPrototype = DOMException.prototype; +const DOMExceptionPrototype = DOMException.prototype; ObjectSetPrototypeOf(DOMExceptionPrototype, ErrorPrototype); ObjectDefineProperties(DOMExceptionPrototype, { [SymbolToStringTag]: { __proto__: null, configurable: true, value: 'DOMException' }, diff --git a/test/parallel/test-domexception-subclass.js b/test/parallel/test-domexception-subclass.js new file mode 100644 index 00000000000000..a9498d95a13d5c --- /dev/null +++ b/test/parallel/test-domexception-subclass.js @@ -0,0 +1,55 @@ +'use strict'; + +require('../common'); +const assert = require('assert'); + +class MyDOMException extends DOMException { + ownProp; + #reason; + + constructor() { + super('my message', 'NotFoundError'); + this.ownProp = 'bar'; + this.#reason = 'hello'; + } + + get reason() { + return this.#reason; + } +} + +const myException = new MyDOMException(); +// Verifies the prototype chain +assert(myException instanceof MyDOMException); +assert(myException instanceof DOMException); +assert(myException instanceof Error); +// Verifies [[ErrorData]] +assert(Error.isError(myException)); + +// Verifies subclass properties +assert(Object.hasOwn(myException, 'ownProp')); +assert(!Object.hasOwn(myException, 'reason')); +assert.strictEqual(myException.reason, 'hello'); + +// Verifies error properties +assert.strictEqual(myException.name, 'NotFoundError'); +assert.strictEqual(myException.code, 8); +assert.strictEqual(myException.message, 'my message'); +assert.strictEqual(typeof myException.stack, 'string'); + +// Verify structuredClone only copies known error properties. +const cloned = structuredClone(myException); +assert(!(cloned instanceof MyDOMException)); +assert(cloned instanceof DOMException); +assert(cloned instanceof Error); +assert(Error.isError(cloned)); + +// Verify custom properties +assert(!Object.hasOwn(cloned, 'ownProp')); +assert.strictEqual(cloned.reason, undefined); + +// Verify cloned error properties +assert.strictEqual(cloned.name, 'NotFoundError'); +assert.strictEqual(cloned.code, 8); +assert.strictEqual(cloned.message, 'my message'); +assert.strictEqual(cloned.stack, myException.stack); From e69be5611f27fa1c1685ec5afaac84494f23c27f Mon Sep 17 00:00:00 2001 From: Nicholas Paun Date: Sun, 31 Aug 2025 10:30:53 -0700 Subject: [PATCH 035/103] fs: fix dereference: false on cpSync MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/59681 Reviewed-By: Yagiz Nizipli Reviewed-By: Juan José Arboleda Reviewed-By: Dario Piotrowicz Reviewed-By: James M Snell Reviewed-By: Luigi Pinca --- src/node_file.cc | 4 +- .../test-fs-cp-sync-dereference.js | 39 --------------- test/parallel/test-fs-cp-sync-dereference.js | 50 +++++++++++++++++++ 3 files changed, 52 insertions(+), 41 deletions(-) delete mode 100644 test/known_issues/test-fs-cp-sync-dereference.js create mode 100644 test/parallel/test-fs-cp-sync-dereference.js diff --git a/src/node_file.cc b/src/node_file.cc index 0816d088df050c..9c22c2228928ae 100644 --- a/src/node_file.cc +++ b/src/node_file.cc @@ -3251,8 +3251,8 @@ static void CpSyncCheckPaths(const FunctionCallbackInfo& args) { errorno, dereference ? "stat" : "lstat", nullptr, src.out()); } auto dest_status = - dereference ? std::filesystem::symlink_status(dest_path, error_code) - : std::filesystem::status(dest_path, error_code); + dereference ? std::filesystem::status(dest_path, error_code) + : std::filesystem::symlink_status(dest_path, error_code); bool dest_exists = !error_code && dest_status.type() != std::filesystem::file_type::not_found; diff --git a/test/known_issues/test-fs-cp-sync-dereference.js b/test/known_issues/test-fs-cp-sync-dereference.js deleted file mode 100644 index fbb07a8f781520..00000000000000 --- a/test/known_issues/test-fs-cp-sync-dereference.js +++ /dev/null @@ -1,39 +0,0 @@ -'use strict'; - -// Refs: https://github.com/nodejs/node/issues/58939 -// -// The cpSync function is not correctly handling the `dereference` option. -// In this test, both the cp and cpSync functions are attempting to copy -// a file over a symlinked directory. In the cp case it works fine. In the -// cpSync case it fails with an error. - -const common = require('../common'); - -const { - cp, - cpSync, - mkdirSync, - symlinkSync, - writeFileSync, -} = require('fs'); - -const { - join, -} = require('path'); - -const tmpdir = require('../common/tmpdir'); -tmpdir.refresh(); - -const pathA = join(tmpdir.path, 'a'); -const pathB = join(tmpdir.path, 'b'); -const pathC = join(tmpdir.path, 'c'); -const pathD = join(tmpdir.path, 'd'); - -writeFileSync(pathA, 'file a'); -mkdirSync(pathB); -symlinkSync(pathB, pathC, 'dir'); -symlinkSync(pathB, pathD, 'dir'); - -cp(pathA, pathD, { dereference: false }, common.mustSucceed()); - -cpSync(pathA, pathC, { dereference: false }); diff --git a/test/parallel/test-fs-cp-sync-dereference.js b/test/parallel/test-fs-cp-sync-dereference.js new file mode 100644 index 00000000000000..dffb5b171c4d1d --- /dev/null +++ b/test/parallel/test-fs-cp-sync-dereference.js @@ -0,0 +1,50 @@ +'use strict'; + +// Refs: https://github.com/nodejs/node/issues/58939 +// +// In this test, both the cp and cpSync functions are attempting to copy +// a file over a symlinked directory. + +const common = require('../common'); + +const { + cp, + cpSync, + mkdirSync, + symlinkSync, + writeFileSync, + readFileSync, + statSync +} = require('fs'); + +const { + join, +} = require('path'); + +const assert = require('assert'); + +const tmpdir = require('../common/tmpdir'); +tmpdir.refresh(); + +const pathA = join(tmpdir.path, 'a'); // file +const pathB = join(tmpdir.path, 'b'); // directory +const pathC = join(tmpdir.path, 'c'); // c -> b +const pathD = join(tmpdir.path, 'd'); // d -> b + +writeFileSync(pathA, 'file a'); +mkdirSync(pathB); +symlinkSync(pathB, pathC, 'dir'); +symlinkSync(pathB, pathD, 'dir'); + +cp(pathA, pathD, { dereference: false }, common.mustSucceed(() => { + // The path d is now a file, not a symlink + assert.strictEqual(readFileSync(pathA, 'utf-8'), readFileSync(pathD, 'utf-8')); + assert.ok(statSync(pathA).isFile()); + assert.ok(statSync(pathD).isFile()); +})); + +cpSync(pathA, pathC, { dereference: false }); + +assert.strictEqual(readFileSync(pathA, 'utf-8'), readFileSync(pathC, 'utf-8')); +assert.ok(statSync(pathA).isFile()); +assert.ok(statSync(pathC).isFile()); From 96db47f91e49dc7fe7eee84d12bf4c2f1d8d8988 Mon Sep 17 00:00:00 2001 From: Miles Guicent Date: Sun, 31 Aug 2025 19:39:23 +0200 Subject: [PATCH 036/103] doc: add Miles Guicent as triager PR-URL: https://github.com/nodejs/node/pull/59562 Reviewed-By: Claudio Wunder Reviewed-By: Xuguang Mei --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index 9b728ffd86547d..061831c2be02e5 100644 --- a/README.md +++ b/README.md @@ -767,6 +767,8 @@ maintaining the Node.js project. **Akhil Marsonya** <> (he/him) * [meixg](https://github.com/meixg) - **Xuguang Mei** <> (he/him) +* [milesguicent](https://github.com/milesguicent) - + **Miles Guicent** <> (he/him) * [preveen-stack](https://github.com/preveen-stack) - **Preveen Padmanabhan** <> (he/him) * [RaisinTen](https://github.com/RaisinTen) - From 88d1ca89904c00128b776eb2dd4c57fa803df706 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Micha=C3=ABl=20Zasso?= Date: Sun, 31 Aug 2025 22:17:31 +0200 Subject: [PATCH 037/103] src: use non-deprecated Get/SetPrototype methods Refs: https://github.com/v8/v8/commit/5e139e98d1e4d8bbcccf7b15cd8e7d08b28e4a81 PR-URL: https://github.com/nodejs/node/pull/59671 Reviewed-By: Edy Silva Reviewed-By: Vladimir Morozov Reviewed-By: Marco Ippolito Reviewed-By: Anna Henningsen Reviewed-By: Luigi Pinca --- src/node_sqlite.cc | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/src/node_sqlite.cc b/src/node_sqlite.cc index c635b1a206e66d..e779cc0d782c08 100644 --- a/src/node_sqlite.cc +++ b/src/node_sqlite.cc @@ -2103,9 +2103,9 @@ void StatementSync::Iterate(const FunctionCallbackInfo& args) { StatementSyncIterator::Create(env, BaseObjectPtr(stmt)); if (iter->object() - ->GetPrototype() + ->GetPrototypeV2() .As() - ->SetPrototype(context, js_iterator_prototype) + ->SetPrototypeV2(context, js_iterator_prototype) .IsNothing()) { return; } From 15cbd3966a336cd17eb0415c6e4d858ac3517813 Mon Sep 17 00:00:00 2001 From: Joyee Cheung Date: Mon, 1 Sep 2025 16:59:04 +0200 Subject: [PATCH 038/103] src: separate module.hasAsyncGraph and module.hasTopLevelAwait Clarify the names - hasAsyncGraph means either the module or its dependencies contains top-level await; hasTopLevelAwait means the module itself contains top-level await. Theoratically the former can be inferred from iterating over the dependencies but for the built-in loader it's currently not fully reliable until we eliminate async linking. Also remove the hasTopLevelAwait method - we can simply put it on the module wrap for source text modules right after compilation. PR-URL: https://github.com/nodejs/node/pull/59675 Reviewed-By: Anna Henningsen Reviewed-By: Chengzhong Wu --- lib/internal/modules/esm/loader.js | 2 +- lib/internal/modules/esm/module_job.js | 15 ++++++------ src/env_properties.h | 1 + src/module_wrap.cc | 32 +++++++------------------- src/module_wrap.h | 2 -- 5 files changed, 18 insertions(+), 34 deletions(-) diff --git a/lib/internal/modules/esm/loader.js b/lib/internal/modules/esm/loader.js index 0443838c8a4ba5..03fe91a3d75109 100644 --- a/lib/internal/modules/esm/loader.js +++ b/lib/internal/modules/esm/loader.js @@ -387,7 +387,7 @@ class ModuleLoader { if (!job.module) { assert.fail(getRaceMessage(filename, parentFilename)); } - if (job.module.async) { + if (job.module.hasAsyncGraph) { throw new ERR_REQUIRE_ASYNC_MODULE(filename, parentFilename); } const status = job.module.getStatus(); diff --git a/lib/internal/modules/esm/module_job.js b/lib/internal/modules/esm/module_job.js index a3b36add4fbe6a..3b48233d636c32 100644 --- a/lib/internal/modules/esm/module_job.js +++ b/lib/internal/modules/esm/module_job.js @@ -327,13 +327,13 @@ class ModuleJob extends ModuleJobBase { // FIXME(joyeecheung): this cannot fully handle < kInstantiated. Make the linking // fully synchronous instead. if (status === kUninstantiated) { - this.module.async = this.module.instantiateSync(); + this.module.hasAsyncGraph = this.module.instantiateSync(); status = this.module.getStatus(); } if (status === kInstantiated || status === kErrored) { const filename = urlToFilename(this.url); const parentFilename = urlToFilename(parent?.filename); - this.module.async ??= this.module.isGraphAsync(); + this.module.hasAsyncGraph ??= this.module.isGraphAsync(); if (this.module.async && !getOptionValue('--experimental-print-required-tla')) { throw new ERR_REQUIRE_ASYNC_MODULE(filename, parentFilename); @@ -370,7 +370,7 @@ class ModuleJob extends ModuleJobBase { try { await this.module.evaluate(timeout, breakOnSigint); } catch (e) { - explainCommonJSGlobalLikeNotDefinedError(e, this.module.url, this.module.hasTopLevelAwait()); + explainCommonJSGlobalLikeNotDefinedError(e, this.module.url, this.module.hasTopLevelAwait); throw e; } return { __proto__: null, module: this.module }; @@ -490,16 +490,17 @@ class ModuleJobSync extends ModuleJobBase { debug('ModuleJobSync.runSync()', this.module); assert(this.phase === kEvaluationPhase); // TODO(joyeecheung): add the error decoration logic from the async instantiate. - this.module.async = this.module.instantiateSync(); + this.module.hasAsyncGraph = this.module.instantiateSync(); // If --experimental-print-required-tla is true, proceeds to evaluation even // if it's async because we want to search for the TLA and help users locate // them. // TODO(joyeecheung): track the asynchroniticy using v8::Module::HasTopLevelAwait() // and we'll be able to throw right after compilation of the modules, using acron - // to find and print the TLA. + // to find and print the TLA. This requires the linking to be synchronous in case + // it runs into cached asynchronous modules that are not yet fetched. const parentFilename = urlToFilename(parent?.filename); const filename = urlToFilename(this.url); - if (this.module.async && !getOptionValue('--experimental-print-required-tla')) { + if (this.module.hasAsyncGraph && !getOptionValue('--experimental-print-required-tla')) { throw new ERR_REQUIRE_ASYNC_MODULE(filename, parentFilename); } setHasStartedUserESMExecution(); @@ -507,7 +508,7 @@ class ModuleJobSync extends ModuleJobBase { const namespace = this.module.evaluateSync(filename, parentFilename); return { __proto__: null, module: this.module, namespace }; } catch (e) { - explainCommonJSGlobalLikeNotDefinedError(e, this.module.url, this.module.hasTopLevelAwait()); + explainCommonJSGlobalLikeNotDefinedError(e, this.module.url, this.module.hasTopLevelAwait); throw e; } } diff --git a/src/env_properties.h b/src/env_properties.h index dd93b9a4009180..96e60c12d2b47d 100644 --- a/src/env_properties.h +++ b/src/env_properties.h @@ -208,6 +208,7 @@ V(gid_string, "gid") \ V(groups_string, "groups") \ V(has_regexp_groups_string, "hasRegExpGroups") \ + V(has_top_level_await_string, "hasTopLevelAwait") \ V(hash_string, "hash") \ V(h2_string, "h2") \ V(handle_string, "handle") \ diff --git a/src/module_wrap.cc b/src/module_wrap.cc index ccd3ded24f92b3..0ce4389e043962 100644 --- a/src/module_wrap.cc +++ b/src/module_wrap.cc @@ -22,6 +22,7 @@ using errors::TryCatchScope; using node::contextify::ContextifyContext; using v8::Array; using v8::ArrayBufferView; +using v8::Boolean; using v8::Context; using v8::Data; using v8::EscapableHandleScope; @@ -414,6 +415,13 @@ void ModuleWrap::New(const FunctionCallbackInfo& args) { return; } + if (that->Set(context, + realm->env()->has_top_level_await_string(), + Boolean::New(isolate, module->HasTopLevelAwait())) + .IsNothing()) { + return; + } + if (that->Set(context, realm->env()->source_url_string(), module->GetUnboundModuleScript()->GetSourceURL()) @@ -999,27 +1007,6 @@ void ModuleWrap::IsGraphAsync(const FunctionCallbackInfo& args) { args.GetReturnValue().Set(module->IsGraphAsync()); } -void ModuleWrap::HasTopLevelAwait(const FunctionCallbackInfo& args) { - Isolate* isolate = args.GetIsolate(); - ModuleWrap* obj; - ASSIGN_OR_RETURN_UNWRAP(&obj, args.This()); - - Local module = obj->module_.Get(isolate); - - // Check if module is valid - if (module.IsEmpty()) { - args.GetReturnValue().Set(false); - return; - } - - // For source text modules, check if the graph is async - // For synthetic modules, it's always false - bool has_top_level_await = - module->IsSourceTextModule() && module->IsGraphAsync(); - - args.GetReturnValue().Set(has_top_level_await); -} - void ModuleWrap::GetError(const FunctionCallbackInfo& args) { Isolate* isolate = args.GetIsolate(); ModuleWrap* obj; @@ -1443,8 +1430,6 @@ void ModuleWrap::CreatePerIsolateProperties(IsolateData* isolate_data, SetProtoMethodNoSideEffect(isolate, tpl, "getNamespace", GetNamespace); SetProtoMethodNoSideEffect(isolate, tpl, "getStatus", GetStatus); SetProtoMethodNoSideEffect(isolate, tpl, "isGraphAsync", IsGraphAsync); - SetProtoMethodNoSideEffect( - isolate, tpl, "hasTopLevelAwait", HasTopLevelAwait); SetProtoMethodNoSideEffect(isolate, tpl, "getError", GetError); SetConstructorFunction(isolate, target, "ModuleWrap", tpl); isolate_data->set_module_wrap_constructor_template(tpl); @@ -1507,7 +1492,6 @@ void ModuleWrap::RegisterExternalReferences( registry->Register(GetStatus); registry->Register(GetError); registry->Register(IsGraphAsync); - registry->Register(HasTopLevelAwait); registry->Register(CreateRequiredModuleFacade); diff --git a/src/module_wrap.h b/src/module_wrap.h index 2d0747dcf06dd6..a84d57d96ac57e 100644 --- a/src/module_wrap.h +++ b/src/module_wrap.h @@ -119,8 +119,6 @@ class ModuleWrap : public BaseObject { v8::Local module, v8::Local meta); - static void HasTopLevelAwait(const v8::FunctionCallbackInfo& args); - v8::Local context() const; v8::Maybe CheckUnsettledTopLevelAwait(); From 2e54411cb6d4856a124a08d3ebb99e293847bcf3 Mon Sep 17 00:00:00 2001 From: theanarkh Date: Tue, 2 Sep 2025 00:25:48 +0800 Subject: [PATCH 039/103] worker: optimize cpu profile implement MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/59683 Reviewed-By: Juan José Arboleda Reviewed-By: James M Snell Reviewed-By: Anna Henningsen --- doc/api/v8.md | 27 +++++++++++ doc/api/worker_threads.md | 27 ++++++++--- lib/internal/worker.js | 61 ++++++++++++++---------- src/env.cc | 20 ++++---- src/env.h | 6 +-- src/node_errors.h | 1 - src/node_worker.cc | 27 +++++------ test/parallel/test-worker-cpu-profile.js | 42 +++------------- tools/doc/type-parser.mjs | 1 + typings/internalBinding/worker.d.ts | 7 ++- 10 files changed, 119 insertions(+), 100 deletions(-) diff --git a/doc/api/v8.md b/doc/api/v8.md index dae98497bda070..ff903dbda992c3 100644 --- a/doc/api/v8.md +++ b/doc/api/v8.md @@ -1394,6 +1394,33 @@ setTimeout(() => { }, 1000); ``` +## Class: `CPUProfileHandle` + + + +### `cpuProfileHandle.stop()` + + + +* Returns: {Promise} + +Stopping collecting the profile, then return a Promise that fulfills with an error or the +profile data. + +### `cpuProfileHandle[Symbol.asyncDispose]()` + + + +* Returns: {Promise} + +Stopping collecting the profile and the profile will be discarded. + ## `v8.isStringOneByteRepresentation(content)` -* name: {string} * Returns: {Promise} -Starting a CPU profile with the given `name`, then return a Promise that fulfills -with an error or an object which has a `stop` method. Calling the `stop` method will -stop collecting the profile, then return a Promise that fulfills with an error or the -profile data. +Starting a CPU profile then return a Promise that fulfills with an error +or an `CPUProfileHandle` object. This API supports `await using` syntax. ```cjs const { Worker } = require('node:worker_threads'); @@ -1976,13 +1973,29 @@ const worker = new Worker(` `, { eval: true }); worker.on('online', async () => { - const handle = await worker.startCpuProfile('demo'); + const handle = await worker.startCpuProfile(); const profile = await handle.stop(); console.log(profile); worker.terminate(); }); ``` +`await using` example. + +```cjs +const { Worker } = require('node::worker_threads'); + +const w = new Worker(` + const { parentPort } = require('worker_threads'); + parentPort.on('message', () => {}); + `, { eval: true }); + +w.on('online', async () => { + // Stop profile automatically when return and profile will be discarded + await using handle = await w.startCpuProfile(); +}); +``` + ### `worker.stderr` - [MSVSProject.Filter('a', contents=['joe\\a\\bob1.c']), - MSVSProject.Filter('b', contents=['joe\\b\\bob2.c'])] - """ + Arguments: + sources: A list of source file paths split. + prefix: A list of source file path layers meant to apply to each of sources. + excluded: A set of excluded files. + msvs_version: A MSVSVersion object. + + Returns: + A hierarchy of filenames and MSVSProject.Filter objects that matches the + layout of the source tree. + For example: + _ConvertSourcesToFilterHierarchy([['a', 'bob1.c'], ['b', 'bob2.c']], + prefix=['joe']) + --> + [MSVSProject.Filter('a', contents=['joe\\a\\bob1.c']), + MSVSProject.Filter('b', contents=['joe\\b\\bob2.c'])] + """ if not prefix: prefix = [] result = [] @@ -361,7 +361,6 @@ def _ConfigWindowsTargetPlatformVersion(config_data, version): def _BuildCommandLineForRuleRaw( spec, cmd, cygwin_shell, has_input_path, quote_cmd, do_setup_env ): - if [x for x in cmd if "$(InputDir)" in x]: input_dir_preamble = ( "set INPUTDIR=$(InputDir)\n" @@ -425,8 +424,7 @@ def _BuildCommandLineForRuleRaw( # Return the path with forward slashes because the command using it might # not support backslashes. arguments = [ - i if (i[:1] in "/-" or "=" in i) else _FixPath(i, "/") - for i in cmd[1:] + i if (i[:1] in "/-" or "=" in i) else _FixPath(i, "/") for i in cmd[1:] ] arguments = [i.replace("$(InputDir)", "%INPUTDIR%") for i in arguments] arguments = [MSVSSettings.FixVCMacroSlashes(i) for i in arguments] @@ -459,17 +457,17 @@ def _BuildCommandLineForRule(spec, rule, has_input_path, do_setup_env): def _AddActionStep(actions_dict, inputs, outputs, description, command): """Merge action into an existing list of actions. - Care must be taken so that actions which have overlapping inputs either don't - get assigned to the same input, or get collapsed into one. - - Arguments: - actions_dict: dictionary keyed on input name, which maps to a list of - dicts describing the actions attached to that input file. - inputs: list of inputs - outputs: list of outputs - description: description of the action - command: command line to execute - """ + Care must be taken so that actions which have overlapping inputs either don't + get assigned to the same input, or get collapsed into one. + + Arguments: + actions_dict: dictionary keyed on input name, which maps to a list of + dicts describing the actions attached to that input file. + inputs: list of inputs + outputs: list of outputs + description: description of the action + command: command line to execute + """ # Require there to be at least one input (call sites will ensure this). assert inputs @@ -496,15 +494,15 @@ def _AddCustomBuildToolForMSVS( ): """Add a custom build tool to execute something. - Arguments: - p: the target project - spec: the target project dict - primary_input: input file to attach the build tool to - inputs: list of inputs - outputs: list of outputs - description: description of the action - cmd: command line to execute - """ + Arguments: + p: the target project + spec: the target project dict + primary_input: input file to attach the build tool to + inputs: list of inputs + outputs: list of outputs + description: description of the action + cmd: command line to execute + """ inputs = _FixPaths(inputs) outputs = _FixPaths(outputs) tool = MSVSProject.Tool( @@ -526,12 +524,12 @@ def _AddCustomBuildToolForMSVS( def _AddAccumulatedActionsToMSVS(p, spec, actions_dict): """Add actions accumulated into an actions_dict, merging as needed. - Arguments: - p: the target project - spec: the target project dict - actions_dict: dictionary keyed on input name, which maps to a list of - dicts describing the actions attached to that input file. - """ + Arguments: + p: the target project + spec: the target project dict + actions_dict: dictionary keyed on input name, which maps to a list of + dicts describing the actions attached to that input file. + """ for primary_input in actions_dict: inputs = OrderedSet() outputs = OrderedSet() @@ -559,12 +557,12 @@ def _AddAccumulatedActionsToMSVS(p, spec, actions_dict): def _RuleExpandPath(path, input_file): """Given the input file to which a rule applied, string substitute a path. - Arguments: - path: a path to string expand - input_file: the file to which the rule applied. - Returns: - The string substituted path. - """ + Arguments: + path: a path to string expand + input_file: the file to which the rule applied. + Returns: + The string substituted path. + """ path = path.replace( "$(InputName)", os.path.splitext(os.path.split(input_file)[1])[0] ) @@ -580,24 +578,24 @@ def _RuleExpandPath(path, input_file): def _FindRuleTriggerFiles(rule, sources): """Find the list of files which a particular rule applies to. - Arguments: - rule: the rule in question - sources: the set of all known source files for this project - Returns: - The list of sources that trigger a particular rule. - """ + Arguments: + rule: the rule in question + sources: the set of all known source files for this project + Returns: + The list of sources that trigger a particular rule. + """ return rule.get("rule_sources", []) def _RuleInputsAndOutputs(rule, trigger_file): """Find the inputs and outputs generated by a rule. - Arguments: - rule: the rule in question. - trigger_file: the main trigger for this rule. - Returns: - The pair of (inputs, outputs) involved in this rule. - """ + Arguments: + rule: the rule in question. + trigger_file: the main trigger for this rule. + Returns: + The pair of (inputs, outputs) involved in this rule. + """ raw_inputs = _FixPaths(rule.get("inputs", [])) raw_outputs = _FixPaths(rule.get("outputs", [])) inputs = OrderedSet() @@ -613,13 +611,13 @@ def _RuleInputsAndOutputs(rule, trigger_file): def _GenerateNativeRulesForMSVS(p, rules, output_dir, spec, options): """Generate a native rules file. - Arguments: - p: the target project - rules: the set of rules to include - output_dir: the directory in which the project/gyp resides - spec: the project dict - options: global generator options - """ + Arguments: + p: the target project + rules: the set of rules to include + output_dir: the directory in which the project/gyp resides + spec: the project dict + options: global generator options + """ rules_filename = "{}{}.rules".format(spec["target_name"], options.suffix) rules_file = MSVSToolFile.Writer( os.path.join(output_dir, rules_filename), spec["target_name"] @@ -658,14 +656,14 @@ def _Cygwinify(path): def _GenerateExternalRules(rules, output_dir, spec, sources, options, actions_to_add): """Generate an external makefile to do a set of rules. - Arguments: - rules: the list of rules to include - output_dir: path containing project and gyp files - spec: project specification data - sources: set of sources known - options: global generator options - actions_to_add: The list of actions we will add to. - """ + Arguments: + rules: the list of rules to include + output_dir: path containing project and gyp files + spec: project specification data + sources: set of sources known + options: global generator options + actions_to_add: The list of actions we will add to. + """ filename = "{}_rules{}.mk".format(spec["target_name"], options.suffix) mk_file = gyp.common.WriteOnDiff(os.path.join(output_dir, filename)) # Find cygwin style versions of some paths. @@ -743,17 +741,17 @@ def _GenerateExternalRules(rules, output_dir, spec, sources, options, actions_to def _EscapeEnvironmentVariableExpansion(s): """Escapes % characters. - Escapes any % characters so that Windows-style environment variable - expansions will leave them alone. - See http://connect.microsoft.com/VisualStudio/feedback/details/106127/cl-d-name-text-containing-percentage-characters-doesnt-compile - to understand why we have to do this. + Escapes any % characters so that Windows-style environment variable + expansions will leave them alone. + See http://connect.microsoft.com/VisualStudio/feedback/details/106127/cl-d-name-text-containing-percentage-characters-doesnt-compile + to understand why we have to do this. - Args: - s: The string to be escaped. + Args: + s: The string to be escaped. - Returns: - The escaped string. - """ + Returns: + The escaped string. + """ s = s.replace("%", "%%") return s @@ -764,17 +762,17 @@ def _EscapeEnvironmentVariableExpansion(s): def _EscapeCommandLineArgumentForMSVS(s): """Escapes a Windows command-line argument. - So that the Win32 CommandLineToArgv function will turn the escaped result back - into the original string. - See http://msdn.microsoft.com/en-us/library/17w5ykft.aspx - ("Parsing C++ Command-Line Arguments") to understand why we have to do - this. + So that the Win32 CommandLineToArgv function will turn the escaped result back + into the original string. + See http://msdn.microsoft.com/en-us/library/17w5ykft.aspx + ("Parsing C++ Command-Line Arguments") to understand why we have to do + this. - Args: - s: the string to be escaped. - Returns: - the escaped string. - """ + Args: + s: the string to be escaped. + Returns: + the escaped string. + """ def _Replace(match): # For a literal quote, CommandLineToArgv requires an odd number of @@ -795,24 +793,24 @@ def _Replace(match): def _EscapeVCProjCommandLineArgListItem(s): """Escapes command line arguments for MSVS. - The VCProj format stores string lists in a single string using commas and - semi-colons as separators, which must be quoted if they are to be - interpreted literally. However, command-line arguments may already have - quotes, and the VCProj parser is ignorant of the backslash escaping - convention used by CommandLineToArgv, so the command-line quotes and the - VCProj quotes may not be the same quotes. So to store a general - command-line argument in a VCProj list, we need to parse the existing - quoting according to VCProj's convention and quote any delimiters that are - not already quoted by that convention. The quotes that we add will also be - seen by CommandLineToArgv, so if backslashes precede them then we also have - to escape those backslashes according to the CommandLineToArgv - convention. - - Args: - s: the string to be escaped. - Returns: - the escaped string. - """ + The VCProj format stores string lists in a single string using commas and + semi-colons as separators, which must be quoted if they are to be + interpreted literally. However, command-line arguments may already have + quotes, and the VCProj parser is ignorant of the backslash escaping + convention used by CommandLineToArgv, so the command-line quotes and the + VCProj quotes may not be the same quotes. So to store a general + command-line argument in a VCProj list, we need to parse the existing + quoting according to VCProj's convention and quote any delimiters that are + not already quoted by that convention. The quotes that we add will also be + seen by CommandLineToArgv, so if backslashes precede them then we also have + to escape those backslashes according to the CommandLineToArgv + convention. + + Args: + s: the string to be escaped. + Returns: + the escaped string. + """ def _Replace(match): # For a non-literal quote, CommandLineToArgv requires an even number of @@ -896,15 +894,15 @@ def _GenerateRulesForMSVS( ): """Generate all the rules for a particular project. - Arguments: - p: the project - output_dir: directory to emit rules to - options: global options passed to the generator - spec: the specification for this project - sources: the set of all known source files in this project - excluded_sources: the set of sources excluded from normal processing - actions_to_add: deferred list of actions to add in - """ + Arguments: + p: the project + output_dir: directory to emit rules to + options: global options passed to the generator + spec: the specification for this project + sources: the set of all known source files in this project + excluded_sources: the set of sources excluded from normal processing + actions_to_add: deferred list of actions to add in + """ rules = spec.get("rules", []) rules_native = [r for r in rules if not int(r.get("msvs_external_rule", 0))] rules_external = [r for r in rules if int(r.get("msvs_external_rule", 0))] @@ -946,12 +944,12 @@ def _AdjustSourcesForRules(rules, sources, excluded_sources, is_msbuild): def _FilterActionsFromExcluded(excluded_sources, actions_to_add): """Take inputs with actions attached out of the list of exclusions. - Arguments: - excluded_sources: list of source files not to be built. - actions_to_add: dict of actions keyed on source file they're attached to. - Returns: - excluded_sources with files that have actions attached removed. - """ + Arguments: + excluded_sources: list of source files not to be built. + actions_to_add: dict of actions keyed on source file they're attached to. + Returns: + excluded_sources with files that have actions attached removed. + """ must_keep = OrderedSet(_FixPaths(actions_to_add.keys())) return [s for s in excluded_sources if s not in must_keep] @@ -963,14 +961,14 @@ def _GetDefaultConfiguration(spec): def _GetGuidOfProject(proj_path, spec): """Get the guid for the project. - Arguments: - proj_path: Path of the vcproj or vcxproj file to generate. - spec: The target dictionary containing the properties of the target. - Returns: - the guid. - Raises: - ValueError: if the specified GUID is invalid. - """ + Arguments: + proj_path: Path of the vcproj or vcxproj file to generate. + spec: The target dictionary containing the properties of the target. + Returns: + the guid. + Raises: + ValueError: if the specified GUID is invalid. + """ # Pluck out the default configuration. default_config = _GetDefaultConfiguration(spec) # Decide the guid of the project. @@ -989,13 +987,13 @@ def _GetGuidOfProject(proj_path, spec): def _GetMsbuildToolsetOfProject(proj_path, spec, version): """Get the platform toolset for the project. - Arguments: - proj_path: Path of the vcproj or vcxproj file to generate. - spec: The target dictionary containing the properties of the target. - version: The MSVSVersion object. - Returns: - the platform toolset string or None. - """ + Arguments: + proj_path: Path of the vcproj or vcxproj file to generate. + spec: The target dictionary containing the properties of the target. + version: The MSVSVersion object. + Returns: + the platform toolset string or None. + """ # Pluck out the default configuration. default_config = _GetDefaultConfiguration(spec) toolset = default_config.get("msbuild_toolset") @@ -1009,14 +1007,14 @@ def _GetMsbuildToolsetOfProject(proj_path, spec, version): def _GenerateProject(project, options, version, generator_flags, spec): """Generates a vcproj file. - Arguments: - project: the MSVSProject object. - options: global generator options. - version: the MSVSVersion object. - generator_flags: dict of generator-specific flags. - Returns: - A list of source files that cannot be found on disk. - """ + Arguments: + project: the MSVSProject object. + options: global generator options. + version: the MSVSVersion object. + generator_flags: dict of generator-specific flags. + Returns: + A list of source files that cannot be found on disk. + """ default_config = _GetDefaultConfiguration(project.spec) # Skip emitting anything if told to with msvs_existing_vcproj option. @@ -1032,12 +1030,12 @@ def _GenerateProject(project, options, version, generator_flags, spec): def _GenerateMSVSProject(project, options, version, generator_flags): """Generates a .vcproj file. It may create .rules and .user files too. - Arguments: - project: The project object we will generate the file for. - options: Global options passed to the generator. - version: The VisualStudioVersion object. - generator_flags: dict of generator-specific flags. - """ + Arguments: + project: The project object we will generate the file for. + options: Global options passed to the generator. + version: The VisualStudioVersion object. + generator_flags: dict of generator-specific flags. + """ spec = project.spec gyp.common.EnsureDirExists(project.path) @@ -1094,11 +1092,11 @@ def _GenerateMSVSProject(project, options, version, generator_flags): def _GetUniquePlatforms(spec): """Returns the list of unique platforms for this spec, e.g ['win32', ...]. - Arguments: - spec: The target dictionary containing the properties of the target. - Returns: - The MSVSUserFile object created. - """ + Arguments: + spec: The target dictionary containing the properties of the target. + Returns: + The MSVSUserFile object created. + """ # Gather list of unique platforms. platforms = OrderedSet() for configuration in spec["configurations"]: @@ -1110,14 +1108,14 @@ def _GetUniquePlatforms(spec): def _CreateMSVSUserFile(proj_path, version, spec): """Generates a .user file for the user running this Gyp program. - Arguments: - proj_path: The path of the project file being created. The .user file - shares the same path (with an appropriate suffix). - version: The VisualStudioVersion object. - spec: The target dictionary containing the properties of the target. - Returns: - The MSVSUserFile object created. - """ + Arguments: + proj_path: The path of the project file being created. The .user file + shares the same path (with an appropriate suffix). + version: The VisualStudioVersion object. + spec: The target dictionary containing the properties of the target. + Returns: + The MSVSUserFile object created. + """ (domain, username) = _GetDomainAndUserName() vcuser_filename = ".".join([proj_path, domain, username, "user"]) user_file = MSVSUserFile.Writer(vcuser_filename, version, spec["target_name"]) @@ -1127,14 +1125,14 @@ def _CreateMSVSUserFile(proj_path, version, spec): def _GetMSVSConfigurationType(spec, build_file): """Returns the configuration type for this project. - It's a number defined by Microsoft. May raise an exception. + It's a number defined by Microsoft. May raise an exception. - Args: - spec: The target dictionary containing the properties of the target. - build_file: The path of the gyp file. - Returns: - An integer, the configuration type. - """ + Args: + spec: The target dictionary containing the properties of the target. + build_file: The path of the gyp file. + Returns: + An integer, the configuration type. + """ try: config_type = { "executable": "1", # .exe @@ -1161,17 +1159,17 @@ def _GetMSVSConfigurationType(spec, build_file): def _AddConfigurationToMSVSProject(p, spec, config_type, config_name, config): """Adds a configuration to the MSVS project. - Many settings in a vcproj file are specific to a configuration. This - function the main part of the vcproj file that's configuration specific. - - Arguments: - p: The target project being generated. - spec: The target dictionary containing the properties of the target. - config_type: The configuration type, a number as defined by Microsoft. - config_name: The name of the configuration. - config: The dictionary that defines the special processing to be done - for this configuration. - """ + Many settings in a vcproj file are specific to a configuration. This + function the main part of the vcproj file that's configuration specific. + + Arguments: + p: The target project being generated. + spec: The target dictionary containing the properties of the target. + config_type: The configuration type, a number as defined by Microsoft. + config_name: The name of the configuration. + config: The dictionary that defines the special processing to be done + for this configuration. + """ # Get the information for this configuration include_dirs, midl_include_dirs, resource_include_dirs = _GetIncludeDirs(config) libraries = _GetLibraries(spec) @@ -1251,12 +1249,12 @@ def _AddConfigurationToMSVSProject(p, spec, config_type, config_name, config): def _GetIncludeDirs(config): """Returns the list of directories to be used for #include directives. - Arguments: - config: The dictionary that defines the special processing to be done - for this configuration. - Returns: - The list of directory paths. - """ + Arguments: + config: The dictionary that defines the special processing to be done + for this configuration. + Returns: + The list of directory paths. + """ # TODO(bradnelson): include_dirs should really be flexible enough not to # require this sort of thing. include_dirs = config.get("include_dirs", []) + config.get( @@ -1275,12 +1273,12 @@ def _GetIncludeDirs(config): def _GetLibraryDirs(config): """Returns the list of directories to be used for library search paths. - Arguments: - config: The dictionary that defines the special processing to be done - for this configuration. - Returns: - The list of directory paths. - """ + Arguments: + config: The dictionary that defines the special processing to be done + for this configuration. + Returns: + The list of directory paths. + """ library_dirs = config.get("library_dirs", []) library_dirs = _FixPaths(library_dirs) @@ -1290,11 +1288,11 @@ def _GetLibraryDirs(config): def _GetLibraries(spec): """Returns the list of libraries for this configuration. - Arguments: - spec: The target dictionary containing the properties of the target. - Returns: - The list of directory paths. - """ + Arguments: + spec: The target dictionary containing the properties of the target. + Returns: + The list of directory paths. + """ libraries = spec.get("libraries", []) # Strip out -l, as it is not used on windows (but is needed so we can pass # in libraries that are assumed to be in the default library path). @@ -1316,14 +1314,14 @@ def _GetLibraries(spec): def _GetOutputFilePathAndTool(spec, msbuild): """Returns the path and tool to use for this target. - Figures out the path of the file this spec will create and the name of - the VC tool that will create it. + Figures out the path of the file this spec will create and the name of + the VC tool that will create it. - Arguments: - spec: The target dictionary containing the properties of the target. - Returns: - A triple of (file path, name of the vc tool, name of the msbuild tool) - """ + Arguments: + spec: The target dictionary containing the properties of the target. + Returns: + A triple of (file path, name of the vc tool, name of the msbuild tool) + """ # Select a name for the output file. out_file = "" vc_tool = "" @@ -1355,15 +1353,15 @@ def _GetOutputFilePathAndTool(spec, msbuild): def _GetOutputTargetExt(spec): """Returns the extension for this target, including the dot - If product_extension is specified, set target_extension to this to avoid - MSB8012, returns None otherwise. Ignores any target_extension settings in - the input files. + If product_extension is specified, set target_extension to this to avoid + MSB8012, returns None otherwise. Ignores any target_extension settings in + the input files. - Arguments: - spec: The target dictionary containing the properties of the target. - Returns: - A string with the extension, or None - """ + Arguments: + spec: The target dictionary containing the properties of the target. + Returns: + A string with the extension, or None + """ if target_extension := spec.get("product_extension"): return "." + target_extension return None @@ -1372,12 +1370,12 @@ def _GetOutputTargetExt(spec): def _GetDefines(config): """Returns the list of preprocessor definitions for this configuration. - Arguments: - config: The dictionary that defines the special processing to be done - for this configuration. - Returns: - The list of preprocessor definitions. - """ + Arguments: + config: The dictionary that defines the special processing to be done + for this configuration. + Returns: + The list of preprocessor definitions. + """ defines = [] for d in config.get("defines", []): fd = "=".join([str(dpart) for dpart in d]) if isinstance(d, list) else str(d) @@ -1411,11 +1409,11 @@ def _GetModuleDefinition(spec): def _ConvertToolsToExpectedForm(tools): """Convert tools to a form expected by Visual Studio. - Arguments: - tools: A dictionary of settings; the tool name is the key. - Returns: - A list of Tool objects. - """ + Arguments: + tools: A dictionary of settings; the tool name is the key. + Returns: + A list of Tool objects. + """ tool_list = [] for tool, settings in tools.items(): # Collapse settings with lists. @@ -1438,15 +1436,15 @@ def _ConvertToolsToExpectedForm(tools): def _AddConfigurationToMSVS(p, spec, tools, config, config_type, config_name): """Add to the project file the configuration specified by config. - Arguments: - p: The target project being generated. - spec: the target project dict. - tools: A dictionary of settings; the tool name is the key. - config: The dictionary that defines the special processing to be done - for this configuration. - config_type: The configuration type, a number as defined by Microsoft. - config_name: The name of the configuration. - """ + Arguments: + p: The target project being generated. + spec: the target project dict. + tools: A dictionary of settings; the tool name is the key. + config: The dictionary that defines the special processing to be done + for this configuration. + config_type: The configuration type, a number as defined by Microsoft. + config_name: The name of the configuration. + """ attributes = _GetMSVSAttributes(spec, config, config_type) # Add in this configuration. tool_list = _ConvertToolsToExpectedForm(tools) @@ -1487,18 +1485,18 @@ def _AddNormalizedSources(sources_set, sources_array): def _PrepareListOfSources(spec, generator_flags, gyp_file): """Prepare list of sources and excluded sources. - Besides the sources specified directly in the spec, adds the gyp file so - that a change to it will cause a re-compile. Also adds appropriate sources - for actions and copies. Assumes later stage will un-exclude files which - have custom build steps attached. - - Arguments: - spec: The target dictionary containing the properties of the target. - gyp_file: The name of the gyp file. - Returns: - A pair of (list of sources, list of excluded sources). - The sources will be relative to the gyp file. - """ + Besides the sources specified directly in the spec, adds the gyp file so + that a change to it will cause a re-compile. Also adds appropriate sources + for actions and copies. Assumes later stage will un-exclude files which + have custom build steps attached. + + Arguments: + spec: The target dictionary containing the properties of the target. + gyp_file: The name of the gyp file. + Returns: + A pair of (list of sources, list of excluded sources). + The sources will be relative to the gyp file. + """ sources = OrderedSet() _AddNormalizedSources(sources, spec.get("sources", [])) excluded_sources = OrderedSet() @@ -1528,19 +1526,19 @@ def _AdjustSourcesAndConvertToFilterHierarchy( ): """Adjusts the list of sources and excluded sources. - Also converts the sets to lists. - - Arguments: - spec: The target dictionary containing the properties of the target. - options: Global generator options. - gyp_dir: The path to the gyp file being processed. - sources: A set of sources to be included for this project. - excluded_sources: A set of sources to be excluded for this project. - version: A MSVSVersion object. - Returns: - A trio of (list of sources, list of excluded sources, - path of excluded IDL file) - """ + Also converts the sets to lists. + + Arguments: + spec: The target dictionary containing the properties of the target. + options: Global generator options. + gyp_dir: The path to the gyp file being processed. + sources: A set of sources to be included for this project. + excluded_sources: A set of sources to be excluded for this project. + version: A MSVSVersion object. + Returns: + A trio of (list of sources, list of excluded sources, + path of excluded IDL file) + """ # Exclude excluded sources coming into the generator. excluded_sources.update(OrderedSet(spec.get("sources_excluded", []))) # Add excluded sources into sources for good measure. @@ -1836,8 +1834,11 @@ def _CollapseSingles(parent, node): # Recursively explorer the tree of dicts looking for projects which are # the sole item in a folder which has the same name as the project. Bring # such projects up one level. - if (isinstance(node, dict) and len(node) == 1 and - next(iter(node)) == parent + ".vcproj"): + if ( + isinstance(node, dict) + and len(node) == 1 + and next(iter(node)) == parent + ".vcproj" + ): return node[next(iter(node))] if not isinstance(node, dict): return node @@ -1906,14 +1907,14 @@ def _GetPlatformOverridesOfProject(spec): def _CreateProjectObjects(target_list, target_dicts, options, msvs_version): """Create a MSVSProject object for the targets found in target list. - Arguments: - target_list: the list of targets to generate project objects for. - target_dicts: the dictionary of specifications. - options: global generator options. - msvs_version: the MSVSVersion object. - Returns: - A set of created projects, keyed by target. - """ + Arguments: + target_list: the list of targets to generate project objects for. + target_dicts: the dictionary of specifications. + options: global generator options. + msvs_version: the MSVSVersion object. + Returns: + A set of created projects, keyed by target. + """ global fixpath_prefix # Generate each project. projects = {} @@ -1957,15 +1958,15 @@ def _CreateProjectObjects(target_list, target_dicts, options, msvs_version): def _InitNinjaFlavor(params, target_list, target_dicts): """Initialize targets for the ninja flavor. - This sets up the necessary variables in the targets to generate msvs projects - that use ninja as an external builder. The variables in the spec are only set - if they have not been set. This allows individual specs to override the - default values initialized here. - Arguments: - params: Params provided to the generator. - target_list: List of target pairs: 'base/base.gyp:base'. - target_dicts: Dict of target properties keyed on target pair. - """ + This sets up the necessary variables in the targets to generate msvs projects + that use ninja as an external builder. The variables in the spec are only set + if they have not been set. This allows individual specs to override the + default values initialized here. + Arguments: + params: Params provided to the generator. + target_list: List of target pairs: 'base/base.gyp:base'. + target_dicts: Dict of target properties keyed on target pair. + """ for qualified_target in target_list: spec = target_dicts[qualified_target] if spec.get("msvs_external_builder"): @@ -2076,12 +2077,12 @@ def CalculateGeneratorInputInfo(params): def GenerateOutput(target_list, target_dicts, data, params): """Generate .sln and .vcproj files. - This is the entry point for this generator. - Arguments: - target_list: List of target pairs: 'base/base.gyp:base'. - target_dicts: Dict of target properties keyed on target pair. - data: Dictionary containing per .gyp data. - """ + This is the entry point for this generator. + Arguments: + target_list: List of target pairs: 'base/base.gyp:base'. + target_dicts: Dict of target properties keyed on target pair. + data: Dictionary containing per .gyp data. + """ global fixpath_prefix options = params["options"] @@ -2175,14 +2176,14 @@ def _GenerateMSBuildFiltersFile( ): """Generate the filters file. - This file is used by Visual Studio to organize the presentation of source - files into folders. + This file is used by Visual Studio to organize the presentation of source + files into folders. - Arguments: - filters_path: The path of the file to be created. - source_files: The hierarchical structure of all the sources. - extension_to_rule_name: A dictionary mapping file extensions to rules. - """ + Arguments: + filters_path: The path of the file to be created. + source_files: The hierarchical structure of all the sources. + extension_to_rule_name: A dictionary mapping file extensions to rules. + """ filter_group = [] source_group = [] _AppendFiltersForMSBuild( @@ -2223,14 +2224,14 @@ def _AppendFiltersForMSBuild( ): """Creates the list of filters and sources to be added in the filter file. - Args: - parent_filter_name: The name of the filter under which the sources are - found. - sources: The hierarchy of filters and sources to process. - extension_to_rule_name: A dictionary mapping file extensions to rules. - filter_group: The list to which filter entries will be appended. - source_group: The list to which source entries will be appended. - """ + Args: + parent_filter_name: The name of the filter under which the sources are + found. + sources: The hierarchy of filters and sources to process. + extension_to_rule_name: A dictionary mapping file extensions to rules. + filter_group: The list to which filter entries will be appended. + source_group: The list to which source entries will be appended. + """ for source in sources: if isinstance(source, MSVSProject.Filter): # We have a sub-filter. Create the name of that sub-filter. @@ -2274,13 +2275,13 @@ def _MapFileToMsBuildSourceType( ): """Returns the group and element type of the source file. - Arguments: - source: The source file name. - extension_to_rule_name: A dictionary mapping file extensions to rules. + Arguments: + source: The source file name. + extension_to_rule_name: A dictionary mapping file extensions to rules. - Returns: - A pair of (group this file should be part of, the label of element) - """ + Returns: + A pair of (group this file should be part of, the label of element) + """ _, ext = os.path.splitext(source) ext = ext.lower() if ext in extension_to_rule_name: @@ -2368,22 +2369,22 @@ def _GenerateRulesForMSBuild( class MSBuildRule: """Used to store information used to generate an MSBuild rule. - Attributes: - rule_name: The rule name, sanitized to use in XML. - target_name: The name of the target. - after_targets: The name of the AfterTargets element. - before_targets: The name of the BeforeTargets element. - depends_on: The name of the DependsOn element. - compute_output: The name of the ComputeOutput element. - dirs_to_make: The name of the DirsToMake element. - inputs: The name of the _inputs element. - tlog: The name of the _tlog element. - extension: The extension this rule applies to. - description: The message displayed when this rule is invoked. - additional_dependencies: A string listing additional dependencies. - outputs: The outputs of this rule. - command: The command used to run the rule. - """ + Attributes: + rule_name: The rule name, sanitized to use in XML. + target_name: The name of the target. + after_targets: The name of the AfterTargets element. + before_targets: The name of the BeforeTargets element. + depends_on: The name of the DependsOn element. + compute_output: The name of the ComputeOutput element. + dirs_to_make: The name of the DirsToMake element. + inputs: The name of the _inputs element. + tlog: The name of the _tlog element. + extension: The extension this rule applies to. + description: The message displayed when this rule is invoked. + additional_dependencies: A string listing additional dependencies. + outputs: The outputs of this rule. + command: The command used to run the rule. + """ def __init__(self, rule, spec): self.display_name = rule["rule_name"] @@ -2908,7 +2909,7 @@ def _GetConfigurationCondition(name, settings, spec): def _GetMSBuildProjectConfigurations(configurations, spec): group = ["ItemGroup", {"Label": "ProjectConfigurations"}] - for (name, settings) in sorted(configurations.items()): + for name, settings in sorted(configurations.items()): configuration, platform = _GetConfigurationAndPlatform(name, settings, spec) designation = f"{configuration}|{platform}" group.append( @@ -3002,10 +3003,11 @@ def _GetMSBuildConfigurationDetails(spec, build_file): vctools_version = msbuild_attributes.get("VCToolsVersion") config_type = msbuild_attributes.get("ConfigurationType") _AddConditionalProperty(properties, condition, "ConfigurationType", config_type) - spectre_mitigation = msbuild_attributes.get('SpectreMitigation') + spectre_mitigation = msbuild_attributes.get("SpectreMitigation") if spectre_mitigation: - _AddConditionalProperty(properties, condition, "SpectreMitigation", - spectre_mitigation) + _AddConditionalProperty( + properties, condition, "SpectreMitigation", spectre_mitigation + ) if config_type == "Driver": _AddConditionalProperty(properties, condition, "DriverType", "WDM") _AddConditionalProperty( @@ -3193,7 +3195,7 @@ def _GetMSBuildConfigurationGlobalProperties(spec, configurations, build_file): new_paths = "$(ExecutablePath);" + ";".join(new_paths) properties = {} - for (name, configuration) in sorted(configurations.items()): + for name, configuration in sorted(configurations.items()): condition = _GetConfigurationCondition(name, configuration, spec) attributes = _GetMSBuildAttributes(spec, configuration, build_file) msbuild_settings = configuration["finalized_msbuild_settings"] @@ -3232,14 +3234,14 @@ def _GetMSBuildConfigurationGlobalProperties(spec, configurations, build_file): def _AddConditionalProperty(properties, condition, name, value): """Adds a property / conditional value pair to a dictionary. - Arguments: - properties: The dictionary to be modified. The key is the name of the - property. The value is itself a dictionary; its key is the value and - the value a list of condition for which this value is true. - condition: The condition under which the named property has the value. - name: The name of the property. - value: The value of the property. - """ + Arguments: + properties: The dictionary to be modified. The key is the name of the + property. The value is itself a dictionary; its key is the value and + the value a list of condition for which this value is true. + condition: The condition under which the named property has the value. + name: The name of the property. + value: The value of the property. + """ if name not in properties: properties[name] = {} values = properties[name] @@ -3256,13 +3258,13 @@ def _AddConditionalProperty(properties, condition, name, value): def _GetMSBuildPropertyGroup(spec, label, properties): """Returns a PropertyGroup definition for the specified properties. - Arguments: - spec: The target project dict. - label: An optional label for the PropertyGroup. - properties: The dictionary to be converted. The key is the name of the - property. The value is itself a dictionary; its key is the value and - the value a list of condition for which this value is true. - """ + Arguments: + spec: The target project dict. + label: An optional label for the PropertyGroup. + properties: The dictionary to be converted. The key is the name of the + property. The value is itself a dictionary; its key is the value and + the value a list of condition for which this value is true. + """ group = ["PropertyGroup"] if label: group.append({"Label": label}) @@ -3311,7 +3313,7 @@ def GetEdges(node): def _GetMSBuildToolSettingsSections(spec, configurations): groups = [] - for (name, configuration) in sorted(configurations.items()): + for name, configuration in sorted(configurations.items()): msbuild_settings = configuration["finalized_msbuild_settings"] group = [ "ItemDefinitionGroup", @@ -3408,7 +3410,7 @@ def _FinalizeMSBuildSettings(spec, configuration): # While MSVC works with just file name eg. "v8_pch.h", ClangCL requires # the full path eg. "tools/msvs/pch/v8_pch.h" to find the file. # P.S. Only ClangCL defines msbuild_toolset, for MSVC it is None. - if configuration.get("msbuild_toolset") != 'ClangCL': + if configuration.get("msbuild_toolset") != "ClangCL": precompiled_header = os.path.split(precompiled_header)[1] _ToolAppend(msbuild_settings, "ClCompile", "PrecompiledHeader", "Use") _ToolAppend( @@ -3470,16 +3472,16 @@ def _GetValueFormattedForMSBuild(tool_name, name, value): def _VerifySourcesExist(sources, root_dir): """Verifies that all source files exist on disk. - Checks that all regular source files, i.e. not created at run time, - exist on disk. Missing files cause needless recompilation but no otherwise - visible errors. + Checks that all regular source files, i.e. not created at run time, + exist on disk. Missing files cause needless recompilation but no otherwise + visible errors. - Arguments: - sources: A recursive list of Filter/file names. - root_dir: The root directory for the relative path names. - Returns: - A list of source files that cannot be found on disk. - """ + Arguments: + sources: A recursive list of Filter/file names. + root_dir: The root directory for the relative path names. + Returns: + A list of source files that cannot be found on disk. + """ missing_sources = [] for source in sources: if isinstance(source, MSVSProject.Filter): @@ -3564,17 +3566,13 @@ def _AddSources2( detail.append(["ExcludedFromBuild", "true"]) else: for config_name, configuration in sorted(excluded_configurations): - condition = _GetConfigurationCondition( - config_name, configuration - ) + condition = _GetConfigurationCondition(config_name, configuration) detail.append( ["ExcludedFromBuild", {"Condition": condition}, "true"] ) # Add precompile if needed for config_name, configuration in spec["configurations"].items(): - precompiled_source = configuration.get( - "msvs_precompiled_source", "" - ) + precompiled_source = configuration.get("msvs_precompiled_source", "") if precompiled_source != "": precompiled_source = _FixPath(precompiled_source) if not extensions_excluded_from_precompile: @@ -3822,15 +3820,15 @@ def _GenerateMSBuildProject(project, options, version, generator_flags, spec): def _GetMSBuildExternalBuilderTargets(spec): """Return a list of MSBuild targets for external builders. - The "Build" and "Clean" targets are always generated. If the spec contains - 'msvs_external_builder_clcompile_cmd', then the "ClCompile" target will also - be generated, to support building selected C/C++ files. + The "Build" and "Clean" targets are always generated. If the spec contains + 'msvs_external_builder_clcompile_cmd', then the "ClCompile" target will also + be generated, to support building selected C/C++ files. - Arguments: - spec: The gyp target spec. - Returns: - List of MSBuild 'Target' specs. - """ + Arguments: + spec: The gyp target spec. + Returns: + List of MSBuild 'Target' specs. + """ build_cmd = _BuildCommandLineForRuleRaw( spec, spec["msvs_external_builder_build_cmd"], False, False, False, False ) @@ -3878,14 +3876,14 @@ def _GetMSBuildExtensionTargets(targets_files_of_rules): def _GenerateActionsForMSBuild(spec, actions_to_add): """Add actions accumulated into an actions_to_add, merging as needed. - Arguments: - spec: the target project dict - actions_to_add: dictionary keyed on input name, which maps to a list of - dicts describing the actions attached to that input file. + Arguments: + spec: the target project dict + actions_to_add: dictionary keyed on input name, which maps to a list of + dicts describing the actions attached to that input file. - Returns: - A pair of (action specification, the sources handled by this action). - """ + Returns: + A pair of (action specification, the sources handled by this action). + """ sources_handled_by_action = OrderedSet() actions_spec = [] for primary_input, actions in actions_to_add.items(): diff --git a/tools/gyp/pylib/gyp/generator/msvs_test.py b/tools/gyp/pylib/gyp/generator/msvs_test.py index 8cea3d1479e3b0..e3c4758696c40d 100755 --- a/tools/gyp/pylib/gyp/generator/msvs_test.py +++ b/tools/gyp/pylib/gyp/generator/msvs_test.py @@ -3,7 +3,7 @@ # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. -""" Unit tests for the msvs.py file. """ +"""Unit tests for the msvs.py file.""" import unittest from io import StringIO diff --git a/tools/gyp/pylib/gyp/generator/ninja.py b/tools/gyp/pylib/gyp/generator/ninja.py index d5dfa1a1182c09..bc9ddd26545e9d 100644 --- a/tools/gyp/pylib/gyp/generator/ninja.py +++ b/tools/gyp/pylib/gyp/generator/ninja.py @@ -1303,7 +1303,7 @@ def WritePchTargets(self, ninja_file, pch_commands): ninja_file.build(gch, cmd, input, variables=[(var_name, lang_flag)]) def WriteLink(self, spec, config_name, config, link_deps, compile_deps): - """Write out a link step. Fills out target.binary. """ + """Write out a link step. Fills out target.binary.""" if self.flavor != "mac" or len(self.archs) == 1: return self.WriteLinkForArch( self.ninja, spec, config_name, config, link_deps, compile_deps @@ -1347,7 +1347,7 @@ def WriteLink(self, spec, config_name, config, link_deps, compile_deps): def WriteLinkForArch( self, ninja_file, spec, config_name, config, link_deps, compile_deps, arch=None ): - """Write out a link step. Fills out target.binary. """ + """Write out a link step. Fills out target.binary.""" command = { "executable": "link", "loadable_module": "solink_module", @@ -1755,11 +1755,9 @@ def GetPostbuildCommand(self, spec, output, output_binary, is_command_start): + " && ".join([ninja_syntax.escape(command) for command in postbuilds]) ) command_string = ( - commands - + "); G=$$?; " + commands + "); G=$$?; " # Remove the final output if any postbuild failed. - "((exit $$G) || rm -rf %s) " % output - + "&& exit $$G)" + "((exit $$G) || rm -rf %s) " % output + "&& exit $$G)" ) if is_command_start: return "(" + command_string + " && " @@ -1948,7 +1946,8 @@ def WriteNewNinjaRule( ) else: rspfile_content = gyp.msvs_emulation.EncodeRspFileList( - args, win_shell_flags.quote) + args, win_shell_flags.quote + ) command = ( "%s gyp-win-tool action-wrapper $arch " % sys.executable + rspfile @@ -2085,6 +2084,7 @@ def GetDefaultConcurrentLinks(): return pool_size if sys.platform in ("win32", "cygwin"): + class MEMORYSTATUSEX(ctypes.Structure): _fields_ = [ ("dwLength", ctypes.c_ulong), @@ -2104,8 +2104,8 @@ class MEMORYSTATUSEX(ctypes.Structure): # VS 2015 uses 20% more working set than VS 2013 and can consume all RAM # on a 64 GiB machine. - mem_limit = max(1, stat.ullTotalPhys // (5 * (2 ** 30))) # total / 5GiB - hard_cap = max(1, int(os.environ.get("GYP_LINK_CONCURRENCY_MAX") or 2 ** 32)) + mem_limit = max(1, stat.ullTotalPhys // (5 * (2**30))) # total / 5GiB + hard_cap = max(1, int(os.environ.get("GYP_LINK_CONCURRENCY_MAX") or 2**32)) return min(mem_limit, hard_cap) elif sys.platform.startswith("linux"): if os.path.exists("/proc/meminfo"): @@ -2116,14 +2116,14 @@ class MEMORYSTATUSEX(ctypes.Structure): if not match: continue # Allow 8Gb per link on Linux because Gold is quite memory hungry - return max(1, int(match.group(1)) // (8 * (2 ** 20))) + return max(1, int(match.group(1)) // (8 * (2**20))) return 1 elif sys.platform == "darwin": try: avail_bytes = int(subprocess.check_output(["sysctl", "-n", "hw.memsize"])) # A static library debug build of Chromium's unit_tests takes ~2.7GB, so # 4GB per ld process allows for some more bloat. - return max(1, avail_bytes // (4 * (2 ** 30))) # total / 4GB + return max(1, avail_bytes // (4 * (2**30))) # total / 4GB except subprocess.CalledProcessError: return 1 else: @@ -2411,8 +2411,7 @@ def GenerateOutputForConfig(target_list, target_dicts, data, params, config_name "cc_s", description="CC $out", command=( - "$cc $defines $includes $cflags $cflags_c " - "$cflags_pch_c -c $in -o $out" + "$cc $defines $includes $cflags $cflags_c $cflags_pch_c -c $in -o $out" ), ) master_ninja.rule( @@ -2523,8 +2522,7 @@ def GenerateOutputForConfig(target_list, target_dicts, data, params, config_name "solink", description="SOLINK $lib", restat=True, - command=mtime_preserving_solink_base - % {"suffix": "@$link_file_list"}, + command=mtime_preserving_solink_base % {"suffix": "@$link_file_list"}, rspfile="$link_file_list", rspfile_content=( "-Wl,--whole-archive $in $solibs -Wl,--no-whole-archive $libs" @@ -2709,7 +2707,7 @@ def GenerateOutputForConfig(target_list, target_dicts, data, params, config_name command="$env %(python)s gyp-mac-tool compile-ios-framework-header-map " "$out $framework $in && $env %(python)s gyp-mac-tool " "copy-ios-framework-headers $framework $copy_headers" - % {'python': sys.executable}, + % {"python": sys.executable}, ) master_ninja.rule( "mac_tool", diff --git a/tools/gyp/pylib/gyp/generator/ninja_test.py b/tools/gyp/pylib/gyp/generator/ninja_test.py index 581b14595e143e..616bc7aaf015a2 100644 --- a/tools/gyp/pylib/gyp/generator/ninja_test.py +++ b/tools/gyp/pylib/gyp/generator/ninja_test.py @@ -4,7 +4,7 @@ # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. -""" Unit tests for the ninja.py file. """ +"""Unit tests for the ninja.py file.""" import sys import unittest diff --git a/tools/gyp/pylib/gyp/generator/xcode.py b/tools/gyp/pylib/gyp/generator/xcode.py index cdf11c3b27b1d5..8e05657961fe98 100644 --- a/tools/gyp/pylib/gyp/generator/xcode.py +++ b/tools/gyp/pylib/gyp/generator/xcode.py @@ -564,12 +564,12 @@ def AddHeaderToTarget(header, pbxp, xct, is_public): def ExpandXcodeVariables(string, expansions): """Expands Xcode-style $(VARIABLES) in string per the expansions dict. - In some rare cases, it is appropriate to expand Xcode variables when a - project file is generated. For any substring $(VAR) in string, if VAR is a - key in the expansions dict, $(VAR) will be replaced with expansions[VAR]. - Any $(VAR) substring in string for which VAR is not a key in the expansions - dict will remain in the returned string. - """ + In some rare cases, it is appropriate to expand Xcode variables when a + project file is generated. For any substring $(VAR) in string, if VAR is a + key in the expansions dict, $(VAR) will be replaced with expansions[VAR]. + Any $(VAR) substring in string for which VAR is not a key in the expansions + dict will remain in the returned string. + """ matches = _xcode_variable_re.findall(string) if matches is None: @@ -592,9 +592,9 @@ def ExpandXcodeVariables(string, expansions): def EscapeXcodeDefine(s): """We must escape the defines that we give to XCode so that it knows not to - split on spaces and to respect backslash and quote literals. However, we - must not quote the define, or Xcode will incorrectly interpret variables - especially $(inherited).""" + split on spaces and to respect backslash and quote literals. However, we + must not quote the define, or Xcode will incorrectly interpret variables + especially $(inherited).""" return re.sub(_xcode_define_re, r"\\\1", s) @@ -679,9 +679,9 @@ def GenerateOutput(target_list, target_dicts, data, params): project_attributes["BuildIndependentTargetsInParallel"] = "YES" if upgrade_check_project_version: project_attributes["LastUpgradeCheck"] = upgrade_check_project_version - project_attributes[ - "LastTestingUpgradeCheck" - ] = upgrade_check_project_version + project_attributes["LastTestingUpgradeCheck"] = ( + upgrade_check_project_version + ) project_attributes["LastSwiftUpdateCheck"] = upgrade_check_project_version pbxp.SetProperty("attributes", project_attributes) @@ -734,8 +734,7 @@ def GenerateOutput(target_list, target_dicts, data, params): "loadable_module+xcuitest": "com.apple.product-type.bundle.ui-testing", "shared_library+bundle": "com.apple.product-type.framework", "executable+extension+bundle": "com.apple.product-type.app-extension", - "executable+watch+extension+bundle": - "com.apple.product-type.watchkit-extension", + "executable+watch+extension+bundle": "com.apple.product-type.watchkit-extension", # noqa: E501 "executable+watch+bundle": "com.apple.product-type.application.watchapp", "mac_kernel_extension+bundle": "com.apple.product-type.kernel-extension", } @@ -780,8 +779,7 @@ def GenerateOutput(target_list, target_dicts, data, params): type_bundle_key += "+watch+extension+bundle" elif is_watch_app: assert is_bundle, ( - "ios_watch_app flag requires mac_bundle " - "(target %s)" % target_name + "ios_watch_app flag requires mac_bundle (target %s)" % target_name ) type_bundle_key += "+watch+bundle" elif is_bundle: @@ -1103,7 +1101,7 @@ def GenerateOutput(target_list, target_dicts, data, params): eol = " \\" makefile.write(f" {concrete_output}{eol}\n") - for (rule_source, concrete_outputs, message, action) in zip( + for rule_source, concrete_outputs, message, action in zip( rule["rule_sources"], concrete_outputs_by_rule_source, messages, diff --git a/tools/gyp/pylib/gyp/generator/xcode_test.py b/tools/gyp/pylib/gyp/generator/xcode_test.py index b0b51a08a6db48..bfd8c587a3175d 100644 --- a/tools/gyp/pylib/gyp/generator/xcode_test.py +++ b/tools/gyp/pylib/gyp/generator/xcode_test.py @@ -4,7 +4,7 @@ # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. -""" Unit tests for the xcode.py file. """ +"""Unit tests for the xcode.py file.""" import sys import unittest diff --git a/tools/gyp/pylib/gyp/input.py b/tools/gyp/pylib/gyp/input.py index 994bf6625fb81d..4965ff1571c73c 100644 --- a/tools/gyp/pylib/gyp/input.py +++ b/tools/gyp/pylib/gyp/input.py @@ -139,21 +139,21 @@ def IsPathSection(section): def GetIncludedBuildFiles(build_file_path, aux_data, included=None): """Return a list of all build files included into build_file_path. - The returned list will contain build_file_path as well as all other files - that it included, either directly or indirectly. Note that the list may - contain files that were included into a conditional section that evaluated - to false and was not merged into build_file_path's dict. + The returned list will contain build_file_path as well as all other files + that it included, either directly or indirectly. Note that the list may + contain files that were included into a conditional section that evaluated + to false and was not merged into build_file_path's dict. - aux_data is a dict containing a key for each build file or included build - file. Those keys provide access to dicts whose "included" keys contain - lists of all other files included by the build file. + aux_data is a dict containing a key for each build file or included build + file. Those keys provide access to dicts whose "included" keys contain + lists of all other files included by the build file. - included should be left at its default None value by external callers. It - is used for recursion. + included should be left at its default None value by external callers. It + is used for recursion. - The returned list will not contain any duplicate entries. Each build file - in the list will be relative to the current directory. - """ + The returned list will not contain any duplicate entries. Each build file + in the list will be relative to the current directory. + """ if included is None: included = [] @@ -171,10 +171,10 @@ def GetIncludedBuildFiles(build_file_path, aux_data, included=None): def CheckedEval(file_contents): """Return the eval of a gyp file. - The gyp file is restricted to dictionaries and lists only, and - repeated keys are not allowed. - Note that this is slower than eval() is. - """ + The gyp file is restricted to dictionaries and lists only, and + repeated keys are not allowed. + Note that this is slower than eval() is. + """ syntax_tree = ast.parse(file_contents) assert isinstance(syntax_tree, ast.Module) @@ -508,9 +508,9 @@ def CallLoadTargetBuildFile( ): """Wrapper around LoadTargetBuildFile for parallel processing. - This wrapper is used when LoadTargetBuildFile is executed in - a worker process. - """ + This wrapper is used when LoadTargetBuildFile is executed in + a worker process. + """ try: signal.signal(signal.SIGINT, signal.SIG_IGN) @@ -559,10 +559,10 @@ class ParallelProcessingError(Exception): class ParallelState: """Class to keep track of state when processing input files in parallel. - If build files are loaded in parallel, use this to keep track of - state during farming out and processing parallel jobs. It's stored - in a global so that the callback function can have access to it. - """ + If build files are loaded in parallel, use this to keep track of + state during farming out and processing parallel jobs. It's stored + in a global so that the callback function can have access to it. + """ def __init__(self): # The multiprocessing pool. @@ -584,8 +584,7 @@ def __init__(self): self.error = False def LoadTargetBuildFileCallback(self, result): - """Handle the results of running LoadTargetBuildFile in another process. - """ + """Handle the results of running LoadTargetBuildFile in another process.""" self.condition.acquire() if not result: self.error = True @@ -692,8 +691,8 @@ def FindEnclosingBracketGroup(input_str): def IsStrCanonicalInt(string): """Returns True if |string| is in its canonical integer form. - The canonical form is such that str(int(string)) == string. - """ + The canonical form is such that str(int(string)) == string. + """ if isinstance(string, str): # This function is called a lot so for maximum performance, avoid # involving regexps which would otherwise make the code much @@ -870,8 +869,9 @@ def ExpandVariables(input, phase, variables, build_file): # This works around actions/rules which have more inputs than will # fit on the command line. if file_list: - contents_list = (contents if isinstance(contents, list) - else contents.split(" ")) + contents_list = ( + contents if isinstance(contents, list) else contents.split(" ") + ) replacement = contents_list[0] if os.path.isabs(replacement): raise GypError('| cannot handle absolute paths, got "%s"' % replacement) @@ -934,7 +934,6 @@ def ExpandVariables(input, phase, variables, build_file): os.chdir(build_file_dir) sys.path.append(os.getcwd()) try: - parsed_contents = shlex.split(contents) try: py_module = __import__(parsed_contents[0]) @@ -965,7 +964,7 @@ def ExpandVariables(input, phase, variables, build_file): stdout=subprocess.PIPE, shell=use_shell, cwd=build_file_dir, - check=False + check=False, ) except Exception as e: raise GypError( @@ -1003,9 +1002,7 @@ def ExpandVariables(input, phase, variables, build_file): # ], replacement = [] else: - raise GypError( - "Undefined variable " + contents + " in " + build_file - ) + raise GypError("Undefined variable " + contents + " in " + build_file) else: replacement = variables[contents] @@ -1114,7 +1111,7 @@ def ExpandVariables(input, phase, variables, build_file): def EvalCondition(condition, conditions_key, phase, variables, build_file): """Returns the dict that should be used or None if the result was - that nothing should be used.""" + that nothing should be used.""" if not isinstance(condition, list): raise GypError(conditions_key + " must be a list") if len(condition) < 2: @@ -1159,7 +1156,7 @@ def EvalCondition(condition, conditions_key, phase, variables, build_file): def EvalSingleCondition(cond_expr, true_dict, false_dict, phase, variables, build_file): """Returns true_dict if cond_expr evaluates to true, and false_dict - otherwise.""" + otherwise.""" # Do expansions on the condition itself. Since the condition can naturally # contain variable references without needing to resort to GYP expansion # syntax, this is of dubious value for variables, but someone might want to @@ -1289,10 +1286,10 @@ def ProcessVariablesAndConditionsInDict( ): """Handle all variable and command expansion and conditional evaluation. - This function is the public entry point for all variable expansions and - conditional evaluations. The variables_in dictionary will not be modified - by this function. - """ + This function is the public entry point for all variable expansions and + conditional evaluations. The variables_in dictionary will not be modified + by this function. + """ # Make a copy of the variables_in dict that can be modified during the # loading of automatics and the loading of the variables dict. @@ -1441,15 +1438,15 @@ def ProcessVariablesAndConditionsInList(the_list, phase, variables, build_file): def BuildTargetsDict(data): """Builds a dict mapping fully-qualified target names to their target dicts. - |data| is a dict mapping loaded build files by pathname relative to the - current directory. Values in |data| are build file contents. For each - |data| value with a "targets" key, the value of the "targets" key is taken - as a list containing target dicts. Each target's fully-qualified name is - constructed from the pathname of the build file (|data| key) and its - "target_name" property. These fully-qualified names are used as the keys - in the returned dict. These keys provide access to the target dicts, - the dicts in the "targets" lists. - """ + |data| is a dict mapping loaded build files by pathname relative to the + current directory. Values in |data| are build file contents. For each + |data| value with a "targets" key, the value of the "targets" key is taken + as a list containing target dicts. Each target's fully-qualified name is + constructed from the pathname of the build file (|data| key) and its + "target_name" property. These fully-qualified names are used as the keys + in the returned dict. These keys provide access to the target dicts, + the dicts in the "targets" lists. + """ targets = {} for build_file in data["target_build_files"]: @@ -1467,13 +1464,13 @@ def BuildTargetsDict(data): def QualifyDependencies(targets): """Make dependency links fully-qualified relative to the current directory. - |targets| is a dict mapping fully-qualified target names to their target - dicts. For each target in this dict, keys known to contain dependency - links are examined, and any dependencies referenced will be rewritten - so that they are fully-qualified and relative to the current directory. - All rewritten dependencies are suitable for use as keys to |targets| or a - similar dict. - """ + |targets| is a dict mapping fully-qualified target names to their target + dicts. For each target in this dict, keys known to contain dependency + links are examined, and any dependencies referenced will be rewritten + so that they are fully-qualified and relative to the current directory. + All rewritten dependencies are suitable for use as keys to |targets| or a + similar dict. + """ all_dependency_sections = [ dep + op for dep in dependency_sections for op in ("", "!", "/") @@ -1516,18 +1513,18 @@ def QualifyDependencies(targets): def ExpandWildcardDependencies(targets, data): """Expands dependencies specified as build_file:*. - For each target in |targets|, examines sections containing links to other - targets. If any such section contains a link of the form build_file:*, it - is taken as a wildcard link, and is expanded to list each target in - build_file. The |data| dict provides access to build file dicts. + For each target in |targets|, examines sections containing links to other + targets. If any such section contains a link of the form build_file:*, it + is taken as a wildcard link, and is expanded to list each target in + build_file. The |data| dict provides access to build file dicts. - Any target that does not wish to be included by wildcard can provide an - optional "suppress_wildcard" key in its target dict. When present and - true, a wildcard dependency link will not include such targets. + Any target that does not wish to be included by wildcard can provide an + optional "suppress_wildcard" key in its target dict. When present and + true, a wildcard dependency link will not include such targets. - All dependency names, including the keys to |targets| and the values in each - dependency list, must be qualified when this function is called. - """ + All dependency names, including the keys to |targets| and the values in each + dependency list, must be qualified when this function is called. + """ for target, target_dict in targets.items(): target_build_file = gyp.common.BuildFile(target) @@ -1573,14 +1570,10 @@ def ExpandWildcardDependencies(targets, data): if int(dependency_target_dict.get("suppress_wildcard", False)): continue dependency_target_name = dependency_target_dict["target_name"] - if ( - dependency_target not in {"*", dependency_target_name} - ): + if dependency_target not in {"*", dependency_target_name}: continue dependency_target_toolset = dependency_target_dict["toolset"] - if ( - dependency_toolset not in {"*", dependency_target_toolset} - ): + if dependency_toolset not in {"*", dependency_target_toolset}: continue dependency = gyp.common.QualifiedTarget( dependency_build_file, @@ -1601,7 +1594,7 @@ def Unify(items): def RemoveDuplicateDependencies(targets): """Makes sure every dependency appears only once in all targets's dependency - lists.""" + lists.""" for target_name, target_dict in targets.items(): for dependency_key in dependency_sections: dependencies = target_dict.get(dependency_key, []) @@ -1617,25 +1610,21 @@ def Filter(items, item): def RemoveSelfDependencies(targets): """Remove self dependencies from targets that have the prune_self_dependency - variable set.""" + variable set.""" for target_name, target_dict in targets.items(): for dependency_key in dependency_sections: dependencies = target_dict.get(dependency_key, []) if dependencies: for t in dependencies: if t == target_name and ( - targets[t] - .get("variables", {}) - .get("prune_self_dependency", 0) + targets[t].get("variables", {}).get("prune_self_dependency", 0) ): - target_dict[dependency_key] = Filter( - dependencies, target_name - ) + target_dict[dependency_key] = Filter(dependencies, target_name) def RemoveLinkDependenciesFromNoneTargets(targets): """Remove dependencies having the 'link_dependency' attribute from the 'none' - targets.""" + targets.""" for target_name, target_dict in targets.items(): for dependency_key in dependency_sections: dependencies = target_dict.get(dependency_key, []) @@ -1651,11 +1640,11 @@ def RemoveLinkDependenciesFromNoneTargets(targets): class DependencyGraphNode: """ - Attributes: - ref: A reference to an object that this DependencyGraphNode represents. - dependencies: List of DependencyGraphNodes on which this one depends. - dependents: List of DependencyGraphNodes that depend on this one. - """ + Attributes: + ref: A reference to an object that this DependencyGraphNode represents. + dependencies: List of DependencyGraphNodes on which this one depends. + dependents: List of DependencyGraphNodes that depend on this one. + """ class CircularException(GypError): pass @@ -1721,8 +1710,8 @@ def ExtractNodeRef(node): def FindCycles(self): """ - Returns a list of cycles in the graph, where each cycle is its own list. - """ + Returns a list of cycles in the graph, where each cycle is its own list. + """ results = [] visited = set() @@ -1753,21 +1742,21 @@ def DirectDependencies(self, dependencies=None): def _AddImportedDependencies(self, targets, dependencies=None): """Given a list of direct dependencies, adds indirect dependencies that - other dependencies have declared to export their settings. - - This method does not operate on self. Rather, it operates on the list - of dependencies in the |dependencies| argument. For each dependency in - that list, if any declares that it exports the settings of one of its - own dependencies, those dependencies whose settings are "passed through" - are added to the list. As new items are added to the list, they too will - be processed, so it is possible to import settings through multiple levels - of dependencies. - - This method is not terribly useful on its own, it depends on being - "primed" with a list of direct dependencies such as one provided by - DirectDependencies. DirectAndImportedDependencies is intended to be the - public entry point. - """ + other dependencies have declared to export their settings. + + This method does not operate on self. Rather, it operates on the list + of dependencies in the |dependencies| argument. For each dependency in + that list, if any declares that it exports the settings of one of its + own dependencies, those dependencies whose settings are "passed through" + are added to the list. As new items are added to the list, they too will + be processed, so it is possible to import settings through multiple levels + of dependencies. + + This method is not terribly useful on its own, it depends on being + "primed" with a list of direct dependencies such as one provided by + DirectDependencies. DirectAndImportedDependencies is intended to be the + public entry point. + """ if dependencies is None: dependencies = [] @@ -1795,9 +1784,9 @@ def _AddImportedDependencies(self, targets, dependencies=None): def DirectAndImportedDependencies(self, targets, dependencies=None): """Returns a list of a target's direct dependencies and all indirect - dependencies that a dependency has advertised settings should be exported - through the dependency for. - """ + dependencies that a dependency has advertised settings should be exported + through the dependency for. + """ dependencies = self.DirectDependencies(dependencies) return self._AddImportedDependencies(targets, dependencies) @@ -1823,19 +1812,19 @@ def _LinkDependenciesInternal( self, targets, include_shared_libraries, dependencies=None, initial=True ): """Returns an OrderedSet of dependency targets that are linked - into this target. + into this target. - This function has a split personality, depending on the setting of - |initial|. Outside callers should always leave |initial| at its default - setting. + This function has a split personality, depending on the setting of + |initial|. Outside callers should always leave |initial| at its default + setting. - When adding a target to the list of dependencies, this function will - recurse into itself with |initial| set to False, to collect dependencies - that are linked into the linkable target for which the list is being built. + When adding a target to the list of dependencies, this function will + recurse into itself with |initial| set to False, to collect dependencies + that are linked into the linkable target for which the list is being built. - If |include_shared_libraries| is False, the resulting dependencies will not - include shared_library targets that are linked into this target. - """ + If |include_shared_libraries| is False, the resulting dependencies will not + include shared_library targets that are linked into this target. + """ if dependencies is None: # Using a list to get ordered output and a set to do fast "is it # already added" checks. @@ -1917,9 +1906,9 @@ def _LinkDependenciesInternal( def DependenciesForLinkSettings(self, targets): """ - Returns a list of dependency targets whose link_settings should be merged - into this target. - """ + Returns a list of dependency targets whose link_settings should be merged + into this target. + """ # TODO(sbaig) Currently, chrome depends on the bug that shared libraries' # link_settings are propagated. So for now, we will allow it, unless the @@ -1932,8 +1921,8 @@ def DependenciesForLinkSettings(self, targets): def DependenciesToLinkAgainst(self, targets): """ - Returns a list of dependency targets that are linked into this target. - """ + Returns a list of dependency targets that are linked into this target. + """ return self._LinkDependenciesInternal(targets, True) @@ -2446,7 +2435,7 @@ def SetUpConfigurations(target, target_dict): merged_configurations = {} configs = target_dict["configurations"] - for (configuration, old_configuration_dict) in configs.items(): + for configuration, old_configuration_dict in configs.items(): # Skip abstract configurations (saves work only). if old_configuration_dict.get("abstract"): continue @@ -2454,7 +2443,7 @@ def SetUpConfigurations(target, target_dict): # Get the inheritance relationship right by making a copy of the target # dict. new_configuration_dict = {} - for (key, target_val) in target_dict.items(): + for key, target_val in target_dict.items(): key_ext = key[-1:] key_base = key[:-1] if key_ext in key_suffixes else key if key_base not in non_configuration_keys: @@ -2502,25 +2491,25 @@ def SetUpConfigurations(target, target_dict): def ProcessListFiltersInDict(name, the_dict): """Process regular expression and exclusion-based filters on lists. - An exclusion list is in a dict key named with a trailing "!", like - "sources!". Every item in such a list is removed from the associated - main list, which in this example, would be "sources". Removed items are - placed into a "sources_excluded" list in the dict. - - Regular expression (regex) filters are contained in dict keys named with a - trailing "/", such as "sources/" to operate on the "sources" list. Regex - filters in a dict take the form: - 'sources/': [ ['exclude', '_(linux|mac|win)\\.cc$'], - ['include', '_mac\\.cc$'] ], - The first filter says to exclude all files ending in _linux.cc, _mac.cc, and - _win.cc. The second filter then includes all files ending in _mac.cc that - are now or were once in the "sources" list. Items matching an "exclude" - filter are subject to the same processing as would occur if they were listed - by name in an exclusion list (ending in "!"). Items matching an "include" - filter are brought back into the main list if previously excluded by an - exclusion list or exclusion regex filter. Subsequent matching "exclude" - patterns can still cause items to be excluded after matching an "include". - """ + An exclusion list is in a dict key named with a trailing "!", like + "sources!". Every item in such a list is removed from the associated + main list, which in this example, would be "sources". Removed items are + placed into a "sources_excluded" list in the dict. + + Regular expression (regex) filters are contained in dict keys named with a + trailing "/", such as "sources/" to operate on the "sources" list. Regex + filters in a dict take the form: + 'sources/': [ ['exclude', '_(linux|mac|win)\\.cc$'], + ['include', '_mac\\.cc$'] ], + The first filter says to exclude all files ending in _linux.cc, _mac.cc, and + _win.cc. The second filter then includes all files ending in _mac.cc that + are now or were once in the "sources" list. Items matching an "exclude" + filter are subject to the same processing as would occur if they were listed + by name in an exclusion list (ending in "!"). Items matching an "include" + filter are brought back into the main list if previously excluded by an + exclusion list or exclusion regex filter. Subsequent matching "exclude" + patterns can still cause items to be excluded after matching an "include". + """ # Look through the dictionary for any lists whose keys end in "!" or "/". # These are lists that will be treated as exclude lists and regular @@ -2682,12 +2671,12 @@ def ProcessListFiltersInList(name, the_list): def ValidateTargetType(target, target_dict): """Ensures the 'type' field on the target is one of the known types. - Arguments: - target: string, name of target. - target_dict: dict, target spec. + Arguments: + target: string, name of target. + target_dict: dict, target spec. - Raises an exception on error. - """ + Raises an exception on error. + """ VALID_TARGET_TYPES = ( "executable", "loadable_module", @@ -2715,14 +2704,14 @@ def ValidateTargetType(target, target_dict): def ValidateRulesInTarget(target, target_dict, extra_sources_for_rules): """Ensures that the rules sections in target_dict are valid and consistent, - and determines which sources they apply to. + and determines which sources they apply to. - Arguments: - target: string, name of target. - target_dict: dict, target spec containing "rules" and "sources" lists. - extra_sources_for_rules: a list of keys to scan for rule matches in - addition to 'sources'. - """ + Arguments: + target: string, name of target. + target_dict: dict, target spec containing "rules" and "sources" lists. + extra_sources_for_rules: a list of keys to scan for rule matches in + addition to 'sources'. + """ # Dicts to map between values found in rules' 'rule_name' and 'extension' # keys and the rule dicts themselves. @@ -2734,9 +2723,7 @@ def ValidateRulesInTarget(target, target_dict, extra_sources_for_rules): # Make sure that there's no conflict among rule names and extensions. rule_name = rule["rule_name"] if rule_name in rule_names: - raise GypError( - f"rule {rule_name} exists in duplicate, target {target}" - ) + raise GypError(f"rule {rule_name} exists in duplicate, target {target}") rule_names[rule_name] = rule rule_extension = rule["extension"] @@ -2835,8 +2822,7 @@ def ValidateActionsInTarget(target, target_dict, build_file): def TurnIntIntoStrInDict(the_dict): - """Given dict the_dict, recursively converts all integers into strings. - """ + """Given dict the_dict, recursively converts all integers into strings.""" # Use items instead of iteritems because there's no need to try to look at # reinserted keys and their associated values. for k, v in the_dict.items(): @@ -2854,8 +2840,7 @@ def TurnIntIntoStrInDict(the_dict): def TurnIntIntoStrInList(the_list): - """Given list the_list, recursively converts all integers into strings. - """ + """Given list the_list, recursively converts all integers into strings.""" for index, item in enumerate(the_list): if isinstance(item, int): the_list[index] = str(item) @@ -2902,9 +2887,9 @@ def PruneUnwantedTargets(targets, flat_list, dependency_nodes, root_targets, dat def VerifyNoCollidingTargets(targets): """Verify that no two targets in the same directory share the same name. - Arguments: - targets: A list of targets in the form 'path/to/file.gyp:target_name'. - """ + Arguments: + targets: A list of targets in the form 'path/to/file.gyp:target_name'. + """ # Keep a dict going from 'subdirectory:target_name' to 'foo.gyp'. used = {} for target in targets: diff --git a/tools/gyp/pylib/gyp/mac_tool.py b/tools/gyp/pylib/gyp/mac_tool.py index 58fb9c7398acad..3710178e110ae5 100755 --- a/tools/gyp/pylib/gyp/mac_tool.py +++ b/tools/gyp/pylib/gyp/mac_tool.py @@ -8,7 +8,6 @@ These functions are executed via gyp-mac-tool when using the Makefile generator. """ - import fcntl import fnmatch import glob @@ -31,7 +30,7 @@ def main(args): class MacTool: """This class performs all the Mac tooling steps. The methods can either be - executed directly, or dispatched from an argument list.""" + executed directly, or dispatched from an argument list.""" def Dispatch(self, args): """Dispatches a string command to a method.""" @@ -47,7 +46,7 @@ def _CommandifyName(self, name_string): def ExecCopyBundleResource(self, source, dest, convert_to_binary): """Copies a resource file to the bundle/Resources directory, performing any - necessary compilation on each resource.""" + necessary compilation on each resource.""" convert_to_binary = convert_to_binary == "True" extension = os.path.splitext(source)[1].lower() if os.path.isdir(source): @@ -155,15 +154,15 @@ def _CopyStringsFile(self, source, dest): def _DetectInputEncoding(self, file_name): """Reads the first few bytes from file_name and tries to guess the text - encoding. Returns None as a guess if it can't detect it.""" + encoding. Returns None as a guess if it can't detect it.""" with open(file_name, "rb") as fp: try: header = fp.read(3) except Exception: return None - if header.startswith((b"\xFE\xFF", b"\xFF\xFE")): + if header.startswith((b"\xfe\xff", b"\xff\xfe")): return "UTF-16" - elif header.startswith(b"\xEF\xBB\xBF"): + elif header.startswith(b"\xef\xbb\xbf"): return "UTF-8" else: return None @@ -254,7 +253,7 @@ def ExecFlock(self, lockfile, *cmd_list): def ExecFilterLibtool(self, *cmd_list): """Calls libtool and filters out '/path/to/libtool: file: foo.o has no - symbols'.""" + symbols'.""" libtool_re = re.compile( r"^.*libtool: (?:for architecture: \S* )?file: .* has no symbols$" ) @@ -303,7 +302,7 @@ def ExecPackageIosFramework(self, framework): def ExecPackageFramework(self, framework, version): """Takes a path to Something.framework and the Current version of that and - sets up all the symlinks.""" + sets up all the symlinks.""" # Find the name of the binary based on the part before the ".framework". binary = os.path.basename(framework).split(".")[0] @@ -332,7 +331,7 @@ def ExecPackageFramework(self, framework, version): def _Relink(self, dest, link): """Creates a symlink to |dest| named |link|. If |link| already exists, - it is overwritten.""" + it is overwritten.""" if os.path.lexists(link): os.remove(link) os.symlink(dest, link) @@ -357,14 +356,14 @@ def ExecCopyIosFrameworkHeaders(self, framework, *copy_headers): def ExecCompileXcassets(self, keys, *inputs): """Compiles multiple .xcassets files into a single .car file. - This invokes 'actool' to compile all the inputs .xcassets files. The - |keys| arguments is a json-encoded dictionary of extra arguments to - pass to 'actool' when the asset catalogs contains an application icon - or a launch image. + This invokes 'actool' to compile all the inputs .xcassets files. The + |keys| arguments is a json-encoded dictionary of extra arguments to + pass to 'actool' when the asset catalogs contains an application icon + or a launch image. - Note that 'actool' does not create the Assets.car file if the asset - catalogs does not contains imageset. - """ + Note that 'actool' does not create the Assets.car file if the asset + catalogs does not contains imageset. + """ command_line = [ "xcrun", "actool", @@ -437,13 +436,13 @@ def ExecMergeInfoPlist(self, output, *inputs): def ExecCodeSignBundle(self, key, entitlements, provisioning, path, preserve): """Code sign a bundle. - This function tries to code sign an iOS bundle, following the same - algorithm as Xcode: - 1. pick the provisioning profile that best match the bundle identifier, - and copy it into the bundle as embedded.mobileprovision, - 2. copy Entitlements.plist from user or SDK next to the bundle, - 3. code sign the bundle. - """ + This function tries to code sign an iOS bundle, following the same + algorithm as Xcode: + 1. pick the provisioning profile that best match the bundle identifier, + and copy it into the bundle as embedded.mobileprovision, + 2. copy Entitlements.plist from user or SDK next to the bundle, + 3. code sign the bundle. + """ substitutions, overrides = self._InstallProvisioningProfile( provisioning, self._GetCFBundleIdentifier() ) @@ -462,16 +461,16 @@ def ExecCodeSignBundle(self, key, entitlements, provisioning, path, preserve): def _InstallProvisioningProfile(self, profile, bundle_identifier): """Installs embedded.mobileprovision into the bundle. - Args: - profile: string, optional, short name of the .mobileprovision file - to use, if empty or the file is missing, the best file installed - will be used - bundle_identifier: string, value of CFBundleIdentifier from Info.plist + Args: + profile: string, optional, short name of the .mobileprovision file + to use, if empty or the file is missing, the best file installed + will be used + bundle_identifier: string, value of CFBundleIdentifier from Info.plist - Returns: - A tuple containing two dictionary: variables substitutions and values - to overrides when generating the entitlements file. - """ + Returns: + A tuple containing two dictionary: variables substitutions and values + to overrides when generating the entitlements file. + """ source_path, provisioning_data, team_id = self._FindProvisioningProfile( profile, bundle_identifier ) @@ -487,24 +486,24 @@ def _InstallProvisioningProfile(self, profile, bundle_identifier): def _FindProvisioningProfile(self, profile, bundle_identifier): """Finds the .mobileprovision file to use for signing the bundle. - Checks all the installed provisioning profiles (or if the user specified - the PROVISIONING_PROFILE variable, only consult it) and select the most - specific that correspond to the bundle identifier. + Checks all the installed provisioning profiles (or if the user specified + the PROVISIONING_PROFILE variable, only consult it) and select the most + specific that correspond to the bundle identifier. - Args: - profile: string, optional, short name of the .mobileprovision file - to use, if empty or the file is missing, the best file installed - will be used - bundle_identifier: string, value of CFBundleIdentifier from Info.plist + Args: + profile: string, optional, short name of the .mobileprovision file + to use, if empty or the file is missing, the best file installed + will be used + bundle_identifier: string, value of CFBundleIdentifier from Info.plist - Returns: - A tuple of the path to the selected provisioning profile, the data of - the embedded plist in the provisioning profile and the team identifier - to use for code signing. + Returns: + A tuple of the path to the selected provisioning profile, the data of + the embedded plist in the provisioning profile and the team identifier + to use for code signing. - Raises: - SystemExit: if no .mobileprovision can be used to sign the bundle. - """ + Raises: + SystemExit: if no .mobileprovision can be used to sign the bundle. + """ profiles_dir = os.path.join( os.environ["HOME"], "Library", "MobileDevice", "Provisioning Profiles" ) @@ -552,12 +551,12 @@ def _FindProvisioningProfile(self, profile, bundle_identifier): def _LoadProvisioningProfile(self, profile_path): """Extracts the plist embedded in a provisioning profile. - Args: - profile_path: string, path to the .mobileprovision file + Args: + profile_path: string, path to the .mobileprovision file - Returns: - Content of the plist embedded in the provisioning profile as a dictionary. - """ + Returns: + Content of the plist embedded in the provisioning profile as a dictionary. + """ with tempfile.NamedTemporaryFile() as temp: subprocess.check_call( ["security", "cms", "-D", "-i", profile_path, "-o", temp.name] @@ -580,16 +579,16 @@ def _MergePlist(self, merged_plist, plist): def _LoadPlistMaybeBinary(self, plist_path): """Loads into a memory a plist possibly encoded in binary format. - This is a wrapper around plistlib.readPlist that tries to convert the - plist to the XML format if it can't be parsed (assuming that it is in - the binary format). + This is a wrapper around plistlib.readPlist that tries to convert the + plist to the XML format if it can't be parsed (assuming that it is in + the binary format). - Args: - plist_path: string, path to a plist file, in XML or binary format + Args: + plist_path: string, path to a plist file, in XML or binary format - Returns: - Content of the plist as a dictionary. - """ + Returns: + Content of the plist as a dictionary. + """ try: # First, try to read the file using plistlib that only supports XML, # and if an exception is raised, convert a temporary copy to XML and @@ -605,13 +604,13 @@ def _LoadPlistMaybeBinary(self, plist_path): def _GetSubstitutions(self, bundle_identifier, app_identifier_prefix): """Constructs a dictionary of variable substitutions for Entitlements.plist. - Args: - bundle_identifier: string, value of CFBundleIdentifier from Info.plist - app_identifier_prefix: string, value for AppIdentifierPrefix + Args: + bundle_identifier: string, value of CFBundleIdentifier from Info.plist + app_identifier_prefix: string, value for AppIdentifierPrefix - Returns: - Dictionary of substitutions to apply when generating Entitlements.plist. - """ + Returns: + Dictionary of substitutions to apply when generating Entitlements.plist. + """ return { "CFBundleIdentifier": bundle_identifier, "AppIdentifierPrefix": app_identifier_prefix, @@ -620,9 +619,9 @@ def _GetSubstitutions(self, bundle_identifier, app_identifier_prefix): def _GetCFBundleIdentifier(self): """Extracts CFBundleIdentifier value from Info.plist in the bundle. - Returns: - Value of CFBundleIdentifier in the Info.plist located in the bundle. - """ + Returns: + Value of CFBundleIdentifier in the Info.plist located in the bundle. + """ info_plist_path = os.path.join( os.environ["TARGET_BUILD_DIR"], os.environ["INFOPLIST_PATH"] ) @@ -632,19 +631,19 @@ def _GetCFBundleIdentifier(self): def _InstallEntitlements(self, entitlements, substitutions, overrides): """Generates and install the ${BundleName}.xcent entitlements file. - Expands variables "$(variable)" pattern in the source entitlements file, - add extra entitlements defined in the .mobileprovision file and the copy - the generated plist to "${BundlePath}.xcent". + Expands variables "$(variable)" pattern in the source entitlements file, + add extra entitlements defined in the .mobileprovision file and the copy + the generated plist to "${BundlePath}.xcent". - Args: - entitlements: string, optional, path to the Entitlements.plist template - to use, defaults to "${SDKROOT}/Entitlements.plist" - substitutions: dictionary, variable substitutions - overrides: dictionary, values to add to the entitlements + Args: + entitlements: string, optional, path to the Entitlements.plist template + to use, defaults to "${SDKROOT}/Entitlements.plist" + substitutions: dictionary, variable substitutions + overrides: dictionary, values to add to the entitlements - Returns: - Path to the generated entitlements file. - """ + Returns: + Path to the generated entitlements file. + """ source_path = entitlements target_path = os.path.join( os.environ["BUILT_PRODUCTS_DIR"], os.environ["PRODUCT_NAME"] + ".xcent" @@ -664,15 +663,15 @@ def _InstallEntitlements(self, entitlements, substitutions, overrides): def _ExpandVariables(self, data, substitutions): """Expands variables "$(variable)" in data. - Args: - data: object, can be either string, list or dictionary - substitutions: dictionary, variable substitutions to perform + Args: + data: object, can be either string, list or dictionary + substitutions: dictionary, variable substitutions to perform - Returns: - Copy of data where each references to "$(variable)" has been replaced - by the corresponding value found in substitutions, or left intact if - the key was not found. - """ + Returns: + Copy of data where each references to "$(variable)" has been replaced + by the corresponding value found in substitutions, or left intact if + the key was not found. + """ if isinstance(data, str): for key, value in substitutions.items(): data = data.replace("$(%s)" % key, value) @@ -691,15 +690,15 @@ def NextGreaterPowerOf2(x): def WriteHmap(output_name, filelist): """Generates a header map based on |filelist|. - Per Mark Mentovai: - A header map is structured essentially as a hash table, keyed by names used - in #includes, and providing pathnames to the actual files. + Per Mark Mentovai: + A header map is structured essentially as a hash table, keyed by names used + in #includes, and providing pathnames to the actual files. - The implementation below and the comment above comes from inspecting: - http://www.opensource.apple.com/source/distcc/distcc-2503/distcc_dist/include_server/headermap.py?txt - while also looking at the implementation in clang in: - https://llvm.org/svn/llvm-project/cfe/trunk/lib/Lex/HeaderMap.cpp - """ + The implementation below and the comment above comes from inspecting: + http://www.opensource.apple.com/source/distcc/distcc-2503/distcc_dist/include_server/headermap.py?txt + while also looking at the implementation in clang in: + https://llvm.org/svn/llvm-project/cfe/trunk/lib/Lex/HeaderMap.cpp + """ magic = 1751998832 version = 1 _reserved = 0 diff --git a/tools/gyp/pylib/gyp/msvs_emulation.py b/tools/gyp/pylib/gyp/msvs_emulation.py index f8b3b87d943d79..7c461a8fdf72d8 100644 --- a/tools/gyp/pylib/gyp/msvs_emulation.py +++ b/tools/gyp/pylib/gyp/msvs_emulation.py @@ -74,8 +74,7 @@ def EncodeRspFileList(args, quote_cmd): program = call + " " + os.path.normpath(program) else: program = os.path.normpath(args[0]) - return (program + " " - + " ".join(QuoteForRspFile(arg, quote_cmd) for arg in args[1:])) + return program + " " + " ".join(QuoteForRspFile(arg, quote_cmd) for arg in args[1:]) def _GenericRetrieve(root, default, path): @@ -934,14 +933,17 @@ def GetRuleShellFlags(self, rule): includes whether it should run under cygwin (msvs_cygwin_shell), and whether the commands should be quoted (msvs_quote_cmd).""" # If the variable is unset, or set to 1 we use cygwin - cygwin = int(rule.get("msvs_cygwin_shell", - self.spec.get("msvs_cygwin_shell", 1))) != 0 + cygwin = ( + int(rule.get("msvs_cygwin_shell", self.spec.get("msvs_cygwin_shell", 1))) + != 0 + ) # Default to quoting. There's only a few special instances where the # target command uses non-standard command line parsing and handle quotes # and quote escaping differently. quote_cmd = int(rule.get("msvs_quote_cmd", 1)) - assert quote_cmd != 0 or cygwin != 1, \ - "msvs_quote_cmd=0 only applicable for msvs_cygwin_shell=0" + assert quote_cmd != 0 or cygwin != 1, ( + "msvs_quote_cmd=0 only applicable for msvs_cygwin_shell=0" + ) return MsvsSettings.RuleShellFlags(cygwin, quote_cmd) def _HasExplicitRuleForExtension(self, spec, extension): @@ -1129,8 +1131,7 @@ def _ExtractImportantEnvironment(output_of_set): for required in ("SYSTEMROOT", "TEMP", "TMP"): if required not in env: raise Exception( - 'Environment variable "%s" ' - "required to be set to valid path" % required + 'Environment variable "%s" required to be set to valid path' % required ) return env diff --git a/tools/gyp/pylib/gyp/simple_copy.py b/tools/gyp/pylib/gyp/simple_copy.py index 729cec0636273b..8b026642fc5ef0 100644 --- a/tools/gyp/pylib/gyp/simple_copy.py +++ b/tools/gyp/pylib/gyp/simple_copy.py @@ -17,8 +17,8 @@ class Error(Exception): def deepcopy(x): """Deep copy operation on gyp objects such as strings, ints, dicts - and lists. More than twice as fast as copy.deepcopy but much less - generic.""" + and lists. More than twice as fast as copy.deepcopy but much less + generic.""" try: return _deepcopy_dispatch[type(x)](x) diff --git a/tools/gyp/pylib/gyp/win_tool.py b/tools/gyp/pylib/gyp/win_tool.py index 3004f533ca9fe3..43665577bdddaf 100755 --- a/tools/gyp/pylib/gyp/win_tool.py +++ b/tools/gyp/pylib/gyp/win_tool.py @@ -9,7 +9,6 @@ These functions are executed via gyp-win-tool when using the ninja generator. """ - import os import re import shutil @@ -33,11 +32,11 @@ def main(args): class WinTool: """This class performs all the Windows tooling steps. The methods can either - be executed directly, or dispatched from an argument list.""" + be executed directly, or dispatched from an argument list.""" def _UseSeparateMspdbsrv(self, env, args): """Allows to use a unique instance of mspdbsrv.exe per linker instead of a - shared one.""" + shared one.""" if len(args) < 1: raise Exception("Not enough arguments") @@ -114,9 +113,9 @@ def _on_error(fn, path, excinfo): def ExecLinkWrapper(self, arch, use_separate_mspdbsrv, *args): """Filter diagnostic output from link that looks like: - ' Creating library ui.dll.lib and object ui.dll.exp' - This happens when there are exports from the dll or exe. - """ + ' Creating library ui.dll.lib and object ui.dll.exp' + This happens when there are exports from the dll or exe. + """ env = self._GetEnv(arch) if use_separate_mspdbsrv == "True": self._UseSeparateMspdbsrv(env, args) @@ -158,10 +157,10 @@ def ExecLinkWithManifests( mt, rc, intermediate_manifest, - *manifests + *manifests, ): """A wrapper for handling creating a manifest resource and then executing - a link command.""" + a link command.""" # The 'normal' way to do manifests is to have link generate a manifest # based on gathering dependencies from the object files, then merge that # manifest with other manifests supplied as sources, convert the merged @@ -245,8 +244,8 @@ def dump(filename): def ExecManifestWrapper(self, arch, *args): """Run manifest tool with environment set. Strip out undesirable warning - (some XML blocks are recognized by the OS loader, but not the manifest - tool).""" + (some XML blocks are recognized by the OS loader, but not the manifest + tool).""" env = self._GetEnv(arch) popen = subprocess.Popen( args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT @@ -259,8 +258,8 @@ def ExecManifestWrapper(self, arch, *args): def ExecManifestToRc(self, arch, *args): """Creates a resource file pointing a SxS assembly manifest. - |args| is tuple containing path to resource file, path to manifest file - and resource name which can be "1" (for executables) or "2" (for DLLs).""" + |args| is tuple containing path to resource file, path to manifest file + and resource name which can be "1" (for executables) or "2" (for DLLs).""" manifest_path, resource_path, resource_name = args with open(resource_path, "w") as output: output.write( @@ -270,8 +269,8 @@ def ExecManifestToRc(self, arch, *args): def ExecMidlWrapper(self, arch, outdir, tlb, h, dlldata, iid, proxy, idl, *flags): """Filter noisy filenames output from MIDL compile step that isn't - quietable via command line flags. - """ + quietable via command line flags. + """ args = ( ["midl", "/nologo"] + list(flags) @@ -327,7 +326,7 @@ def ExecAsmWrapper(self, arch, *args): def ExecRcWrapper(self, arch, *args): """Filter logo banner from invocations of rc.exe. Older versions of RC - don't support the /nologo flag.""" + don't support the /nologo flag.""" env = self._GetEnv(arch) popen = subprocess.Popen( args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT @@ -344,7 +343,7 @@ def ExecRcWrapper(self, arch, *args): def ExecActionWrapper(self, arch, rspfile, *dir): """Runs an action command line from a response file using the environment - for |arch|. If |dir| is supplied, use that as the working directory.""" + for |arch|. If |dir| is supplied, use that as the working directory.""" env = self._GetEnv(arch) # TODO(scottmg): This is a temporary hack to get some specific variables # through to actions that are set after gyp-time. http://crbug.com/333738. @@ -357,7 +356,7 @@ def ExecActionWrapper(self, arch, rspfile, *dir): def ExecClCompile(self, project_dir, selected_files): """Executed by msvs-ninja projects when the 'ClCompile' target is used to - build selected C/C++ files.""" + build selected C/C++ files.""" project_dir = os.path.relpath(project_dir, BASE_DIR) selected_files = selected_files.split(";") ninja_targets = [ diff --git a/tools/gyp/pylib/gyp/xcode_emulation.py b/tools/gyp/pylib/gyp/xcode_emulation.py index 08e645c57d5cda..192a523529fddd 100644 --- a/tools/gyp/pylib/gyp/xcode_emulation.py +++ b/tools/gyp/pylib/gyp/xcode_emulation.py @@ -7,7 +7,6 @@ other build systems, such as make and ninja. """ - import copy import os import os.path @@ -31,7 +30,7 @@ def XcodeArchsVariableMapping(archs, archs_including_64_bit=None): """Constructs a dictionary with expansion for $(ARCHS_STANDARD) variable, - and optionally for $(ARCHS_STANDARD_INCLUDING_64_BIT).""" + and optionally for $(ARCHS_STANDARD_INCLUDING_64_BIT).""" mapping = {"$(ARCHS_STANDARD)": archs} if archs_including_64_bit: mapping["$(ARCHS_STANDARD_INCLUDING_64_BIT)"] = archs_including_64_bit @@ -40,10 +39,10 @@ def XcodeArchsVariableMapping(archs, archs_including_64_bit=None): class XcodeArchsDefault: """A class to resolve ARCHS variable from xcode_settings, resolving Xcode - macros and implementing filtering by VALID_ARCHS. The expansion of macros - depends on the SDKROOT used ("macosx", "iphoneos", "iphonesimulator") and - on the version of Xcode. - """ + macros and implementing filtering by VALID_ARCHS. The expansion of macros + depends on the SDKROOT used ("macosx", "iphoneos", "iphonesimulator") and + on the version of Xcode. + """ # Match variable like $(ARCHS_STANDARD). variable_pattern = re.compile(r"\$\([a-zA-Z_][a-zA-Z0-9_]*\)$") @@ -82,8 +81,8 @@ def _ExpandArchs(self, archs, sdkroot): def ActiveArchs(self, archs, valid_archs, sdkroot): """Expands variables references in ARCHS, and filter by VALID_ARCHS if it - is defined (if not set, Xcode accept any value in ARCHS, otherwise, only - values present in VALID_ARCHS are kept).""" + is defined (if not set, Xcode accept any value in ARCHS, otherwise, only + values present in VALID_ARCHS are kept).""" expanded_archs = self._ExpandArchs(archs or self._default, sdkroot or "") if valid_archs: filtered_archs = [] @@ -96,24 +95,24 @@ def ActiveArchs(self, archs, valid_archs, sdkroot): def GetXcodeArchsDefault(): """Returns the |XcodeArchsDefault| object to use to expand ARCHS for the - installed version of Xcode. The default values used by Xcode for ARCHS - and the expansion of the variables depends on the version of Xcode used. + installed version of Xcode. The default values used by Xcode for ARCHS + and the expansion of the variables depends on the version of Xcode used. - For all version anterior to Xcode 5.0 or posterior to Xcode 5.1 included - uses $(ARCHS_STANDARD) if ARCHS is unset, while Xcode 5.0 to 5.0.2 uses - $(ARCHS_STANDARD_INCLUDING_64_BIT). This variable was added to Xcode 5.0 - and deprecated with Xcode 5.1. + For all version anterior to Xcode 5.0 or posterior to Xcode 5.1 included + uses $(ARCHS_STANDARD) if ARCHS is unset, while Xcode 5.0 to 5.0.2 uses + $(ARCHS_STANDARD_INCLUDING_64_BIT). This variable was added to Xcode 5.0 + and deprecated with Xcode 5.1. - For "macosx" SDKROOT, all version starting with Xcode 5.0 includes 64-bit - architecture as part of $(ARCHS_STANDARD) and default to only building it. + For "macosx" SDKROOT, all version starting with Xcode 5.0 includes 64-bit + architecture as part of $(ARCHS_STANDARD) and default to only building it. - For "iphoneos" and "iphonesimulator" SDKROOT, 64-bit architectures are part - of $(ARCHS_STANDARD_INCLUDING_64_BIT) from Xcode 5.0. From Xcode 5.1, they - are also part of $(ARCHS_STANDARD). + For "iphoneos" and "iphonesimulator" SDKROOT, 64-bit architectures are part + of $(ARCHS_STANDARD_INCLUDING_64_BIT) from Xcode 5.0. From Xcode 5.1, they + are also part of $(ARCHS_STANDARD). - All these rules are coded in the construction of the |XcodeArchsDefault| - object to use depending on the version of Xcode detected. The object is - for performance reason.""" + All these rules are coded in the construction of the |XcodeArchsDefault| + object to use depending on the version of Xcode detected. The object is + for performance reason.""" global XCODE_ARCHS_DEFAULT_CACHE if XCODE_ARCHS_DEFAULT_CACHE: return XCODE_ARCHS_DEFAULT_CACHE @@ -190,8 +189,8 @@ def __init__(self, spec): def _ConvertConditionalKeys(self, configname): """Converts or warns on conditional keys. Xcode supports conditional keys, - such as CODE_SIGN_IDENTITY[sdk=iphoneos*]. This is a partial implementation - with some keys converted while the rest force a warning.""" + such as CODE_SIGN_IDENTITY[sdk=iphoneos*]. This is a partial implementation + with some keys converted while the rest force a warning.""" settings = self.xcode_settings[configname] conditional_keys = [key for key in settings if key.endswith("]")] for key in conditional_keys: @@ -256,13 +255,13 @@ def _IsIosWatchApp(self): def GetFrameworkVersion(self): """Returns the framework version of the current target. Only valid for - bundles.""" + bundles.""" assert self._IsBundle() return self.GetPerTargetSetting("FRAMEWORK_VERSION", default="A") def GetWrapperExtension(self): """Returns the bundle extension (.app, .framework, .plugin, etc). Only - valid for bundles.""" + valid for bundles.""" assert self._IsBundle() if self.spec["type"] in ("loadable_module", "shared_library"): default_wrapper_extension = { @@ -297,13 +296,13 @@ def GetFullProductName(self): def GetWrapperName(self): """Returns the directory name of the bundle represented by this target. - Only valid for bundles.""" + Only valid for bundles.""" assert self._IsBundle() return self.GetProductName() + self.GetWrapperExtension() def GetBundleContentsFolderPath(self): """Returns the qualified path to the bundle's contents folder. E.g. - Chromium.app/Contents or Foo.bundle/Versions/A. Only valid for bundles.""" + Chromium.app/Contents or Foo.bundle/Versions/A. Only valid for bundles.""" if self.isIOS: return self.GetWrapperName() assert self._IsBundle() @@ -317,7 +316,7 @@ def GetBundleContentsFolderPath(self): def GetBundleResourceFolder(self): """Returns the qualified path to the bundle's resource folder. E.g. - Chromium.app/Contents/Resources. Only valid for bundles.""" + Chromium.app/Contents/Resources. Only valid for bundles.""" assert self._IsBundle() if self.isIOS: return self.GetBundleContentsFolderPath() @@ -325,7 +324,7 @@ def GetBundleResourceFolder(self): def GetBundleExecutableFolderPath(self): """Returns the qualified path to the bundle's executables folder. E.g. - Chromium.app/Contents/MacOS. Only valid for bundles.""" + Chromium.app/Contents/MacOS. Only valid for bundles.""" assert self._IsBundle() if self.spec["type"] in ("shared_library") or self.isIOS: return self.GetBundleContentsFolderPath() @@ -334,25 +333,25 @@ def GetBundleExecutableFolderPath(self): def GetBundleJavaFolderPath(self): """Returns the qualified path to the bundle's Java resource folder. - E.g. Chromium.app/Contents/Resources/Java. Only valid for bundles.""" + E.g. Chromium.app/Contents/Resources/Java. Only valid for bundles.""" assert self._IsBundle() return os.path.join(self.GetBundleResourceFolder(), "Java") def GetBundleFrameworksFolderPath(self): """Returns the qualified path to the bundle's frameworks folder. E.g, - Chromium.app/Contents/Frameworks. Only valid for bundles.""" + Chromium.app/Contents/Frameworks. Only valid for bundles.""" assert self._IsBundle() return os.path.join(self.GetBundleContentsFolderPath(), "Frameworks") def GetBundleSharedFrameworksFolderPath(self): """Returns the qualified path to the bundle's frameworks folder. E.g, - Chromium.app/Contents/SharedFrameworks. Only valid for bundles.""" + Chromium.app/Contents/SharedFrameworks. Only valid for bundles.""" assert self._IsBundle() return os.path.join(self.GetBundleContentsFolderPath(), "SharedFrameworks") def GetBundleSharedSupportFolderPath(self): """Returns the qualified path to the bundle's shared support folder. E.g, - Chromium.app/Contents/SharedSupport. Only valid for bundles.""" + Chromium.app/Contents/SharedSupport. Only valid for bundles.""" assert self._IsBundle() if self.spec["type"] == "shared_library": return self.GetBundleResourceFolder() @@ -361,19 +360,19 @@ def GetBundleSharedSupportFolderPath(self): def GetBundlePlugInsFolderPath(self): """Returns the qualified path to the bundle's plugins folder. E.g, - Chromium.app/Contents/PlugIns. Only valid for bundles.""" + Chromium.app/Contents/PlugIns. Only valid for bundles.""" assert self._IsBundle() return os.path.join(self.GetBundleContentsFolderPath(), "PlugIns") def GetBundleXPCServicesFolderPath(self): """Returns the qualified path to the bundle's XPC services folder. E.g, - Chromium.app/Contents/XPCServices. Only valid for bundles.""" + Chromium.app/Contents/XPCServices. Only valid for bundles.""" assert self._IsBundle() return os.path.join(self.GetBundleContentsFolderPath(), "XPCServices") def GetBundlePlistPath(self): """Returns the qualified path to the bundle's plist file. E.g. - Chromium.app/Contents/Info.plist. Only valid for bundles.""" + Chromium.app/Contents/Info.plist. Only valid for bundles.""" assert self._IsBundle() if ( self.spec["type"] in ("executable", "loadable_module") @@ -439,7 +438,7 @@ def GetMachOType(self): def _GetBundleBinaryPath(self): """Returns the name of the bundle binary of by this target. - E.g. Chromium.app/Contents/MacOS/Chromium. Only valid for bundles.""" + E.g. Chromium.app/Contents/MacOS/Chromium. Only valid for bundles.""" assert self._IsBundle() return os.path.join( self.GetBundleExecutableFolderPath(), self.GetExecutableName() @@ -470,14 +469,14 @@ def _GetStandaloneExecutablePrefix(self): def _GetStandaloneBinaryPath(self): """Returns the name of the non-bundle binary represented by this target. - E.g. hello_world. Only valid for non-bundles.""" + E.g. hello_world. Only valid for non-bundles.""" assert not self._IsBundle() assert self.spec["type"] in { "executable", "shared_library", "static_library", "loadable_module", - }, ("Unexpected type %s" % self.spec["type"]) + }, "Unexpected type %s" % self.spec["type"] target = self.spec["target_name"] if self.spec["type"] in {"loadable_module", "shared_library", "static_library"}: if target[:3] == "lib": @@ -490,7 +489,7 @@ def _GetStandaloneBinaryPath(self): def GetExecutableName(self): """Returns the executable name of the bundle represented by this target. - E.g. Chromium.""" + E.g. Chromium.""" if self._IsBundle(): return self.spec.get("product_name", self.spec["target_name"]) else: @@ -498,7 +497,7 @@ def GetExecutableName(self): def GetExecutablePath(self): """Returns the qualified path to the primary executable of the bundle - represented by this target. E.g. Chromium.app/Contents/MacOS/Chromium.""" + represented by this target. E.g. Chromium.app/Contents/MacOS/Chromium.""" if self._IsBundle(): return self._GetBundleBinaryPath() else: @@ -568,7 +567,7 @@ def _AppendPlatformVersionMinFlags(self, lst): def GetCflags(self, configname, arch=None): """Returns flags that need to be added to .c, .cc, .m, and .mm - compilations.""" + compilations.""" # This functions (and the similar ones below) do not offer complete # emulation of all xcode_settings keys. They're implemented on demand. @@ -863,7 +862,7 @@ def GetInstallName(self): def _MapLinkerFlagFilename(self, ldflag, gyp_to_build_path): """Checks if ldflag contains a filename and if so remaps it from - gyp-directory-relative to build-directory-relative.""" + gyp-directory-relative to build-directory-relative.""" # This list is expanded on demand. # They get matched as: # -exported_symbols_list file @@ -895,13 +894,13 @@ def _MapLinkerFlagFilename(self, ldflag, gyp_to_build_path): def GetLdflags(self, configname, product_dir, gyp_to_build_path, arch=None): """Returns flags that need to be passed to the linker. - Args: - configname: The name of the configuration to get ld flags for. - product_dir: The directory where products such static and dynamic - libraries are placed. This is added to the library search path. - gyp_to_build_path: A function that converts paths relative to the - current gyp file to paths relative to the build directory. - """ + Args: + configname: The name of the configuration to get ld flags for. + product_dir: The directory where products such static and dynamic + libraries are placed. This is added to the library search path. + gyp_to_build_path: A function that converts paths relative to the + current gyp file to paths relative to the build directory. + """ self.configname = configname ldflags = [] @@ -1001,9 +1000,9 @@ def GetLdflags(self, configname, product_dir, gyp_to_build_path, arch=None): def GetLibtoolflags(self, configname): """Returns flags that need to be passed to the static linker. - Args: - configname: The name of the configuration to get ld flags for. - """ + Args: + configname: The name of the configuration to get ld flags for. + """ self.configname = configname libtoolflags = [] @@ -1016,7 +1015,7 @@ def GetLibtoolflags(self, configname): def GetPerTargetSettings(self): """Gets a list of all the per-target settings. This will only fetch keys - whose values are the same across all configurations.""" + whose values are the same across all configurations.""" first_pass = True result = {} for configname in sorted(self.xcode_settings.keys()): @@ -1039,7 +1038,7 @@ def GetPerConfigSetting(self, setting, configname, default=None): def GetPerTargetSetting(self, setting, default=None): """Tries to get xcode_settings.setting from spec. Assumes that the setting - has the same value in all configurations and throws otherwise.""" + has the same value in all configurations and throws otherwise.""" is_first_pass = True result = None for configname in sorted(self.xcode_settings.keys()): @@ -1057,15 +1056,14 @@ def GetPerTargetSetting(self, setting, default=None): def _GetStripPostbuilds(self, configname, output_binary, quiet): """Returns a list of shell commands that contain the shell commands - necessary to strip this target's binary. These should be run as postbuilds - before the actual postbuilds run.""" + necessary to strip this target's binary. These should be run as postbuilds + before the actual postbuilds run.""" self.configname = configname result = [] if self._Test("DEPLOYMENT_POSTPROCESSING", "YES", default="NO") and self._Test( "STRIP_INSTALLED_PRODUCT", "YES", default="NO" ): - default_strip_style = "debugging" if ( self.spec["type"] == "loadable_module" or self._IsIosAppExtension() @@ -1092,8 +1090,8 @@ def _GetStripPostbuilds(self, configname, output_binary, quiet): def _GetDebugInfoPostbuilds(self, configname, output, output_binary, quiet): """Returns a list of shell commands that contain the shell commands - necessary to massage this target's debug information. These should be run - as postbuilds before the actual postbuilds run.""" + necessary to massage this target's debug information. These should be run + as postbuilds before the actual postbuilds run.""" self.configname = configname # For static libraries, no dSYMs are created. @@ -1114,7 +1112,7 @@ def _GetDebugInfoPostbuilds(self, configname, output, output_binary, quiet): def _GetTargetPostbuilds(self, configname, output, output_binary, quiet=False): """Returns a list of shell commands that contain the shell commands - to run as postbuilds for this target, before the actual postbuilds.""" + to run as postbuilds for this target, before the actual postbuilds.""" # dSYMs need to build before stripping happens. return self._GetDebugInfoPostbuilds( configname, output, output_binary, quiet @@ -1122,11 +1120,10 @@ def _GetTargetPostbuilds(self, configname, output, output_binary, quiet=False): def _GetIOSPostbuilds(self, configname, output_binary): """Return a shell command to codesign the iOS output binary so it can - be deployed to a device. This should be run as the very last step of the - build.""" + be deployed to a device. This should be run as the very last step of the + build.""" if not ( - (self.isIOS - and (self.spec["type"] == "executable" or self._IsXCTest())) + (self.isIOS and (self.spec["type"] == "executable" or self._IsXCTest())) or self.IsIosFramework() ): return [] @@ -1240,7 +1237,7 @@ def AddImplicitPostbuilds( self, configname, output, output_binary, postbuilds=[], quiet=False ): """Returns a list of shell commands that should run before and after - |postbuilds|.""" + |postbuilds|.""" assert output_binary is not None pre = self._GetTargetPostbuilds(configname, output, output_binary, quiet) post = self._GetIOSPostbuilds(configname, output_binary) @@ -1276,8 +1273,8 @@ def _AdjustLibrary(self, library, config_name=None): def AdjustLibraries(self, libraries, config_name=None): """Transforms entries like 'Cocoa.framework' in libraries into entries like - '-framework Cocoa', 'libcrypto.dylib' into '-lcrypto', etc. - """ + '-framework Cocoa', 'libcrypto.dylib' into '-lcrypto', etc. + """ libraries = [self._AdjustLibrary(library, config_name) for library in libraries] return libraries @@ -1342,10 +1339,10 @@ def GetExtraPlistItems(self, configname=None): def _DefaultSdkRoot(self): """Returns the default SDKROOT to use. - Prior to version 5.0.0, if SDKROOT was not explicitly set in the Xcode - project, then the environment variable was empty. Starting with this - version, Xcode uses the name of the newest SDK installed. - """ + Prior to version 5.0.0, if SDKROOT was not explicitly set in the Xcode + project, then the environment variable was empty. Starting with this + version, Xcode uses the name of the newest SDK installed. + """ xcode_version, _ = XcodeVersion() if xcode_version < "0500": return "" @@ -1370,39 +1367,39 @@ def _DefaultSdkRoot(self): class MacPrefixHeader: """A class that helps with emulating Xcode's GCC_PREFIX_HEADER feature. - This feature consists of several pieces: - * If GCC_PREFIX_HEADER is present, all compilations in that project get an - additional |-include path_to_prefix_header| cflag. - * If GCC_PRECOMPILE_PREFIX_HEADER is present too, then the prefix header is - instead compiled, and all other compilations in the project get an - additional |-include path_to_compiled_header| instead. - + Compiled prefix headers have the extension gch. There is one gch file for - every language used in the project (c, cc, m, mm), since gch files for - different languages aren't compatible. - + gch files themselves are built with the target's normal cflags, but they - obviously don't get the |-include| flag. Instead, they need a -x flag that - describes their language. - + All o files in the target need to depend on the gch file, to make sure - it's built before any o file is built. - - This class helps with some of these tasks, but it needs help from the build - system for writing dependencies to the gch files, for writing build commands - for the gch files, and for figuring out the location of the gch files. - """ + This feature consists of several pieces: + * If GCC_PREFIX_HEADER is present, all compilations in that project get an + additional |-include path_to_prefix_header| cflag. + * If GCC_PRECOMPILE_PREFIX_HEADER is present too, then the prefix header is + instead compiled, and all other compilations in the project get an + additional |-include path_to_compiled_header| instead. + + Compiled prefix headers have the extension gch. There is one gch file for + every language used in the project (c, cc, m, mm), since gch files for + different languages aren't compatible. + + gch files themselves are built with the target's normal cflags, but they + obviously don't get the |-include| flag. Instead, they need a -x flag that + describes their language. + + All o files in the target need to depend on the gch file, to make sure + it's built before any o file is built. + + This class helps with some of these tasks, but it needs help from the build + system for writing dependencies to the gch files, for writing build commands + for the gch files, and for figuring out the location of the gch files. + """ def __init__( self, xcode_settings, gyp_path_to_build_path, gyp_path_to_build_output ): """If xcode_settings is None, all methods on this class are no-ops. - Args: - gyp_path_to_build_path: A function that takes a gyp-relative path, - and returns a path relative to the build directory. - gyp_path_to_build_output: A function that takes a gyp-relative path and - a language code ('c', 'cc', 'm', or 'mm'), and that returns a path - to where the output of precompiling that path for that language - should be placed (without the trailing '.gch'). - """ + Args: + gyp_path_to_build_path: A function that takes a gyp-relative path, + and returns a path relative to the build directory. + gyp_path_to_build_output: A function that takes a gyp-relative path and + a language code ('c', 'cc', 'm', or 'mm'), and that returns a path + to where the output of precompiling that path for that language + should be placed (without the trailing '.gch'). + """ # This doesn't support per-configuration prefix headers. Good enough # for now. self.header = None @@ -1447,9 +1444,9 @@ def _Gch(self, lang, arch): def GetObjDependencies(self, sources, objs, arch=None): """Given a list of source files and the corresponding object files, returns - a list of (source, object, gch) tuples, where |gch| is the build-directory - relative path to the gch file each object file depends on. |compilable[i]| - has to be the source file belonging to |objs[i]|.""" + a list of (source, object, gch) tuples, where |gch| is the build-directory + relative path to the gch file each object file depends on. |compilable[i]| + has to be the source file belonging to |objs[i]|.""" if not self.header or not self.compile_headers: return [] @@ -1470,8 +1467,8 @@ def GetObjDependencies(self, sources, objs, arch=None): def GetPchBuildCommands(self, arch=None): """Returns [(path_to_gch, language_flag, language, header)]. - |path_to_gch| and |header| are relative to the build directory. - """ + |path_to_gch| and |header| are relative to the build directory. + """ if not self.header or not self.compile_headers: return [] return [ @@ -1555,8 +1552,8 @@ def CLTVersion(): def GetStdoutQuiet(cmdlist): """Returns the content of standard output returned by invoking |cmdlist|. - Ignores the stderr. - Raises |GypError| if the command return with a non-zero return code.""" + Ignores the stderr. + Raises |GypError| if the command return with a non-zero return code.""" job = subprocess.Popen(cmdlist, stdout=subprocess.PIPE, stderr=subprocess.PIPE) out = job.communicate()[0].decode("utf-8") if job.returncode != 0: @@ -1566,7 +1563,7 @@ def GetStdoutQuiet(cmdlist): def GetStdout(cmdlist): """Returns the content of standard output returned by invoking |cmdlist|. - Raises |GypError| if the command return with a non-zero return code.""" + Raises |GypError| if the command return with a non-zero return code.""" job = subprocess.Popen(cmdlist, stdout=subprocess.PIPE) out = job.communicate()[0].decode("utf-8") if job.returncode != 0: @@ -1577,9 +1574,9 @@ def GetStdout(cmdlist): def MergeGlobalXcodeSettingsToSpec(global_dict, spec): """Merges the global xcode_settings dictionary into each configuration of the - target represented by spec. For keys that are both in the global and the local - xcode_settings dict, the local key gets precedence. - """ + target represented by spec. For keys that are both in the global and the local + xcode_settings dict, the local key gets precedence. + """ # The xcode generator special-cases global xcode_settings and does something # that amounts to merging in the global xcode_settings into each local # xcode_settings dict. @@ -1594,9 +1591,9 @@ def MergeGlobalXcodeSettingsToSpec(global_dict, spec): def IsMacBundle(flavor, spec): """Returns if |spec| should be treated as a bundle. - Bundles are directories with a certain subdirectory structure, instead of - just a single file. Bundle rules do not produce a binary but also package - resources into that directory.""" + Bundles are directories with a certain subdirectory structure, instead of + just a single file. Bundle rules do not produce a binary but also package + resources into that directory.""" is_mac_bundle = ( int(spec.get("mac_xctest_bundle", 0)) != 0 or int(spec.get("mac_xcuitest_bundle", 0)) != 0 @@ -1613,14 +1610,14 @@ def IsMacBundle(flavor, spec): def GetMacBundleResources(product_dir, xcode_settings, resources): """Yields (output, resource) pairs for every resource in |resources|. - Only call this for mac bundle targets. - - Args: - product_dir: Path to the directory containing the output bundle, - relative to the build directory. - xcode_settings: The XcodeSettings of the current target. - resources: A list of bundle resources, relative to the build directory. - """ + Only call this for mac bundle targets. + + Args: + product_dir: Path to the directory containing the output bundle, + relative to the build directory. + xcode_settings: The XcodeSettings of the current target. + resources: A list of bundle resources, relative to the build directory. + """ dest = os.path.join(product_dir, xcode_settings.GetBundleResourceFolder()) for res in resources: output = dest @@ -1651,24 +1648,24 @@ def GetMacBundleResources(product_dir, xcode_settings, resources): def GetMacInfoPlist(product_dir, xcode_settings, gyp_path_to_build_path): """Returns (info_plist, dest_plist, defines, extra_env), where: - * |info_plist| is the source plist path, relative to the - build directory, - * |dest_plist| is the destination plist path, relative to the - build directory, - * |defines| is a list of preprocessor defines (empty if the plist - shouldn't be preprocessed, - * |extra_env| is a dict of env variables that should be exported when - invoking |mac_tool copy-info-plist|. - - Only call this for mac bundle targets. - - Args: - product_dir: Path to the directory containing the output bundle, - relative to the build directory. - xcode_settings: The XcodeSettings of the current target. - gyp_to_build_path: A function that converts paths relative to the - current gyp file to paths relative to the build directory. - """ + * |info_plist| is the source plist path, relative to the + build directory, + * |dest_plist| is the destination plist path, relative to the + build directory, + * |defines| is a list of preprocessor defines (empty if the plist + shouldn't be preprocessed, + * |extra_env| is a dict of env variables that should be exported when + invoking |mac_tool copy-info-plist|. + + Only call this for mac bundle targets. + + Args: + product_dir: Path to the directory containing the output bundle, + relative to the build directory. + xcode_settings: The XcodeSettings of the current target. + gyp_to_build_path: A function that converts paths relative to the + current gyp file to paths relative to the build directory. + """ info_plist = xcode_settings.GetPerTargetSetting("INFOPLIST_FILE") if not info_plist: return None, None, [], {} @@ -1706,18 +1703,18 @@ def _GetXcodeEnv( xcode_settings, built_products_dir, srcroot, configuration, additional_settings=None ): """Return the environment variables that Xcode would set. See - http://developer.apple.com/library/mac/#documentation/DeveloperTools/Reference/XcodeBuildSettingRef/1-Build_Setting_Reference/build_setting_ref.html#//apple_ref/doc/uid/TP40003931-CH3-SW153 - for a full list. - - Args: - xcode_settings: An XcodeSettings object. If this is None, this function - returns an empty dict. - built_products_dir: Absolute path to the built products dir. - srcroot: Absolute path to the source root. - configuration: The build configuration name. - additional_settings: An optional dict with more values to add to the - result. - """ + http://developer.apple.com/library/mac/#documentation/DeveloperTools/Reference/XcodeBuildSettingRef/1-Build_Setting_Reference/build_setting_ref.html#//apple_ref/doc/uid/TP40003931-CH3-SW153 + for a full list. + + Args: + xcode_settings: An XcodeSettings object. If this is None, this function + returns an empty dict. + built_products_dir: Absolute path to the built products dir. + srcroot: Absolute path to the source root. + configuration: The build configuration name. + additional_settings: An optional dict with more values to add to the + result. + """ if not xcode_settings: return {} @@ -1771,17 +1768,17 @@ def _GetXcodeEnv( ) env["CONTENTS_FOLDER_PATH"] = xcode_settings.GetBundleContentsFolderPath() env["EXECUTABLE_FOLDER_PATH"] = xcode_settings.GetBundleExecutableFolderPath() - env[ - "UNLOCALIZED_RESOURCES_FOLDER_PATH" - ] = xcode_settings.GetBundleResourceFolder() + env["UNLOCALIZED_RESOURCES_FOLDER_PATH"] = ( + xcode_settings.GetBundleResourceFolder() + ) env["JAVA_FOLDER_PATH"] = xcode_settings.GetBundleJavaFolderPath() env["FRAMEWORKS_FOLDER_PATH"] = xcode_settings.GetBundleFrameworksFolderPath() - env[ - "SHARED_FRAMEWORKS_FOLDER_PATH" - ] = xcode_settings.GetBundleSharedFrameworksFolderPath() - env[ - "SHARED_SUPPORT_FOLDER_PATH" - ] = xcode_settings.GetBundleSharedSupportFolderPath() + env["SHARED_FRAMEWORKS_FOLDER_PATH"] = ( + xcode_settings.GetBundleSharedFrameworksFolderPath() + ) + env["SHARED_SUPPORT_FOLDER_PATH"] = ( + xcode_settings.GetBundleSharedSupportFolderPath() + ) env["PLUGINS_FOLDER_PATH"] = xcode_settings.GetBundlePlugInsFolderPath() env["XPCSERVICES_FOLDER_PATH"] = xcode_settings.GetBundleXPCServicesFolderPath() env["INFOPLIST_PATH"] = xcode_settings.GetBundlePlistPath() @@ -1817,8 +1814,8 @@ def _GetXcodeEnv( def _NormalizeEnvVarReferences(str): """Takes a string containing variable references in the form ${FOO}, $(FOO), - or $FOO, and returns a string with all variable references in the form ${FOO}. - """ + or $FOO, and returns a string with all variable references in the form ${FOO}. + """ # $FOO -> ${FOO} str = re.sub(r"\$([a-zA-Z_][a-zA-Z0-9_]*)", r"${\1}", str) @@ -1834,9 +1831,9 @@ def _NormalizeEnvVarReferences(str): def ExpandEnvVars(string, expansions): """Expands ${VARIABLES}, $(VARIABLES), and $VARIABLES in string per the - expansions list. If the variable expands to something that references - another variable, this variable is expanded as well if it's in env -- - until no variables present in env are left.""" + expansions list. If the variable expands to something that references + another variable, this variable is expanded as well if it's in env -- + until no variables present in env are left.""" for k, v in reversed(expansions): string = string.replace("${" + k + "}", v) string = string.replace("$(" + k + ")", v) @@ -1846,11 +1843,11 @@ def ExpandEnvVars(string, expansions): def _TopologicallySortedEnvVarKeys(env): """Takes a dict |env| whose values are strings that can refer to other keys, - for example env['foo'] = '$(bar) and $(baz)'. Returns a list L of all keys of - env such that key2 is after key1 in L if env[key2] refers to env[key1]. + for example env['foo'] = '$(bar) and $(baz)'. Returns a list L of all keys of + env such that key2 is after key1 in L if env[key2] refers to env[key1]. - Throws an Exception in case of dependency cycles. - """ + Throws an Exception in case of dependency cycles. + """ # Since environment variables can refer to other variables, the evaluation # order is important. Below is the logic to compute the dependency graph # and sort it. @@ -1891,7 +1888,7 @@ def GetSortedXcodeEnv( def GetSpecPostbuildCommands(spec, quiet=False): """Returns the list of postbuilds explicitly defined on |spec|, in a form - executable by a shell.""" + executable by a shell.""" postbuilds = [] for postbuild in spec.get("postbuilds", []): if not quiet: @@ -1905,7 +1902,7 @@ def GetSpecPostbuildCommands(spec, quiet=False): def _HasIOSTarget(targets): """Returns true if any target contains the iOS specific key - IPHONEOS_DEPLOYMENT_TARGET.""" + IPHONEOS_DEPLOYMENT_TARGET.""" for target_dict in targets.values(): for config in target_dict["configurations"].values(): if config.get("xcode_settings", {}).get("IPHONEOS_DEPLOYMENT_TARGET"): @@ -1915,7 +1912,7 @@ def _HasIOSTarget(targets): def _AddIOSDeviceConfigurations(targets): """Clone all targets and append -iphoneos to the name. Configure these targets - to build for iOS devices and use correct architectures for those builds.""" + to build for iOS devices and use correct architectures for those builds.""" for target_dict in targets.values(): toolset = target_dict["toolset"] configs = target_dict["configurations"] @@ -1931,7 +1928,7 @@ def _AddIOSDeviceConfigurations(targets): def CloneConfigurationForDeviceAndEmulator(target_dicts): """If |target_dicts| contains any iOS targets, automatically create -iphoneos - targets for iOS device builds.""" + targets for iOS device builds.""" if _HasIOSTarget(target_dicts): return _AddIOSDeviceConfigurations(target_dicts) return target_dicts diff --git a/tools/gyp/pylib/gyp/xcode_ninja.py b/tools/gyp/pylib/gyp/xcode_ninja.py index ae3079d85a48cb..1a97a06c51d9f5 100644 --- a/tools/gyp/pylib/gyp/xcode_ninja.py +++ b/tools/gyp/pylib/gyp/xcode_ninja.py @@ -21,7 +21,7 @@ def _WriteWorkspace(main_gyp, sources_gyp, params): - """ Create a workspace to wrap main and sources gyp paths. """ + """Create a workspace to wrap main and sources gyp paths.""" (build_file_root, build_file_ext) = os.path.splitext(main_gyp) workspace_path = build_file_root + ".xcworkspace" options = params["options"] @@ -57,7 +57,7 @@ def _WriteWorkspace(main_gyp, sources_gyp, params): def _TargetFromSpec(old_spec, params): - """ Create fake target for xcode-ninja wrapper. """ + """Create fake target for xcode-ninja wrapper.""" # Determine ninja top level build dir (e.g. /path/to/out). ninja_toplevel = None jobs = 0 @@ -102,9 +102,9 @@ def _TargetFromSpec(old_spec, params): new_xcode_settings[key] = old_xcode_settings[key] ninja_target["configurations"][config] = {} - ninja_target["configurations"][config][ - "xcode_settings" - ] = new_xcode_settings + ninja_target["configurations"][config]["xcode_settings"] = ( + new_xcode_settings + ) ninja_target["mac_bundle"] = old_spec.get("mac_bundle", 0) ninja_target["mac_xctest_bundle"] = old_spec.get("mac_xctest_bundle", 0) @@ -137,13 +137,13 @@ def _TargetFromSpec(old_spec, params): def IsValidTargetForWrapper(target_extras, executable_target_pattern, spec): """Limit targets for Xcode wrapper. - Xcode sometimes performs poorly with too many targets, so only include - proper executable targets, with filters to customize. - Arguments: - target_extras: Regular expression to always add, matching any target. - executable_target_pattern: Regular expression limiting executable targets. - spec: Specifications for target. - """ + Xcode sometimes performs poorly with too many targets, so only include + proper executable targets, with filters to customize. + Arguments: + target_extras: Regular expression to always add, matching any target. + executable_target_pattern: Regular expression limiting executable targets. + spec: Specifications for target. + """ target_name = spec.get("target_name") # Always include targets matching target_extras. if target_extras is not None and re.search(target_extras, target_name): @@ -154,7 +154,6 @@ def IsValidTargetForWrapper(target_extras, executable_target_pattern, spec): spec.get("type", "") == "executable" and spec.get("product_extension", "") != "bundle" ): - # If there is a filter and the target does not match, exclude the target. if executable_target_pattern is not None: if not re.search(executable_target_pattern, target_name): @@ -166,14 +165,14 @@ def IsValidTargetForWrapper(target_extras, executable_target_pattern, spec): def CreateWrapper(target_list, target_dicts, data, params): """Initialize targets for the ninja wrapper. - This sets up the necessary variables in the targets to generate Xcode projects - that use ninja as an external builder. - Arguments: - target_list: List of target pairs: 'base/base.gyp:base'. - target_dicts: Dict of target properties keyed on target pair. - data: Dict of flattened build files keyed on gyp path. - params: Dict of global options for gyp. - """ + This sets up the necessary variables in the targets to generate Xcode projects + that use ninja as an external builder. + Arguments: + target_list: List of target pairs: 'base/base.gyp:base'. + target_dicts: Dict of target properties keyed on target pair. + data: Dict of flattened build files keyed on gyp path. + params: Dict of global options for gyp. + """ orig_gyp = params["build_files"][0] for gyp_name, gyp_dict in data.items(): if gyp_name == orig_gyp: diff --git a/tools/gyp/pylib/gyp/xcodeproj_file.py b/tools/gyp/pylib/gyp/xcodeproj_file.py index 0376693d95a073..11e2be07372230 100644 --- a/tools/gyp/pylib/gyp/xcodeproj_file.py +++ b/tools/gyp/pylib/gyp/xcodeproj_file.py @@ -176,12 +176,12 @@ def cmp(x, y): def SourceTreeAndPathFromPath(input_path): """Given input_path, returns a tuple with sourceTree and path values. - Examples: - input_path (source_tree, output_path) - '$(VAR)/path' ('VAR', 'path') - '$(VAR)' ('VAR', None) - 'path' (None, 'path') - """ + Examples: + input_path (source_tree, output_path) + '$(VAR)/path' ('VAR', 'path') + '$(VAR)' ('VAR', None) + 'path' (None, 'path') + """ if source_group_match := _path_leading_variable.match(input_path): source_tree = source_group_match.group(1) @@ -200,70 +200,70 @@ def ConvertVariablesToShellSyntax(input_string): class XCObject: """The abstract base of all class types used in Xcode project files. - Class variables: - _schema: A dictionary defining the properties of this class. The keys to - _schema are string property keys as used in project files. Values - are a list of four or five elements: - [ is_list, property_type, is_strong, is_required, default ] - is_list: True if the property described is a list, as opposed - to a single element. - property_type: The type to use as the value of the property, - or if is_list is True, the type to use for each - element of the value's list. property_type must - be an XCObject subclass, or one of the built-in - types str, int, or dict. - is_strong: If property_type is an XCObject subclass, is_strong - is True to assert that this class "owns," or serves - as parent, to the property value (or, if is_list is - True, values). is_strong must be False if - property_type is not an XCObject subclass. - is_required: True if the property is required for the class. - Note that is_required being True does not preclude - an empty string ("", in the case of property_type - str) or list ([], in the case of is_list True) from - being set for the property. - default: Optional. If is_required is True, default may be set - to provide a default value for objects that do not supply - their own value. If is_required is True and default - is not provided, users of the class must supply their own - value for the property. - Note that although the values of the array are expressed in - boolean terms, subclasses provide values as integers to conserve - horizontal space. - _should_print_single_line: False in XCObject. Subclasses whose objects - should be written to the project file in the - alternate single-line format, such as - PBXFileReference and PBXBuildFile, should - set this to True. - _encode_transforms: Used by _EncodeString to encode unprintable characters. - The index into this list is the ordinal of the - character to transform; each value is a string - used to represent the character in the output. XCObject - provides an _encode_transforms list suitable for most - XCObject subclasses. - _alternate_encode_transforms: Provided for subclasses that wish to use - the alternate encoding rules. Xcode seems - to use these rules when printing objects in - single-line format. Subclasses that desire - this behavior should set _encode_transforms - to _alternate_encode_transforms. - _hashables: A list of XCObject subclasses that can be hashed by ComputeIDs - to construct this object's ID. Most classes that need custom - hashing behavior should do it by overriding Hashables, - but in some cases an object's parent may wish to push a - hashable value into its child, and it can do so by appending - to _hashables. - Attributes: - id: The object's identifier, a 24-character uppercase hexadecimal string. - Usually, objects being created should not set id until the entire - project file structure is built. At that point, UpdateIDs() should - be called on the root object to assign deterministic values for id to - each object in the tree. - parent: The object's parent. This is set by a parent XCObject when a child - object is added to it. - _properties: The object's property dictionary. An object's properties are - described by its class' _schema variable. - """ + Class variables: + _schema: A dictionary defining the properties of this class. The keys to + _schema are string property keys as used in project files. Values + are a list of four or five elements: + [ is_list, property_type, is_strong, is_required, default ] + is_list: True if the property described is a list, as opposed + to a single element. + property_type: The type to use as the value of the property, + or if is_list is True, the type to use for each + element of the value's list. property_type must + be an XCObject subclass, or one of the built-in + types str, int, or dict. + is_strong: If property_type is an XCObject subclass, is_strong + is True to assert that this class "owns," or serves + as parent, to the property value (or, if is_list is + True, values). is_strong must be False if + property_type is not an XCObject subclass. + is_required: True if the property is required for the class. + Note that is_required being True does not preclude + an empty string ("", in the case of property_type + str) or list ([], in the case of is_list True) from + being set for the property. + default: Optional. If is_required is True, default may be set + to provide a default value for objects that do not supply + their own value. If is_required is True and default + is not provided, users of the class must supply their own + value for the property. + Note that although the values of the array are expressed in + boolean terms, subclasses provide values as integers to conserve + horizontal space. + _should_print_single_line: False in XCObject. Subclasses whose objects + should be written to the project file in the + alternate single-line format, such as + PBXFileReference and PBXBuildFile, should + set this to True. + _encode_transforms: Used by _EncodeString to encode unprintable characters. + The index into this list is the ordinal of the + character to transform; each value is a string + used to represent the character in the output. XCObject + provides an _encode_transforms list suitable for most + XCObject subclasses. + _alternate_encode_transforms: Provided for subclasses that wish to use + the alternate encoding rules. Xcode seems + to use these rules when printing objects in + single-line format. Subclasses that desire + this behavior should set _encode_transforms + to _alternate_encode_transforms. + _hashables: A list of XCObject subclasses that can be hashed by ComputeIDs + to construct this object's ID. Most classes that need custom + hashing behavior should do it by overriding Hashables, + but in some cases an object's parent may wish to push a + hashable value into its child, and it can do so by appending + to _hashables. + Attributes: + id: The object's identifier, a 24-character uppercase hexadecimal string. + Usually, objects being created should not set id until the entire + project file structure is built. At that point, UpdateIDs() should + be called on the root object to assign deterministic values for id to + each object in the tree. + parent: The object's parent. This is set by a parent XCObject when a child + object is added to it. + _properties: The object's property dictionary. An object's properties are + described by its class' _schema variable. + """ _schema = {} _should_print_single_line = False @@ -305,12 +305,12 @@ def __repr__(self): def Copy(self): """Make a copy of this object. - The new object will have its own copy of lists and dicts. Any XCObject - objects owned by this object (marked "strong") will be copied in the - new object, even those found in lists. If this object has any weak - references to other XCObjects, the same references are added to the new - object without making a copy. - """ + The new object will have its own copy of lists and dicts. Any XCObject + objects owned by this object (marked "strong") will be copied in the + new object, even those found in lists. If this object has any weak + references to other XCObjects, the same references are added to the new + object without making a copy. + """ that = self.__class__(id=self.id, parent=self.parent) for key, value in self._properties.items(): @@ -359,9 +359,9 @@ def Copy(self): def Name(self): """Return the name corresponding to an object. - Not all objects necessarily need to be nameable, and not all that do have - a "name" property. Override as needed. - """ + Not all objects necessarily need to be nameable, and not all that do have + a "name" property. Override as needed. + """ # If the schema indicates that "name" is required, try to access the # property even if it doesn't exist. This will result in a KeyError @@ -377,12 +377,12 @@ def Name(self): def Comment(self): """Return a comment string for the object. - Most objects just use their name as the comment, but PBXProject uses - different values. + Most objects just use their name as the comment, but PBXProject uses + different values. - The returned comment is not escaped and does not have any comment marker - strings applied to it. - """ + The returned comment is not escaped and does not have any comment marker + strings applied to it. + """ return self.Name() @@ -402,26 +402,26 @@ def HashablesForChild(self): def ComputeIDs(self, recursive=True, overwrite=True, seed_hash=None): """Set "id" properties deterministically. - An object's "id" property is set based on a hash of its class type and - name, as well as the class type and name of all ancestor objects. As - such, it is only advisable to call ComputeIDs once an entire project file - tree is built. + An object's "id" property is set based on a hash of its class type and + name, as well as the class type and name of all ancestor objects. As + such, it is only advisable to call ComputeIDs once an entire project file + tree is built. - If recursive is True, recurse into all descendant objects and update their - hashes. + If recursive is True, recurse into all descendant objects and update their + hashes. - If overwrite is True, any existing value set in the "id" property will be - replaced. - """ + If overwrite is True, any existing value set in the "id" property will be + replaced. + """ def _HashUpdate(hash, data): """Update hash with data's length and contents. - If the hash were updated only with the value of data, it would be - possible for clowns to induce collisions by manipulating the names of - their objects. By adding the length, it's exceedingly less likely that - ID collisions will be encountered, intentionally or not. - """ + If the hash were updated only with the value of data, it would be + possible for clowns to induce collisions by manipulating the names of + their objects. By adding the length, it's exceedingly less likely that + ID collisions will be encountered, intentionally or not. + """ hash.update(struct.pack(">i", len(data))) if isinstance(data, str): @@ -464,8 +464,7 @@ def _HashUpdate(hash, data): self.id = "%08X%08X%08X" % tuple(id_ints) def EnsureNoIDCollisions(self): - """Verifies that no two objects have the same ID. Checks all descendants. - """ + """Verifies that no two objects have the same ID. Checks all descendants.""" ids = {} descendants = self.Descendants() @@ -498,8 +497,8 @@ def Children(self): def Descendants(self): """Returns a list of all of this object's descendants, including this - object. - """ + object. + """ children = self.Children() descendants = [self] @@ -515,8 +514,8 @@ def PBXProjectAncestor(self): def _EncodeComment(self, comment): """Encodes a comment to be placed in the project file output, mimicking - Xcode behavior. - """ + Xcode behavior. + """ # This mimics Xcode behavior by wrapping the comment in "/*" and "*/". If # the string already contains a "*/", it is turned into "(*)/". This keeps @@ -543,8 +542,8 @@ def _EncodeTransform(self, match): def _EncodeString(self, value): """Encodes a string to be placed in the project file output, mimicking - Xcode behavior. - """ + Xcode behavior. + """ # Use quotation marks when any character outside of the range A-Z, a-z, 0-9, # $ (dollar sign), . (period), and _ (underscore) is present. Also use @@ -585,18 +584,18 @@ def _XCPrint(self, file, tabs, line): def _XCPrintableValue(self, tabs, value, flatten_list=False): """Returns a representation of value that may be printed in a project file, - mimicking Xcode's behavior. + mimicking Xcode's behavior. - _XCPrintableValue can handle str and int values, XCObjects (which are - made printable by returning their id property), and list and dict objects - composed of any of the above types. When printing a list or dict, and - _should_print_single_line is False, the tabs parameter is used to determine - how much to indent the lines corresponding to the items in the list or - dict. + _XCPrintableValue can handle str and int values, XCObjects (which are + made printable by returning their id property), and list and dict objects + composed of any of the above types. When printing a list or dict, and + _should_print_single_line is False, the tabs parameter is used to determine + how much to indent the lines corresponding to the items in the list or + dict. - If flatten_list is True, single-element lists will be transformed into - strings. - """ + If flatten_list is True, single-element lists will be transformed into + strings. + """ printable = "" comment = None @@ -657,12 +656,12 @@ def _XCPrintableValue(self, tabs, value, flatten_list=False): def _XCKVPrint(self, file, tabs, key, value): """Prints a key and value, members of an XCObject's _properties dictionary, - to file. + to file. - tabs is an int identifying the indentation level. If the class' - _should_print_single_line variable is True, tabs is ignored and the - key-value pair will be followed by a space instead of a newline. - """ + tabs is an int identifying the indentation level. If the class' + _should_print_single_line variable is True, tabs is ignored and the + key-value pair will be followed by a space instead of a newline. + """ if self._should_print_single_line: printable = "" @@ -720,8 +719,8 @@ def _XCKVPrint(self, file, tabs, key, value): def Print(self, file=sys.stdout): """Prints a reprentation of this object to file, adhering to Xcode output - formatting. - """ + formatting. + """ self.VerifyHasRequiredProperties() @@ -759,15 +758,15 @@ def Print(self, file=sys.stdout): def UpdateProperties(self, properties, do_copy=False): """Merge the supplied properties into the _properties dictionary. - The input properties must adhere to the class schema or a KeyError or - TypeError exception will be raised. If adding an object of an XCObject - subclass and the schema indicates a strong relationship, the object's - parent will be set to this object. + The input properties must adhere to the class schema or a KeyError or + TypeError exception will be raised. If adding an object of an XCObject + subclass and the schema indicates a strong relationship, the object's + parent will be set to this object. - If do_copy is True, then lists, dicts, strong-owned XCObjects, and - strong-owned XCObjects in lists will be copied instead of having their - references added. - """ + If do_copy is True, then lists, dicts, strong-owned XCObjects, and + strong-owned XCObjects in lists will be copied instead of having their + references added. + """ if properties is None: return @@ -908,8 +907,8 @@ def AppendProperty(self, key, value): def VerifyHasRequiredProperties(self): """Ensure that all properties identified as required by the schema are - set. - """ + set. + """ # TODO(mark): A stronger verification mechanism is needed. Some # subclasses need to perform validation beyond what the schema can enforce. @@ -920,7 +919,7 @@ def VerifyHasRequiredProperties(self): def _SetDefaultsFromSchema(self): """Assign object default values according to the schema. This will not - overwrite properties that have already been set.""" + overwrite properties that have already been set.""" defaults = {} for property, attributes in self._schema.items(): @@ -942,7 +941,7 @@ def _SetDefaultsFromSchema(self): class XCHierarchicalElement(XCObject): """Abstract base for PBXGroup and PBXFileReference. Not represented in a - project file.""" + project file.""" # TODO(mark): Do name and path belong here? Probably so. # If path is set and name is not, name may have a default value. Name will @@ -1008,27 +1007,27 @@ def Name(self): def Hashables(self): """Custom hashables for XCHierarchicalElements. - XCHierarchicalElements are special. Generally, their hashes shouldn't - change if the paths don't change. The normal XCObject implementation of - Hashables adds a hashable for each object, which means that if - the hierarchical structure changes (possibly due to changes caused when - TakeOverOnlyChild runs and encounters slight changes in the hierarchy), - the hashes will change. For example, if a project file initially contains - a/b/f1 and a/b becomes collapsed into a/b, f1 will have a single parent - a/b. If someone later adds a/f2 to the project file, a/b can no longer be - collapsed, and f1 winds up with parent b and grandparent a. That would - be sufficient to change f1's hash. - - To counteract this problem, hashables for all XCHierarchicalElements except - for the main group (which has neither a name nor a path) are taken to be - just the set of path components. Because hashables are inherited from - parents, this provides assurance that a/b/f1 has the same set of hashables - whether its parent is b or a/b. - - The main group is a special case. As it is permitted to have no name or - path, it is permitted to use the standard XCObject hash mechanism. This - is not considered a problem because there can be only one main group. - """ + XCHierarchicalElements are special. Generally, their hashes shouldn't + change if the paths don't change. The normal XCObject implementation of + Hashables adds a hashable for each object, which means that if + the hierarchical structure changes (possibly due to changes caused when + TakeOverOnlyChild runs and encounters slight changes in the hierarchy), + the hashes will change. For example, if a project file initially contains + a/b/f1 and a/b becomes collapsed into a/b, f1 will have a single parent + a/b. If someone later adds a/f2 to the project file, a/b can no longer be + collapsed, and f1 winds up with parent b and grandparent a. That would + be sufficient to change f1's hash. + + To counteract this problem, hashables for all XCHierarchicalElements except + for the main group (which has neither a name nor a path) are taken to be + just the set of path components. Because hashables are inherited from + parents, this provides assurance that a/b/f1 has the same set of hashables + whether its parent is b or a/b. + + The main group is a special case. As it is permitted to have no name or + path, it is permitted to use the standard XCObject hash mechanism. This + is not considered a problem because there can be only one main group. + """ if self == self.PBXProjectAncestor()._properties["mainGroup"]: # super @@ -1157,12 +1156,12 @@ def FullPath(self): class PBXGroup(XCHierarchicalElement): """ - Attributes: - _children_by_path: Maps pathnames of children of this PBXGroup to the - actual child XCHierarchicalElement objects. - _variant_children_by_name_and_path: Maps (name, path) tuples of - PBXVariantGroup children to the actual child PBXVariantGroup objects. - """ + Attributes: + _children_by_path: Maps pathnames of children of this PBXGroup to the + actual child XCHierarchicalElement objects. + _variant_children_by_name_and_path: Maps (name, path) tuples of + PBXVariantGroup children to the actual child PBXVariantGroup objects. + """ _schema = XCHierarchicalElement._schema.copy() _schema.update( @@ -1281,20 +1280,20 @@ def GetChildByRemoteObject(self, remote_object): def AddOrGetFileByPath(self, path, hierarchical): """Returns an existing or new file reference corresponding to path. - If hierarchical is True, this method will create or use the necessary - hierarchical group structure corresponding to path. Otherwise, it will - look in and create an item in the current group only. + If hierarchical is True, this method will create or use the necessary + hierarchical group structure corresponding to path. Otherwise, it will + look in and create an item in the current group only. - If an existing matching reference is found, it is returned, otherwise, a - new one will be created, added to the correct group, and returned. + If an existing matching reference is found, it is returned, otherwise, a + new one will be created, added to the correct group, and returned. - If path identifies a directory by virtue of carrying a trailing slash, - this method returns a PBXFileReference of "folder" type. If path - identifies a variant, by virtue of it identifying a file inside a directory - with an ".lproj" extension, this method returns a PBXVariantGroup - containing the variant named by path, and possibly other variants. For - all other paths, a "normal" PBXFileReference will be returned. - """ + If path identifies a directory by virtue of carrying a trailing slash, + this method returns a PBXFileReference of "folder" type. If path + identifies a variant, by virtue of it identifying a file inside a directory + with an ".lproj" extension, this method returns a PBXVariantGroup + containing the variant named by path, and possibly other variants. For + all other paths, a "normal" PBXFileReference will be returned. + """ # Adding or getting a directory? Directories end with a trailing slash. is_dir = False @@ -1379,15 +1378,15 @@ def AddOrGetFileByPath(self, path, hierarchical): def AddOrGetVariantGroupByNameAndPath(self, name, path): """Returns an existing or new PBXVariantGroup for name and path. - If a PBXVariantGroup identified by the name and path arguments is already - present as a child of this object, it is returned. Otherwise, a new - PBXVariantGroup with the correct properties is created, added as a child, - and returned. + If a PBXVariantGroup identified by the name and path arguments is already + present as a child of this object, it is returned. Otherwise, a new + PBXVariantGroup with the correct properties is created, added as a child, + and returned. - This method will generally be called by AddOrGetFileByPath, which knows - when to create a variant group based on the structure of the pathnames - passed to it. - """ + This method will generally be called by AddOrGetFileByPath, which knows + when to create a variant group based on the structure of the pathnames + passed to it. + """ key = (name, path) if key in self._variant_children_by_name_and_path: @@ -1405,19 +1404,19 @@ def AddOrGetVariantGroupByNameAndPath(self, name, path): def TakeOverOnlyChild(self, recurse=False): """If this PBXGroup has only one child and it's also a PBXGroup, take - it over by making all of its children this object's children. - - This function will continue to take over only children when those children - are groups. If there are three PBXGroups representing a, b, and c, with - c inside b and b inside a, and a and b have no other children, this will - result in a taking over both b and c, forming a PBXGroup for a/b/c. - - If recurse is True, this function will recurse into children and ask them - to collapse themselves by taking over only children as well. Assuming - an example hierarchy with files at a/b/c/d1, a/b/c/d2, and a/b/c/d3/e/f - (d1, d2, and f are files, the rest are groups), recursion will result in - a group for a/b/c containing a group for d3/e. - """ + it over by making all of its children this object's children. + + This function will continue to take over only children when those children + are groups. If there are three PBXGroups representing a, b, and c, with + c inside b and b inside a, and a and b have no other children, this will + result in a taking over both b and c, forming a PBXGroup for a/b/c. + + If recurse is True, this function will recurse into children and ask them + to collapse themselves by taking over only children as well. Assuming + an example hierarchy with files at a/b/c/d1, a/b/c/d2, and a/b/c/d3/e/f + (d1, d2, and f are files, the rest are groups), recursion will result in + a group for a/b/c containing a group for d3/e. + """ # At this stage, check that child class types are PBXGroup exactly, # instead of using isinstance. The only subclass of PBXGroup, @@ -1716,16 +1715,16 @@ def DefaultConfiguration(self): def HasBuildSetting(self, key): """Determines the state of a build setting in all XCBuildConfiguration - child objects. + child objects. - If all child objects have key in their build settings, and the value is the - same in all child objects, returns 1. + If all child objects have key in their build settings, and the value is the + same in all child objects, returns 1. - If no child objects have the key in their build settings, returns 0. + If no child objects have the key in their build settings, returns 0. - If some, but not all, child objects have the key in their build settings, - or if any children have different values for the key, returns -1. - """ + If some, but not all, child objects have the key in their build settings, + or if any children have different values for the key, returns -1. + """ has = None value = None @@ -1751,9 +1750,9 @@ def HasBuildSetting(self, key): def GetBuildSetting(self, key): """Gets the build setting for key. - All child XCConfiguration objects must have the same value set for the - setting, or a ValueError will be raised. - """ + All child XCConfiguration objects must have the same value set for the + setting, or a ValueError will be raised. + """ # TODO(mark): This is wrong for build settings that are lists. The list # contents should be compared (and a list copy returned?) @@ -1770,31 +1769,30 @@ def GetBuildSetting(self, key): def SetBuildSetting(self, key, value): """Sets the build setting for key to value in all child - XCBuildConfiguration objects. - """ + XCBuildConfiguration objects. + """ for configuration in self._properties["buildConfigurations"]: configuration.SetBuildSetting(key, value) def AppendBuildSetting(self, key, value): """Appends value to the build setting for key, which is treated as a list, - in all child XCBuildConfiguration objects. - """ + in all child XCBuildConfiguration objects. + """ for configuration in self._properties["buildConfigurations"]: configuration.AppendBuildSetting(key, value) def DelBuildSetting(self, key): """Deletes the build setting key from all child XCBuildConfiguration - objects. - """ + objects. + """ for configuration in self._properties["buildConfigurations"]: configuration.DelBuildSetting(key) def SetBaseConfiguration(self, value): - """Sets the build configuration in all child XCBuildConfiguration objects. - """ + """Sets the build configuration in all child XCBuildConfiguration objects.""" for configuration in self._properties["buildConfigurations"]: configuration.SetBaseConfiguration(value) @@ -1834,14 +1832,14 @@ def Hashables(self): class XCBuildPhase(XCObject): """Abstract base for build phase classes. Not represented in a project - file. + file. - Attributes: - _files_by_path: A dict mapping each path of a child in the files list by - path (keys) to the corresponding PBXBuildFile children (values). - _files_by_xcfilelikeelement: A dict mapping each XCFileLikeElement (keys) - to the corresponding PBXBuildFile children (values). - """ + Attributes: + _files_by_path: A dict mapping each path of a child in the files list by + path (keys) to the corresponding PBXBuildFile children (values). + _files_by_xcfilelikeelement: A dict mapping each XCFileLikeElement (keys) + to the corresponding PBXBuildFile children (values). + """ # TODO(mark): Some build phase types, like PBXShellScriptBuildPhase, don't # actually have a "files" list. XCBuildPhase should not have "files" but @@ -1880,8 +1878,8 @@ def FileGroup(self, path): def _AddPathToDict(self, pbxbuildfile, path): """Adds path to the dict tracking paths belonging to this build phase. - If the path is already a member of this build phase, raises an exception. - """ + If the path is already a member of this build phase, raises an exception. + """ if path in self._files_by_path: raise ValueError("Found multiple build files with path " + path) @@ -1890,28 +1888,28 @@ def _AddPathToDict(self, pbxbuildfile, path): def _AddBuildFileToDicts(self, pbxbuildfile, path=None): """Maintains the _files_by_path and _files_by_xcfilelikeelement dicts. - If path is specified, then it is the path that is being added to the - phase, and pbxbuildfile must contain either a PBXFileReference directly - referencing that path, or it must contain a PBXVariantGroup that itself - contains a PBXFileReference referencing the path. - - If path is not specified, either the PBXFileReference's path or the paths - of all children of the PBXVariantGroup are taken as being added to the - phase. - - If the path is already present in the phase, raises an exception. - - If the PBXFileReference or PBXVariantGroup referenced by pbxbuildfile - are already present in the phase, referenced by a different PBXBuildFile - object, raises an exception. This does not raise an exception when - a PBXFileReference or PBXVariantGroup reappear and are referenced by the - same PBXBuildFile that has already introduced them, because in the case - of PBXVariantGroup objects, they may correspond to multiple paths that are - not all added simultaneously. When this situation occurs, the path needs - to be added to _files_by_path, but nothing needs to change in - _files_by_xcfilelikeelement, and the caller should have avoided adding - the PBXBuildFile if it is already present in the list of children. - """ + If path is specified, then it is the path that is being added to the + phase, and pbxbuildfile must contain either a PBXFileReference directly + referencing that path, or it must contain a PBXVariantGroup that itself + contains a PBXFileReference referencing the path. + + If path is not specified, either the PBXFileReference's path or the paths + of all children of the PBXVariantGroup are taken as being added to the + phase. + + If the path is already present in the phase, raises an exception. + + If the PBXFileReference or PBXVariantGroup referenced by pbxbuildfile + are already present in the phase, referenced by a different PBXBuildFile + object, raises an exception. This does not raise an exception when + a PBXFileReference or PBXVariantGroup reappear and are referenced by the + same PBXBuildFile that has already introduced them, because in the case + of PBXVariantGroup objects, they may correspond to multiple paths that are + not all added simultaneously. When this situation occurs, the path needs + to be added to _files_by_path, but nothing needs to change in + _files_by_xcfilelikeelement, and the caller should have avoided adding + the PBXBuildFile if it is already present in the list of children. + """ xcfilelikeelement = pbxbuildfile._properties["fileRef"] @@ -2102,9 +2100,9 @@ def FileGroup(self, path): def SetDestination(self, path): """Set the dstSubfolderSpec and dstPath properties from path. - path may be specified in the same notation used for XCHierarchicalElements, - specifically, "$(DIR)/path". - """ + path may be specified in the same notation used for XCHierarchicalElements, + specifically, "$(DIR)/path". + """ if path_tree_match := self.path_tree_re.search(path): path_tree = path_tree_match.group(1) @@ -2178,9 +2176,7 @@ def SetDestination(self, path): subfolder = 0 relative_path = path[1:] else: - raise ValueError( - f"Can't use path {path} in a {self.__class__.__name__}" - ) + raise ValueError(f"Can't use path {path} in a {self.__class__.__name__}") self._properties["dstPath"] = relative_path self._properties["dstSubfolderSpec"] = subfolder @@ -2530,9 +2526,9 @@ def __init__( # loadable modules, but there's precedent: Python loadable modules on # Mac OS X use an .so extension. if self._properties["productType"] == "com.googlecode.gyp.xcode.bundle": - self._properties[ - "productType" - ] = "com.apple.product-type.library.dynamic" + self._properties["productType"] = ( + "com.apple.product-type.library.dynamic" + ) self.SetBuildSetting("MACH_O_TYPE", "mh_bundle") self.SetBuildSetting("DYLIB_CURRENT_VERSION", "") self.SetBuildSetting("DYLIB_COMPATIBILITY_VERSION", "") @@ -2540,9 +2536,10 @@ def __init__( force_extension = suffix[1:] if ( - self._properties["productType"] in { + self._properties["productType"] + in { "com.apple.product-type-bundle.unit.test", - "com.apple.product-type-bundle.ui-testing" + "com.apple.product-type-bundle.ui-testing", } ) and force_extension is None: force_extension = suffix[1:] @@ -2694,10 +2691,8 @@ def AddDependency(self, other): other._properties["productType"] == static_library_type or ( ( - other._properties["productType"] in { - shared_library_type, - framework_type - } + other._properties["productType"] + in {shared_library_type, framework_type} ) and ( (not other.HasBuildSetting("MACH_O_TYPE")) @@ -2706,7 +2701,6 @@ def AddDependency(self, other): ) ) ): - file_ref = other.GetProperty("productReference") pbxproject = self.PBXProjectAncestor() @@ -2732,13 +2726,13 @@ class PBXProject(XCContainerPortal): # PBXContainerItemProxy. """ - Attributes: - path: "sample.xcodeproj". TODO(mark) Document me! - _other_pbxprojects: A dictionary, keyed by other PBXProject objects. Each - value is a reference to the dict in the - projectReferences list associated with the keyed - PBXProject. - """ + Attributes: + path: "sample.xcodeproj". TODO(mark) Document me! + _other_pbxprojects: A dictionary, keyed by other PBXProject objects. Each + value is a reference to the dict in the + projectReferences list associated with the keyed + PBXProject. + """ _schema = XCContainerPortal._schema.copy() _schema.update( @@ -2833,17 +2827,17 @@ def ProjectsGroup(self): def RootGroupForPath(self, path): """Returns a PBXGroup child of this object to which path should be added. - This method is intended to choose between SourceGroup and - IntermediatesGroup on the basis of whether path is present in a source - directory or an intermediates directory. For the purposes of this - determination, any path located within a derived file directory such as - PROJECT_DERIVED_FILE_DIR is treated as being in an intermediates - directory. + This method is intended to choose between SourceGroup and + IntermediatesGroup on the basis of whether path is present in a source + directory or an intermediates directory. For the purposes of this + determination, any path located within a derived file directory such as + PROJECT_DERIVED_FILE_DIR is treated as being in an intermediates + directory. - The returned value is a two-element tuple. The first element is the - PBXGroup, and the second element specifies whether that group should be - organized hierarchically (True) or as a single flat list (False). - """ + The returned value is a two-element tuple. The first element is the + PBXGroup, and the second element specifies whether that group should be + organized hierarchically (True) or as a single flat list (False). + """ # TODO(mark): make this a class variable and bind to self on call? # Also, this list is nowhere near exhaustive. @@ -2869,11 +2863,11 @@ def RootGroupForPath(self, path): def AddOrGetFileInRootGroup(self, path): """Returns a PBXFileReference corresponding to path in the correct group - according to RootGroupForPath's heuristics. + according to RootGroupForPath's heuristics. - If an existing PBXFileReference for path exists, it will be returned. - Otherwise, one will be created and returned. - """ + If an existing PBXFileReference for path exists, it will be returned. + Otherwise, one will be created and returned. + """ (group, hierarchical) = self.RootGroupForPath(path) return group.AddOrGetFileByPath(path, hierarchical) @@ -2923,17 +2917,17 @@ def SortGroups(self): def AddOrGetProjectReference(self, other_pbxproject): """Add a reference to another project file (via PBXProject object) to this - one. + one. - Returns [ProductGroup, ProjectRef]. ProductGroup is a PBXGroup object in - this project file that contains a PBXReferenceProxy object for each - product of each PBXNativeTarget in the other project file. ProjectRef is - a PBXFileReference to the other project file. + Returns [ProductGroup, ProjectRef]. ProductGroup is a PBXGroup object in + this project file that contains a PBXReferenceProxy object for each + product of each PBXNativeTarget in the other project file. ProjectRef is + a PBXFileReference to the other project file. - If this project file already references the other project file, the - existing ProductGroup and ProjectRef are returned. The ProductGroup will - still be updated if necessary. - """ + If this project file already references the other project file, the + existing ProductGroup and ProjectRef are returned. The ProductGroup will + still be updated if necessary. + """ if "projectReferences" not in self._properties: self._properties["projectReferences"] = [] @@ -2985,7 +2979,7 @@ def AddOrGetProjectReference(self, other_pbxproject): # Xcode seems to sort this list case-insensitively self._properties["projectReferences"] = sorted( self._properties["projectReferences"], - key=lambda x: x["ProjectRef"].Name().lower() + key=lambda x: x["ProjectRef"].Name().lower(), ) else: # The link already exists. Pull out the relevant data. @@ -3010,11 +3004,8 @@ def _AllSymrootsUnique(self, target, inherit_unique_symroot): # define an explicit value for 'SYMROOT'. symroots = self._DefinedSymroots(target) for s in self._DefinedSymroots(target): - if ( - (s is not None - and not self._IsUniqueSymrootForTarget(s)) - or (s is None - and not inherit_unique_symroot) + if (s is not None and not self._IsUniqueSymrootForTarget(s)) or ( + s is None and not inherit_unique_symroot ): return False return True if symroots else inherit_unique_symroot @@ -3118,7 +3109,8 @@ def CompareProducts(x, y, remote_products): product_group._properties["children"] = sorted( product_group._properties["children"], key=cmp_to_key( - lambda x, y, rp=remote_products: CompareProducts(x, y, rp)), + lambda x, y, rp=remote_products: CompareProducts(x, y, rp) + ), ) @@ -3152,9 +3144,7 @@ def Print(self, file=sys.stdout): self._XCPrint(file, 0, "{ ") else: self._XCPrint(file, 0, "{\n") - for property, value in sorted( - self._properties.items() - ): + for property, value in sorted(self._properties.items()): if property == "objects": self._PrintObjects(file) else: @@ -3180,9 +3170,7 @@ def _PrintObjects(self, file): for class_name in sorted(objects_by_class): self._XCPrint(file, 0, "\n") self._XCPrint(file, 0, "/* Begin " + class_name + " section */\n") - for object in sorted( - objects_by_class[class_name], key=attrgetter("id") - ): + for object in sorted(objects_by_class[class_name], key=attrgetter("id")): object.Print(file) self._XCPrint(file, 0, "/* End " + class_name + " section */\n") diff --git a/tools/gyp/pylib/gyp/xml_fix.py b/tools/gyp/pylib/gyp/xml_fix.py index 530196366946d8..d7e3b5a95604f7 100644 --- a/tools/gyp/pylib/gyp/xml_fix.py +++ b/tools/gyp/pylib/gyp/xml_fix.py @@ -9,7 +9,6 @@ TODO(bradnelson): Consider dropping this when we drop XP support. """ - import xml.dom.minidom diff --git a/tools/gyp/pyproject.toml b/tools/gyp/pyproject.toml index b233d8504df687..3a029c4fc5140c 100644 --- a/tools/gyp/pyproject.toml +++ b/tools/gyp/pyproject.toml @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta" [project] name = "gyp-next" -version = "0.20.3" +version = "0.20.4" authors = [ { name="Node.js contributors", email="ryzokuken@disroot.org" }, ] diff --git a/tools/gyp/test_gyp.py b/tools/gyp/test_gyp.py index 8e910a2b76290d..70c81ae8ca3bf9 100755 --- a/tools/gyp/test_gyp.py +++ b/tools/gyp/test_gyp.py @@ -5,7 +5,6 @@ """gyptest.py -- test runner for GYP tests.""" - import argparse import os import platform diff --git a/tools/gyp/tools/graphviz.py b/tools/gyp/tools/graphviz.py index f19426b69faa6d..ed1c7ab3cd10b5 100755 --- a/tools/gyp/tools/graphviz.py +++ b/tools/gyp/tools/graphviz.py @@ -8,7 +8,6 @@ generate input suitable for graphviz to render a dependency graph of targets.""" - import collections import json import sys @@ -22,7 +21,7 @@ def ParseTarget(target): def LoadEdges(filename, targets): """Load the edges map from the dump file, and filter it to only - show targets in |targets| and their depedendents.""" + show targets in |targets| and their depedendents.""" file = open("dump.json") edges = json.load(file) @@ -43,7 +42,7 @@ def LoadEdges(filename, targets): def WriteGraph(edges): """Print a graphviz graph to stdout. - |edges| is a map of target to a list of other targets it depends on.""" + |edges| is a map of target to a list of other targets it depends on.""" # Bucket targets by file. files = collections.defaultdict(list) @@ -64,9 +63,7 @@ def WriteGraph(edges): # the display by making it a box without an internal node. target = targets[0] build_file, target_name, toolset = ParseTarget(target) - print( - f' "{target}" [shape=box, label="{filename}\\n{target_name}"]' - ) + print(f' "{target}" [shape=box, label="{filename}\\n{target_name}"]') else: # Group multiple nodes together in a subgraph. print(' subgraph "cluster_%s" {' % filename) diff --git a/tools/gyp/tools/pretty_gyp.py b/tools/gyp/tools/pretty_gyp.py index a023887205c719..562a73ee672aba 100755 --- a/tools/gyp/tools/pretty_gyp.py +++ b/tools/gyp/tools/pretty_gyp.py @@ -6,7 +6,6 @@ """Pretty-prints the contents of a GYP file.""" - import re import sys @@ -49,7 +48,7 @@ def mask_quotes(input): def do_split(input, masked_input, search_re): output = [] mask_output = [] - for (line, masked_line) in zip(input, masked_input): + for line, masked_line in zip(input, masked_input): m = search_re.match(masked_line) while m: split = len(m.group(1)) @@ -63,13 +62,13 @@ def do_split(input, masked_input, search_re): def split_double_braces(input): """Masks out the quotes and comments, and then splits appropriate - lines (lines that matche the double_*_brace re's above) before - indenting them below. + lines (lines that matche the double_*_brace re's above) before + indenting them below. - These are used to split lines which have multiple braces on them, so - that the indentation looks prettier when all laid out (e.g. closing - braces make a nice diagonal line). - """ + These are used to split lines which have multiple braces on them, so + that the indentation looks prettier when all laid out (e.g. closing + braces make a nice diagonal line). + """ double_open_brace_re = re.compile(r"(.*?[\[\{\(,])(\s*)([\[\{\(])") double_close_brace_re = re.compile(r"(.*?[\]\}\)],?)(\s*)([\]\}\)])") @@ -85,8 +84,8 @@ def split_double_braces(input): def count_braces(line): """keeps track of the number of braces on a given line and returns the result. - It starts at zero and subtracts for closed braces, and adds for open braces. - """ + It starts at zero and subtracts for closed braces, and adds for open braces. + """ open_braces = ["[", "(", "{"] close_braces = ["]", ")", "}"] closing_prefix_re = re.compile(r"[^\s\]\}\)]\s*[\]\}\)]+,?\s*$") diff --git a/tools/gyp/tools/pretty_sln.py b/tools/gyp/tools/pretty_sln.py index 850fb150f4a8be..70c91aefad46ed 100755 --- a/tools/gyp/tools/pretty_sln.py +++ b/tools/gyp/tools/pretty_sln.py @@ -6,13 +6,12 @@ """Prints the information in a sln file in a diffable way. - It first outputs each projects in alphabetical order with their - dependencies. +It first outputs each projects in alphabetical order with their +dependencies. - Then it outputs a possible build order. +Then it outputs a possible build order. """ - import os import re import sys @@ -113,7 +112,7 @@ def PrintDependencies(projects, deps): print("---------------------------------------") print("-- --") - for (project, dep_list) in sorted(deps.items()): + for project, dep_list in sorted(deps.items()): print("Project : %s" % project) print("Path : %s" % projects[project][0]) if dep_list: @@ -131,7 +130,7 @@ def PrintBuildOrder(projects, deps): print("-- --") built = [] - for (project, _) in sorted(deps.items()): + for project, _ in sorted(deps.items()): if project not in built: BuildProject(project, built, projects, deps) @@ -139,7 +138,6 @@ def PrintBuildOrder(projects, deps): def PrintVCProj(projects): - for project in projects: print("-------------------------------------") print("-------------------------------------") diff --git a/tools/gyp/tools/pretty_vcproj.py b/tools/gyp/tools/pretty_vcproj.py index c7427ed2d8b805..82d47a0bdd4b35 100755 --- a/tools/gyp/tools/pretty_vcproj.py +++ b/tools/gyp/tools/pretty_vcproj.py @@ -6,13 +6,12 @@ """Make the format of a vcproj really pretty. - This script normalize and sort an xml. It also fetches all the properties - inside linked vsprops and include them explicitly in the vcproj. +This script normalize and sort an xml. It also fetches all the properties +inside linked vsprops and include them explicitly in the vcproj. - It outputs the resulting xml to stdout. +It outputs the resulting xml to stdout. """ - import os import sys from xml.dom.minidom import Node, parse @@ -48,11 +47,11 @@ def get_string(node): node_string += node.getAttribute("Name") all_nodes = [] - for (name, value) in node.attributes.items(): + for name, value in node.attributes.items(): all_nodes.append((name, value)) all_nodes.sort(CmpTuple()) - for (name, value) in all_nodes: + for name, value in all_nodes: node_string += name node_string += value @@ -81,10 +80,10 @@ def PrettyPrintNode(node, indent=0): print("{}<{}".format(" " * indent, node.nodeName)) all_attributes = [] - for (name, value) in node.attributes.items(): + for name, value in node.attributes.items(): all_attributes.append((name, value)) all_attributes.sort(CmpTuple()) - for (name, value) in all_attributes: + for name, value in all_attributes: print('{} {}="{}"'.format(" " * indent, name, value)) print("%s>" % (" " * indent)) if node.nodeValue: @@ -130,7 +129,7 @@ def FixFilenames(filenames, current_directory): def AbsoluteNode(node): """Makes all the properties we know about in this node absolute.""" if node.attributes: - for (name, value) in node.attributes.items(): + for name, value in node.attributes.items(): if name in [ "InheritedPropertySheets", "RelativePath", @@ -163,7 +162,7 @@ def CleanupVcproj(node): # Fix all the semicolon separated attributes to be sorted, and we also # remove the dups. if node.attributes: - for (name, value) in node.attributes.items(): + for name, value in node.attributes.items(): sorted_list = sorted(value.split(";")) unique_list = [] for i in sorted_list: @@ -252,7 +251,7 @@ def MergeAttributes(node1, node2): if not node2.attributes: return - for (name, value2) in node2.attributes.items(): + for name, value2 in node2.attributes.items(): # Don't merge the 'Name' attribute. if name == "Name": continue From 90d0a1b2e93a3820a7bce03aeb79df1f91090e4a Mon Sep 17 00:00:00 2001 From: Edy Silva Date: Mon, 1 Sep 2025 23:57:19 -0300 Subject: [PATCH 041/103] src,sqlite: refactor value conversion PR-URL: https://github.com/nodejs/node/pull/59659 Reviewed-By: Anna Henningsen Reviewed-By: Zeyu "Alex" Yang --- src/node_sqlite.cc | 71 ++++++++++++++++++++++------------------------ src/node_sqlite.h | 4 +-- 2 files changed, 36 insertions(+), 39 deletions(-) diff --git a/src/node_sqlite.cc b/src/node_sqlite.cc index e779cc0d782c08..49b7f9a36c85da 100644 --- a/src/node_sqlite.cc +++ b/src/node_sqlite.cc @@ -35,11 +35,14 @@ using v8::HandleScope; using v8::Int32; using v8::Integer; using v8::Isolate; +using v8::JustVoid; using v8::Local; using v8::LocalVector; +using v8::Maybe; using v8::MaybeLocal; using v8::Name; using v8::NewStringType; +using v8::Nothing; using v8::Null; using v8::Number; using v8::Object; @@ -2009,6 +2012,20 @@ MaybeLocal StatementSync::ColumnNameToName(const int column) { void StatementSync::MemoryInfo(MemoryTracker* tracker) const {} +Maybe ExtractRowValues(Isolate* isolate, + int num_cols, + StatementSync* stmt, + LocalVector* row_values) { + row_values->clear(); + row_values->reserve(num_cols); + for (int i = 0; i < num_cols; ++i) { + Local val; + if (!stmt->ColumnToValue(i).ToLocal(&val)) return Nothing(); + row_values->emplace_back(val); + } + return JustVoid(); +} + void StatementSync::All(const FunctionCallbackInfo& args) { StatementSync* stmt; ASSIGN_OR_RETURN_UNWRAP(&stmt, args.This()); @@ -2026,24 +2043,19 @@ void StatementSync::All(const FunctionCallbackInfo& args) { auto reset = OnScopeLeave([&]() { sqlite3_reset(stmt->statement_); }); int num_cols = sqlite3_column_count(stmt->statement_); LocalVector rows(isolate); + LocalVector row_values(isolate); + LocalVector row_keys(isolate); - if (stmt->return_arrays_) { - while ((r = sqlite3_step(stmt->statement_)) == SQLITE_ROW) { - LocalVector array_values(isolate); - array_values.reserve(num_cols); - for (int i = 0; i < num_cols; ++i) { - Local val; - if (!stmt->ColumnToValue(i).ToLocal(&val)) return; - array_values.emplace_back(val); - } + while ((r = sqlite3_step(stmt->statement_)) == SQLITE_ROW) { + auto maybe_row_values = + ExtractRowValues(env->isolate(), num_cols, stmt, &row_values); + if (maybe_row_values.IsNothing()) return; + + if (stmt->return_arrays_) { Local row_array = - Array::New(isolate, array_values.data(), array_values.size()); + Array::New(isolate, row_values.data(), row_values.size()); rows.emplace_back(row_array); - } - } else { - LocalVector row_keys(isolate); - - while ((r = sqlite3_step(stmt->statement_)) == SQLITE_ROW) { + } else { if (row_keys.size() == 0) { row_keys.reserve(num_cols); for (int i = 0; i < num_cols; ++i) { @@ -2053,14 +2065,6 @@ void StatementSync::All(const FunctionCallbackInfo& args) { } } - LocalVector row_values(isolate); - row_values.reserve(num_cols); - for (int i = 0; i < num_cols; ++i) { - Local val; - if (!stmt->ColumnToValue(i).ToLocal(&val)) return; - row_values.emplace_back(val); - } - DCHECK_EQ(row_keys.size(), row_values.size()); Local row_obj = Object::New( isolate, Null(isolate), row_keys.data(), row_values.data(), num_cols); @@ -2538,28 +2542,21 @@ void StatementSyncIterator::Next(const FunctionCallbackInfo& args) { int num_cols = sqlite3_column_count(iter->stmt_->statement_); Local row_value; + LocalVector row_keys(isolate); + LocalVector row_values(isolate); + + auto maybe_row_values = + ExtractRowValues(isolate, num_cols, iter->stmt_.get(), &row_values); + if (maybe_row_values.IsNothing()) return; if (iter->stmt_->return_arrays_) { - LocalVector array_values(isolate); - array_values.reserve(num_cols); - for (int i = 0; i < num_cols; ++i) { - Local val; - if (!iter->stmt_->ColumnToValue(i).ToLocal(&val)) return; - array_values.emplace_back(val); - } - row_value = Array::New(isolate, array_values.data(), array_values.size()); + row_value = Array::New(isolate, row_values.data(), row_values.size()); } else { - LocalVector row_keys(isolate); - LocalVector row_values(isolate); row_keys.reserve(num_cols); - row_values.reserve(num_cols); for (int i = 0; i < num_cols; ++i) { Local key; if (!iter->stmt_->ColumnNameToName(i).ToLocal(&key)) return; - Local val; - if (!iter->stmt_->ColumnToValue(i).ToLocal(&val)) return; row_keys.emplace_back(key); - row_values.emplace_back(val); } DCHECK_EQ(row_keys.size(), row_values.size()); diff --git a/src/node_sqlite.h b/src/node_sqlite.h index 34845accf3f782..3a9f08c16573b2 100644 --- a/src/node_sqlite.h +++ b/src/node_sqlite.h @@ -173,6 +173,8 @@ class StatementSync : public BaseObject { const v8::FunctionCallbackInfo& args); static void SetReadBigInts(const v8::FunctionCallbackInfo& args); static void SetReturnArrays(const v8::FunctionCallbackInfo& args); + v8::MaybeLocal ColumnToValue(const int column); + v8::MaybeLocal ColumnNameToName(const int column); void Finalize(); bool IsFinalized(); @@ -190,8 +192,6 @@ class StatementSync : public BaseObject { std::optional> bare_named_params_; bool BindParams(const v8::FunctionCallbackInfo& args); bool BindValue(const v8::Local& value, const int index); - v8::MaybeLocal ColumnToValue(const int column); - v8::MaybeLocal ColumnNameToName(const int column); friend class StatementSyncIterator; }; From 15fa779ac535fde135411fbe4fda935dcaac1232 Mon Sep 17 00:00:00 2001 From: Chengzhong Wu Date: Tue, 2 Sep 2025 11:19:25 +0100 Subject: [PATCH 042/103] src: fix race on process exit and off thread CA loading When calling `process.exit()` or on uncaught exceptions as soon as the process starts, the process will try to terminate immediately. In this case, there could be a race condition on the unfinished off-thread system CA loader which tries to access the OpenSSL API which has been de-inited on the main thread. PR-URL: https://github.com/nodejs/node/pull/59632 Refs: https://github.com/nodejs/node/pull/59550 Reviewed-By: Joyee Cheung --- src/api/environment.cc | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/src/api/environment.cc b/src/api/environment.cc index be2745fb44759a..d10f861c96931d 100644 --- a/src/api/environment.cc +++ b/src/api/environment.cc @@ -1,4 +1,7 @@ #include +#if HAVE_OPENSSL +#include "crypto/crypto_util.h" +#endif // HAVE_OPENSSL #include "env_properties.h" #include "node.h" #include "node_builtins.h" @@ -1024,6 +1027,11 @@ void DefaultProcessExitHandlerInternal(Environment* env, ExitCode exit_code) { // in node_v8_platform-inl.h uv_library_shutdown(); DisposePlatform(); + +#if HAVE_OPENSSL + crypto::CleanupCachedRootCertificates(); +#endif // HAVE_OPENSSL + Exit(exit_code); } From 4a317150d5aede8dee12c2ba52e6a65316c5ce95 Mon Sep 17 00:00:00 2001 From: hqzing Date: Wed, 3 Sep 2025 00:27:04 +0800 Subject: [PATCH 043/103] build: fix 'implicit-function-declaration' on OpenHarmony platform MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/59547 Reviewed-By: Michaël Zasso Reviewed-By: Luigi Pinca Reviewed-By: Ulises Gascón Reviewed-By: Rafael Gonzaga --- deps/uvwasi/uvwasi.gyp | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/deps/uvwasi/uvwasi.gyp b/deps/uvwasi/uvwasi.gyp index 4dfe3c46d51818..1b1cd2a7299439 100644 --- a/deps/uvwasi/uvwasi.gyp +++ b/deps/uvwasi/uvwasi.gyp @@ -28,7 +28,7 @@ 'include_dirs': ['include'] }, 'conditions': [ - [ 'OS=="linux"', { + [ 'OS=="linux" or OS=="openharmony"', { 'defines': [ '_GNU_SOURCE', '_POSIX_C_SOURCE=200112', From e7bf712c5702aec11df9bf9a0e3d7b0c2e7152ae Mon Sep 17 00:00:00 2001 From: Josh Kelley Date: Tue, 2 Sep 2025 14:12:17 -0400 Subject: [PATCH 044/103] doc: update "Type stripping in dependencies" section The docs state, "To discourage package authors from publishing packages written in TypeScript, Node.js will _by default_ refuse to handle TypeScript files inside folders under a `node_modules` path" (emphasis added). This suggests that there's a way to override that default. However, as far as I can tell, there is not. PR-URL: https://github.com/nodejs/node/pull/59652 Reviewed-By: Marco Ippolito Reviewed-By: Zeyu "Alex" Yang Reviewed-By: Luigi Pinca Reviewed-By: Ethan Arrowood --- doc/api/typescript.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/doc/api/typescript.md b/doc/api/typescript.md index 642e3f26db7cb3..dbfc286d5f1ca4 100644 --- a/doc/api/typescript.md +++ b/doc/api/typescript.md @@ -202,8 +202,8 @@ are enabled by default. ### Type stripping in dependencies To discourage package authors from publishing packages written in TypeScript, -Node.js will by default refuse to handle TypeScript files inside folders under -a `node_modules` path. +Node.js refuses to handle TypeScript files inside folders under a `node_modules` +path. ### Paths aliases From 7bbbcf66660b9fc3e9d7aa5cbdb55e8abb77c335 Mon Sep 17 00:00:00 2001 From: Bruno Rodrigues Date: Wed, 3 Sep 2025 16:18:05 -0300 Subject: [PATCH 045/103] benchmark: sqlite prevent create both tables on prepare selects MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/59709 Reviewed-By: Vinícius Lourenço Claro Cardoso Reviewed-By: Rafael Gonzaga --- benchmark/sqlite/sqlite-prepare-select-all.js | 45 +++++++++++-------- benchmark/sqlite/sqlite-prepare-select-get.js | 45 +++++++++++-------- 2 files changed, 52 insertions(+), 38 deletions(-) diff --git a/benchmark/sqlite/sqlite-prepare-select-all.js b/benchmark/sqlite/sqlite-prepare-select-all.js index e7ea882f16fd83..c8487dca471b84 100644 --- a/benchmark/sqlite/sqlite-prepare-select-all.js +++ b/benchmark/sqlite/sqlite-prepare-select-all.js @@ -26,25 +26,33 @@ const bench = common.createBenchmark(main, { function main(conf) { const db = new sqlite.DatabaseSync(':memory:'); - db.exec('CREATE TABLE foo (text_column TEXT, integer_column INTEGER, real_column REAL, blob_column BLOB)'); - const fooInsertStatement = db.prepare( - 'INSERT INTO foo (text_column, integer_column, real_column, blob_column) VALUES (?, ?, ?, ?)', - ); - - for (let i = 0; i < conf.tableSeedSize; i++) { - fooInsertStatement.run( - crypto.randomUUID(), - Math.floor(Math.random() * 100), - Math.random(), - Buffer.from('example blob data'), + // Create only the necessary table for the benchmark type. + // If the statement includes 'foo_large', create the foo_large table; otherwise, create the foo table. + if (conf.statement.includes('foo_large')) { + db.exec('CREATE TABLE foo_large (text_8kb_column TEXT)'); + const fooLargeInsertStatement = db.prepare( + 'INSERT INTO foo_large (text_8kb_column) VALUES (?)', + ); + const largeText = 'a'.repeat(8 * 1024); + for (let i = 0; i < conf.tableSeedSize; i++) { + fooLargeInsertStatement.run(largeText); + } + } else { + db.exec( + 'CREATE TABLE foo (text_column TEXT, integer_column INTEGER, real_column REAL, blob_column BLOB)', + ); + const fooInsertStatement = db.prepare( + 'INSERT INTO foo (text_column, integer_column, real_column, blob_column) VALUES (?, ?, ?, ?)', ); - } - db.exec('CREATE TABLE foo_large (text_8kb_column TEXT)'); - const fooLargeInsertStatement = db.prepare('INSERT INTO foo_large (text_8kb_column) VALUES (?)'); - const largeText = 'a'.repeat(8 * 1024); - for (let i = 0; i < conf.tableSeedSize; i++) { - fooLargeInsertStatement.run(largeText); + for (let i = 0; i < conf.tableSeedSize; i++) { + fooInsertStatement.run( + crypto.randomUUID(), + Math.floor(Math.random() * 100), + Math.random(), + Buffer.from('example blob data'), + ); + } } let i; @@ -53,8 +61,7 @@ function main(conf) { const stmt = db.prepare(conf.statement); bench.start(); - for (i = 0; i < conf.n; i += 1) - deadCodeElimination = stmt.all(); + for (i = 0; i < conf.n; i += 1) deadCodeElimination = stmt.all(); bench.end(conf.n); assert.ok(deadCodeElimination !== undefined); diff --git a/benchmark/sqlite/sqlite-prepare-select-get.js b/benchmark/sqlite/sqlite-prepare-select-get.js index 2308fe8947654b..0fff29ce5686da 100644 --- a/benchmark/sqlite/sqlite-prepare-select-get.js +++ b/benchmark/sqlite/sqlite-prepare-select-get.js @@ -20,25 +20,33 @@ const bench = common.createBenchmark(main, { function main(conf) { const db = new sqlite.DatabaseSync(':memory:'); - db.exec('CREATE TABLE foo (text_column TEXT, integer_column INTEGER, real_column REAL, blob_column BLOB)'); - const fooInsertStatement = db.prepare( - 'INSERT INTO foo (text_column, integer_column, real_column, blob_column) VALUES (?, ?, ?, ?)', - ); - - for (let i = 0; i < conf.tableSeedSize; i++) { - fooInsertStatement.run( - crypto.randomUUID(), - Math.floor(Math.random() * 100), - Math.random(), - Buffer.from('example blob data'), + // Create only the necessary table for the benchmark type. + // If the statement includes 'foo_large', create the foo_large table; otherwise, create the foo table. + if (conf.statement.includes('foo_large')) { + db.exec('CREATE TABLE foo_large (text_8kb_column TEXT)'); + const fooLargeInsertStatement = db.prepare( + 'INSERT INTO foo_large (text_8kb_column) VALUES (?)', + ); + const largeText = 'a'.repeat(8 * 1024); + for (let i = 0; i < conf.tableSeedSize; i++) { + fooLargeInsertStatement.run(largeText); + } + } else { + db.exec( + 'CREATE TABLE foo (text_column TEXT, integer_column INTEGER, real_column REAL, blob_column BLOB)', + ); + const fooInsertStatement = db.prepare( + 'INSERT INTO foo (text_column, integer_column, real_column, blob_column) VALUES (?, ?, ?, ?)', ); - } - db.exec('CREATE TABLE foo_large (text_8kb_column TEXT)'); - const fooLargeInsertStatement = db.prepare('INSERT INTO foo_large (text_8kb_column) VALUES (?)'); - const largeText = 'a'.repeat(8 * 1024); - for (let i = 0; i < conf.tableSeedSize; i++) { - fooLargeInsertStatement.run(largeText); + for (let i = 0; i < conf.tableSeedSize; i++) { + fooInsertStatement.run( + crypto.randomUUID(), + Math.floor(Math.random() * 100), + Math.random(), + Buffer.from('example blob data'), + ); + } } let i; @@ -47,8 +55,7 @@ function main(conf) { const stmt = db.prepare(conf.statement); bench.start(); - for (i = 0; i < conf.n; i += 1) - deadCodeElimination = stmt.get(); + for (i = 0; i < conf.n; i += 1) deadCodeElimination = stmt.get(); bench.end(conf.n); assert.ok(deadCodeElimination !== undefined); From 34d752586fcc677c42c398972b92852eb6d9ca97 Mon Sep 17 00:00:00 2001 From: Thomas Klausner Date: Thu, 4 Sep 2025 10:00:30 +0200 Subject: [PATCH 046/103] src: fix build on NetBSD MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Include missing cmath.h header. Fixes: https://github.com/nodejs/node/issues/59714 PR-URL: https://github.com/nodejs/node/pull/59718 Reviewed-By: Michaël Zasso Reviewed-By: Ulises Gascón Reviewed-By: Richard Lau Reviewed-By: Anna Henningsen Reviewed-By: Luigi Pinca --- src/tracing/traced_value.cc | 1 + 1 file changed, 1 insertion(+) diff --git a/src/tracing/traced_value.cc b/src/tracing/traced_value.cc index 46c15ae8f1669c..a7c9b9b5b30ac1 100644 --- a/src/tracing/traced_value.cc +++ b/src/tracing/traced_value.cc @@ -9,6 +9,7 @@ #include #endif +#include #include #include "node_metadata.h" From 8dbd0f13e8cf4799b9cf1ed54f48872afde05d1e Mon Sep 17 00:00:00 2001 From: Antoine du Hamel Date: Thu, 4 Sep 2025 10:52:20 +0200 Subject: [PATCH 047/103] tools: add sccache to `test-internet` workflow MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/59720 Reviewed-By: Michaël Zasso Reviewed-By: Marco Ippolito Reviewed-By: Moshe Atlow Reviewed-By: Luigi Pinca --- .github/workflows/test-internet.yml | 9 +++++++-- 1 file changed, 7 insertions(+), 2 deletions(-) diff --git a/.github/workflows/test-internet.yml b/.github/workflows/test-internet.yml index 7d25113c024f78..450f6d59a231e2 100644 --- a/.github/workflows/test-internet.yml +++ b/.github/workflows/test-internet.yml @@ -33,8 +33,9 @@ concurrency: env: PYTHON_VERSION: '3.12' FLAKY_TESTS: keep_retrying - CC: clang - CXX: clang++ + CC: sccache clang + CXX: sccache clang++ + SCCACHE_GHA_ENABLED: 'true' permissions: contents: read @@ -51,6 +52,10 @@ jobs: uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0 with: python-version: ${{ env.PYTHON_VERSION }} + - name: Set up sccache + uses: Mozilla-Actions/sccache-action@7d986dd989559c6ecdb630a3fd2557667be217ad # v0.0.9 + with: + version: v0.10.0 - name: Environment Information run: npx envinfo - name: Build From 7f347fc551a8932026ff202f18654703bd510eec Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Micha=C3=ABl=20Zasso?= Date: Thu, 4 Sep 2025 11:28:24 +0200 Subject: [PATCH 048/103] build: fix getting OpenSSL version on Windows Node.js on Windows is built with `clang`, not `gcc`. PR-URL: https://github.com/nodejs/node/pull/59609 Reviewed-By: Stefan Stojanovic Reviewed-By: James M Snell Reviewed-By: Luigi Pinca --- configure.py | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/configure.py b/configure.py index b54a2f5961f9bc..b39be54f7654f3 100755 --- a/configure.py +++ b/configure.py @@ -20,9 +20,9 @@ original_argv = sys.argv[1:] # gcc and g++ as defaults matches what GYP's Makefile generator does, -# except on OS X. -CC = os.environ.get('CC', 'cc' if sys.platform == 'darwin' else 'gcc') -CXX = os.environ.get('CXX', 'c++' if sys.platform == 'darwin' else 'g++') +# except on macOS and Windows. +CC = os.environ.get('CC', 'cc' if sys.platform == 'darwin' else 'clang' if sys.platform == 'win32' else 'gcc') +CXX = os.environ.get('CXX', 'c++' if sys.platform == 'darwin' else 'clang' if sys.platform == 'win32' else 'g++') tools_path = Path('tools') From 2cd6a3b7ecaf5bb43eb433da7263e31b8ccb3d60 Mon Sep 17 00:00:00 2001 From: Anna Henningsen Date: Thu, 4 Sep 2025 11:35:39 +0200 Subject: [PATCH 049/103] src: track async resources via pointers to stack-allocated handles MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit This addresses an existing `TODO` to remove the need for a separate `LocalVector`. Since all relevant handles are already present on the stack, we can track their addresses instead of storing the objects in a separate container. PR-URL: https://github.com/nodejs/node/pull/59704 Reviewed-By: Gerhard Stöbich Reviewed-By: Chengzhong Wu --- src/api/callback.cc | 23 ++++++++++++++++------- src/async_wrap.cc | 2 +- src/env-inl.h | 6 ++++-- src/env.cc | 21 ++++++++++----------- src/env.h | 15 +++++---------- src/node_internals.h | 11 +++++++++-- src/node_task_queue.cc | 3 ++- 7 files changed, 47 insertions(+), 34 deletions(-) diff --git a/src/api/callback.cc b/src/api/callback.cc index 6ca7cda3a5d3f1..57c9359be9fd95 100644 --- a/src/api/callback.cc +++ b/src/api/callback.cc @@ -48,17 +48,26 @@ InternalCallbackScope::InternalCallbackScope(AsyncWrap* async_wrap, int flags) flags, async_wrap->context_frame()) {} -InternalCallbackScope::InternalCallbackScope(Environment* env, - Local object, - const async_context& asyncContext, - int flags, - Local context_frame) +InternalCallbackScope::InternalCallbackScope( + Environment* env, + std::variant, Local*> object, + const async_context& asyncContext, + int flags, + Local context_frame) : env_(env), async_context_(asyncContext), - object_(object), skip_hooks_(flags & kSkipAsyncHooks), skip_task_queues_(flags & kSkipTaskQueues) { CHECK_NOT_NULL(env); + + if (std::holds_alternative>(object)) { + object_storage_ = std::get>(object); + object_ = &object_storage_; + } else { + object_ = std::get*>(object); + CHECK_NOT_NULL(object_); + } + env->PushAsyncCallbackScope(); if (!env->can_call_into_js()) { @@ -85,7 +94,7 @@ InternalCallbackScope::InternalCallbackScope(Environment* env, isolate, async_context_frame::exchange(isolate, context_frame)); env->async_hooks()->push_async_context( - async_context_.async_id, async_context_.trigger_async_id, object); + async_context_.async_id, async_context_.trigger_async_id, object_); pushed_ids_ = true; diff --git a/src/async_wrap.cc b/src/async_wrap.cc index d9b2c76ede38c0..3dba0373de71ee 100644 --- a/src/async_wrap.cc +++ b/src/async_wrap.cc @@ -278,7 +278,7 @@ void AsyncWrap::PushAsyncContext(const FunctionCallbackInfo& args) { // then the checks in push_async_ids() and pop_async_id() will. double async_id = args[0]->NumberValue(env->context()).FromJust(); double trigger_async_id = args[1]->NumberValue(env->context()).FromJust(); - env->async_hooks()->push_async_context(async_id, trigger_async_id, {}); + env->async_hooks()->push_async_context(async_id, trigger_async_id, nullptr); } diff --git a/src/env-inl.h b/src/env-inl.h index 0eee9a3ed9d1fc..8e86b1d2a5ea9c 100644 --- a/src/env-inl.h +++ b/src/env-inl.h @@ -106,8 +106,10 @@ v8::Local AsyncHooks::js_execution_async_resources() { } v8::Local AsyncHooks::native_execution_async_resource(size_t i) { - if (i >= native_execution_async_resources_.size()) return {}; - return native_execution_async_resources_[i]; + if (i >= native_execution_async_resources_.size() || + native_execution_async_resources_[i] == nullptr) + return {}; + return *native_execution_async_resources_[i]; } inline v8::Local AsyncHooks::provider_string(int idx) { diff --git a/src/env.cc b/src/env.cc index a8ad8c7d895ffb..53f0bf7fc1e5c8 100644 --- a/src/env.cc +++ b/src/env.cc @@ -122,7 +122,8 @@ void Environment::ResetPromiseHooks(Local init, // Remember to keep this code aligned with pushAsyncContext() in JS. void AsyncHooks::push_async_context(double async_id, double trigger_async_id, - Local resource) { + Local* resource) { + CHECK_IMPLIES(resource != nullptr, !resource->IsEmpty()); // Since async_hooks is experimental, do only perform the check // when async_hooks is enabled. if (fields_[kCheck] > 0) { @@ -140,14 +141,14 @@ void AsyncHooks::push_async_context(double async_id, #ifdef DEBUG for (uint32_t i = offset; i < native_execution_async_resources_.size(); i++) - CHECK(native_execution_async_resources_[i].IsEmpty()); + CHECK_NULL(native_execution_async_resources_[i]); #endif // When this call comes from JS (as a way of increasing the stack size), // `resource` will be empty, because JS caches these values anyway. - if (!resource.IsEmpty()) { + if (resource != nullptr) { native_execution_async_resources_.resize(offset + 1); - // Caveat: This is a v8::Local<> assignment, we do not keep a v8::Global<>! + // Caveat: This is a v8::Local<>* assignment, we do not keep a v8::Global<>! native_execution_async_resources_[offset] = resource; } } @@ -172,11 +173,11 @@ bool AsyncHooks::pop_async_context(double async_id) { fields_[kStackLength] = offset; if (offset < native_execution_async_resources_.size() && - !native_execution_async_resources_[offset].IsEmpty()) [[likely]] { + native_execution_async_resources_[offset] != nullptr) [[likely]] { #ifdef DEBUG for (uint32_t i = offset + 1; i < native_execution_async_resources_.size(); i++) { - CHECK(native_execution_async_resources_[i].IsEmpty()); + CHECK_NULL(native_execution_async_resources_[i]); } #endif native_execution_async_resources_.resize(offset); @@ -1717,7 +1718,6 @@ AsyncHooks::AsyncHooks(Isolate* isolate, const SerializeInfo* info) fields_(isolate, kFieldsCount, MAYBE_FIELD_PTR(info, fields)), async_id_fields_( isolate, kUidFieldsCount, MAYBE_FIELD_PTR(info, async_id_fields)), - native_execution_async_resources_(isolate), info_(info) { HandleScope handle_scope(isolate); if (info == nullptr) { @@ -1806,10 +1806,9 @@ AsyncHooks::SerializeInfo AsyncHooks::Serialize(Local context, native_execution_async_resources_.size()); for (size_t i = 0; i < native_execution_async_resources_.size(); i++) { info.native_execution_async_resources[i] = - native_execution_async_resources_[i].IsEmpty() ? SIZE_MAX : - creator->AddData( - context, - native_execution_async_resources_[i]); + native_execution_async_resources_[i] == nullptr + ? SIZE_MAX + : creator->AddData(context, *native_execution_async_resources_[i]); } // At the moment, promise hooks are not supported in the startup snapshot. diff --git a/src/env.h b/src/env.h index bb8d206e7f35a5..f3a2d221f4bb52 100644 --- a/src/env.h +++ b/src/env.h @@ -59,6 +59,7 @@ #include #include #include +#include #include #include #include @@ -324,7 +325,7 @@ class AsyncHooks : public MemoryRetainer { // `pop_async_context()` or `clear_async_id_stack()` are called. void push_async_context(double async_id, double trigger_async_id, - v8::Local execution_async_resource); + v8::Local* execution_async_resource); bool pop_async_context(double async_id); void clear_async_id_stack(); // Used in fatal exceptions. @@ -386,15 +387,9 @@ class AsyncHooks : public MemoryRetainer { v8::Global js_execution_async_resources_; - // TODO(@jasnell): Note that this is technically illegal use of - // v8::Locals which should be kept on the stack. Here, the entries - // in this object grows and shrinks with the C stack, and entries - // will be in the right handle scopes, but v8::Locals are supposed - // to remain on the stack and not the heap. For general purposes - // this *should* be ok but may need to be looked at further should - // v8 become stricter in the future about v8::Locals being held in - // the stack. - v8::LocalVector native_execution_async_resources_; + // We avoid storing the handles directly here, because they are already + // properly allocated on the stack, we just need access to them here. + std::deque*> native_execution_async_resources_; // Non-empty during deserialization const SerializeInfo* info_ = nullptr; diff --git a/src/node_internals.h b/src/node_internals.h index 275534285ec28f..12ea72b61b0a5e 100644 --- a/src/node_internals.h +++ b/src/node_internals.h @@ -37,6 +37,7 @@ #include #include +#include #include struct sockaddr; @@ -245,9 +246,14 @@ class InternalCallbackScope { // compatibility issues, but it shouldn't.) kSkipTaskQueues = 2 }; + // You need to either guarantee that this `InternalCallbackScope` is + // stack-allocated itself, OR that `object` is a pointer to a stack-allocated + // `v8::Local` which outlives this scope (e.g. for the + // public `CallbackScope` which indirectly allocates an instance of + // this class for ABI stability purposes). InternalCallbackScope( Environment* env, - v8::Local object, + std::variant, v8::Local*> object, const async_context& asyncContext, int flags = kNoFlags, v8::Local context_frame = v8::Local()); @@ -263,7 +269,8 @@ class InternalCallbackScope { private: Environment* env_; async_context async_context_; - v8::Local object_; + v8::Local object_storage_; + v8::Local* object_; bool skip_hooks_; bool skip_task_queues_; bool failed_ = false; diff --git a/src/node_task_queue.cc b/src/node_task_queue.cc index 0a5aba6e31fa79..c4257110d8b520 100644 --- a/src/node_task_queue.cc +++ b/src/node_task_queue.cc @@ -100,10 +100,11 @@ void PromiseRejectCallback(PromiseRejectMessage message) { if (!GetAssignedPromiseAsyncId(env, promise, env->trigger_async_id_symbol()) .To(&trigger_async_id)) return; + Local promise_as_obj = promise; if (async_id != AsyncWrap::kInvalidAsyncId && trigger_async_id != AsyncWrap::kInvalidAsyncId) { env->async_hooks()->push_async_context( - async_id, trigger_async_id, promise); + async_id, trigger_async_id, &promise_as_obj); } USE(callback->Call( From dcdb259e858d5347822f3560d11d96390c1ed61e Mon Sep 17 00:00:00 2001 From: Moshe Atlow Date: Thu, 4 Sep 2025 12:35:48 +0300 Subject: [PATCH 050/103] test_runner: fix todo inheritance MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/59721 Reviewed-By: Antoine du Hamel Reviewed-By: Chemi Atlow Reviewed-By: Ulises Gascón Reviewed-By: Pietro Marchini Reviewed-By: Marco Ippolito --- lib/internal/test_runner/test.js | 2 +- test/fixtures/test-runner/output/name_pattern.snapshot | 10 +++++----- test/fixtures/test-runner/todo_exit_code.js | 6 ++++++ test/parallel/test-runner-exit-code.js | 4 ++-- 4 files changed, 14 insertions(+), 8 deletions(-) diff --git a/lib/internal/test_runner/test.js b/lib/internal/test_runner/test.js index 8e189b62cad817..d62421f7b42e15 100644 --- a/lib/internal/test_runner/test.js +++ b/lib/internal/test_runner/test.js @@ -647,7 +647,7 @@ class Test extends AsyncResource { this.expectedAssertions = plan; this.cancelled = false; this.skipped = skip !== undefined && skip !== false; - this.isTodo = todo !== undefined && todo !== false; + this.isTodo = (todo !== undefined && todo !== false) || this.parent?.isTodo; this.startTime = null; this.endTime = null; this.passed = false; diff --git a/test/fixtures/test-runner/output/name_pattern.snapshot b/test/fixtures/test-runner/output/name_pattern.snapshot index e965b470730488..80f310ad268eac 100644 --- a/test/fixtures/test-runner/output/name_pattern.snapshot +++ b/test/fixtures/test-runner/output/name_pattern.snapshot @@ -146,20 +146,20 @@ ok 9 - no ... # Subtest: no with todo # Subtest: yes - ok 1 - yes + ok 1 - yes # TODO --- duration_ms: * type: 'test' ... # Subtest: maybe # Subtest: yes - ok 1 - yes + ok 1 - yes # TODO --- duration_ms: * type: 'test' ... 1..1 - ok 2 - maybe + ok 2 - maybe # TODO --- duration_ms: * type: 'suite' @@ -193,9 +193,9 @@ ok 11 - DescribeForMatchWithAncestors 1..11 # tests 18 # suites 12 -# pass 16 +# pass 14 # fail 0 # cancelled 0 # skipped 2 -# todo 0 +# todo 2 # duration_ms * diff --git a/test/fixtures/test-runner/todo_exit_code.js b/test/fixtures/test-runner/todo_exit_code.js index 6577eefe52f7dc..77f519058e9760 100644 --- a/test/fixtures/test-runner/todo_exit_code.js +++ b/test/fixtures/test-runner/todo_exit_code.js @@ -13,3 +13,9 @@ test.todo('should fail without effecting exit code', () => { test('empty string todo', { todo: '' }, () => { throw new Error('Fail but not badly') }); + +describe.todo('should inherit todo', () => { + test('should fail without harming suite', () => { + throw new Error('Fail but not badly'); + }); +}); diff --git a/test/parallel/test-runner-exit-code.js b/test/parallel/test-runner-exit-code.js index d2f0251e5fb30c..792c5f1717bd60 100644 --- a/test/parallel/test-runner-exit-code.js +++ b/test/parallel/test-runner-exit-code.js @@ -58,10 +58,10 @@ if (process.argv[2] === 'child') { assert.strictEqual(child.status, 0); assert.strictEqual(child.signal, null); const stdout = child.stdout.toString(); - assert.match(stdout, /tests 3/); + assert.match(stdout, /tests 4/); assert.match(stdout, /pass 0/); assert.match(stdout, /fail 0/); - assert.match(stdout, /todo 3/); + assert.match(stdout, /todo 4/); child = spawnSync(process.execPath, [__filename, 'child', 'fail']); assert.strictEqual(child.status, 1); From dbe6e63baf7740035cb5ceb8a945044141523288 Mon Sep 17 00:00:00 2001 From: Joyee Cheung Date: Thu, 4 Sep 2025 12:01:32 +0200 Subject: [PATCH 051/103] esm: fix missed renaming in ModuleJob.runSync MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit https://redirect.github.com/nodejs/node/pull/59675 missed a case when renaming .async to .hasAsyncGraph. This fixes that and add a test that would previously crash with the missed rename. PR-URL: https://github.com/nodejs/node/pull/59724 Refs: https://github.com/nodejs/node/pull/59675 Reviewed-By: Chengzhong Wu Reviewed-By: Ulises Gascón Reviewed-By: Marco Ippolito --- lib/internal/modules/esm/module_job.js | 2 +- .../test-import-require-tla-twice.js | 21 +++++++++++++++++++ .../import-require-tla-twice/hook.js | 6 ++++++ .../import-require-tla-twice/require-tla.js | 11 ++++++++++ .../import-require-tla-twice/tla.mjs | 1 + 5 files changed, 40 insertions(+), 1 deletion(-) create mode 100644 test/es-module/test-import-require-tla-twice.js create mode 100644 test/fixtures/es-modules/import-require-tla-twice/hook.js create mode 100644 test/fixtures/es-modules/import-require-tla-twice/require-tla.js create mode 100644 test/fixtures/es-modules/import-require-tla-twice/tla.mjs diff --git a/lib/internal/modules/esm/module_job.js b/lib/internal/modules/esm/module_job.js index 3b48233d636c32..82f8961b16ba0b 100644 --- a/lib/internal/modules/esm/module_job.js +++ b/lib/internal/modules/esm/module_job.js @@ -335,7 +335,7 @@ class ModuleJob extends ModuleJobBase { const parentFilename = urlToFilename(parent?.filename); this.module.hasAsyncGraph ??= this.module.isGraphAsync(); - if (this.module.async && !getOptionValue('--experimental-print-required-tla')) { + if (this.module.hasAsyncGraph && !getOptionValue('--experimental-print-required-tla')) { throw new ERR_REQUIRE_ASYNC_MODULE(filename, parentFilename); } if (status === kInstantiated) { diff --git a/test/es-module/test-import-require-tla-twice.js b/test/es-module/test-import-require-tla-twice.js new file mode 100644 index 00000000000000..a7cb32a983e3c1 --- /dev/null +++ b/test/es-module/test-import-require-tla-twice.js @@ -0,0 +1,21 @@ +'use strict'; +// This tests that in the require() in imported CJS can retry loading an ESM with TLA +// twice and get the correct error both times. + +require('../common'); +const { spawnSyncAndAssert } = require('../common/child_process'); +const fixtures = require('../common/fixtures'); +const assert = require('assert'); + +spawnSyncAndAssert( + process.execPath, + ['--import', fixtures.fileURL('es-modules', 'import-require-tla-twice', 'hook.js'), + fixtures.path('es-modules', 'import-require-tla-twice', 'require-tla.js'), + ], + { + stdout(output) { + const matches = output.matchAll(/e\.code === ERR_REQUIRE_ASYNC_MODULE true/g); + assert.strictEqual([...matches].length, 2); + } + } +); diff --git a/test/fixtures/es-modules/import-require-tla-twice/hook.js b/test/fixtures/es-modules/import-require-tla-twice/hook.js new file mode 100644 index 00000000000000..b1a32daad22fda --- /dev/null +++ b/test/fixtures/es-modules/import-require-tla-twice/hook.js @@ -0,0 +1,6 @@ +const { registerHooks } = require('module'); +registerHooks({ + load(url, context, nextLoad) { + return nextLoad(url, context); + } +}); diff --git a/test/fixtures/es-modules/import-require-tla-twice/require-tla.js b/test/fixtures/es-modules/import-require-tla-twice/require-tla.js new file mode 100644 index 00000000000000..1959ac4779bc10 --- /dev/null +++ b/test/fixtures/es-modules/import-require-tla-twice/require-tla.js @@ -0,0 +1,11 @@ +try { + require('./tla.mjs'); +} catch (e) { + console.log('e.code === ERR_REQUIRE_ASYNC_MODULE', e.code === 'ERR_REQUIRE_ASYNC_MODULE'); +} + +try { + require('./tla.mjs'); +} catch (e) { + console.log('e.code === ERR_REQUIRE_ASYNC_MODULE', e.code === 'ERR_REQUIRE_ASYNC_MODULE'); +} diff --git a/test/fixtures/es-modules/import-require-tla-twice/tla.mjs b/test/fixtures/es-modules/import-require-tla-twice/tla.mjs new file mode 100644 index 00000000000000..143bce35866c48 --- /dev/null +++ b/test/fixtures/es-modules/import-require-tla-twice/tla.mjs @@ -0,0 +1 @@ +await Promise.resolve('1'); From 5806ea02afb0a91e8e427080cc6b0ee9646f74bf Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 4 Sep 2025 11:25:33 +0000 Subject: [PATCH 052/103] meta: bump actions/checkout from 4.2.2 to 5.0.0 Bumps [actions/checkout](https://github.com/actions/checkout) from 4.2.2 to 5.0.0. - [Release notes](https://github.com/actions/checkout/releases) - [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md) - [Commits](https://github.com/actions/checkout/compare/11bd71901bbe5b1630ceea73d27597364c9af683...08c6903cd8c0fde910a37f88322edcfb5dd907a8) --- updated-dependencies: - dependency-name: actions/checkout dependency-version: 5.0.0 dependency-type: direct:production update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] PR-URL: https://github.com/nodejs/node/pull/59725 Reviewed-By: Antoine du Hamel Reviewed-By: Luigi Pinca Reviewed-By: Rafael Gonzaga --- .github/workflows/auto-start-ci.yml | 2 +- .github/workflows/build-tarball.yml | 4 ++-- .github/workflows/codeql.yml | 2 +- .github/workflows/commit-lint.yml | 2 +- .github/workflows/commit-queue.yml | 2 +- .../workflows/coverage-linux-without-intl.yml | 2 +- .github/workflows/coverage-linux.yml | 2 +- .github/workflows/coverage-windows.yml | 2 +- .github/workflows/create-release-proposal.yml | 2 +- .github/workflows/daily-wpt-fyi.yml | 6 +++--- .github/workflows/daily.yml | 2 +- .github/workflows/doc.yml | 2 +- .../workflows/find-inactive-collaborators.yml | 2 +- .github/workflows/find-inactive-tsc.yml | 4 ++-- .github/workflows/license-builder.yml | 2 +- .github/workflows/lint-release-proposal.yml | 2 +- .github/workflows/linters.yml | 20 +++++++++---------- .github/workflows/notify-on-push.yml | 2 +- .github/workflows/scorecard.yml | 2 +- .github/workflows/test-internet.yml | 2 +- .github/workflows/test-linux.yml | 2 +- .github/workflows/test-macos.yml | 2 +- .github/workflows/timezone-update.yml | 4 ++-- .github/workflows/tools.yml | 2 +- .github/workflows/update-openssl.yml | 2 +- .github/workflows/update-v8.yml | 2 +- .github/workflows/update-wpt.yml | 2 +- 27 files changed, 41 insertions(+), 41 deletions(-) diff --git a/.github/workflows/auto-start-ci.yml b/.github/workflows/auto-start-ci.yml index 2588bc82da3f66..077824621de03c 100644 --- a/.github/workflows/auto-start-ci.yml +++ b/.github/workflows/auto-start-ci.yml @@ -45,7 +45,7 @@ jobs: if: needs.get-prs-for-ci.outputs.numbers != '' runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false diff --git a/.github/workflows/build-tarball.yml b/.github/workflows/build-tarball.yml index c5911684070fba..2ad8feb31495d0 100644 --- a/.github/workflows/build-tarball.yml +++ b/.github/workflows/build-tarball.yml @@ -42,7 +42,7 @@ jobs: if: github.event.pull_request.draft == false runs-on: ubuntu-24.04 steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Set up Python ${{ env.PYTHON_VERSION }} @@ -72,7 +72,7 @@ jobs: needs: build-tarball runs-on: ubuntu-24.04 steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Set up Python ${{ env.PYTHON_VERSION }} diff --git a/.github/workflows/codeql.yml b/.github/workflows/codeql.yml index 969ec37ee63ece..7726d5040d7a58 100644 --- a/.github/workflows/codeql.yml +++ b/.github/workflows/codeql.yml @@ -23,7 +23,7 @@ jobs: steps: - name: Checkout repository - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 # Initializes the CodeQL tools for scanning. - name: Initialize CodeQL diff --git a/.github/workflows/commit-lint.yml b/.github/workflows/commit-lint.yml index 4cd44c6d18538f..6703752b8eef7e 100644 --- a/.github/workflows/commit-lint.yml +++ b/.github/workflows/commit-lint.yml @@ -17,7 +17,7 @@ jobs: run: | echo "plusOne=$((${{ github.event.pull_request.commits }} + 1))" >> $GITHUB_OUTPUT echo "minusOne=$((${{ github.event.pull_request.commits }} - 1))" >> $GITHUB_OUTPUT - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: fetch-depth: ${{ steps.nb-of-commits.outputs.plusOne }} persist-credentials: false diff --git a/.github/workflows/commit-queue.yml b/.github/workflows/commit-queue.yml index f1690f648d33b6..5dd31f4f9486b7 100644 --- a/.github/workflows/commit-queue.yml +++ b/.github/workflows/commit-queue.yml @@ -59,7 +59,7 @@ jobs: if: needs.get_mergeable_prs.outputs.numbers != '' runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: # A personal token is required because pushing with GITHUB_TOKEN will # prevent commits from running CI after they land. It needs diff --git a/.github/workflows/coverage-linux-without-intl.yml b/.github/workflows/coverage-linux-without-intl.yml index 6c44a51d81798c..4a95db7c2fc9a7 100644 --- a/.github/workflows/coverage-linux-without-intl.yml +++ b/.github/workflows/coverage-linux-without-intl.yml @@ -48,7 +48,7 @@ jobs: if: github.event.pull_request.draft == false runs-on: ubuntu-24.04 steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Set up Python ${{ env.PYTHON_VERSION }} diff --git a/.github/workflows/coverage-linux.yml b/.github/workflows/coverage-linux.yml index 6c1ef24648a85b..4e378ebb02b20c 100644 --- a/.github/workflows/coverage-linux.yml +++ b/.github/workflows/coverage-linux.yml @@ -48,7 +48,7 @@ jobs: if: github.event.pull_request.draft == false runs-on: ubuntu-24.04 steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Set up Python ${{ env.PYTHON_VERSION }} diff --git a/.github/workflows/coverage-windows.yml b/.github/workflows/coverage-windows.yml index 6bffbbbccc7dcb..67b895d0b66635 100644 --- a/.github/workflows/coverage-windows.yml +++ b/.github/workflows/coverage-windows.yml @@ -45,7 +45,7 @@ jobs: if: github.event.pull_request.draft == false runs-on: windows-2025 steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Set up Python ${{ env.PYTHON_VERSION }} diff --git a/.github/workflows/create-release-proposal.yml b/.github/workflows/create-release-proposal.yml index 58cf5a0bba55f6..f88992dcdb017e 100644 --- a/.github/workflows/create-release-proposal.yml +++ b/.github/workflows/create-release-proposal.yml @@ -33,7 +33,7 @@ jobs: RELEASE_LINE: ${{ inputs.release-line }} runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: ref: ${{ env.STAGING_BRANCH }} persist-credentials: false diff --git a/.github/workflows/daily-wpt-fyi.yml b/.github/workflows/daily-wpt-fyi.yml index 3cc6e6c54cd28c..64ab04bae7f0f3 100644 --- a/.github/workflows/daily-wpt-fyi.yml +++ b/.github/workflows/daily-wpt-fyi.yml @@ -63,7 +63,7 @@ jobs: SHORT_SHA=$(node -p 'process.version.split(/-nightly\d{8}/)[1]') echo "NIGHTLY_REF=$(gh api /repos/nodejs/node/commits/$SHORT_SHA --jq '.sha')" >> $GITHUB_ENV - name: Checkout ${{ steps.setup-node.outputs.node-version }} - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false ref: ${{ env.NIGHTLY_REF || steps.setup-node.outputs.node-version }} @@ -79,7 +79,7 @@ jobs: run: rm -rf wpt working-directory: test/fixtures - name: Checkout epochs/daily WPT - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: repository: web-platform-tests/wpt persist-credentials: false @@ -104,7 +104,7 @@ jobs: run: rm -rf deps/undici - name: Checkout undici if: ${{ env.WPT_REPORT != '' }} - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: repository: nodejs/undici persist-credentials: false diff --git a/.github/workflows/daily.yml b/.github/workflows/daily.yml index ae2b7a577f7b0f..43f8bb3df1eb2d 100644 --- a/.github/workflows/daily.yml +++ b/.github/workflows/daily.yml @@ -15,7 +15,7 @@ jobs: build-lto: runs-on: ubuntu-24.04 steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Use Node.js ${{ env.NODE_VERSION }} diff --git a/.github/workflows/doc.yml b/.github/workflows/doc.yml index 1793edd12ce261..3ca1a569ea3bbd 100644 --- a/.github/workflows/doc.yml +++ b/.github/workflows/doc.yml @@ -24,7 +24,7 @@ jobs: if: github.event.pull_request.draft == false runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Use Node.js ${{ env.NODE_VERSION }} diff --git a/.github/workflows/find-inactive-collaborators.yml b/.github/workflows/find-inactive-collaborators.yml index 269bc2db172be8..725824e6f18cfb 100644 --- a/.github/workflows/find-inactive-collaborators.yml +++ b/.github/workflows/find-inactive-collaborators.yml @@ -19,7 +19,7 @@ jobs: runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: fetch-depth: 0 persist-credentials: false diff --git a/.github/workflows/find-inactive-tsc.yml b/.github/workflows/find-inactive-tsc.yml index 9e294994ba8180..4ee5d2a595c6ee 100644 --- a/.github/workflows/find-inactive-tsc.yml +++ b/.github/workflows/find-inactive-tsc.yml @@ -20,13 +20,13 @@ jobs: steps: - name: Checkout the repo - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: fetch-depth: 0 persist-credentials: false - name: Clone nodejs/TSC repository - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: fetch-depth: 0 path: .tmp diff --git a/.github/workflows/license-builder.yml b/.github/workflows/license-builder.yml index c62e9b1f08fe54..6c7dc8721d382b 100644 --- a/.github/workflows/license-builder.yml +++ b/.github/workflows/license-builder.yml @@ -17,7 +17,7 @@ jobs: if: github.repository == 'nodejs/node' runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - run: ./tools/license-builder.sh # Run the license builder tool diff --git a/.github/workflows/lint-release-proposal.yml b/.github/workflows/lint-release-proposal.yml index 101fa9964f0c73..c3e0ee34d18188 100644 --- a/.github/workflows/lint-release-proposal.yml +++ b/.github/workflows/lint-release-proposal.yml @@ -23,7 +23,7 @@ jobs: contents: read pull-requests: read steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Lint release commit title format diff --git a/.github/workflows/linters.yml b/.github/workflows/linters.yml index b3db61eca640f7..9f7030b4ad0fbd 100644 --- a/.github/workflows/linters.yml +++ b/.github/workflows/linters.yml @@ -25,7 +25,7 @@ jobs: if: github.event.pull_request.draft == false runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Use Node.js ${{ env.NODE_VERSION }} @@ -40,7 +40,7 @@ jobs: if: github.event.pull_request.draft == false runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Set up Python ${{ env.PYTHON_VERSION }} @@ -55,7 +55,7 @@ jobs: if: ${{ github.event.pull_request && github.event.pull_request.draft == false && github.base_ref == github.event.repository.default_branch }} runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: fetch-depth: 0 persist-credentials: false @@ -93,7 +93,7 @@ jobs: if: github.event.pull_request.draft == false runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Use Node.js ${{ env.NODE_VERSION }} @@ -118,7 +118,7 @@ jobs: if: github.event.pull_request.draft == false runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Set up Python ${{ env.PYTHON_VERSION }} @@ -135,7 +135,7 @@ jobs: if: github.event.pull_request.draft == false runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Use Python ${{ env.PYTHON_VERSION }} @@ -153,7 +153,7 @@ jobs: if: github.event.pull_request.draft == false runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - run: shellcheck -V @@ -163,7 +163,7 @@ jobs: if: github.event.pull_request.draft == false runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - uses: mszostok/codeowners-validator@7f3f5e28c6d7b8dfae5731e54ce2272ca384592f @@ -173,7 +173,7 @@ jobs: if: ${{ github.event.pull_request }} runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: fetch-depth: 2 persist-credentials: false @@ -182,7 +182,7 @@ jobs: lint-readme: runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Get team members if possible diff --git a/.github/workflows/notify-on-push.yml b/.github/workflows/notify-on-push.yml index bdda058f104795..d21f3e7943ccfb 100644 --- a/.github/workflows/notify-on-push.yml +++ b/.github/workflows/notify-on-push.yml @@ -35,7 +35,7 @@ jobs: contents: read pull-requests: write steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Check commit message diff --git a/.github/workflows/scorecard.yml b/.github/workflows/scorecard.yml index 45a5ab30e74ff6..37f60e52cdac7c 100644 --- a/.github/workflows/scorecard.yml +++ b/.github/workflows/scorecard.yml @@ -38,7 +38,7 @@ jobs: egress-policy: audit # TODO: change to 'egress-policy: block' after couple of runs - name: Checkout code - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false diff --git a/.github/workflows/test-internet.yml b/.github/workflows/test-internet.yml index 450f6d59a231e2..759bbfee76b573 100644 --- a/.github/workflows/test-internet.yml +++ b/.github/workflows/test-internet.yml @@ -45,7 +45,7 @@ jobs: if: github.repository == 'nodejs/node' || github.event_name != 'schedule' runs-on: ubuntu-24.04 steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Set up Python ${{ env.PYTHON_VERSION }} diff --git a/.github/workflows/test-linux.yml b/.github/workflows/test-linux.yml index 789689c25cd4bf..88da01978c3917 100644 --- a/.github/workflows/test-linux.yml +++ b/.github/workflows/test-linux.yml @@ -43,7 +43,7 @@ jobs: matrix: os: [ubuntu-24.04, ubuntu-24.04-arm] steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false path: node diff --git a/.github/workflows/test-macos.yml b/.github/workflows/test-macos.yml index c74e200acb0295..510cb96411062f 100644 --- a/.github/workflows/test-macos.yml +++ b/.github/workflows/test-macos.yml @@ -47,7 +47,7 @@ jobs: CXX: sccache g++ SCCACHE_GHA_ENABLED: 'true' steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false path: node diff --git a/.github/workflows/timezone-update.yml b/.github/workflows/timezone-update.yml index e951b848ad4155..51dd8f155b89fa 100644 --- a/.github/workflows/timezone-update.yml +++ b/.github/workflows/timezone-update.yml @@ -20,12 +20,12 @@ jobs: steps: - name: Checkout nodejs/node - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Checkout unicode-org/icu-data - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: path: icu-data persist-credentials: false diff --git a/.github/workflows/tools.yml b/.github/workflows/tools.yml index e6bb4cef52c5bd..2e8ffd22534691 100644 --- a/.github/workflows/tools.yml +++ b/.github/workflows/tools.yml @@ -288,7 +288,7 @@ jobs: run: | git config --global user.name "Node.js GitHub Bot" git config --global user.email "github-bot@iojs.org" - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 if: github.event_name == 'schedule' || inputs.id == 'all' || inputs.id == matrix.id with: persist-credentials: false diff --git a/.github/workflows/update-openssl.yml b/.github/workflows/update-openssl.yml index f4f198104bfd23..ee9a3e0fa11c03 100644 --- a/.github/workflows/update-openssl.yml +++ b/.github/workflows/update-openssl.yml @@ -14,7 +14,7 @@ jobs: if: github.repository == 'nodejs/node' runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Check and download new OpenSSL version diff --git a/.github/workflows/update-v8.yml b/.github/workflows/update-v8.yml index 0b290e41ada4a0..77a36ed5aa4a8f 100644 --- a/.github/workflows/update-v8.yml +++ b/.github/workflows/update-v8.yml @@ -16,7 +16,7 @@ jobs: if: github.repository == 'nodejs/node' runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false - name: Cache node modules and update-v8 diff --git a/.github/workflows/update-wpt.yml b/.github/workflows/update-wpt.yml index bd43791e6c860b..488b2f88282f4f 100644 --- a/.github/workflows/update-wpt.yml +++ b/.github/workflows/update-wpt.yml @@ -27,7 +27,7 @@ jobs: subsystem: ${{ fromJSON(github.event.inputs.subsystems || '["url", "urlpattern", "WebCryptoAPI"]') }} steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0 with: persist-credentials: false From 34f7ab55021c2127fa24398d2c1601a976875290 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 4 Sep 2025 11:33:38 +0000 Subject: [PATCH 053/103] meta: bump actions/cache from 4.2.3 to 4.2.4 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Bumps [actions/cache](https://github.com/actions/cache) from 4.2.3 to 4.2.4. - [Release notes](https://github.com/actions/cache/releases) - [Changelog](https://github.com/actions/cache/blob/main/RELEASES.md) - [Commits](https://github.com/actions/cache/compare/5a3ec84eff668545956fd18022155c47e93e2684...0400d5f644dc74513175e3cd8d07132dd4860809) --- updated-dependencies: - dependency-name: actions/cache dependency-version: 4.2.4 dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] PR-URL: https://github.com/nodejs/node/pull/59727 Reviewed-By: Rafael Gonzaga Reviewed-By: Ulises Gascón Reviewed-By: Luigi Pinca --- .github/workflows/update-v8.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/update-v8.yml b/.github/workflows/update-v8.yml index 77a36ed5aa4a8f..d45ecd102a016b 100644 --- a/.github/workflows/update-v8.yml +++ b/.github/workflows/update-v8.yml @@ -20,7 +20,7 @@ jobs: with: persist-credentials: false - name: Cache node modules and update-v8 - uses: actions/cache@5a3ec84eff668545956fd18022155c47e93e2684 # v4.2.3 + uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4 id: cache-v8-npm env: cache-name: cache-v8-npm From 01b66b122ee94a257645b74d8428ae361bd76121 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 4 Sep 2025 11:42:21 +0000 Subject: [PATCH 054/103] meta: bump github/codeql-action from 3.29.2 to 3.30.0 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Bumps [github/codeql-action](https://github.com/github/codeql-action) from 3.29.2 to 3.30.0. - [Release notes](https://github.com/github/codeql-action/releases) - [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md) - [Commits](https://github.com/github/codeql-action/compare/181d5eefc20863364f96762470ba6f862bdef56b...2d92b76c45b91eb80fc44c74ce3fce0ee94e8f9d) --- updated-dependencies: - dependency-name: github/codeql-action dependency-version: 3.30.0 dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] PR-URL: https://github.com/nodejs/node/pull/59728 Reviewed-By: Ulises Gascón Reviewed-By: Luigi Pinca --- .github/workflows/codeql.yml | 6 +++--- .github/workflows/scorecard.yml | 2 +- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/.github/workflows/codeql.yml b/.github/workflows/codeql.yml index 7726d5040d7a58..e4e2eabfe7ef1d 100644 --- a/.github/workflows/codeql.yml +++ b/.github/workflows/codeql.yml @@ -27,15 +27,15 @@ jobs: # Initializes the CodeQL tools for scanning. - name: Initialize CodeQL - uses: github/codeql-action/init@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2 + uses: github/codeql-action/init@2d92b76c45b91eb80fc44c74ce3fce0ee94e8f9d # v3.30.0 with: languages: ${{ matrix.language }} config-file: ./.github/codeql-config.yml - name: Autobuild - uses: github/codeql-action/autobuild@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2 + uses: github/codeql-action/autobuild@2d92b76c45b91eb80fc44c74ce3fce0ee94e8f9d # v3.30.0 - name: Perform CodeQL Analysis - uses: github/codeql-action/analyze@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2 + uses: github/codeql-action/analyze@2d92b76c45b91eb80fc44c74ce3fce0ee94e8f9d # v3.30.0 with: category: /language:${{matrix.language}} diff --git a/.github/workflows/scorecard.yml b/.github/workflows/scorecard.yml index 37f60e52cdac7c..2b4dfa7189645e 100644 --- a/.github/workflows/scorecard.yml +++ b/.github/workflows/scorecard.yml @@ -73,6 +73,6 @@ jobs: # Upload the results to GitHub's code scanning dashboard. - name: Upload to code-scanning - uses: github/codeql-action/upload-sarif@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2 + uses: github/codeql-action/upload-sarif@2d92b76c45b91eb80fc44c74ce3fce0ee94e8f9d # v3.30.0 with: sarif_file: results.sarif From 0bf9775ee2dba9dce47b834ba5201c532bf66faf Mon Sep 17 00:00:00 2001 From: Joyee Cheung Date: Thu, 4 Sep 2025 13:58:50 +0200 Subject: [PATCH 055/103] sea: implement sea.getAssetKeys() This adds a new API to allow the bundled script in SEA to query the list of assets. PR-URL: https://github.com/nodejs/node/pull/59661 Refs: https://github.com/nodejs/single-executable/discussions/112 Reviewed-By: Darshan Sen --- doc/api/single-executable-applications.md | 23 +++++- lib/sea.js | 16 +++- src/node_sea.cc | 22 ++++++ test/fixtures/sea/get-asset-keys.js | 9 +++ test/parallel/test-sea-get-asset-keys.js | 11 +++ test/sequential/sequential.status | 2 + ...executable-application-asset-keys-empty.js | 65 ++++++++++++++++ ...ingle-executable-application-asset-keys.js | 74 +++++++++++++++++++ 8 files changed, 218 insertions(+), 4 deletions(-) create mode 100644 test/fixtures/sea/get-asset-keys.js create mode 100644 test/parallel/test-sea-get-asset-keys.js create mode 100644 test/sequential/test-single-executable-application-asset-keys-empty.js create mode 100644 test/sequential/test-single-executable-application-asset-keys.js diff --git a/doc/api/single-executable-applications.md b/doc/api/single-executable-applications.md index 41e4bd7383ef7f..d1000bcbef269a 100644 --- a/doc/api/single-executable-applications.md +++ b/doc/api/single-executable-applications.md @@ -221,7 +221,10 @@ executable, users can retrieve the assets using the [`sea.getAsset()`][] and The single-executable application can access the assets as follows: ```cjs -const { getAsset, getAssetAsBlob, getRawAsset } = require('node:sea'); +const { getAsset, getAssetAsBlob, getRawAsset, getAssetKeys } = require('node:sea'); +// Get all asset keys. +const keys = getAssetKeys(); +console.log(keys); // ['a.jpg', 'b.txt'] // Returns a copy of the data in an ArrayBuffer. const image = getAsset('a.jpg'); // Returns a string decoded from the asset as UTF8. @@ -232,8 +235,8 @@ const blob = getAssetAsBlob('a.jpg'); const raw = getRawAsset('a.jpg'); ``` -See documentation of the [`sea.getAsset()`][], [`sea.getAssetAsBlob()`][] and [`sea.getRawAsset()`][] -APIs for more information. +See documentation of the [`sea.getAsset()`][], [`sea.getAssetAsBlob()`][], +[`sea.getRawAsset()`][] and [`sea.getAssetKeys()`][] APIs for more information. ### Startup snapshot support @@ -429,6 +432,19 @@ writes to the returned array buffer is likely to result in a crash. `assets` field in the single-executable application configuration. * Returns: {ArrayBuffer} +### `sea.getAssetKeys()` + + + +* Returns {string\[]} An array containing all the keys of the assets + embedded in the executable. If no assets are embedded, returns an empty array. + +This method can be used to retrieve an array of all the keys of assets +embedded into the single-executable application. +An error is thrown when not running inside a single-executable application. + ### `require(id)` in the injected main script is not file based `require()` in the injected main script is not the same as the [`require()`][] @@ -503,6 +519,7 @@ to help us document them. [`require.main`]: modules.md#accessing-the-main-module [`sea.getAsset()`]: #seagetassetkey-encoding [`sea.getAssetAsBlob()`]: #seagetassetasblobkey-options +[`sea.getAssetKeys()`]: #seagetassetkeys [`sea.getRawAsset()`]: #seagetrawassetkey [`v8.startupSnapshot.setDeserializeMainFunction()`]: v8.md#v8startupsnapshotsetdeserializemainfunctioncallback-data [`v8.startupSnapshot` API]: v8.md#startup-snapshot-api diff --git a/lib/sea.js b/lib/sea.js index f7727014c4e3c9..5da9a75d095d7a 100644 --- a/lib/sea.js +++ b/lib/sea.js @@ -3,7 +3,7 @@ const { ArrayBufferPrototypeSlice, } = primordials; -const { isSea, getAsset: getAssetInternal } = internalBinding('sea'); +const { isSea, getAsset: getAssetInternal, getAssetKeys: getAssetKeysInternal } = internalBinding('sea'); const { TextDecoder } = require('internal/encoding'); const { validateString } = require('internal/validators'); const { @@ -68,9 +68,23 @@ function getAssetAsBlob(key, options) { return new Blob([asset], options); } +/** + * Returns an array of all the keys of assets embedded into the + * single-executable application. + * @returns {string[]} + */ +function getAssetKeys() { + if (!isSea()) { + throw new ERR_NOT_IN_SINGLE_EXECUTABLE_APPLICATION(); + } + + return getAssetKeysInternal() || []; +} + module.exports = { isSea, getAsset, getRawAsset, getAssetAsBlob, + getAssetKeys, }; diff --git a/src/node_sea.cc b/src/node_sea.cc index 49071304262f10..e66a2299db74eb 100644 --- a/src/node_sea.cc +++ b/src/node_sea.cc @@ -29,6 +29,7 @@ #include using node::ExitCode; +using v8::Array; using v8::ArrayBuffer; using v8::BackingStore; using v8::Context; @@ -807,6 +808,25 @@ void GetAsset(const FunctionCallbackInfo& args) { args.GetReturnValue().Set(ab); } +void GetAssetKeys(const FunctionCallbackInfo& args) { + CHECK_EQ(args.Length(), 0); + Isolate* isolate = args.GetIsolate(); + SeaResource sea_resource = FindSingleExecutableResource(); + + Local context = isolate->GetCurrentContext(); + LocalVector keys(isolate); + keys.reserve(sea_resource.assets.size()); + for (const auto& [key, _] : sea_resource.assets) { + Local key_str; + if (!ToV8Value(context, key).ToLocal(&key_str)) { + return; + } + keys.push_back(key_str); + } + Local result = Array::New(isolate, keys.data(), keys.size()); + args.GetReturnValue().Set(result); +} + MaybeLocal LoadSingleExecutableApplication( const StartExecutionCallbackInfo& info) { // Here we are currently relying on the fact that in NodeMainInstance::Run(), @@ -858,12 +878,14 @@ void Initialize(Local target, "isExperimentalSeaWarningNeeded", IsExperimentalSeaWarningNeeded); SetMethod(context, target, "getAsset", GetAsset); + SetMethod(context, target, "getAssetKeys", GetAssetKeys); } void RegisterExternalReferences(ExternalReferenceRegistry* registry) { registry->Register(IsSea); registry->Register(IsExperimentalSeaWarningNeeded); registry->Register(GetAsset); + registry->Register(GetAssetKeys); } } // namespace sea diff --git a/test/fixtures/sea/get-asset-keys.js b/test/fixtures/sea/get-asset-keys.js new file mode 100644 index 00000000000000..2330b8fa3d9549 --- /dev/null +++ b/test/fixtures/sea/get-asset-keys.js @@ -0,0 +1,9 @@ +'use strict'; + +const { isSea, getAssetKeys } = require('node:sea'); +const assert = require('node:assert'); + +assert(isSea()); + +const keys = getAssetKeys(); +console.log('Asset keys:', JSON.stringify(keys.sort())); diff --git a/test/parallel/test-sea-get-asset-keys.js b/test/parallel/test-sea-get-asset-keys.js new file mode 100644 index 00000000000000..8fe7fed4f0c78a --- /dev/null +++ b/test/parallel/test-sea-get-asset-keys.js @@ -0,0 +1,11 @@ +'use strict'; + +require('../common'); + +const { getAssetKeys } = require('node:sea'); +const assert = require('node:assert'); + +// Test that getAssetKeys throws when not in SEA +assert.throws(() => getAssetKeys(), { + code: 'ERR_NOT_IN_SINGLE_EXECUTABLE_APPLICATION' +}); diff --git a/test/sequential/sequential.status b/test/sequential/sequential.status index 63414f94193595..b1cdd9c368d5d6 100644 --- a/test/sequential/sequential.status +++ b/test/sequential/sequential.status @@ -57,6 +57,8 @@ test-watch-mode-inspect: SKIP test-single-executable-application: SKIP test-single-executable-application-assets: SKIP test-single-executable-application-assets-raw: SKIP +test-single-executable-application-asset-keys-empty: SKIP +test-single-executable-application-asset-keys: SKIP test-single-executable-application-disable-experimental-sea-warning: SKIP test-single-executable-application-empty: SKIP test-single-executable-application-exec-argv: SKIP diff --git a/test/sequential/test-single-executable-application-asset-keys-empty.js b/test/sequential/test-single-executable-application-asset-keys-empty.js new file mode 100644 index 00000000000000..77892699a95794 --- /dev/null +++ b/test/sequential/test-single-executable-application-asset-keys-empty.js @@ -0,0 +1,65 @@ +'use strict'; + +// This test verifies that the `getAssetKeys()` function works correctly +// in a single executable application without any assets. + +require('../common'); + +const { + generateSEA, + skipIfSingleExecutableIsNotSupported, +} = require('../common/sea'); + +skipIfSingleExecutableIsNotSupported(); + +const tmpdir = require('../common/tmpdir'); + +const { copyFileSync, writeFileSync, existsSync } = require('fs'); +const { + spawnSyncAndExitWithoutError, + spawnSyncAndAssert, +} = require('../common/child_process'); +const assert = require('assert'); +const fixtures = require('../common/fixtures'); + +const seaPrepBlob = tmpdir.resolve('sea-prep.blob'); +const outputFile = tmpdir.resolve(process.platform === 'win32' ? 'sea.exe' : 'sea'); + +tmpdir.refresh(); +copyFileSync(fixtures.path('sea', 'get-asset-keys.js'), tmpdir.resolve('sea.js')); + +writeFileSync(tmpdir.resolve('sea-config.json'), ` +{ + "main": "sea.js", + "output": "sea-prep.blob" +} +`, 'utf8'); + +spawnSyncAndExitWithoutError( + process.execPath, + ['--experimental-sea-config', 'sea-config.json'], + { + env: { + NODE_DEBUG_NATIVE: 'SEA', + ...process.env, + }, + cwd: tmpdir.path + }, + {}); + +assert(existsSync(seaPrepBlob)); + +generateSEA(outputFile, process.execPath, seaPrepBlob); + +spawnSyncAndAssert( + outputFile, + { + env: { + ...process.env, + NODE_DEBUG_NATIVE: 'SEA', + } + }, + { + stdout: /Asset keys: \[\]/, + } +); diff --git a/test/sequential/test-single-executable-application-asset-keys.js b/test/sequential/test-single-executable-application-asset-keys.js new file mode 100644 index 00000000000000..e210f61579b342 --- /dev/null +++ b/test/sequential/test-single-executable-application-asset-keys.js @@ -0,0 +1,74 @@ +'use strict'; + +// This test verifies that the `getAssetKeys()` function works correctly +// in a single executable application with assets. + +require('../common'); + +const { + generateSEA, + skipIfSingleExecutableIsNotSupported, +} = require('../common/sea'); + +skipIfSingleExecutableIsNotSupported(); + +const tmpdir = require('../common/tmpdir'); + +const { copyFileSync, writeFileSync, existsSync } = require('fs'); +const { + spawnSyncAndExitWithoutError, + spawnSyncAndAssert, +} = require('../common/child_process'); +const assert = require('assert'); +const fixtures = require('../common/fixtures'); + +const configFile = tmpdir.resolve('sea-config.json'); +const seaPrepBlob = tmpdir.resolve('sea-prep.blob'); +const outputFile = tmpdir.resolve(process.platform === 'win32' ? 'sea.exe' : 'sea'); + +tmpdir.refresh(); +copyFileSync(fixtures.path('sea', 'get-asset-keys.js'), tmpdir.resolve('sea.js')); +writeFileSync(tmpdir.resolve('asset-1.txt'), 'This is asset 1'); +writeFileSync(tmpdir.resolve('asset-2.txt'), 'This is asset 2'); +writeFileSync(tmpdir.resolve('asset-3.txt'), 'This is asset 3'); + +writeFileSync(configFile, ` +{ + "main": "sea.js", + "output": "sea-prep.blob", + "assets": { + "asset-1.txt": "asset-1.txt", + "asset-2.txt": "asset-2.txt", + "asset-3.txt": "asset-3.txt" + } +} +`, 'utf8'); + +spawnSyncAndExitWithoutError( + process.execPath, + ['--experimental-sea-config', 'sea-config.json'], + { + env: { + NODE_DEBUG_NATIVE: 'SEA', + ...process.env, + }, + cwd: tmpdir.path + }, + {}); + +assert(existsSync(seaPrepBlob)); + +generateSEA(outputFile, process.execPath, seaPrepBlob); + +spawnSyncAndAssert( + outputFile, + { + env: { + ...process.env, + NODE_DEBUG_NATIVE: 'SEA', + } + }, + { + stdout: /Asset keys: \["asset-1\.txt","asset-2\.txt","asset-3\.txt"\]/, + } +); From 45d148d9be58bc60fd954a61c59e1c89c86b864a Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 4 Sep 2025 12:54:13 +0000 Subject: [PATCH 056/103] meta: bump actions/download-artifact from 4.3.0 to 5.0.0 Bumps [actions/download-artifact](https://github.com/actions/download-artifact) from 4.3.0 to 5.0.0. - [Release notes](https://github.com/actions/download-artifact/releases) - [Commits](https://github.com/actions/download-artifact/compare/d3f86a106a0bac45b974a628896c90dbdf5c8093...634f93cb2916e3fdff6788551b99b062d0335ce0) --- updated-dependencies: - dependency-name: actions/download-artifact dependency-version: 5.0.0 dependency-type: direct:production update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] PR-URL: https://github.com/nodejs/node/pull/59729 Reviewed-By: Luigi Pinca Reviewed-By: Antoine du Hamel --- .github/workflows/build-tarball.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/build-tarball.yml b/.github/workflows/build-tarball.yml index 2ad8feb31495d0..c9b38f314589b1 100644 --- a/.github/workflows/build-tarball.yml +++ b/.github/workflows/build-tarball.yml @@ -86,7 +86,7 @@ jobs: - name: Environment Information run: npx envinfo - name: Download tarball - uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0 + uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0 with: name: tarballs path: tarballs From 9b5eb6eb5043bb528bb8e3aa2dd8830c4d902f33 Mon Sep 17 00:00:00 2001 From: Nam Yooseong <102887277+meteorqz6@users.noreply.github.com> Date: Fri, 5 Sep 2025 13:38:43 +0900 Subject: [PATCH 057/103] doc: fix missing links in the `errors` page PR-URL: https://github.com/nodejs/node/pull/59427 Reviewed-By: Daeyeon Jeong --- doc/api/errors.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/doc/api/errors.md b/doc/api/errors.md index 3bef4a5cdfe2b4..3971a5e2aae48f 100644 --- a/doc/api/errors.md +++ b/doc/api/errors.md @@ -2183,8 +2183,8 @@ contains the URL that failed to parse. ### `ERR_INVALID_URL_PATTERN` -An invalid URLPattern was passed to the [WHATWG][WHATWG URL API] \[`URLPattern` -constructor]\[`new URLPattern(input)`] to be parsed. +An invalid URLPattern was passed to the [WHATWG][WHATWG URL API] +[`URLPattern` constructor][`new URLPattern(input)`] to be parsed. @@ -4404,6 +4404,7 @@ An error occurred trying to allocate memory. This should never happen. [`net.Socket.write()`]: net.md#socketwritedata-encoding-callback [`net`]: net.md [`new URL(input)`]: url.md#new-urlinput-base +[`new URLPattern(input)`]: url.md#new-urlpatternstring-baseurl-options [`new URLSearchParams(iterable)`]: url.md#new-urlsearchparamsiterable [`package.json`]: packages.md#nodejs-packagejson-field-definitions [`postMessage()`]: worker_threads.md#portpostmessagevalue-transferlist From bb0755df37be8a7c82753f6aa66a79d1a92ce660 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Fri, 5 Sep 2025 08:03:11 +0200 Subject: [PATCH 058/103] meta: bump `codecov/codecov-action` Bumps `codecov/codecov-action` from 5.4.3 to 5.5.0. - [Release notes](https://github.com/codecov/codecov-action/releases) - [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md) - [Commits](https://github.com/codecov/codecov-action/compare/39a2af19d...5.5.0) Signed-off-by: dependabot[bot] PR-URL: https://github.com/nodejs/node/pull/59726 Refs: https://github.com/codecov/codecov-action/commit/fdcc8476540edceab3de004e990f80d881c6cc00 Reviewed-By: Antoine du Hamel Reviewed-By: Rich Trott --- .github/workflows/coverage-linux-without-intl.yml | 2 +- .github/workflows/coverage-linux.yml | 2 +- .github/workflows/coverage-windows.yml | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/.github/workflows/coverage-linux-without-intl.yml b/.github/workflows/coverage-linux-without-intl.yml index 4a95db7c2fc9a7..3127971713fb54 100644 --- a/.github/workflows/coverage-linux-without-intl.yml +++ b/.github/workflows/coverage-linux-without-intl.yml @@ -79,6 +79,6 @@ jobs: - name: Clean tmp run: rm -rf coverage/tmp && rm -rf out - name: Upload - uses: codecov/codecov-action@39a2af19d997be74586469d4062e173ecae614f6 # v5.4.3+ + uses: codecov/codecov-action@fdcc8476540edceab3de004e990f80d881c6cc00 # v5.5.0 with: directory: ./coverage diff --git a/.github/workflows/coverage-linux.yml b/.github/workflows/coverage-linux.yml index 4e378ebb02b20c..edefa86ae2ef29 100644 --- a/.github/workflows/coverage-linux.yml +++ b/.github/workflows/coverage-linux.yml @@ -79,6 +79,6 @@ jobs: - name: Clean tmp run: rm -rf coverage/tmp && rm -rf out - name: Upload - uses: codecov/codecov-action@39a2af19d997be74586469d4062e173ecae614f6 # v5.4.3+ + uses: codecov/codecov-action@fdcc8476540edceab3de004e990f80d881c6cc00 # v5.5.0 with: directory: ./coverage diff --git a/.github/workflows/coverage-windows.yml b/.github/workflows/coverage-windows.yml index 67b895d0b66635..81b4fc4001eebb 100644 --- a/.github/workflows/coverage-windows.yml +++ b/.github/workflows/coverage-windows.yml @@ -71,6 +71,6 @@ jobs: - name: Clean tmp run: npx rimraf ./coverage/tmp - name: Upload - uses: codecov/codecov-action@39a2af19d997be74586469d4062e173ecae614f6 # v5.4.3+ + uses: codecov/codecov-action@fdcc8476540edceab3de004e990f80d881c6cc00 # v5.5.0 with: directory: ./coverage From 41245ad4c741a164a1860d1922b944ec90a05c07 Mon Sep 17 00:00:00 2001 From: Richard Lau Date: Fri, 5 Sep 2025 13:39:32 +0100 Subject: [PATCH 059/103] test: skip more sea tests on Linux ppc64le Theses tests are failing when compiled with clang. Skip for now to avoid breaking the CI when we switch over to building with clang. PR-URL: https://github.com/nodejs/node/pull/59755 Refs: https://github.com/nodejs/node/issues/59561 Reviewed-By: Antoine du Hamel Reviewed-By: Ruben Bridgewater --- test/sequential/sequential.status | 3 +++ 1 file changed, 3 insertions(+) diff --git a/test/sequential/sequential.status b/test/sequential/sequential.status index b1cdd9c368d5d6..e87e76b34aff7b 100644 --- a/test/sequential/sequential.status +++ b/test/sequential/sequential.status @@ -63,6 +63,9 @@ test-single-executable-application-disable-experimental-sea-warning: SKIP test-single-executable-application-empty: SKIP test-single-executable-application-exec-argv: SKIP test-single-executable-application-exec-argv-empty: SKIP +test-single-executable-application-exec-argv-extension-cli: SKIP +test-single-executable-application-exec-argv-extension-env: SKIP +test-single-executable-application-exec-argv-extension-none: SKIP test-single-executable-application-inspect-in-sea-flags: SKIP test-single-executable-application-inspect: SKIP test-single-executable-application-snapshot: SKIP From 70d2d6d47966a567eacaf505b3f9223d78250f34 Mon Sep 17 00:00:00 2001 From: Joyee Cheung Date: Fri, 5 Sep 2025 19:03:16 +0200 Subject: [PATCH 060/103] url: add err.input to ERR_INVALID_FILE_URL_PATH Otherwise there's no information from the error about what exactly is the invalid URL. PR-URL: https://github.com/nodejs/node/pull/59730 Reviewed-By: Yagiz Nizipli Reviewed-By: Anna Henningsen --- doc/api/errors.md | 3 +++ lib/internal/errors.js | 5 +++- lib/internal/url.js | 7 +++--- test/parallel/test-url-fileurltopath.js | 18 ------------- .../test-url-invalid-file-url-path-input.js | 25 +++++++++++++++++++ 5 files changed, 36 insertions(+), 22 deletions(-) create mode 100644 test/parallel/test-url-invalid-file-url-path-input.js diff --git a/doc/api/errors.md b/doc/api/errors.md index 3971a5e2aae48f..89c5b4d1e0117f 100644 --- a/doc/api/errors.md +++ b/doc/api/errors.md @@ -1996,6 +1996,9 @@ A Node.js API that consumes `file:` URLs (such as certain functions in the [`fs`][] module) encountered a file URL with an incompatible path. The exact semantics for determining whether a path can be used is platform-dependent. +The thrown error object includes an `input` property that contains the URL object +of the invalid `file:` URL. + ### `ERR_INVALID_HANDLE_TYPE` diff --git a/lib/internal/errors.js b/lib/internal/errors.js index db4d47381b06eb..d9446fb9c8e633 100644 --- a/lib/internal/errors.js +++ b/lib/internal/errors.js @@ -1476,7 +1476,10 @@ E('ERR_INVALID_FD', E('ERR_INVALID_FD_TYPE', 'Unsupported fd type: %s', TypeError); E('ERR_INVALID_FILE_URL_HOST', 'File URL host must be "localhost" or empty on %s', TypeError); -E('ERR_INVALID_FILE_URL_PATH', 'File URL path %s', TypeError); +E('ERR_INVALID_FILE_URL_PATH', function(reason, input) { + this.input = input; + return `File URL path ${reason}`; +}, TypeError); E('ERR_INVALID_HANDLE_TYPE', 'This handle type cannot be sent', TypeError); E('ERR_INVALID_HTTP_TOKEN', '%s must be a valid HTTP token ["%s"]', TypeError, HideStackFramesError); E('ERR_INVALID_IP_ADDRESS', 'Invalid IP address: %s', TypeError); diff --git a/lib/internal/url.js b/lib/internal/url.js index 77f148144b391d..9105940b2a45a0 100644 --- a/lib/internal/url.js +++ b/lib/internal/url.js @@ -1463,7 +1463,7 @@ function getPathFromURLWin32(url) { if ((pathname[n + 1] === '2' && third === 102) || // 2f 2F / (pathname[n + 1] === '5' && third === 99)) { // 5c 5C \ throw new ERR_INVALID_FILE_URL_PATH( - 'must not include encoded \\ or / characters', + 'must not include encoded \\ or / characters', url, ); } } @@ -1484,7 +1484,7 @@ function getPathFromURLWin32(url) { const sep = StringPrototypeCharAt(pathname, 2); if (letter < CHAR_LOWERCASE_A || letter > CHAR_LOWERCASE_Z || // a..z A..Z (sep !== ':')) { - throw new ERR_INVALID_FILE_URL_PATH('must be absolute'); + throw new ERR_INVALID_FILE_URL_PATH('must be absolute', url); } return StringPrototypeSlice(pathname, 1); } @@ -1551,7 +1551,7 @@ function getPathBufferFromURLWin32(url) { if (letter < CHAR_LOWERCASE_A || letter > CHAR_LOWERCASE_Z || // a..z A..Z (sep !== CHAR_COLON)) { - throw new ERR_INVALID_FILE_URL_PATH('must be absolute'); + throw new ERR_INVALID_FILE_URL_PATH('must be absolute', url); } // Now, we'll just return everything except the first byte of @@ -1570,6 +1570,7 @@ function getPathFromURLPosix(url) { if (pathname[n + 1] === '2' && third === 102) { throw new ERR_INVALID_FILE_URL_PATH( 'must not include encoded / characters', + url, ); } } diff --git a/test/parallel/test-url-fileurltopath.js b/test/parallel/test-url-fileurltopath.js index 338efacaa1a62c..e042e1aa6c222c 100644 --- a/test/parallel/test-url-fileurltopath.js +++ b/test/parallel/test-url-fileurltopath.js @@ -31,24 +31,6 @@ test('fileURLToPath with host', () => { } }); -test('fileURLToPath with invalid path', () => { - if (isWindows) { - assert.throws(() => url.fileURLToPath('file:///C:/a%2F/'), { - code: 'ERR_INVALID_FILE_URL_PATH' - }); - assert.throws(() => url.fileURLToPath('file:///C:/a%5C/'), { - code: 'ERR_INVALID_FILE_URL_PATH' - }); - assert.throws(() => url.fileURLToPath('file:///?:/'), { - code: 'ERR_INVALID_FILE_URL_PATH' - }); - } else { - assert.throws(() => url.fileURLToPath('file:///a%2F/'), { - code: 'ERR_INVALID_FILE_URL_PATH' - }); - } -}); - const windowsTestCases = [ // Lowercase ascii alpha { path: 'C:\\foo', fileURL: 'file:///C:/foo' }, diff --git a/test/parallel/test-url-invalid-file-url-path-input.js b/test/parallel/test-url-invalid-file-url-path-input.js new file mode 100644 index 00000000000000..f47df380d7eff3 --- /dev/null +++ b/test/parallel/test-url-invalid-file-url-path-input.js @@ -0,0 +1,25 @@ +'use strict'; + +// This tests that url.fileURLToPath() throws ERR_INVALID_FILE_URL_PATH +// for invalid file URL paths along with the input property. + +const { isWindows } = require('../common'); +const assert = require('assert'); +const url = require('url'); + +const inputs = []; + +if (isWindows) { + inputs.push('file:///C:/a%2F/', 'file:///C:/a%5C/', 'file:///?:/'); +} else { + inputs.push('file:///a%2F/'); +} + +for (const input of inputs) { + assert.throws(() => url.fileURLToPath(input), (err) => { + assert.strictEqual(err.code, 'ERR_INVALID_FILE_URL_PATH'); + assert(err.input instanceof URL); + assert.strictEqual(err.input.href, input); + return true; + }); +} From 65e4e68c90ab1369f8928559673f5f4ec32b3549 Mon Sep 17 00:00:00 2001 From: Ruben Bridgewater Date: Fri, 5 Sep 2025 13:15:12 -0400 Subject: [PATCH 061/103] util: hide duplicated stack frames when using util.inspect MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Long stack traces often have duplicated stack frames from recursive calls. These make it difficult to identify important parts of the stack. This hides the duplicated ones and notifies the user which lines were hidden. PR-URL: https://github.com/nodejs/node/pull/59447 Reviewed-By: James M Snell Reviewed-By: Michaël Zasso Reviewed-By: Chengzhong Wu Reviewed-By: Moshe Atlow Reviewed-By: Jordan Harband --- lib/internal/util/inspect.js | 134 ++++++++++++++++++++++- test/parallel/test-util-inspect.js | 169 +++++++++++++++++++++++++++++ 2 files changed, 300 insertions(+), 3 deletions(-) diff --git a/lib/internal/util/inspect.js b/lib/internal/util/inspect.js index bc25cfdd3899b6..adbf5b72fa1435 100644 --- a/lib/internal/util/inspect.js +++ b/lib/internal/util/inspect.js @@ -1315,13 +1315,122 @@ function identicalSequenceRange(a, b) { len++; } if (len > 3) { - return { len, offset: i }; + return [len, i]; } } } } - return { len: 0, offset: 0 }; + return [0, 0]; +} + +function getDuplicateErrorFrameRanges(frames) { + // Build a map: frame line -> sorted list of indices where it occurs + const result = []; + const lineToPositions = new SafeMap(); + + for (let i = 0; i < frames.length; i++) { + const positions = lineToPositions.get(frames[i]); + if (positions === undefined) { + lineToPositions.set(frames[i], [i]); + } else { + positions[positions.length] = i; + } + } + + const minimumDuplicateRange = 3; + // Not enough duplicate lines to consider collapsing + if (frames.length - lineToPositions.size <= minimumDuplicateRange) { + return result; + } + + for (let i = 0; i < frames.length - minimumDuplicateRange; i++) { + const positions = lineToPositions.get(frames[i]); + // Find the next occurrence of the same line after i, if any + if (positions.length === 1 || positions[positions.length - 1] === i) { + continue; + } + + const current = positions.indexOf(i) + 1; + if (current === positions.length) { + continue; + } + + // Theoretical maximum range, adjusted while iterating + let range = positions[positions.length - 1] - i; + if (range < minimumDuplicateRange) { + continue; + } + let extraSteps; + if (current + 1 < positions.length) { + // Optimize initial step size by choosing the greatest common divisor (GCD) + // of all candidate distances to the same frame line. This tends to match + // the true repeating block size and minimizes fallback iterations. + let gcdRange = 0; + for (let j = current; j < positions.length; j++) { + let distance = positions[j] - i; + while (distance !== 0) { + const remainder = gcdRange % distance; + if (gcdRange !== 0) { + // Add other possible ranges as fallback + extraSteps ??= new SafeSet(); + extraSteps.add(gcdRange); + } + gcdRange = distance; + distance = remainder; + } + if (gcdRange === 1) break; + } + range = gcdRange; + if (extraSteps) { + extraSteps.delete(range); + extraSteps = [...extraSteps]; + } + } + let maxRange = range; + let maxDuplicates = 0; + + let duplicateRanges = 0; + + for (let nextStart = i + range; /* ignored */ ; nextStart += range) { + let equalFrames = 0; + for (let j = 0; j < range; j++) { + if (frames[i + j] !== frames[nextStart + j]) { + break; + } + equalFrames++; + } + // Adjust the range to match different type of ranges. + if (equalFrames !== range) { + if (!extraSteps?.length) { + break; + } + // Memorize former range in case the smaller one would hide less. + if (duplicateRanges !== 0 && maxRange * maxDuplicates < range * duplicateRanges) { + maxRange = range; + maxDuplicates = duplicateRanges; + } + range = extraSteps.pop(); + nextStart = i; + duplicateRanges = 0; + continue; + } + duplicateRanges++; + } + + if (maxDuplicates !== 0 && maxRange * maxDuplicates >= range * duplicateRanges) { + range = maxRange; + duplicateRanges = maxDuplicates; + } + + if (duplicateRanges * range >= 3) { + result.push(i + range, range, duplicateRanges); + // Skip over the collapsed portion to avoid overlapping matches. + i += range * (duplicateRanges + 1) - 1; + } + } + + return result; } function getStackString(ctx, error) { @@ -1355,7 +1464,7 @@ function getStackFrames(ctx, err, stack) { const causeStackStart = StringPrototypeIndexOf(causeStack, '\n at'); if (causeStackStart !== -1) { const causeFrames = StringPrototypeSplit(StringPrototypeSlice(causeStack, causeStackStart + 1), '\n'); - const { len, offset } = identicalSequenceRange(frames, causeFrames); + const { 0: len, 1: offset } = identicalSequenceRange(frames, causeFrames); if (len > 0) { const skipped = len - 2; const msg = ` ... ${skipped} lines matching cause stack trace ...`; @@ -1363,6 +1472,25 @@ function getStackFrames(ctx, err, stack) { } } } + + // Remove recursive repetitive stack frames in long stacks + if (frames.length > 10) { + const ranges = getDuplicateErrorFrameRanges(frames); + + for (let i = ranges.length - 3; i >= 0; i -= 3) { + const offset = ranges[i]; + const length = ranges[i + 1]; + const duplicateRanges = ranges[i + 2]; + + const msg = ` ... collapsed ${length * duplicateRanges} duplicate lines ` + + 'matching above ' + + (duplicateRanges > 1 ? + `${length} lines ${duplicateRanges} times...` : + 'lines ...'); + frames.splice(offset, length * duplicateRanges, ctx.stylize(msg, 'undefined')); + } + } + return frames; } diff --git a/test/parallel/test-util-inspect.js b/test/parallel/test-util-inspect.js index 5aafb4378c18e1..054526268a0ae9 100644 --- a/test/parallel/test-util-inspect.js +++ b/test/parallel/test-util-inspect.js @@ -2920,6 +2920,175 @@ assert.strictEqual( process.cwd = originalCWD; } +{ + // Use a fake stack to verify the expected colored outcome. + const err = new Error('Hide duplicate frames in long stack'); + err.stack = [ + 'Error: Hide duplicate frames in long stack', + ' at A. (/foo/node_modules/bar/baz.js:2:7)', + ' at A. (/foo/node_modules/bar/baz.js:2:7)', + ' at Module._compile (node:internal/modules/cjs/loader:827:30)', + ' at Fancy (node:vm:697:32)', + ' at tryModuleLoad (node:internal/modules/cjs/foo:629:12)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Fancy (node:vm:697:32)', + ' at tryModuleLoad (node:internal/modules/cjs/foo:629:12)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Module.require [as weird/name] (node:internal/aaaaa/loader:735:19)', + ' at require (node:internal/modules/helpers:14:16)', + ' at Array.forEach ()', + ' at require (node:internal/modules/helpers:14:16)', + ' at Array.forEach ()', + ` at foobar/test/parallel/test-util-inspect.js:2760:12`, + ` at Object. (foobar/node_modules/m/folder/file.js:2753:10)`, + ' at Module.require [as weird/name] (node:internal/aaaaa/loader:735:19)', + ' at require (node:internal/modules/helpers:14:16)', + ' at Array.forEach ()', + ` at foobar/test/parallel/test-util-inspect.js:2760:12`, + ` at Object. (foobar/node_modules/m/folder/file.js:2753:10)`, + ' at Module.require [as weird/name] (node:internal/aaaaa/loader:735:19)', + ' at require (node:internal/modules/helpers:14:16)', + ' at Array.forEach ()', + ` at foobar/test/parallel/test-util-inspect.js:2760:12`, + ` at Object. (foobar/node_modules/m/folder/file.js:2753:10)`, + ' at Module.require [as weird/name] (node:internal/aaaaa/loader:735:19)', + ' at require (node:internal/modules/helpers:14:16)', + ' at Array.forEach ()', + ` at foobar/test/parallel/test-util-inspect.js:2760:12`, + ` at Object. (foobar/node_modules/m/folder/file.js:2753:10)`, + ' at /test/test-util-inspect.js:2239:9', + ' at getActual (node:assert:592:5)', + ' at /test/test-util-inspect.js:2239:9', + ' at getActual (node:assert:592:5)', + ' at /test/test-util-inspect.js:2239:9', + ' at getActual (node:assert:592:5)', + ].join('\n'); + + assert.strictEqual( + util.inspect(err, { colors: true }), + 'Error: Hide duplicate frames in long stack\n' + + ' at A. (/foo/node_modules/\x1B[4mbar\x1B[24m/baz.js:2:7)\n' + + ' at A. (/foo/node_modules/\x1B[4mbar\x1B[24m/baz.js:2:7)\n' + + '\x1B[90m at Module._compile (node:internal/modules/cjs/loader:827:30)\x1B[39m\n' + + '\x1B[90m at Fancy (node:vm:697:32)\x1B[39m\n' + + ' at tryModuleLoad (node:internal/modules/cjs/foo:629:12)\n' + + '\x1B[90m at Function.Module._load (node:internal/modules/cjs/loader:621:3)\x1B[39m\n' + + '\x1B[90m ... collapsed 3 duplicate lines matching above lines ...\x1B[39m\n' + + + '\x1B[90m at Function.Module._load (node:internal/modules/cjs/loader:621:3)\x1B[39m\n' + + '\x1B[90m ... collapsed 5 duplicate lines matching above 1 lines 5 times...\x1B[39m\n' + + + ' at Module.require [as weird/name] (node:internal/aaaaa/loader:735:19)\n' + + '\x1B[90m at require (node:internal/modules/helpers:14:16)\x1B[39m\n' + + ' at Array.forEach ()\n' + + '\x1B[90m at require (node:internal/modules/helpers:14:16)\x1B[39m\n' + + ' at Array.forEach ()\n' + + ' at foobar/test/parallel/test-util-inspect.js:2760:12\n' + + ' at Object. (foobar/node_modules/\x1B[4mm\x1B[24m/folder/file.js:2753:10)\n' + + ' at Module.require [as weird/name] (node:internal/aaaaa/loader:735:19)\n' + + '\x1B[90m ... collapsed 10 duplicate lines matching above 5 lines 2 times...\x1B[39m\n' + + + '\x1B[90m at require (node:internal/modules/helpers:14:16)\x1B[39m\n' + + ' at Array.forEach ()\n' + + ' at foobar/test/parallel/test-util-inspect.js:2760:12\n' + + ' at Object. (foobar/node_modules/\x1B[4mm\x1B[24m/folder/file.js:2753:10)\n' + + ' at /test/test-util-inspect.js:2239:9\n' + + '\x1B[90m at getActual (node:assert:592:5)\x1B[39m\n' + + '\x1B[90m ... collapsed 4 duplicate lines matching above 2 lines 2 times...\x1B[39m', + ); + + // Use a fake stack to verify the expected colored outcome. + const err2 = new Error('Hide duplicate frames in long stack'); + err2.stack = [ + 'Error: Hide duplicate frames in long stack', + ' at A. (/foo/node_modules/bar/baz.js:2:7)', + ' at A. (/foo/node_modules/bar/baz.js:2:7)', + ' at Module._compile (node:internal/modules/cjs/loader:827:30)', + + // 3 + ' at Fancy (node:vm:697:32)', + ' at tryModuleLoad (node:internal/modules/cjs/foo:629:12)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Fancy (node:vm:697:32)', + ' at tryModuleLoad (node:internal/modules/cjs/foo:629:12)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + + // 6 * 1 + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + ' at Function.Module._load (node:internal/modules/cjs/loader:621:3)', + + // 10 + ' at require (node:internal/modules/helpers:14:16)', + ' at Array.forEach ()', + ` at foobar/test/parallel/test-util-inspect.js:2760:12`, + ` at Object. (foobar/node_modules/m/folder/file.js:2753:10)`, + ' at Module.require [as weird/name] (node:internal/aaaaa/loader:735:19)', + ' at Module.require [as weird/name] (node:internal/aaaaa/loader:735:19)', + ' at require (node:internal/modules/helpers:14:16)', + ' at Array.forEach ()', + ` at foobar/test/parallel/test-util-inspect.js:2760:12`, + ` at Object. (foobar/node_modules/m/folder/file.js:2753:10)`, + + ' at require (node:internal/modules/helpers:14:16)', + ' at Array.forEach ()', + ` at foobar/test/parallel/test-util-inspect.js:2760:12`, + ` at Object. (foobar/node_modules/m/folder/file.js:2753:10)`, + ' at Module.require [as weird/name] (node:internal/aaaaa/loader:735:19)', + ' at Module.require [as weird/name] (node:internal/aaaaa/loader:735:19)', + ' at require (node:internal/modules/helpers:14:16)', + ' at Array.forEach ()', + ` at foobar/test/parallel/test-util-inspect.js:2760:12`, + ` at Object. (foobar/node_modules/m/folder/file.js:2753:10)`, + + // 2 * 2 + ' at /test/test-util-inspect.js:2239:9', + ' at getActual (node:assert:592:5)', + ' at /test/test-util-inspect.js:2239:9', + ' at getActual (node:assert:592:5)', + ' at /test/test-util-inspect.js:2239:9', + ' at getActual (node:assert:592:5)', + ].join('\n'); + + assert.strictEqual( + util.inspect(err2, { colors: true }), + 'Error: Hide duplicate frames in long stack\n' + + ' at A. (/foo/node_modules/\x1B[4mbar\x1B[24m/baz.js:2:7)\n' + + ' at A. (/foo/node_modules/\x1B[4mbar\x1B[24m/baz.js:2:7)\n' + + '\x1B[90m at Module._compile (node:internal/modules/cjs/loader:827:30)\x1B[39m\n' + + '\x1B[90m at Fancy (node:vm:697:32)\x1B[39m\n' + + ' at tryModuleLoad (node:internal/modules/cjs/foo:629:12)\n' + + '\x1B[90m at Function.Module._load (node:internal/modules/cjs/loader:621:3)\x1B[39m\n' + + '\x1B[90m ... collapsed 3 duplicate lines matching above lines ...\x1B[39m\n' + + '\x1B[90m at Function.Module._load (node:internal/modules/cjs/loader:621:3)\x1B[39m\n' + + '\x1B[90m ... collapsed 6 duplicate lines matching above 1 lines 6 times...\x1B[39m\n' + + '\x1B[90m at require (node:internal/modules/helpers:14:16)\x1B[39m\n' + + ' at Array.forEach ()\n' + + ' at foobar/test/parallel/test-util-inspect.js:2760:12\n' + + ' at Object. (foobar/node_modules/\x1B[4mm\x1B[24m/folder/file.js:2753:10)\n' + + ' at Module.require [as weird/name] (node:internal/aaaaa/loader:735:19)\n' + + ' at Module.require [as weird/name] (node:internal/aaaaa/loader:735:19)\n' + + '\x1B[90m at require (node:internal/modules/helpers:14:16)\x1B[39m\n' + + ' at Array.forEach ()\n' + + ' at foobar/test/parallel/test-util-inspect.js:2760:12\n' + + ' at Object. (foobar/node_modules/\x1B[4mm\x1B[24m/folder/file.js:2753:10)\n' + + '\x1B[90m ... collapsed 10 duplicate lines matching above lines ...\x1B[39m\n' + + ' at /test/test-util-inspect.js:2239:9\n' + + '\x1B[90m at getActual (node:assert:592:5)\x1B[39m\n' + + '\x1B[90m ... collapsed 4 duplicate lines matching above 2 lines 2 times...\x1B[39m', + ); +} + { // Cross platform checks. const err = new Error('foo'); From a414c1eb51f0dda71336db6937a1a7d0be3d9b50 Mon Sep 17 00:00:00 2001 From: Kingsword Date: Sat, 6 Sep 2025 01:37:04 +0800 Subject: [PATCH 062/103] repl: fix REPL completion under unary expressions PR-URL: https://github.com/nodejs/node/pull/59744 Fixes: https://github.com/nodejs/node/issues/59735 Reviewed-By: Ruben Bridgewater Reviewed-By: Anna Henningsen --- lib/repl.js | 10 ++ ...est-repl-tab-complete-unary-expressions.js | 141 ++++++++++++++++++ 2 files changed, 151 insertions(+) create mode 100644 test/parallel/test-repl-tab-complete-unary-expressions.js diff --git a/lib/repl.js b/lib/repl.js index 3d66e928601f07..b95beb45643eb2 100644 --- a/lib/repl.js +++ b/lib/repl.js @@ -1744,6 +1744,16 @@ function findExpressionCompleteTarget(code) { return findExpressionCompleteTarget(lastDeclarationInitCode); } + // If the last statement is an expression statement with a unary operator (delete, typeof, etc.) + // we want to extract the argument for completion (e.g. for `delete obj.prop` we want `obj.prop`) + if (lastBodyStatement.type === 'ExpressionStatement' && + lastBodyStatement.expression.type === 'UnaryExpression' && + lastBodyStatement.expression.argument) { + const argument = lastBodyStatement.expression.argument; + const argumentCode = code.slice(argument.start, argument.end); + return findExpressionCompleteTarget(argumentCode); + } + // If any of the above early returns haven't activated then it means that // the potential complete target is the full code (e.g. the code represents // a simple partial identifier, a member expression, etc...) diff --git a/test/parallel/test-repl-tab-complete-unary-expressions.js b/test/parallel/test-repl-tab-complete-unary-expressions.js new file mode 100644 index 00000000000000..d84f0672b98151 --- /dev/null +++ b/test/parallel/test-repl-tab-complete-unary-expressions.js @@ -0,0 +1,141 @@ +'use strict'; + +const common = require('../common'); +const assert = require('assert'); +const repl = require('repl'); +const { describe, it } = require('node:test'); + +// This test verifies that tab completion works correctly with unary expressions +// like delete, typeof, void, etc. This is a regression test for the issue where +// typing "delete globalThis._" and then backspacing and typing "globalThis" +// would cause "globalThis is not defined" error. + +describe('REPL tab completion with unary expressions', () => { + it('should handle delete operator correctly', (t, done) => { + const r = repl.start({ + prompt: '', + input: process.stdin, + output: process.stdout, + terminal: false, + }); + + // Test delete with member expression + r.complete( + 'delete globalThis._', + common.mustSucceed((completions) => { + assert.strictEqual(completions[1], 'globalThis._'); + + // Test delete with identifier + r.complete( + 'delete globalThis', + common.mustSucceed((completions) => { + assert.strictEqual(completions[1], 'globalThis'); + r.close(); + done(); + }) + ); + }) + ); + }); + + it('should handle typeof operator correctly', (t, done) => { + const r = repl.start({ + prompt: '', + input: process.stdin, + output: process.stdout, + terminal: false, + }); + + r.complete( + 'typeof globalThis', + common.mustSucceed((completions) => { + assert.strictEqual(completions[1], 'globalThis'); + r.close(); + done(); + }) + ); + }); + + it('should handle void operator correctly', (t, done) => { + const r = repl.start({ + prompt: '', + input: process.stdin, + output: process.stdout, + terminal: false, + }); + + r.complete( + 'void globalThis', + common.mustSucceed((completions) => { + assert.strictEqual(completions[1], 'globalThis'); + r.close(); + done(); + }) + ); + }); + + it('should handle other unary operators correctly', (t, done) => { + const r = repl.start({ + prompt: '', + input: process.stdin, + output: process.stdout, + terminal: false, + }); + + const unaryOperators = [ + '!globalThis', + '+globalThis', + '-globalThis', + '~globalThis', + ]; + + let testIndex = 0; + + function testNext() { + if (testIndex >= unaryOperators.length) { + r.close(); + done(); + return; + } + + const testCase = unaryOperators[testIndex++]; + r.complete( + testCase, + common.mustSucceed((completions) => { + assert.strictEqual(completions[1], 'globalThis'); + testNext(); + }) + ); + } + + testNext(); + }); + + it('should still evaluate globalThis correctly after unary expression completion', (t, done) => { + const r = repl.start({ + prompt: '', + input: process.stdin, + output: process.stdout, + terminal: false, + }); + + // First trigger completion with delete + r.complete( + 'delete globalThis._', + common.mustSucceed(() => { + // Then evaluate globalThis + r.eval( + 'globalThis', + r.context, + 'test.js', + common.mustSucceed((result) => { + assert.strictEqual(typeof result, 'object'); + assert.ok(result !== null); + r.close(); + done(); + }) + ); + }) + ); + }); +}); From 45af6966aef3573729217e99ae5ab5024b3a7819 Mon Sep 17 00:00:00 2001 From: npm CLI robot Date: Fri, 5 Sep 2025 13:29:22 -0700 Subject: [PATCH 063/103] deps: upgrade npm to 11.6.0 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/59750 Reviewed-By: Luigi Pinca Reviewed-By: Jordan Harband Reviewed-By: Michaël Zasso Reviewed-By: Rafael Gonzaga --- deps/npm/bin/npm.ps1 | 10 +- deps/npm/bin/npx.ps1 | 10 +- deps/npm/docs/content/commands/npm-ls.md | 2 +- deps/npm/docs/content/commands/npm.md | 2 +- .../npm/docs/content/configuring-npm/npmrc.md | 5 +- deps/npm/docs/output/commands/npm-access.html | 4 +- .../npm/docs/output/commands/npm-adduser.html | 4 +- deps/npm/docs/output/commands/npm-audit.html | 4 +- deps/npm/docs/output/commands/npm-bugs.html | 4 +- deps/npm/docs/output/commands/npm-cache.html | 4 +- deps/npm/docs/output/commands/npm-ci.html | 4 +- .../docs/output/commands/npm-completion.html | 4 +- deps/npm/docs/output/commands/npm-config.html | 4 +- deps/npm/docs/output/commands/npm-dedupe.html | 4 +- .../docs/output/commands/npm-deprecate.html | 4 +- deps/npm/docs/output/commands/npm-diff.html | 4 +- .../docs/output/commands/npm-dist-tag.html | 4 +- deps/npm/docs/output/commands/npm-docs.html | 4 +- deps/npm/docs/output/commands/npm-doctor.html | 4 +- deps/npm/docs/output/commands/npm-edit.html | 4 +- deps/npm/docs/output/commands/npm-exec.html | 4 +- .../npm/docs/output/commands/npm-explain.html | 4 +- .../npm/docs/output/commands/npm-explore.html | 4 +- .../docs/output/commands/npm-find-dupes.html | 4 +- deps/npm/docs/output/commands/npm-fund.html | 4 +- .../docs/output/commands/npm-help-search.html | 4 +- deps/npm/docs/output/commands/npm-help.html | 4 +- deps/npm/docs/output/commands/npm-init.html | 4 +- .../output/commands/npm-install-ci-test.html | 4 +- .../output/commands/npm-install-test.html | 4 +- .../npm/docs/output/commands/npm-install.html | 4 +- deps/npm/docs/output/commands/npm-link.html | 4 +- deps/npm/docs/output/commands/npm-login.html | 4 +- deps/npm/docs/output/commands/npm-logout.html | 4 +- deps/npm/docs/output/commands/npm-ls.html | 6 +- deps/npm/docs/output/commands/npm-org.html | 4 +- .../docs/output/commands/npm-outdated.html | 4 +- deps/npm/docs/output/commands/npm-owner.html | 4 +- deps/npm/docs/output/commands/npm-pack.html | 4 +- deps/npm/docs/output/commands/npm-ping.html | 4 +- deps/npm/docs/output/commands/npm-pkg.html | 4 +- deps/npm/docs/output/commands/npm-prefix.html | 4 +- .../npm/docs/output/commands/npm-profile.html | 4 +- deps/npm/docs/output/commands/npm-prune.html | 4 +- .../npm/docs/output/commands/npm-publish.html | 4 +- deps/npm/docs/output/commands/npm-query.html | 4 +- .../npm/docs/output/commands/npm-rebuild.html | 4 +- deps/npm/docs/output/commands/npm-repo.html | 4 +- .../npm/docs/output/commands/npm-restart.html | 4 +- deps/npm/docs/output/commands/npm-root.html | 4 +- deps/npm/docs/output/commands/npm-run.html | 4 +- deps/npm/docs/output/commands/npm-sbom.html | 4 +- deps/npm/docs/output/commands/npm-search.html | 4 +- .../docs/output/commands/npm-shrinkwrap.html | 4 +- deps/npm/docs/output/commands/npm-star.html | 4 +- deps/npm/docs/output/commands/npm-stars.html | 4 +- deps/npm/docs/output/commands/npm-start.html | 4 +- deps/npm/docs/output/commands/npm-stop.html | 4 +- deps/npm/docs/output/commands/npm-team.html | 4 +- deps/npm/docs/output/commands/npm-test.html | 4 +- deps/npm/docs/output/commands/npm-token.html | 4 +- .../docs/output/commands/npm-undeprecate.html | 4 +- .../docs/output/commands/npm-uninstall.html | 4 +- .../docs/output/commands/npm-unpublish.html | 4 +- deps/npm/docs/output/commands/npm-unstar.html | 4 +- deps/npm/docs/output/commands/npm-update.html | 4 +- .../npm/docs/output/commands/npm-version.html | 4 +- deps/npm/docs/output/commands/npm-view.html | 4 +- deps/npm/docs/output/commands/npm-whoami.html | 4 +- deps/npm/docs/output/commands/npm.html | 6 +- deps/npm/docs/output/commands/npx.html | 4 +- .../docs/output/configuring-npm/folders.html | 4 +- .../docs/output/configuring-npm/install.html | 4 +- .../output/configuring-npm/npm-global.html | 4 +- .../docs/output/configuring-npm/npm-json.html | 4 +- .../configuring-npm/npm-shrinkwrap-json.html | 4 +- .../docs/output/configuring-npm/npmrc.html | 9 +- .../output/configuring-npm/package-json.html | 4 +- .../configuring-npm/package-lock-json.html | 4 +- deps/npm/docs/output/using-npm/config.html | 4 +- .../using-npm/dependency-selectors.html | 4 +- .../npm/docs/output/using-npm/developers.html | 4 +- deps/npm/docs/output/using-npm/logging.html | 4 +- deps/npm/docs/output/using-npm/orgs.html | 4 +- .../docs/output/using-npm/package-spec.html | 4 +- deps/npm/docs/output/using-npm/registry.html | 4 +- deps/npm/docs/output/using-npm/removal.html | 4 +- deps/npm/docs/output/using-npm/scope.html | 4 +- deps/npm/docs/output/using-npm/scripts.html | 4 +- .../npm/docs/output/using-npm/workspaces.html | 4 +- deps/npm/lib/cli/exit-handler.js | 21 --- deps/npm/lib/utils/format.js | 5 +- deps/npm/lib/utils/oidc.js | 57 ++++---- deps/npm/man/man1/npm-access.1 | 2 +- deps/npm/man/man1/npm-adduser.1 | 2 +- deps/npm/man/man1/npm-audit.1 | 2 +- deps/npm/man/man1/npm-bugs.1 | 2 +- deps/npm/man/man1/npm-cache.1 | 2 +- deps/npm/man/man1/npm-ci.1 | 2 +- deps/npm/man/man1/npm-completion.1 | 2 +- deps/npm/man/man1/npm-config.1 | 2 +- deps/npm/man/man1/npm-dedupe.1 | 2 +- deps/npm/man/man1/npm-deprecate.1 | 2 +- deps/npm/man/man1/npm-diff.1 | 2 +- deps/npm/man/man1/npm-dist-tag.1 | 2 +- deps/npm/man/man1/npm-docs.1 | 2 +- deps/npm/man/man1/npm-doctor.1 | 2 +- deps/npm/man/man1/npm-edit.1 | 2 +- deps/npm/man/man1/npm-exec.1 | 2 +- deps/npm/man/man1/npm-explain.1 | 2 +- deps/npm/man/man1/npm-explore.1 | 2 +- deps/npm/man/man1/npm-find-dupes.1 | 2 +- deps/npm/man/man1/npm-fund.1 | 2 +- deps/npm/man/man1/npm-help-search.1 | 2 +- deps/npm/man/man1/npm-help.1 | 2 +- deps/npm/man/man1/npm-init.1 | 2 +- deps/npm/man/man1/npm-install-ci-test.1 | 2 +- deps/npm/man/man1/npm-install-test.1 | 2 +- deps/npm/man/man1/npm-install.1 | 2 +- deps/npm/man/man1/npm-link.1 | 2 +- deps/npm/man/man1/npm-login.1 | 2 +- deps/npm/man/man1/npm-logout.1 | 2 +- deps/npm/man/man1/npm-ls.1 | 4 +- deps/npm/man/man1/npm-org.1 | 2 +- deps/npm/man/man1/npm-outdated.1 | 2 +- deps/npm/man/man1/npm-owner.1 | 2 +- deps/npm/man/man1/npm-pack.1 | 2 +- deps/npm/man/man1/npm-ping.1 | 2 +- deps/npm/man/man1/npm-pkg.1 | 2 +- deps/npm/man/man1/npm-prefix.1 | 2 +- deps/npm/man/man1/npm-profile.1 | 2 +- deps/npm/man/man1/npm-prune.1 | 2 +- deps/npm/man/man1/npm-publish.1 | 2 +- deps/npm/man/man1/npm-query.1 | 2 +- deps/npm/man/man1/npm-rebuild.1 | 2 +- deps/npm/man/man1/npm-repo.1 | 2 +- deps/npm/man/man1/npm-restart.1 | 2 +- deps/npm/man/man1/npm-root.1 | 2 +- deps/npm/man/man1/npm-run.1 | 2 +- deps/npm/man/man1/npm-sbom.1 | 2 +- deps/npm/man/man1/npm-search.1 | 2 +- deps/npm/man/man1/npm-shrinkwrap.1 | 2 +- deps/npm/man/man1/npm-star.1 | 2 +- deps/npm/man/man1/npm-stars.1 | 2 +- deps/npm/man/man1/npm-start.1 | 2 +- deps/npm/man/man1/npm-stop.1 | 2 +- deps/npm/man/man1/npm-team.1 | 2 +- deps/npm/man/man1/npm-test.1 | 2 +- deps/npm/man/man1/npm-token.1 | 2 +- deps/npm/man/man1/npm-undeprecate.1 | 2 +- deps/npm/man/man1/npm-uninstall.1 | 2 +- deps/npm/man/man1/npm-unpublish.1 | 2 +- deps/npm/man/man1/npm-unstar.1 | 2 +- deps/npm/man/man1/npm-update.1 | 2 +- deps/npm/man/man1/npm-version.1 | 2 +- deps/npm/man/man1/npm-view.1 | 2 +- deps/npm/man/man1/npm-whoami.1 | 2 +- deps/npm/man/man1/npm.1 | 4 +- deps/npm/man/man1/npx.1 | 2 +- deps/npm/man/man5/folders.5 | 2 +- deps/npm/man/man5/install.5 | 2 +- deps/npm/man/man5/npm-global.5 | 2 +- deps/npm/man/man5/npm-json.5 | 2 +- deps/npm/man/man5/npm-shrinkwrap-json.5 | 2 +- deps/npm/man/man5/npmrc.5 | 5 +- deps/npm/man/man5/package-json.5 | 2 +- deps/npm/man/man5/package-lock-json.5 | 2 +- deps/npm/man/man7/config.7 | 2 +- deps/npm/man/man7/dependency-selectors.7 | 2 +- deps/npm/man/man7/developers.7 | 2 +- deps/npm/man/man7/logging.7 | 2 +- deps/npm/man/man7/orgs.7 | 2 +- deps/npm/man/man7/package-spec.7 | 2 +- deps/npm/man/man7/registry.7 | 2 +- deps/npm/man/man7/removal.7 | 2 +- deps/npm/man/man7/scope.7 | 2 +- deps/npm/man/man7/scripts.7 | 2 +- deps/npm/man/man7/workspaces.7 | 2 +- .../arborist/lib/arborist/build-ideal-tree.js | 27 +++- .../@npmcli/arborist/lib/arborist/reify.js | 1 + .../@npmcli/arborist/package.json | 2 +- .../@npmcli/config/lib/env-replace.js | 8 +- .../node_modules/@npmcli/config/package.json | 2 +- deps/npm/node_modules/libnpmdiff/package.json | 4 +- deps/npm/node_modules/libnpmexec/package.json | 4 +- deps/npm/node_modules/libnpmfund/package.json | 4 +- deps/npm/node_modules/libnpmpack/package.json | 4 +- deps/npm/package.json | 14 +- deps/npm/test/fixtures/mock-oidc.js | 15 +- deps/npm/test/lib/cli/exit-handler.js | 135 +----------------- deps/npm/test/lib/commands/publish.js | 97 +++++++++++++ deps/npm/test/lib/utils/display.js | 2 + 192 files changed, 471 insertions(+), 486 deletions(-) diff --git a/deps/npm/bin/npm.ps1 b/deps/npm/bin/npm.ps1 index 5993adaf556621..efed03fe5655ec 100644 --- a/deps/npm/bin/npm.ps1 +++ b/deps/npm/bin/npm.ps1 @@ -1,5 +1,7 @@ #!/usr/bin/env pwsh +Set-StrictMode -Version 'Latest' + $NODE_EXE="$PSScriptRoot/node.exe" if (-not (Test-Path $NODE_EXE)) { $NODE_EXE="$PSScriptRoot/node" @@ -27,7 +29,7 @@ if ($MyInvocation.ExpectingInput) { # takes pipeline input } elseif (-not $MyInvocation.Line) { # used "-File" argument & $NODE_EXE $NPM_CLI_JS $args } else { # used "-Command" argument - if ($MyInvocation.Statement) { + if (($MyInvocation | Get-Member -Name 'Statement') -and $MyInvocation.Statement) { $NPM_ORIGINAL_COMMAND = $MyInvocation.Statement } else { $NPM_ORIGINAL_COMMAND = ( @@ -38,9 +40,9 @@ if ($MyInvocation.ExpectingInput) { # takes pipeline input $NODE_EXE = $NODE_EXE.Replace("``", "````") $NPM_CLI_JS = $NPM_CLI_JS.Replace("``", "````") - $NPM_NO_REDIRECTS_COMMAND = [Management.Automation.Language.Parser]::ParseInput($NPM_ORIGINAL_COMMAND, [ref] $null, [ref] $null). - EndBlock.Statements.PipelineElements.CommandElements.Extent.Text -join ' ' - $NPM_ARGS = $NPM_NO_REDIRECTS_COMMAND.Substring($MyInvocation.InvocationName.Length).Trim() + $NPM_COMMAND_ARRAY = [Management.Automation.Language.Parser]::ParseInput($NPM_ORIGINAL_COMMAND, [ref] $null, [ref] $null). + EndBlock.Statements.PipelineElements.CommandElements.Extent.Text + $NPM_ARGS = ($NPM_COMMAND_ARRAY | Select-Object -Skip 1) -join ' ' Invoke-Expression "& `"$NODE_EXE`" `"$NPM_CLI_JS`" $NPM_ARGS" } diff --git a/deps/npm/bin/npx.ps1 b/deps/npm/bin/npx.ps1 index cc1aa047bdc217..3fe7b5435763a0 100644 --- a/deps/npm/bin/npx.ps1 +++ b/deps/npm/bin/npx.ps1 @@ -1,5 +1,7 @@ #!/usr/bin/env pwsh +Set-StrictMode -Version 'Latest' + $NODE_EXE="$PSScriptRoot/node.exe" if (-not (Test-Path $NODE_EXE)) { $NODE_EXE="$PSScriptRoot/node" @@ -27,7 +29,7 @@ if ($MyInvocation.ExpectingInput) { # takes pipeline input } elseif (-not $MyInvocation.Line) { # used "-File" argument & $NODE_EXE $NPX_CLI_JS $args } else { # used "-Command" argument - if ($MyInvocation.Statement) { + if (($MyInvocation | Get-Member -Name 'Statement') -and $MyInvocation.Statement) { $NPX_ORIGINAL_COMMAND = $MyInvocation.Statement } else { $NPX_ORIGINAL_COMMAND = ( @@ -38,9 +40,9 @@ if ($MyInvocation.ExpectingInput) { # takes pipeline input $NODE_EXE = $NODE_EXE.Replace("``", "````") $NPX_CLI_JS = $NPX_CLI_JS.Replace("``", "````") - $NPX_NO_REDIRECTS_COMMAND = [Management.Automation.Language.Parser]::ParseInput($NPX_ORIGINAL_COMMAND, [ref] $null, [ref] $null). - EndBlock.Statements.PipelineElements.CommandElements.Extent.Text -join ' ' - $NPX_ARGS = $NPX_NO_REDIRECTS_COMMAND.Substring($MyInvocation.InvocationName.Length).Trim() + $NPX_COMMAND_ARRAY = [Management.Automation.Language.Parser]::ParseInput($NPX_ORIGINAL_COMMAND, [ref] $null, [ref] $null). + EndBlock.Statements.PipelineElements.CommandElements.Extent.Text + $NPX_ARGS = ($NPX_COMMAND_ARRAY | Select-Object -Skip 1) -join ' ' Invoke-Expression "& `"$NODE_EXE`" `"$NPX_CLI_JS`" $NPX_ARGS" } diff --git a/deps/npm/docs/content/commands/npm-ls.md b/deps/npm/docs/content/commands/npm-ls.md index 628747ba12aca2..e20c0629defa8a 100644 --- a/deps/npm/docs/content/commands/npm-ls.md +++ b/deps/npm/docs/content/commands/npm-ls.md @@ -27,7 +27,7 @@ packages will *also* show the paths to the specified packages. For example, running `npm ls promzard` in npm's source tree will show: ```bash -npm@11.5.1 /path/to/npm +npm@11.6.0 /path/to/npm └─┬ init-package-json@0.0.4 └── promzard@0.1.5 ``` diff --git a/deps/npm/docs/content/commands/npm.md b/deps/npm/docs/content/commands/npm.md index 218f7b06b5c8c2..3c230f2bbb3e73 100644 --- a/deps/npm/docs/content/commands/npm.md +++ b/deps/npm/docs/content/commands/npm.md @@ -14,7 +14,7 @@ Note: This command is unaware of workspaces. ### Version -11.5.1 +11.6.0 ### Description diff --git a/deps/npm/docs/content/configuring-npm/npmrc.md b/deps/npm/docs/content/configuring-npm/npmrc.md index cd31ae886f1320..47e126f3c3ab03 100644 --- a/deps/npm/docs/content/configuring-npm/npmrc.md +++ b/deps/npm/docs/content/configuring-npm/npmrc.md @@ -25,11 +25,14 @@ The four relevant files are: * npm builtin config file (`/path/to/npm/npmrc`) All npm config files are an ini-formatted list of `key = value` parameters. -Environment variables can be replaced using `${VARIABLE_NAME}`. For +Environment variables can be replaced using `${VARIABLE_NAME}`. By default +if the variable is not defined, it is left unreplaced. By adding `?` after +variable name they can be forced to evaluate to an empty string instead. For example: ```bash cache = ${HOME}/.npm-packages +node-options = "${NODE_OPTIONS?} --use-system-ca" ``` Each of these files is loaded, and config options are resolved in priority diff --git a/deps/npm/docs/output/commands/npm-access.html b/deps/npm/docs/output/commands/npm-access.html index 3bc103fb1f65ed..9120fd4d0ba7d7 100644 --- a/deps/npm/docs/output/commands/npm-access.html +++ b/deps/npm/docs/output/commands/npm-access.html @@ -141,9 +141,9 @@
-

+

npm-access - @11.5.1 + @11.6.0

Set access level on published packages
diff --git a/deps/npm/docs/output/commands/npm-adduser.html b/deps/npm/docs/output/commands/npm-adduser.html index ca6a06bc441c26..369a5d00c955e4 100644 --- a/deps/npm/docs/output/commands/npm-adduser.html +++ b/deps/npm/docs/output/commands/npm-adduser.html @@ -141,9 +141,9 @@
-

+

npm-adduser - @11.5.1 + @11.6.0

Add a registry user account
diff --git a/deps/npm/docs/output/commands/npm-audit.html b/deps/npm/docs/output/commands/npm-audit.html index c74ce4dcecbc17..a0fca3a0bd6c21 100644 --- a/deps/npm/docs/output/commands/npm-audit.html +++ b/deps/npm/docs/output/commands/npm-audit.html @@ -141,9 +141,9 @@
-

+

npm-audit - @11.5.1 + @11.6.0

Run a security audit
diff --git a/deps/npm/docs/output/commands/npm-bugs.html b/deps/npm/docs/output/commands/npm-bugs.html index 221cefb803b39b..f31d886e341e03 100644 --- a/deps/npm/docs/output/commands/npm-bugs.html +++ b/deps/npm/docs/output/commands/npm-bugs.html @@ -141,9 +141,9 @@
-

+

npm-bugs - @11.5.1 + @11.6.0

Report bugs for a package in a web browser
diff --git a/deps/npm/docs/output/commands/npm-cache.html b/deps/npm/docs/output/commands/npm-cache.html index 675128b4b476df..bef085de809b31 100644 --- a/deps/npm/docs/output/commands/npm-cache.html +++ b/deps/npm/docs/output/commands/npm-cache.html @@ -141,9 +141,9 @@
-

+

npm-cache - @11.5.1 + @11.6.0

Manipulates packages cache
diff --git a/deps/npm/docs/output/commands/npm-ci.html b/deps/npm/docs/output/commands/npm-ci.html index 3df7f0b1d2701c..0874f99054f2ba 100644 --- a/deps/npm/docs/output/commands/npm-ci.html +++ b/deps/npm/docs/output/commands/npm-ci.html @@ -141,9 +141,9 @@
-

+

npm-ci - @11.5.1 + @11.6.0

Clean install a project
diff --git a/deps/npm/docs/output/commands/npm-completion.html b/deps/npm/docs/output/commands/npm-completion.html index ee3d7698937a63..ed557856ae4c3f 100644 --- a/deps/npm/docs/output/commands/npm-completion.html +++ b/deps/npm/docs/output/commands/npm-completion.html @@ -141,9 +141,9 @@
-

+

npm-completion - @11.5.1 + @11.6.0

Tab Completion for npm
diff --git a/deps/npm/docs/output/commands/npm-config.html b/deps/npm/docs/output/commands/npm-config.html index 4a956bbaf4ad84..6ab7887034539f 100644 --- a/deps/npm/docs/output/commands/npm-config.html +++ b/deps/npm/docs/output/commands/npm-config.html @@ -141,9 +141,9 @@
-

+

npm-config - @11.5.1 + @11.6.0

Manage the npm configuration files
diff --git a/deps/npm/docs/output/commands/npm-dedupe.html b/deps/npm/docs/output/commands/npm-dedupe.html index fe16a99afe74d3..03ddc38834deb8 100644 --- a/deps/npm/docs/output/commands/npm-dedupe.html +++ b/deps/npm/docs/output/commands/npm-dedupe.html @@ -141,9 +141,9 @@
-

+

npm-dedupe - @11.5.1 + @11.6.0

Reduce duplication in the package tree
diff --git a/deps/npm/docs/output/commands/npm-deprecate.html b/deps/npm/docs/output/commands/npm-deprecate.html index 260296635182b7..1e2ef57bd7958f 100644 --- a/deps/npm/docs/output/commands/npm-deprecate.html +++ b/deps/npm/docs/output/commands/npm-deprecate.html @@ -141,9 +141,9 @@
-

+

npm-deprecate - @11.5.1 + @11.6.0

Deprecate a version of a package
diff --git a/deps/npm/docs/output/commands/npm-diff.html b/deps/npm/docs/output/commands/npm-diff.html index 5338e8dce63240..4fbaaa64035518 100644 --- a/deps/npm/docs/output/commands/npm-diff.html +++ b/deps/npm/docs/output/commands/npm-diff.html @@ -141,9 +141,9 @@
-

+

npm-diff - @11.5.1 + @11.6.0

The registry diff command
diff --git a/deps/npm/docs/output/commands/npm-dist-tag.html b/deps/npm/docs/output/commands/npm-dist-tag.html index 8d1be0ad219250..d332861a536aa5 100644 --- a/deps/npm/docs/output/commands/npm-dist-tag.html +++ b/deps/npm/docs/output/commands/npm-dist-tag.html @@ -141,9 +141,9 @@
-

+

npm-dist-tag - @11.5.1 + @11.6.0

Modify package distribution tags
diff --git a/deps/npm/docs/output/commands/npm-docs.html b/deps/npm/docs/output/commands/npm-docs.html index f84820654e8376..7d3d72c9693356 100644 --- a/deps/npm/docs/output/commands/npm-docs.html +++ b/deps/npm/docs/output/commands/npm-docs.html @@ -141,9 +141,9 @@
-

+

npm-docs - @11.5.1 + @11.6.0

Open documentation for a package in a web browser
diff --git a/deps/npm/docs/output/commands/npm-doctor.html b/deps/npm/docs/output/commands/npm-doctor.html index e4a646fd2dd41b..159dbd0852672d 100644 --- a/deps/npm/docs/output/commands/npm-doctor.html +++ b/deps/npm/docs/output/commands/npm-doctor.html @@ -141,9 +141,9 @@
-

+

npm-doctor - @11.5.1 + @11.6.0

Check the health of your npm environment
diff --git a/deps/npm/docs/output/commands/npm-edit.html b/deps/npm/docs/output/commands/npm-edit.html index a5feaa33665158..7ad19af79da767 100644 --- a/deps/npm/docs/output/commands/npm-edit.html +++ b/deps/npm/docs/output/commands/npm-edit.html @@ -141,9 +141,9 @@
-

+

npm-edit - @11.5.1 + @11.6.0

Edit an installed package
diff --git a/deps/npm/docs/output/commands/npm-exec.html b/deps/npm/docs/output/commands/npm-exec.html index e506c90124484e..cea8b6403f5f89 100644 --- a/deps/npm/docs/output/commands/npm-exec.html +++ b/deps/npm/docs/output/commands/npm-exec.html @@ -141,9 +141,9 @@
-

+

npm-exec - @11.5.1 + @11.6.0

Run a command from a local or remote npm package
diff --git a/deps/npm/docs/output/commands/npm-explain.html b/deps/npm/docs/output/commands/npm-explain.html index 1668633c10c662..bfb7ae24b153fe 100644 --- a/deps/npm/docs/output/commands/npm-explain.html +++ b/deps/npm/docs/output/commands/npm-explain.html @@ -141,9 +141,9 @@
-

+

npm-explain - @11.5.1 + @11.6.0

Explain installed packages
diff --git a/deps/npm/docs/output/commands/npm-explore.html b/deps/npm/docs/output/commands/npm-explore.html index 161b036a7fe775..a2e9644fa58a69 100644 --- a/deps/npm/docs/output/commands/npm-explore.html +++ b/deps/npm/docs/output/commands/npm-explore.html @@ -141,9 +141,9 @@
-

+

npm-explore - @11.5.1 + @11.6.0

Browse an installed package
diff --git a/deps/npm/docs/output/commands/npm-find-dupes.html b/deps/npm/docs/output/commands/npm-find-dupes.html index 2be0a026c69a3f..3cfffe82b7b87b 100644 --- a/deps/npm/docs/output/commands/npm-find-dupes.html +++ b/deps/npm/docs/output/commands/npm-find-dupes.html @@ -141,9 +141,9 @@
-

+

npm-find-dupes - @11.5.1 + @11.6.0

Find duplication in the package tree
diff --git a/deps/npm/docs/output/commands/npm-fund.html b/deps/npm/docs/output/commands/npm-fund.html index 3fc3e336da48c9..7ce5998a990879 100644 --- a/deps/npm/docs/output/commands/npm-fund.html +++ b/deps/npm/docs/output/commands/npm-fund.html @@ -141,9 +141,9 @@
-

+

npm-fund - @11.5.1 + @11.6.0

Retrieve funding information
diff --git a/deps/npm/docs/output/commands/npm-help-search.html b/deps/npm/docs/output/commands/npm-help-search.html index 3116f4120887e4..739be98ea53a7e 100644 --- a/deps/npm/docs/output/commands/npm-help-search.html +++ b/deps/npm/docs/output/commands/npm-help-search.html @@ -141,9 +141,9 @@
-

+

npm-help-search - @11.5.1 + @11.6.0

Search npm help documentation
diff --git a/deps/npm/docs/output/commands/npm-help.html b/deps/npm/docs/output/commands/npm-help.html index 4c217add7e3f8b..abad90cf954b1f 100644 --- a/deps/npm/docs/output/commands/npm-help.html +++ b/deps/npm/docs/output/commands/npm-help.html @@ -141,9 +141,9 @@
-

+

npm-help - @11.5.1 + @11.6.0

Get help on npm
diff --git a/deps/npm/docs/output/commands/npm-init.html b/deps/npm/docs/output/commands/npm-init.html index 28a3a4fb655d60..b599af959585ac 100644 --- a/deps/npm/docs/output/commands/npm-init.html +++ b/deps/npm/docs/output/commands/npm-init.html @@ -141,9 +141,9 @@
-

+

npm-init - @11.5.1 + @11.6.0

Create a package.json file
diff --git a/deps/npm/docs/output/commands/npm-install-ci-test.html b/deps/npm/docs/output/commands/npm-install-ci-test.html index 1f163982f65be9..d4a3997fd36444 100644 --- a/deps/npm/docs/output/commands/npm-install-ci-test.html +++ b/deps/npm/docs/output/commands/npm-install-ci-test.html @@ -141,9 +141,9 @@
-

+

npm-install-ci-test - @11.5.1 + @11.6.0

Install a project with a clean slate and run tests
diff --git a/deps/npm/docs/output/commands/npm-install-test.html b/deps/npm/docs/output/commands/npm-install-test.html index ed4baca4d8cb20..03b72dde78338b 100644 --- a/deps/npm/docs/output/commands/npm-install-test.html +++ b/deps/npm/docs/output/commands/npm-install-test.html @@ -141,9 +141,9 @@
-

+

npm-install-test - @11.5.1 + @11.6.0

Install package(s) and run tests
diff --git a/deps/npm/docs/output/commands/npm-install.html b/deps/npm/docs/output/commands/npm-install.html index 097b413d53ce53..fe31fbefd78dc2 100644 --- a/deps/npm/docs/output/commands/npm-install.html +++ b/deps/npm/docs/output/commands/npm-install.html @@ -141,9 +141,9 @@
-

+

npm-install - @11.5.1 + @11.6.0

Install a package
diff --git a/deps/npm/docs/output/commands/npm-link.html b/deps/npm/docs/output/commands/npm-link.html index 8c6963e10d038e..58df5e7aaff3bd 100644 --- a/deps/npm/docs/output/commands/npm-link.html +++ b/deps/npm/docs/output/commands/npm-link.html @@ -141,9 +141,9 @@
-

+

npm-link - @11.5.1 + @11.6.0

Symlink a package folder
diff --git a/deps/npm/docs/output/commands/npm-login.html b/deps/npm/docs/output/commands/npm-login.html index 44e2fe5b95c0d2..11593bc3c557d2 100644 --- a/deps/npm/docs/output/commands/npm-login.html +++ b/deps/npm/docs/output/commands/npm-login.html @@ -141,9 +141,9 @@
-

+

npm-login - @11.5.1 + @11.6.0

Login to a registry user account
diff --git a/deps/npm/docs/output/commands/npm-logout.html b/deps/npm/docs/output/commands/npm-logout.html index 90c426863f7e9b..7fad965c711c9f 100644 --- a/deps/npm/docs/output/commands/npm-logout.html +++ b/deps/npm/docs/output/commands/npm-logout.html @@ -141,9 +141,9 @@
-

+

npm-logout - @11.5.1 + @11.6.0

Log out of the registry
diff --git a/deps/npm/docs/output/commands/npm-ls.html b/deps/npm/docs/output/commands/npm-ls.html index 85221ba60a7e88..42fdcfb7f31ece 100644 --- a/deps/npm/docs/output/commands/npm-ls.html +++ b/deps/npm/docs/output/commands/npm-ls.html @@ -141,9 +141,9 @@
-

+

npm-ls - @11.5.1 + @11.6.0

List installed packages
@@ -168,7 +168,7 @@

Description

the results to only the paths to the packages named. Note that nested packages will also show the paths to the specified packages. For example, running npm ls promzard in npm's source tree will show:

-
npm@11.5.1 /path/to/npm
+
npm@11.6.0 /path/to/npm
 └─┬ init-package-json@0.0.4
   └── promzard@0.1.5
 
diff --git a/deps/npm/docs/output/commands/npm-org.html b/deps/npm/docs/output/commands/npm-org.html index 51fc57e6b05776..c63af4e03dbbfa 100644 --- a/deps/npm/docs/output/commands/npm-org.html +++ b/deps/npm/docs/output/commands/npm-org.html @@ -141,9 +141,9 @@
-

+

npm-org - @11.5.1 + @11.6.0

Manage orgs
diff --git a/deps/npm/docs/output/commands/npm-outdated.html b/deps/npm/docs/output/commands/npm-outdated.html index 8f1c40aa72922c..30964a0a554773 100644 --- a/deps/npm/docs/output/commands/npm-outdated.html +++ b/deps/npm/docs/output/commands/npm-outdated.html @@ -141,9 +141,9 @@
-

+

npm-outdated - @11.5.1 + @11.6.0

Check for outdated packages
diff --git a/deps/npm/docs/output/commands/npm-owner.html b/deps/npm/docs/output/commands/npm-owner.html index ccc611dbf61a67..7c4605ba9c749c 100644 --- a/deps/npm/docs/output/commands/npm-owner.html +++ b/deps/npm/docs/output/commands/npm-owner.html @@ -141,9 +141,9 @@
-

+

npm-owner - @11.5.1 + @11.6.0

Manage package owners
diff --git a/deps/npm/docs/output/commands/npm-pack.html b/deps/npm/docs/output/commands/npm-pack.html index 6a9165dfcd17ee..e2c477f7e47a99 100644 --- a/deps/npm/docs/output/commands/npm-pack.html +++ b/deps/npm/docs/output/commands/npm-pack.html @@ -141,9 +141,9 @@
-

+

npm-pack - @11.5.1 + @11.6.0

Create a tarball from a package
diff --git a/deps/npm/docs/output/commands/npm-ping.html b/deps/npm/docs/output/commands/npm-ping.html index 5862980f080da4..9fc082a35d7672 100644 --- a/deps/npm/docs/output/commands/npm-ping.html +++ b/deps/npm/docs/output/commands/npm-ping.html @@ -141,9 +141,9 @@
-

+

npm-ping - @11.5.1 + @11.6.0

Ping npm registry
diff --git a/deps/npm/docs/output/commands/npm-pkg.html b/deps/npm/docs/output/commands/npm-pkg.html index 3ea7c0658e1c50..ae9c9d3439805d 100644 --- a/deps/npm/docs/output/commands/npm-pkg.html +++ b/deps/npm/docs/output/commands/npm-pkg.html @@ -141,9 +141,9 @@
-

+

npm-pkg - @11.5.1 + @11.6.0

Manages your package.json
diff --git a/deps/npm/docs/output/commands/npm-prefix.html b/deps/npm/docs/output/commands/npm-prefix.html index ac6a195b142e18..74229cd63224bc 100644 --- a/deps/npm/docs/output/commands/npm-prefix.html +++ b/deps/npm/docs/output/commands/npm-prefix.html @@ -141,9 +141,9 @@
-

+

npm-prefix - @11.5.1 + @11.6.0

Display prefix
diff --git a/deps/npm/docs/output/commands/npm-profile.html b/deps/npm/docs/output/commands/npm-profile.html index 7ac7aaca433db5..fc1de44eb9172c 100644 --- a/deps/npm/docs/output/commands/npm-profile.html +++ b/deps/npm/docs/output/commands/npm-profile.html @@ -141,9 +141,9 @@
-

+

npm-profile - @11.5.1 + @11.6.0

Change settings on your registry profile
diff --git a/deps/npm/docs/output/commands/npm-prune.html b/deps/npm/docs/output/commands/npm-prune.html index a1c56766648686..4796be5f8bca83 100644 --- a/deps/npm/docs/output/commands/npm-prune.html +++ b/deps/npm/docs/output/commands/npm-prune.html @@ -141,9 +141,9 @@
-

+

npm-prune - @11.5.1 + @11.6.0

Remove extraneous packages
diff --git a/deps/npm/docs/output/commands/npm-publish.html b/deps/npm/docs/output/commands/npm-publish.html index f2a075fdc52661..3ec9644ce5a709 100644 --- a/deps/npm/docs/output/commands/npm-publish.html +++ b/deps/npm/docs/output/commands/npm-publish.html @@ -141,9 +141,9 @@
-

+

npm-publish - @11.5.1 + @11.6.0

Publish a package
diff --git a/deps/npm/docs/output/commands/npm-query.html b/deps/npm/docs/output/commands/npm-query.html index a196efe399d3b9..a1f7287a5e6fcd 100644 --- a/deps/npm/docs/output/commands/npm-query.html +++ b/deps/npm/docs/output/commands/npm-query.html @@ -141,9 +141,9 @@
-

+

npm-query - @11.5.1 + @11.6.0

Dependency selector query
diff --git a/deps/npm/docs/output/commands/npm-rebuild.html b/deps/npm/docs/output/commands/npm-rebuild.html index 3d2ad013a064f9..75e45aef86ef26 100644 --- a/deps/npm/docs/output/commands/npm-rebuild.html +++ b/deps/npm/docs/output/commands/npm-rebuild.html @@ -141,9 +141,9 @@
-

+

npm-rebuild - @11.5.1 + @11.6.0

Rebuild a package
diff --git a/deps/npm/docs/output/commands/npm-repo.html b/deps/npm/docs/output/commands/npm-repo.html index 0ea012c5205adc..4fd54856948978 100644 --- a/deps/npm/docs/output/commands/npm-repo.html +++ b/deps/npm/docs/output/commands/npm-repo.html @@ -141,9 +141,9 @@
-

+

npm-repo - @11.5.1 + @11.6.0

Open package repository page in the browser
diff --git a/deps/npm/docs/output/commands/npm-restart.html b/deps/npm/docs/output/commands/npm-restart.html index eaac195de4bae8..1b760bed85f9b2 100644 --- a/deps/npm/docs/output/commands/npm-restart.html +++ b/deps/npm/docs/output/commands/npm-restart.html @@ -141,9 +141,9 @@
-

+

npm-restart - @11.5.1 + @11.6.0

Restart a package
diff --git a/deps/npm/docs/output/commands/npm-root.html b/deps/npm/docs/output/commands/npm-root.html index 7b6093cda28e0b..b8e84524154d3d 100644 --- a/deps/npm/docs/output/commands/npm-root.html +++ b/deps/npm/docs/output/commands/npm-root.html @@ -141,9 +141,9 @@
-

+

npm-root - @11.5.1 + @11.6.0

Display npm root
diff --git a/deps/npm/docs/output/commands/npm-run.html b/deps/npm/docs/output/commands/npm-run.html index 018d5ed6bda120..092fe83ebb7855 100644 --- a/deps/npm/docs/output/commands/npm-run.html +++ b/deps/npm/docs/output/commands/npm-run.html @@ -141,9 +141,9 @@
-

+

npm-run - @11.5.1 + @11.6.0

Run arbitrary package scripts
diff --git a/deps/npm/docs/output/commands/npm-sbom.html b/deps/npm/docs/output/commands/npm-sbom.html index a3e44f0ec23df5..7cbaab52174349 100644 --- a/deps/npm/docs/output/commands/npm-sbom.html +++ b/deps/npm/docs/output/commands/npm-sbom.html @@ -141,9 +141,9 @@
-

+

npm-sbom - @11.5.1 + @11.6.0

Generate a Software Bill of Materials (SBOM)
diff --git a/deps/npm/docs/output/commands/npm-search.html b/deps/npm/docs/output/commands/npm-search.html index 1dbc2fcd74042c..a63b14047d174c 100644 --- a/deps/npm/docs/output/commands/npm-search.html +++ b/deps/npm/docs/output/commands/npm-search.html @@ -141,9 +141,9 @@
-

+

npm-search - @11.5.1 + @11.6.0

Search for packages
diff --git a/deps/npm/docs/output/commands/npm-shrinkwrap.html b/deps/npm/docs/output/commands/npm-shrinkwrap.html index 88c25fa7be9231..f78d98eaf5db74 100644 --- a/deps/npm/docs/output/commands/npm-shrinkwrap.html +++ b/deps/npm/docs/output/commands/npm-shrinkwrap.html @@ -141,9 +141,9 @@
-

+

npm-shrinkwrap - @11.5.1 + @11.6.0

Lock down dependency versions for publication
diff --git a/deps/npm/docs/output/commands/npm-star.html b/deps/npm/docs/output/commands/npm-star.html index 15c1362721b666..8c0b45c5f15587 100644 --- a/deps/npm/docs/output/commands/npm-star.html +++ b/deps/npm/docs/output/commands/npm-star.html @@ -141,9 +141,9 @@
-

+

npm-star - @11.5.1 + @11.6.0

Mark your favorite packages
diff --git a/deps/npm/docs/output/commands/npm-stars.html b/deps/npm/docs/output/commands/npm-stars.html index d7fdd21457910f..40b8be5aa37b25 100644 --- a/deps/npm/docs/output/commands/npm-stars.html +++ b/deps/npm/docs/output/commands/npm-stars.html @@ -141,9 +141,9 @@
-

+

npm-stars - @11.5.1 + @11.6.0

View packages marked as favorites
diff --git a/deps/npm/docs/output/commands/npm-start.html b/deps/npm/docs/output/commands/npm-start.html index 4c9be28e94301c..e70474c7725ec4 100644 --- a/deps/npm/docs/output/commands/npm-start.html +++ b/deps/npm/docs/output/commands/npm-start.html @@ -141,9 +141,9 @@
-

+

npm-start - @11.5.1 + @11.6.0

Start a package
diff --git a/deps/npm/docs/output/commands/npm-stop.html b/deps/npm/docs/output/commands/npm-stop.html index 4ee13d0edceb20..58c85f41aecb10 100644 --- a/deps/npm/docs/output/commands/npm-stop.html +++ b/deps/npm/docs/output/commands/npm-stop.html @@ -141,9 +141,9 @@
-

+

npm-stop - @11.5.1 + @11.6.0

Stop a package
diff --git a/deps/npm/docs/output/commands/npm-team.html b/deps/npm/docs/output/commands/npm-team.html index afb95a9c50a94e..d4ade5c1810d5b 100644 --- a/deps/npm/docs/output/commands/npm-team.html +++ b/deps/npm/docs/output/commands/npm-team.html @@ -141,9 +141,9 @@
-

+

npm-team - @11.5.1 + @11.6.0

Manage organization teams and team memberships
diff --git a/deps/npm/docs/output/commands/npm-test.html b/deps/npm/docs/output/commands/npm-test.html index 181c14e1011a70..66e8047c79551e 100644 --- a/deps/npm/docs/output/commands/npm-test.html +++ b/deps/npm/docs/output/commands/npm-test.html @@ -141,9 +141,9 @@
-

+

npm-test - @11.5.1 + @11.6.0

Test a package
diff --git a/deps/npm/docs/output/commands/npm-token.html b/deps/npm/docs/output/commands/npm-token.html index e493aba66e7b5b..324c436c5f9505 100644 --- a/deps/npm/docs/output/commands/npm-token.html +++ b/deps/npm/docs/output/commands/npm-token.html @@ -141,9 +141,9 @@
-

+

npm-token - @11.5.1 + @11.6.0

Manage your authentication tokens
diff --git a/deps/npm/docs/output/commands/npm-undeprecate.html b/deps/npm/docs/output/commands/npm-undeprecate.html index bff4c3722d20f1..088507a4ae6699 100644 --- a/deps/npm/docs/output/commands/npm-undeprecate.html +++ b/deps/npm/docs/output/commands/npm-undeprecate.html @@ -141,9 +141,9 @@
-

+

npm-undeprecate - @11.5.1 + @11.6.0

Undeprecate a version of a package
diff --git a/deps/npm/docs/output/commands/npm-uninstall.html b/deps/npm/docs/output/commands/npm-uninstall.html index 1a6bdfc68a8ae7..043adb00ac9fd3 100644 --- a/deps/npm/docs/output/commands/npm-uninstall.html +++ b/deps/npm/docs/output/commands/npm-uninstall.html @@ -141,9 +141,9 @@
-

+

npm-uninstall - @11.5.1 + @11.6.0

Remove a package
diff --git a/deps/npm/docs/output/commands/npm-unpublish.html b/deps/npm/docs/output/commands/npm-unpublish.html index 806f6ff9c1c8a1..ba5ec3630407db 100644 --- a/deps/npm/docs/output/commands/npm-unpublish.html +++ b/deps/npm/docs/output/commands/npm-unpublish.html @@ -141,9 +141,9 @@
-

+

npm-unpublish - @11.5.1 + @11.6.0

Remove a package from the registry
diff --git a/deps/npm/docs/output/commands/npm-unstar.html b/deps/npm/docs/output/commands/npm-unstar.html index ed467c5e617aee..5f5bae1545df8a 100644 --- a/deps/npm/docs/output/commands/npm-unstar.html +++ b/deps/npm/docs/output/commands/npm-unstar.html @@ -141,9 +141,9 @@
-

+

npm-unstar - @11.5.1 + @11.6.0

Remove an item from your favorite packages
diff --git a/deps/npm/docs/output/commands/npm-update.html b/deps/npm/docs/output/commands/npm-update.html index 44a28c7538abf6..d3bb32456efcb6 100644 --- a/deps/npm/docs/output/commands/npm-update.html +++ b/deps/npm/docs/output/commands/npm-update.html @@ -141,9 +141,9 @@
-

+

npm-update - @11.5.1 + @11.6.0

Update packages
diff --git a/deps/npm/docs/output/commands/npm-version.html b/deps/npm/docs/output/commands/npm-version.html index b521405869cba3..8927c0a3d51801 100644 --- a/deps/npm/docs/output/commands/npm-version.html +++ b/deps/npm/docs/output/commands/npm-version.html @@ -141,9 +141,9 @@
-

+

npm-version - @11.5.1 + @11.6.0

Bump a package version
diff --git a/deps/npm/docs/output/commands/npm-view.html b/deps/npm/docs/output/commands/npm-view.html index eec33cff83e10e..b0f8fce4a9b809 100644 --- a/deps/npm/docs/output/commands/npm-view.html +++ b/deps/npm/docs/output/commands/npm-view.html @@ -141,9 +141,9 @@
-

+

npm-view - @11.5.1 + @11.6.0

View registry info
diff --git a/deps/npm/docs/output/commands/npm-whoami.html b/deps/npm/docs/output/commands/npm-whoami.html index 5612f62241450a..e3066809e076e6 100644 --- a/deps/npm/docs/output/commands/npm-whoami.html +++ b/deps/npm/docs/output/commands/npm-whoami.html @@ -141,9 +141,9 @@
-

+

npm-whoami - @11.5.1 + @11.6.0

Display npm username
diff --git a/deps/npm/docs/output/commands/npm.html b/deps/npm/docs/output/commands/npm.html index 7776359185350c..ffc364b516cdc6 100644 --- a/deps/npm/docs/output/commands/npm.html +++ b/deps/npm/docs/output/commands/npm.html @@ -141,9 +141,9 @@
-

+

npm - @11.5.1 + @11.6.0

javascript package manager
@@ -158,7 +158,7 @@

Table of contents

Note: This command is unaware of workspaces.

Version

-

11.5.1

+

11.6.0

Description

npm is the package manager for the Node JavaScript platform. It puts modules in place so that node can find them, and manages dependency diff --git a/deps/npm/docs/output/commands/npx.html b/deps/npm/docs/output/commands/npx.html index 9d393a2b498cf3..be136f3519185c 100644 --- a/deps/npm/docs/output/commands/npx.html +++ b/deps/npm/docs/output/commands/npx.html @@ -141,9 +141,9 @@

-

+

npx - @11.5.1 + @11.6.0

Run a command from a local or remote npm package
diff --git a/deps/npm/docs/output/configuring-npm/folders.html b/deps/npm/docs/output/configuring-npm/folders.html index 7619de3198aafc..7affb09c1e17dc 100644 --- a/deps/npm/docs/output/configuring-npm/folders.html +++ b/deps/npm/docs/output/configuring-npm/folders.html @@ -141,9 +141,9 @@
-

+

folders - @11.5.1 + @11.6.0

Folder Structures Used by npm
diff --git a/deps/npm/docs/output/configuring-npm/install.html b/deps/npm/docs/output/configuring-npm/install.html index ebeb2a6ba06b46..b331947e777875 100644 --- a/deps/npm/docs/output/configuring-npm/install.html +++ b/deps/npm/docs/output/configuring-npm/install.html @@ -141,9 +141,9 @@
-

+

install - @11.5.1 + @11.6.0

Download and install node and npm
diff --git a/deps/npm/docs/output/configuring-npm/npm-global.html b/deps/npm/docs/output/configuring-npm/npm-global.html index 7619de3198aafc..7affb09c1e17dc 100644 --- a/deps/npm/docs/output/configuring-npm/npm-global.html +++ b/deps/npm/docs/output/configuring-npm/npm-global.html @@ -141,9 +141,9 @@
-

+

folders - @11.5.1 + @11.6.0

Folder Structures Used by npm
diff --git a/deps/npm/docs/output/configuring-npm/npm-json.html b/deps/npm/docs/output/configuring-npm/npm-json.html index 899eddd3373689..d6bbb8e2d10fe4 100644 --- a/deps/npm/docs/output/configuring-npm/npm-json.html +++ b/deps/npm/docs/output/configuring-npm/npm-json.html @@ -141,9 +141,9 @@
-

+

package.json - @11.5.1 + @11.6.0

Specifics of npm's package.json handling
diff --git a/deps/npm/docs/output/configuring-npm/npm-shrinkwrap-json.html b/deps/npm/docs/output/configuring-npm/npm-shrinkwrap-json.html index 339b2865962291..cae31ba8b02fa5 100644 --- a/deps/npm/docs/output/configuring-npm/npm-shrinkwrap-json.html +++ b/deps/npm/docs/output/configuring-npm/npm-shrinkwrap-json.html @@ -141,9 +141,9 @@
-

+

npm-shrinkwrap.json - @11.5.1 + @11.6.0

A publishable lockfile
diff --git a/deps/npm/docs/output/configuring-npm/npmrc.html b/deps/npm/docs/output/configuring-npm/npmrc.html index 555546c0359187..5eca1206c6e13b 100644 --- a/deps/npm/docs/output/configuring-npm/npmrc.html +++ b/deps/npm/docs/output/configuring-npm/npmrc.html @@ -141,9 +141,9 @@
-

+

npmrc - @11.5.1 + @11.6.0

The npm config files
@@ -169,9 +169,12 @@

Files

  • npm builtin config file (/path/to/npm/npmrc)
  • All npm config files are an ini-formatted list of key = value parameters. -Environment variables can be replaced using ${VARIABLE_NAME}. For +Environment variables can be replaced using ${VARIABLE_NAME}. By default +if the variable is not defined, it is left unreplaced. By adding ? after +variable name they can be forced to evaluate to an empty string instead. For example:

    cache = ${HOME}/.npm-packages
    +node-options = "${NODE_OPTIONS?} --use-system-ca"
     

    Each of these files is loaded, and config options are resolved in priority order. For example, a setting in the userconfig file would override the diff --git a/deps/npm/docs/output/configuring-npm/package-json.html b/deps/npm/docs/output/configuring-npm/package-json.html index 899eddd3373689..d6bbb8e2d10fe4 100644 --- a/deps/npm/docs/output/configuring-npm/package-json.html +++ b/deps/npm/docs/output/configuring-npm/package-json.html @@ -141,9 +141,9 @@

    -

    +

    package.json - @11.5.1 + @11.6.0

    Specifics of npm's package.json handling
    diff --git a/deps/npm/docs/output/configuring-npm/package-lock-json.html b/deps/npm/docs/output/configuring-npm/package-lock-json.html index 29cf287806c8e3..a148128dc2af99 100644 --- a/deps/npm/docs/output/configuring-npm/package-lock-json.html +++ b/deps/npm/docs/output/configuring-npm/package-lock-json.html @@ -141,9 +141,9 @@
    -

    +

    package-lock.json - @11.5.1 + @11.6.0

    A manifestation of the manifest
    diff --git a/deps/npm/docs/output/using-npm/config.html b/deps/npm/docs/output/using-npm/config.html index 6fbf5de5120c88..38e66508c0dce6 100644 --- a/deps/npm/docs/output/using-npm/config.html +++ b/deps/npm/docs/output/using-npm/config.html @@ -141,9 +141,9 @@
    -

    +

    config - @11.5.1 + @11.6.0

    More than you probably want to know about npm configuration
    diff --git a/deps/npm/docs/output/using-npm/dependency-selectors.html b/deps/npm/docs/output/using-npm/dependency-selectors.html index 82f000eec57a94..18e4bf6acd9c61 100644 --- a/deps/npm/docs/output/using-npm/dependency-selectors.html +++ b/deps/npm/docs/output/using-npm/dependency-selectors.html @@ -141,9 +141,9 @@
    -

    +

    Dependency Selector Syntax & Querying - @11.5.1 + @11.6.0

    Dependency Selector Syntax & Querying
    diff --git a/deps/npm/docs/output/using-npm/developers.html b/deps/npm/docs/output/using-npm/developers.html index 815cdd863efa89..6c63eeb468cb3a 100644 --- a/deps/npm/docs/output/using-npm/developers.html +++ b/deps/npm/docs/output/using-npm/developers.html @@ -141,9 +141,9 @@
    -

    +

    developers - @11.5.1 + @11.6.0

    Developer Guide
    diff --git a/deps/npm/docs/output/using-npm/logging.html b/deps/npm/docs/output/using-npm/logging.html index eab52de4c8ffcb..ec048a9f8bcbe4 100644 --- a/deps/npm/docs/output/using-npm/logging.html +++ b/deps/npm/docs/output/using-npm/logging.html @@ -141,9 +141,9 @@
    -

    +

    Logging - @11.5.1 + @11.6.0

    Why, What & How We Log
    diff --git a/deps/npm/docs/output/using-npm/orgs.html b/deps/npm/docs/output/using-npm/orgs.html index 40b46e2d71efad..21a778a2040326 100644 --- a/deps/npm/docs/output/using-npm/orgs.html +++ b/deps/npm/docs/output/using-npm/orgs.html @@ -141,9 +141,9 @@
    -

    +

    orgs - @11.5.1 + @11.6.0

    Working with Teams & Orgs
    diff --git a/deps/npm/docs/output/using-npm/package-spec.html b/deps/npm/docs/output/using-npm/package-spec.html index a223f97e98ec81..70369d48a3bf0c 100644 --- a/deps/npm/docs/output/using-npm/package-spec.html +++ b/deps/npm/docs/output/using-npm/package-spec.html @@ -141,9 +141,9 @@
    -

    +

    package-spec - @11.5.1 + @11.6.0

    Package name specifier
    diff --git a/deps/npm/docs/output/using-npm/registry.html b/deps/npm/docs/output/using-npm/registry.html index 8b3b25394d656e..3b88eb3660d70a 100644 --- a/deps/npm/docs/output/using-npm/registry.html +++ b/deps/npm/docs/output/using-npm/registry.html @@ -141,9 +141,9 @@
    -

    +

    registry - @11.5.1 + @11.6.0

    The JavaScript Package Registry
    diff --git a/deps/npm/docs/output/using-npm/removal.html b/deps/npm/docs/output/using-npm/removal.html index 766d293de99079..df419cbf9bca91 100644 --- a/deps/npm/docs/output/using-npm/removal.html +++ b/deps/npm/docs/output/using-npm/removal.html @@ -141,9 +141,9 @@
    -

    +

    removal - @11.5.1 + @11.6.0

    Cleaning the Slate
    diff --git a/deps/npm/docs/output/using-npm/scope.html b/deps/npm/docs/output/using-npm/scope.html index adcc7badd9c8fc..4304c4ff21ea59 100644 --- a/deps/npm/docs/output/using-npm/scope.html +++ b/deps/npm/docs/output/using-npm/scope.html @@ -141,9 +141,9 @@
    -

    +

    scope - @11.5.1 + @11.6.0

    Scoped packages
    diff --git a/deps/npm/docs/output/using-npm/scripts.html b/deps/npm/docs/output/using-npm/scripts.html index 61427ad5a18a98..6b15a205e6d8c0 100644 --- a/deps/npm/docs/output/using-npm/scripts.html +++ b/deps/npm/docs/output/using-npm/scripts.html @@ -141,9 +141,9 @@
    -

    +

    scripts - @11.5.1 + @11.6.0

    How npm handles the "scripts" field
    diff --git a/deps/npm/docs/output/using-npm/workspaces.html b/deps/npm/docs/output/using-npm/workspaces.html index 7c3f6c391c809c..3d0a9d4b05b9eb 100644 --- a/deps/npm/docs/output/using-npm/workspaces.html +++ b/deps/npm/docs/output/using-npm/workspaces.html @@ -141,9 +141,9 @@
    -

    +

    workspaces - @11.5.1 + @11.6.0

    Working with workspaces
    diff --git a/deps/npm/lib/cli/exit-handler.js b/deps/npm/lib/cli/exit-handler.js index efb09138aec282..e76b08c80a635b 100644 --- a/deps/npm/lib/cli/exit-handler.js +++ b/deps/npm/lib/cli/exit-handler.js @@ -43,16 +43,6 @@ class ExitHandler { registerUncaughtHandlers () { this.#process.on('uncaughtException', this.#handleExit) this.#process.on('unhandledRejection', this.#handleExit) - - // Handle signals that might bypass normal exit flow - // These signals can cause the process to exit without calling the exit handler - const signalsToHandle = ['SIGTERM', 'SIGINT', 'SIGHUP'] - for (const signal of signalsToHandle) { - this.#process.on(signal, () => { - // Call the exit handler to ensure proper cleanup - this.#handleExit(new Error(`Process received ${signal}`)) - }) - } } exit (err) { @@ -67,17 +57,6 @@ class ExitHandler { this.#process.off('exit', this.#handleProcesExitAndReset) this.#process.off('uncaughtException', this.#handleExit) this.#process.off('unhandledRejection', this.#handleExit) - - const signalsToCleanup = ['SIGTERM', 'SIGINT', 'SIGHUP'] - for (const signal of signalsToCleanup) { - try { - this.#process.off(signal, this.#handleExit) - } catch (err) { - // Ignore errors during cleanup - this is defensive programming for edge cases - // where the process object might be in an unexpected state during shutdown - } - } - if (this.#loaded) { this.#npm.unload() } diff --git a/deps/npm/lib/utils/format.js b/deps/npm/lib/utils/format.js index aaecfe1ba0e7a7..9216c7918678ac 100644 --- a/deps/npm/lib/utils/format.js +++ b/deps/npm/lib/utils/format.js @@ -1,4 +1,7 @@ +// All logging goes through here, both to console and log files + const { formatWithOptions: baseFormatWithOptions } = require('node:util') +const { redactLog } = require('@npmcli/redact') // These are most assuredly not a mistake // https://eslint.org/docs/latest/rules/no-control-regex @@ -40,7 +43,7 @@ function STRIP_C01 (str) { const formatWithOptions = ({ prefix: prefixes = [], eol = '\n', ...options }, ...args) => { const prefix = prefixes.filter(p => p != null).join(' ') - const formatted = STRIP_C01(baseFormatWithOptions(options, ...args)) + const formatted = redactLog(STRIP_C01(baseFormatWithOptions(options, ...args))) // Splitting could be changed to only `\n` once we are sure we only emit unix newlines. // The eol param to this function will put the correct newlines in place for the returned string. const lines = formatted.split(/\r?\n/) diff --git a/deps/npm/lib/utils/oidc.js b/deps/npm/lib/utils/oidc.js index 859d596243433f..24524f4b4bf72d 100644 --- a/deps/npm/lib/utils/oidc.js +++ b/deps/npm/lib/utils/oidc.js @@ -3,6 +3,7 @@ const npmFetch = require('npm-registry-fetch') const ciInfo = require('ci-info') const fetch = require('make-fetch-happen') const npa = require('npm-package-arg') +const libaccess = require('libnpmaccess') /** * Handles OpenID Connect (OIDC) token retrieval and exchange for CI environments. @@ -108,31 +109,6 @@ async function oidc ({ packageName, registry, opts, config }) { return undefined } - // this checks if the user configured provenance or it's the default unset value - const isDefaultProvenance = config.isDefault('provenance') - const provenanceIntent = config.get('provenance') - let enableProvenance = false - - // if provenance is the default value or the user explicitly set it - if (isDefaultProvenance || provenanceIntent) { - const [headerB64, payloadB64] = idToken.split('.') - if (headerB64 && payloadB64) { - const payloadJson = Buffer.from(payloadB64, 'base64').toString('utf8') - try { - const payload = JSON.parse(payloadJson) - if (ciInfo.GITHUB_ACTIONS && payload.repository_visibility === 'public') { - enableProvenance = true - } - // only set provenance for gitlab if SIGSTORE_ID_TOKEN is available - if (ciInfo.GITLAB && payload.project_visibility === 'public' && process.env.SIGSTORE_ID_TOKEN) { - enableProvenance = true - } - } catch (e) { - // Failed to parse idToken payload as JSON - } - } - } - const parsedRegistry = new URL(registry) const regKey = `//${parsedRegistry.host}${parsedRegistry.pathname}` const authTokenKey = `${regKey}:_authToken` @@ -155,12 +131,6 @@ async function oidc ({ packageName, registry, opts, config }) { return undefined } - if (enableProvenance) { - // Repository is public, setting provenance - opts.provenance = true - config.set('provenance', true, 'user') - } - /* * The "opts" object is a clone of npm.flatOptions and is passed through the `publish` command, * eventually reaching `otplease`. To ensure the token is accessible during the publishing process, @@ -170,6 +140,31 @@ async function oidc ({ packageName, registry, opts, config }) { opts[authTokenKey] = response.token config.set(authTokenKey, response.token, 'user') log.verbose('oidc', `Successfully retrieved and set token`) + + try { + const isDefaultProvenance = config.isDefault('provenance') + if (isDefaultProvenance) { + const [headerB64, payloadB64] = idToken.split('.') + if (headerB64 && payloadB64) { + const payloadJson = Buffer.from(payloadB64, 'base64').toString('utf8') + const payload = JSON.parse(payloadJson) + if ( + (ciInfo.GITHUB_ACTIONS && payload.repository_visibility === 'public') || + // only set provenance for gitlab if the repo is public and SIGSTORE_ID_TOKEN is available + (ciInfo.GITLAB && payload.project_visibility === 'public' && process.env.SIGSTORE_ID_TOKEN) + ) { + const visibility = await libaccess.getVisibility(packageName, opts) + if (visibility?.public) { + log.verbose('oidc', `Enabling provenance`) + opts.provenance = true + config.set('provenance', true, 'user') + } + } + } + } + } catch (error) { + log.verbose('oidc', `Failed to set provenance with message: ${error?.message || 'Unknown error'}`) + } } catch (error) { log.verbose('oidc', `Failure with message: ${error?.message || 'Unknown error'}`) } diff --git a/deps/npm/man/man1/npm-access.1 b/deps/npm/man/man1/npm-access.1 index cc3fe580a3c1f8..c281cc3578dd9e 100644 --- a/deps/npm/man/man1/npm-access.1 +++ b/deps/npm/man/man1/npm-access.1 @@ -1,4 +1,4 @@ -.TH "NPM-ACCESS" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-ACCESS" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-access\fR - Set access level on published packages .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-adduser.1 b/deps/npm/man/man1/npm-adduser.1 index a9ffc99427cf79..78085ee965d0ae 100644 --- a/deps/npm/man/man1/npm-adduser.1 +++ b/deps/npm/man/man1/npm-adduser.1 @@ -1,4 +1,4 @@ -.TH "NPM-ADDUSER" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-ADDUSER" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-adduser\fR - Add a registry user account .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-audit.1 b/deps/npm/man/man1/npm-audit.1 index ee55dd1025340f..f3b272769ef0e6 100644 --- a/deps/npm/man/man1/npm-audit.1 +++ b/deps/npm/man/man1/npm-audit.1 @@ -1,4 +1,4 @@ -.TH "NPM-AUDIT" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-AUDIT" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-audit\fR - Run a security audit .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-bugs.1 b/deps/npm/man/man1/npm-bugs.1 index 7999999069e711..c723ac32275a42 100644 --- a/deps/npm/man/man1/npm-bugs.1 +++ b/deps/npm/man/man1/npm-bugs.1 @@ -1,4 +1,4 @@ -.TH "NPM-BUGS" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-BUGS" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-bugs\fR - Report bugs for a package in a web browser .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-cache.1 b/deps/npm/man/man1/npm-cache.1 index 65ffbed6067093..6fa76be9d108b2 100644 --- a/deps/npm/man/man1/npm-cache.1 +++ b/deps/npm/man/man1/npm-cache.1 @@ -1,4 +1,4 @@ -.TH "NPM-CACHE" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-CACHE" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-cache\fR - Manipulates packages cache .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-ci.1 b/deps/npm/man/man1/npm-ci.1 index d2aca2fb64b30b..b5dcd8e778f8d6 100644 --- a/deps/npm/man/man1/npm-ci.1 +++ b/deps/npm/man/man1/npm-ci.1 @@ -1,4 +1,4 @@ -.TH "NPM-CI" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-CI" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-ci\fR - Clean install a project .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-completion.1 b/deps/npm/man/man1/npm-completion.1 index d58c21cf2d07c0..95363c9b4cdb2b 100644 --- a/deps/npm/man/man1/npm-completion.1 +++ b/deps/npm/man/man1/npm-completion.1 @@ -1,4 +1,4 @@ -.TH "NPM-COMPLETION" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-COMPLETION" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-completion\fR - Tab Completion for npm .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-config.1 b/deps/npm/man/man1/npm-config.1 index 67a88bef0641d5..899be37170d346 100644 --- a/deps/npm/man/man1/npm-config.1 +++ b/deps/npm/man/man1/npm-config.1 @@ -1,4 +1,4 @@ -.TH "NPM-CONFIG" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-CONFIG" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-config\fR - Manage the npm configuration files .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-dedupe.1 b/deps/npm/man/man1/npm-dedupe.1 index f1b26e49581a01..6ff1d61304c70f 100644 --- a/deps/npm/man/man1/npm-dedupe.1 +++ b/deps/npm/man/man1/npm-dedupe.1 @@ -1,4 +1,4 @@ -.TH "NPM-DEDUPE" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-DEDUPE" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-dedupe\fR - Reduce duplication in the package tree .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-deprecate.1 b/deps/npm/man/man1/npm-deprecate.1 index 0ce359a24088e0..ff2b04d24f5795 100644 --- a/deps/npm/man/man1/npm-deprecate.1 +++ b/deps/npm/man/man1/npm-deprecate.1 @@ -1,4 +1,4 @@ -.TH "NPM-DEPRECATE" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-DEPRECATE" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-deprecate\fR - Deprecate a version of a package .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-diff.1 b/deps/npm/man/man1/npm-diff.1 index 3b3a6df1fe0280..88874ef44c17f0 100644 --- a/deps/npm/man/man1/npm-diff.1 +++ b/deps/npm/man/man1/npm-diff.1 @@ -1,4 +1,4 @@ -.TH "NPM-DIFF" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-DIFF" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-diff\fR - The registry diff command .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-dist-tag.1 b/deps/npm/man/man1/npm-dist-tag.1 index 73a9dc901d7159..4b1eb615a009ab 100644 --- a/deps/npm/man/man1/npm-dist-tag.1 +++ b/deps/npm/man/man1/npm-dist-tag.1 @@ -1,4 +1,4 @@ -.TH "NPM-DIST-TAG" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-DIST-TAG" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-dist-tag\fR - Modify package distribution tags .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-docs.1 b/deps/npm/man/man1/npm-docs.1 index d4cf707da810db..f6fe1e7c3c3695 100644 --- a/deps/npm/man/man1/npm-docs.1 +++ b/deps/npm/man/man1/npm-docs.1 @@ -1,4 +1,4 @@ -.TH "NPM-DOCS" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-DOCS" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-docs\fR - Open documentation for a package in a web browser .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-doctor.1 b/deps/npm/man/man1/npm-doctor.1 index f850d1783a3f90..d9173c84ecf5e4 100644 --- a/deps/npm/man/man1/npm-doctor.1 +++ b/deps/npm/man/man1/npm-doctor.1 @@ -1,4 +1,4 @@ -.TH "NPM-DOCTOR" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-DOCTOR" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-doctor\fR - Check the health of your npm environment .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-edit.1 b/deps/npm/man/man1/npm-edit.1 index eaf818feab014e..905da8480a8616 100644 --- a/deps/npm/man/man1/npm-edit.1 +++ b/deps/npm/man/man1/npm-edit.1 @@ -1,4 +1,4 @@ -.TH "NPM-EDIT" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-EDIT" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-edit\fR - Edit an installed package .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-exec.1 b/deps/npm/man/man1/npm-exec.1 index 72a408edebac22..d0abe0ee1ba752 100644 --- a/deps/npm/man/man1/npm-exec.1 +++ b/deps/npm/man/man1/npm-exec.1 @@ -1,4 +1,4 @@ -.TH "NPM-EXEC" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-EXEC" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-exec\fR - Run a command from a local or remote npm package .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-explain.1 b/deps/npm/man/man1/npm-explain.1 index 05eca8209d08b2..f97f7c5d93f38c 100644 --- a/deps/npm/man/man1/npm-explain.1 +++ b/deps/npm/man/man1/npm-explain.1 @@ -1,4 +1,4 @@ -.TH "NPM-EXPLAIN" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-EXPLAIN" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-explain\fR - Explain installed packages .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-explore.1 b/deps/npm/man/man1/npm-explore.1 index 2579b707ecbc35..186e59faf40229 100644 --- a/deps/npm/man/man1/npm-explore.1 +++ b/deps/npm/man/man1/npm-explore.1 @@ -1,4 +1,4 @@ -.TH "NPM-EXPLORE" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-EXPLORE" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-explore\fR - Browse an installed package .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-find-dupes.1 b/deps/npm/man/man1/npm-find-dupes.1 index 992e6ba3c1e9ba..1bc945b85f748b 100644 --- a/deps/npm/man/man1/npm-find-dupes.1 +++ b/deps/npm/man/man1/npm-find-dupes.1 @@ -1,4 +1,4 @@ -.TH "NPM-FIND-DUPES" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-FIND-DUPES" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-find-dupes\fR - Find duplication in the package tree .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-fund.1 b/deps/npm/man/man1/npm-fund.1 index 7145eb2771bd94..59d984c1bc4bc2 100644 --- a/deps/npm/man/man1/npm-fund.1 +++ b/deps/npm/man/man1/npm-fund.1 @@ -1,4 +1,4 @@ -.TH "NPM-FUND" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-FUND" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-fund\fR - Retrieve funding information .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-help-search.1 b/deps/npm/man/man1/npm-help-search.1 index db61cc6bc1eefd..078f6fd76f06c2 100644 --- a/deps/npm/man/man1/npm-help-search.1 +++ b/deps/npm/man/man1/npm-help-search.1 @@ -1,4 +1,4 @@ -.TH "NPM-HELP-SEARCH" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-HELP-SEARCH" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-help-search\fR - Search npm help documentation .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-help.1 b/deps/npm/man/man1/npm-help.1 index cd995949f9e3ee..9a747c4787f341 100644 --- a/deps/npm/man/man1/npm-help.1 +++ b/deps/npm/man/man1/npm-help.1 @@ -1,4 +1,4 @@ -.TH "NPM-HELP" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-HELP" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-help\fR - Get help on npm .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-init.1 b/deps/npm/man/man1/npm-init.1 index 6df4f462224309..f602fea0c1fa3c 100644 --- a/deps/npm/man/man1/npm-init.1 +++ b/deps/npm/man/man1/npm-init.1 @@ -1,4 +1,4 @@ -.TH "NPM-INIT" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-INIT" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-init\fR - Create a package.json file .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-install-ci-test.1 b/deps/npm/man/man1/npm-install-ci-test.1 index db2ff4fded9e3b..24f091012f57c8 100644 --- a/deps/npm/man/man1/npm-install-ci-test.1 +++ b/deps/npm/man/man1/npm-install-ci-test.1 @@ -1,4 +1,4 @@ -.TH "NPM-INSTALL-CI-TEST" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-INSTALL-CI-TEST" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-install-ci-test\fR - Install a project with a clean slate and run tests .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-install-test.1 b/deps/npm/man/man1/npm-install-test.1 index ae5d7e425ac070..7b369331f22c86 100644 --- a/deps/npm/man/man1/npm-install-test.1 +++ b/deps/npm/man/man1/npm-install-test.1 @@ -1,4 +1,4 @@ -.TH "NPM-INSTALL-TEST" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-INSTALL-TEST" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-install-test\fR - Install package(s) and run tests .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-install.1 b/deps/npm/man/man1/npm-install.1 index f21a7146ba0036..72f8c382fbeda2 100644 --- a/deps/npm/man/man1/npm-install.1 +++ b/deps/npm/man/man1/npm-install.1 @@ -1,4 +1,4 @@ -.TH "NPM-INSTALL" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-INSTALL" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-install\fR - Install a package .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-link.1 b/deps/npm/man/man1/npm-link.1 index 328ab37d283260..d2250ea1da9bce 100644 --- a/deps/npm/man/man1/npm-link.1 +++ b/deps/npm/man/man1/npm-link.1 @@ -1,4 +1,4 @@ -.TH "NPM-LINK" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-LINK" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-link\fR - Symlink a package folder .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-login.1 b/deps/npm/man/man1/npm-login.1 index fdf7dc536a5238..a599bdcd825422 100644 --- a/deps/npm/man/man1/npm-login.1 +++ b/deps/npm/man/man1/npm-login.1 @@ -1,4 +1,4 @@ -.TH "NPM-LOGIN" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-LOGIN" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-login\fR - Login to a registry user account .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-logout.1 b/deps/npm/man/man1/npm-logout.1 index 1fda83a77a0b03..5704d10ef90e9d 100644 --- a/deps/npm/man/man1/npm-logout.1 +++ b/deps/npm/man/man1/npm-logout.1 @@ -1,4 +1,4 @@ -.TH "NPM-LOGOUT" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-LOGOUT" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-logout\fR - Log out of the registry .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-ls.1 b/deps/npm/man/man1/npm-ls.1 index f917683046da16..7619aaa6cbd504 100644 --- a/deps/npm/man/man1/npm-ls.1 +++ b/deps/npm/man/man1/npm-ls.1 @@ -1,4 +1,4 @@ -.TH "NPM-LS" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-LS" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-ls\fR - List installed packages .SS "Synopsis" @@ -20,7 +20,7 @@ Positional arguments are \fBname@version-range\fR identifiers, which will limit .P .RS 2 .nf -npm@11.5.1 /path/to/npm +npm@11.6.0 /path/to/npm └─┬ init-package-json@0.0.4 └── promzard@0.1.5 .fi diff --git a/deps/npm/man/man1/npm-org.1 b/deps/npm/man/man1/npm-org.1 index 47eccc70240181..f7459a8d9756bf 100644 --- a/deps/npm/man/man1/npm-org.1 +++ b/deps/npm/man/man1/npm-org.1 @@ -1,4 +1,4 @@ -.TH "NPM-ORG" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-ORG" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-org\fR - Manage orgs .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-outdated.1 b/deps/npm/man/man1/npm-outdated.1 index 44109de6f347ef..609909fb621d36 100644 --- a/deps/npm/man/man1/npm-outdated.1 +++ b/deps/npm/man/man1/npm-outdated.1 @@ -1,4 +1,4 @@ -.TH "NPM-OUTDATED" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-OUTDATED" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-outdated\fR - Check for outdated packages .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-owner.1 b/deps/npm/man/man1/npm-owner.1 index 81ef6cf66d9871..2cdd0b034d2491 100644 --- a/deps/npm/man/man1/npm-owner.1 +++ b/deps/npm/man/man1/npm-owner.1 @@ -1,4 +1,4 @@ -.TH "NPM-OWNER" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-OWNER" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-owner\fR - Manage package owners .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-pack.1 b/deps/npm/man/man1/npm-pack.1 index 8709e59b7d1168..3d80e9cfdcdac7 100644 --- a/deps/npm/man/man1/npm-pack.1 +++ b/deps/npm/man/man1/npm-pack.1 @@ -1,4 +1,4 @@ -.TH "NPM-PACK" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-PACK" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-pack\fR - Create a tarball from a package .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-ping.1 b/deps/npm/man/man1/npm-ping.1 index d78ca5567945ce..fb6a7403ef1807 100644 --- a/deps/npm/man/man1/npm-ping.1 +++ b/deps/npm/man/man1/npm-ping.1 @@ -1,4 +1,4 @@ -.TH "NPM-PING" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-PING" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-ping\fR - Ping npm registry .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-pkg.1 b/deps/npm/man/man1/npm-pkg.1 index 58ded5febd5d21..b6ecf7f0a62ea8 100644 --- a/deps/npm/man/man1/npm-pkg.1 +++ b/deps/npm/man/man1/npm-pkg.1 @@ -1,4 +1,4 @@ -.TH "NPM-PKG" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-PKG" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-pkg\fR - Manages your package.json .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-prefix.1 b/deps/npm/man/man1/npm-prefix.1 index 7cd525b761868e..dd5d059be7992c 100644 --- a/deps/npm/man/man1/npm-prefix.1 +++ b/deps/npm/man/man1/npm-prefix.1 @@ -1,4 +1,4 @@ -.TH "NPM-PREFIX" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-PREFIX" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-prefix\fR - Display prefix .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-profile.1 b/deps/npm/man/man1/npm-profile.1 index e79de330b16ffd..910f02a0298166 100644 --- a/deps/npm/man/man1/npm-profile.1 +++ b/deps/npm/man/man1/npm-profile.1 @@ -1,4 +1,4 @@ -.TH "NPM-PROFILE" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-PROFILE" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-profile\fR - Change settings on your registry profile .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-prune.1 b/deps/npm/man/man1/npm-prune.1 index 9c86cc4c67590d..8355e9a60e842d 100644 --- a/deps/npm/man/man1/npm-prune.1 +++ b/deps/npm/man/man1/npm-prune.1 @@ -1,4 +1,4 @@ -.TH "NPM-PRUNE" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-PRUNE" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-prune\fR - Remove extraneous packages .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-publish.1 b/deps/npm/man/man1/npm-publish.1 index 2b4323e35f09cc..d4cdfa376a573a 100644 --- a/deps/npm/man/man1/npm-publish.1 +++ b/deps/npm/man/man1/npm-publish.1 @@ -1,4 +1,4 @@ -.TH "NPM-PUBLISH" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-PUBLISH" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-publish\fR - Publish a package .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-query.1 b/deps/npm/man/man1/npm-query.1 index e0403860218cdb..4ce7630af04230 100644 --- a/deps/npm/man/man1/npm-query.1 +++ b/deps/npm/man/man1/npm-query.1 @@ -1,4 +1,4 @@ -.TH "NPM-QUERY" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-QUERY" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-query\fR - Dependency selector query .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-rebuild.1 b/deps/npm/man/man1/npm-rebuild.1 index 31d59b66ab3654..6b7865221e0aed 100644 --- a/deps/npm/man/man1/npm-rebuild.1 +++ b/deps/npm/man/man1/npm-rebuild.1 @@ -1,4 +1,4 @@ -.TH "NPM-REBUILD" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-REBUILD" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-rebuild\fR - Rebuild a package .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-repo.1 b/deps/npm/man/man1/npm-repo.1 index d2276b2038e16d..34a39483bc1429 100644 --- a/deps/npm/man/man1/npm-repo.1 +++ b/deps/npm/man/man1/npm-repo.1 @@ -1,4 +1,4 @@ -.TH "NPM-REPO" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-REPO" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-repo\fR - Open package repository page in the browser .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-restart.1 b/deps/npm/man/man1/npm-restart.1 index c783c1efcd8b32..42a2af54be24e9 100644 --- a/deps/npm/man/man1/npm-restart.1 +++ b/deps/npm/man/man1/npm-restart.1 @@ -1,4 +1,4 @@ -.TH "NPM-RESTART" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-RESTART" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-restart\fR - Restart a package .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-root.1 b/deps/npm/man/man1/npm-root.1 index 870f1a3ac87ae5..2f7b66913f6ac6 100644 --- a/deps/npm/man/man1/npm-root.1 +++ b/deps/npm/man/man1/npm-root.1 @@ -1,4 +1,4 @@ -.TH "NPM-ROOT" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-ROOT" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-root\fR - Display npm root .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-run.1 b/deps/npm/man/man1/npm-run.1 index 20a2d5396dee92..c74fa22db30182 100644 --- a/deps/npm/man/man1/npm-run.1 +++ b/deps/npm/man/man1/npm-run.1 @@ -1,4 +1,4 @@ -.TH "NPM-RUN" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-RUN" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-run\fR - Run arbitrary package scripts .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-sbom.1 b/deps/npm/man/man1/npm-sbom.1 index b78f930ef67ed8..8f32919d9010d1 100644 --- a/deps/npm/man/man1/npm-sbom.1 +++ b/deps/npm/man/man1/npm-sbom.1 @@ -1,4 +1,4 @@ -.TH "NPM-SBOM" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-SBOM" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-sbom\fR - Generate a Software Bill of Materials (SBOM) .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-search.1 b/deps/npm/man/man1/npm-search.1 index 71e1eb03da636a..f10adcf2ff14d0 100644 --- a/deps/npm/man/man1/npm-search.1 +++ b/deps/npm/man/man1/npm-search.1 @@ -1,4 +1,4 @@ -.TH "NPM-SEARCH" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-SEARCH" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-search\fR - Search for packages .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-shrinkwrap.1 b/deps/npm/man/man1/npm-shrinkwrap.1 index c3db5d6b2b4fb3..805c49dd282f26 100644 --- a/deps/npm/man/man1/npm-shrinkwrap.1 +++ b/deps/npm/man/man1/npm-shrinkwrap.1 @@ -1,4 +1,4 @@ -.TH "NPM-SHRINKWRAP" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-SHRINKWRAP" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-shrinkwrap\fR - Lock down dependency versions for publication .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-star.1 b/deps/npm/man/man1/npm-star.1 index 039a9d77aee542..4a6c5bdbec1014 100644 --- a/deps/npm/man/man1/npm-star.1 +++ b/deps/npm/man/man1/npm-star.1 @@ -1,4 +1,4 @@ -.TH "NPM-STAR" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-STAR" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-star\fR - Mark your favorite packages .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-stars.1 b/deps/npm/man/man1/npm-stars.1 index 3d01cceeb71486..bf67a21b1f8c06 100644 --- a/deps/npm/man/man1/npm-stars.1 +++ b/deps/npm/man/man1/npm-stars.1 @@ -1,4 +1,4 @@ -.TH "NPM-STARS" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-STARS" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-stars\fR - View packages marked as favorites .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-start.1 b/deps/npm/man/man1/npm-start.1 index b1c8d43f2908be..b2311aa3d04a46 100644 --- a/deps/npm/man/man1/npm-start.1 +++ b/deps/npm/man/man1/npm-start.1 @@ -1,4 +1,4 @@ -.TH "NPM-START" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-START" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-start\fR - Start a package .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-stop.1 b/deps/npm/man/man1/npm-stop.1 index 0ef270c59023fc..0ca1a2cc7a7e26 100644 --- a/deps/npm/man/man1/npm-stop.1 +++ b/deps/npm/man/man1/npm-stop.1 @@ -1,4 +1,4 @@ -.TH "NPM-STOP" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-STOP" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-stop\fR - Stop a package .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-team.1 b/deps/npm/man/man1/npm-team.1 index 797e36745ba9f7..c13f764184a99c 100644 --- a/deps/npm/man/man1/npm-team.1 +++ b/deps/npm/man/man1/npm-team.1 @@ -1,4 +1,4 @@ -.TH "NPM-TEAM" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-TEAM" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-team\fR - Manage organization teams and team memberships .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-test.1 b/deps/npm/man/man1/npm-test.1 index 700ff4b7ead51c..6d51b5cd49e47e 100644 --- a/deps/npm/man/man1/npm-test.1 +++ b/deps/npm/man/man1/npm-test.1 @@ -1,4 +1,4 @@ -.TH "NPM-TEST" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-TEST" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-test\fR - Test a package .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-token.1 b/deps/npm/man/man1/npm-token.1 index 8834947b79c86f..c370a5f4130927 100644 --- a/deps/npm/man/man1/npm-token.1 +++ b/deps/npm/man/man1/npm-token.1 @@ -1,4 +1,4 @@ -.TH "NPM-TOKEN" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-TOKEN" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-token\fR - Manage your authentication tokens .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-undeprecate.1 b/deps/npm/man/man1/npm-undeprecate.1 index f0be1531efffb6..74d6468caf27d4 100644 --- a/deps/npm/man/man1/npm-undeprecate.1 +++ b/deps/npm/man/man1/npm-undeprecate.1 @@ -1,4 +1,4 @@ -.TH "NPM-UNDEPRECATE" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-UNDEPRECATE" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-undeprecate\fR - Undeprecate a version of a package .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-uninstall.1 b/deps/npm/man/man1/npm-uninstall.1 index 3c6357d135a1ca..e9e58db20892c7 100644 --- a/deps/npm/man/man1/npm-uninstall.1 +++ b/deps/npm/man/man1/npm-uninstall.1 @@ -1,4 +1,4 @@ -.TH "NPM-UNINSTALL" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-UNINSTALL" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-uninstall\fR - Remove a package .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-unpublish.1 b/deps/npm/man/man1/npm-unpublish.1 index cd56006c8caa3d..3fe1b4703c4b57 100644 --- a/deps/npm/man/man1/npm-unpublish.1 +++ b/deps/npm/man/man1/npm-unpublish.1 @@ -1,4 +1,4 @@ -.TH "NPM-UNPUBLISH" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-UNPUBLISH" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-unpublish\fR - Remove a package from the registry .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-unstar.1 b/deps/npm/man/man1/npm-unstar.1 index 7a6273221b19ea..7adbc92198f482 100644 --- a/deps/npm/man/man1/npm-unstar.1 +++ b/deps/npm/man/man1/npm-unstar.1 @@ -1,4 +1,4 @@ -.TH "NPM-UNSTAR" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-UNSTAR" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-unstar\fR - Remove an item from your favorite packages .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-update.1 b/deps/npm/man/man1/npm-update.1 index e54e5c2c19a3ed..2e18852e3b7e33 100644 --- a/deps/npm/man/man1/npm-update.1 +++ b/deps/npm/man/man1/npm-update.1 @@ -1,4 +1,4 @@ -.TH "NPM-UPDATE" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-UPDATE" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-update\fR - Update packages .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-version.1 b/deps/npm/man/man1/npm-version.1 index bc2a20603fcf26..aa6e8ca38048b6 100644 --- a/deps/npm/man/man1/npm-version.1 +++ b/deps/npm/man/man1/npm-version.1 @@ -1,4 +1,4 @@ -.TH "NPM-VERSION" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-VERSION" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-version\fR - Bump a package version .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-view.1 b/deps/npm/man/man1/npm-view.1 index d1e83d5f40c07c..44cb8ccad70b24 100644 --- a/deps/npm/man/man1/npm-view.1 +++ b/deps/npm/man/man1/npm-view.1 @@ -1,4 +1,4 @@ -.TH "NPM-VIEW" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-VIEW" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-view\fR - View registry info .SS "Synopsis" diff --git a/deps/npm/man/man1/npm-whoami.1 b/deps/npm/man/man1/npm-whoami.1 index f748178998a735..a382a63182c1ab 100644 --- a/deps/npm/man/man1/npm-whoami.1 +++ b/deps/npm/man/man1/npm-whoami.1 @@ -1,4 +1,4 @@ -.TH "NPM-WHOAMI" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM-WHOAMI" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-whoami\fR - Display npm username .SS "Synopsis" diff --git a/deps/npm/man/man1/npm.1 b/deps/npm/man/man1/npm.1 index 0b3ebfd4ed7cac..cbab59dd3cd0bd 100644 --- a/deps/npm/man/man1/npm.1 +++ b/deps/npm/man/man1/npm.1 @@ -1,4 +1,4 @@ -.TH "NPM" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPM" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm\fR - javascript package manager .SS "Synopsis" @@ -12,7 +12,7 @@ npm Note: This command is unaware of workspaces. .SS "Version" .P -11.5.1 +11.6.0 .SS "Description" .P npm is the package manager for the Node JavaScript platform. It puts modules in place so that node can find them, and manages dependency conflicts intelligently. diff --git a/deps/npm/man/man1/npx.1 b/deps/npm/man/man1/npx.1 index 7b86cacea20021..c87f4a046be75b 100644 --- a/deps/npm/man/man1/npx.1 +++ b/deps/npm/man/man1/npx.1 @@ -1,4 +1,4 @@ -.TH "NPX" "1" "July 2025" "NPM@11.5.1" "" +.TH "NPX" "1" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpx\fR - Run a command from a local or remote npm package .SS "Synopsis" diff --git a/deps/npm/man/man5/folders.5 b/deps/npm/man/man5/folders.5 index 35abf7890468ef..c74c26fc961328 100644 --- a/deps/npm/man/man5/folders.5 +++ b/deps/npm/man/man5/folders.5 @@ -1,4 +1,4 @@ -.TH "FOLDERS" "5" "July 2025" "NPM@11.5.1" "" +.TH "FOLDERS" "5" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBfolders\fR - Folder Structures Used by npm .SS "Description" diff --git a/deps/npm/man/man5/install.5 b/deps/npm/man/man5/install.5 index dcc827ba9615ac..28a3f8995de5f3 100644 --- a/deps/npm/man/man5/install.5 +++ b/deps/npm/man/man5/install.5 @@ -1,4 +1,4 @@ -.TH "INSTALL" "5" "July 2025" "NPM@11.5.1" "" +.TH "INSTALL" "5" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBinstall\fR - Download and install node and npm .SS "Description" diff --git a/deps/npm/man/man5/npm-global.5 b/deps/npm/man/man5/npm-global.5 index 35abf7890468ef..c74c26fc961328 100644 --- a/deps/npm/man/man5/npm-global.5 +++ b/deps/npm/man/man5/npm-global.5 @@ -1,4 +1,4 @@ -.TH "FOLDERS" "5" "July 2025" "NPM@11.5.1" "" +.TH "FOLDERS" "5" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBfolders\fR - Folder Structures Used by npm .SS "Description" diff --git a/deps/npm/man/man5/npm-json.5 b/deps/npm/man/man5/npm-json.5 index 86974c10a31b58..ea073800fcc7a6 100644 --- a/deps/npm/man/man5/npm-json.5 +++ b/deps/npm/man/man5/npm-json.5 @@ -1,4 +1,4 @@ -.TH "PACKAGE.JSON" "5" "July 2025" "NPM@11.5.1" "" +.TH "PACKAGE.JSON" "5" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBpackage.json\fR - Specifics of npm's package.json handling .SS "Description" diff --git a/deps/npm/man/man5/npm-shrinkwrap-json.5 b/deps/npm/man/man5/npm-shrinkwrap-json.5 index b52ced318945d6..0b1f7e858fc13b 100644 --- a/deps/npm/man/man5/npm-shrinkwrap-json.5 +++ b/deps/npm/man/man5/npm-shrinkwrap-json.5 @@ -1,4 +1,4 @@ -.TH "NPM-SHRINKWRAP.JSON" "5" "July 2025" "NPM@11.5.1" "" +.TH "NPM-SHRINKWRAP.JSON" "5" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpm-shrinkwrap.json\fR - A publishable lockfile .SS "Description" diff --git a/deps/npm/man/man5/npmrc.5 b/deps/npm/man/man5/npmrc.5 index bc9f9d48229a66..3a232549d1731e 100644 --- a/deps/npm/man/man5/npmrc.5 +++ b/deps/npm/man/man5/npmrc.5 @@ -1,4 +1,4 @@ -.TH "NPMRC" "5" "July 2025" "NPM@11.5.1" "" +.TH "NPMRC" "5" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBnpmrc\fR - The npm config files .SS "Description" @@ -23,11 +23,12 @@ npm builtin config file (\fB/path/to/npm/npmrc\fR) .RE 0 .P -All npm config files are an ini-formatted list of \fBkey = value\fR parameters. Environment variables can be replaced using \fB${VARIABLE_NAME}\fR. For example: +All npm config files are an ini-formatted list of \fBkey = value\fR parameters. Environment variables can be replaced using \fB${VARIABLE_NAME}\fR. By default if the variable is not defined, it is left unreplaced. By adding \fB?\fR after variable name they can be forced to evaluate to an empty string instead. For example: .P .RS 2 .nf cache = ${HOME}/.npm-packages +node-options = "${NODE_OPTIONS?} --use-system-ca" .fi .RE .P diff --git a/deps/npm/man/man5/package-json.5 b/deps/npm/man/man5/package-json.5 index 86974c10a31b58..ea073800fcc7a6 100644 --- a/deps/npm/man/man5/package-json.5 +++ b/deps/npm/man/man5/package-json.5 @@ -1,4 +1,4 @@ -.TH "PACKAGE.JSON" "5" "July 2025" "NPM@11.5.1" "" +.TH "PACKAGE.JSON" "5" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBpackage.json\fR - Specifics of npm's package.json handling .SS "Description" diff --git a/deps/npm/man/man5/package-lock-json.5 b/deps/npm/man/man5/package-lock-json.5 index 0817708a68a5e0..bb4be95f167e50 100644 --- a/deps/npm/man/man5/package-lock-json.5 +++ b/deps/npm/man/man5/package-lock-json.5 @@ -1,4 +1,4 @@ -.TH "PACKAGE-LOCK.JSON" "5" "July 2025" "NPM@11.5.1" "" +.TH "PACKAGE-LOCK.JSON" "5" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBpackage-lock.json\fR - A manifestation of the manifest .SS "Description" diff --git a/deps/npm/man/man7/config.7 b/deps/npm/man/man7/config.7 index 233ab61b966aeb..c0452c42dae516 100644 --- a/deps/npm/man/man7/config.7 +++ b/deps/npm/man/man7/config.7 @@ -1,4 +1,4 @@ -.TH "CONFIG" "7" "July 2025" "NPM@11.5.1" "" +.TH "CONFIG" "7" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBconfig\fR - More than you probably want to know about npm configuration .SS "Description" diff --git a/deps/npm/man/man7/dependency-selectors.7 b/deps/npm/man/man7/dependency-selectors.7 index b249ea851bceec..a42f051056bd8c 100644 --- a/deps/npm/man/man7/dependency-selectors.7 +++ b/deps/npm/man/man7/dependency-selectors.7 @@ -1,4 +1,4 @@ -.TH "QUERYING" "7" "July 2025" "NPM@11.5.1" "" +.TH "QUERYING" "7" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBQuerying\fR - Dependency Selector Syntax & Querying .SS "Description" diff --git a/deps/npm/man/man7/developers.7 b/deps/npm/man/man7/developers.7 index 6961a751b7db9a..6f4e03a4227d91 100644 --- a/deps/npm/man/man7/developers.7 +++ b/deps/npm/man/man7/developers.7 @@ -1,4 +1,4 @@ -.TH "DEVELOPERS" "7" "July 2025" "NPM@11.5.1" "" +.TH "DEVELOPERS" "7" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBdevelopers\fR - Developer Guide .SS "Description" diff --git a/deps/npm/man/man7/logging.7 b/deps/npm/man/man7/logging.7 index fa6bcdb2d35597..45d16bd8ef178a 100644 --- a/deps/npm/man/man7/logging.7 +++ b/deps/npm/man/man7/logging.7 @@ -1,4 +1,4 @@ -.TH "LOGGING" "7" "July 2025" "NPM@11.5.1" "" +.TH "LOGGING" "7" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBLogging\fR - Why, What & How We Log .SS "Description" diff --git a/deps/npm/man/man7/orgs.7 b/deps/npm/man/man7/orgs.7 index 38ddc1cbe05bf6..3ad5af51c4db2a 100644 --- a/deps/npm/man/man7/orgs.7 +++ b/deps/npm/man/man7/orgs.7 @@ -1,4 +1,4 @@ -.TH "ORGS" "7" "July 2025" "NPM@11.5.1" "" +.TH "ORGS" "7" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBorgs\fR - Working with Teams & Orgs .SS "Description" diff --git a/deps/npm/man/man7/package-spec.7 b/deps/npm/man/man7/package-spec.7 index a3bd8733cd8945..aef208443c42ef 100644 --- a/deps/npm/man/man7/package-spec.7 +++ b/deps/npm/man/man7/package-spec.7 @@ -1,4 +1,4 @@ -.TH "PACKAGE-SPEC" "7" "July 2025" "NPM@11.5.1" "" +.TH "PACKAGE-SPEC" "7" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBpackage-spec\fR - Package name specifier .SS "Description" diff --git a/deps/npm/man/man7/registry.7 b/deps/npm/man/man7/registry.7 index 13ded7b2a54790..89b8352bdf723d 100644 --- a/deps/npm/man/man7/registry.7 +++ b/deps/npm/man/man7/registry.7 @@ -1,4 +1,4 @@ -.TH "REGISTRY" "7" "July 2025" "NPM@11.5.1" "" +.TH "REGISTRY" "7" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBregistry\fR - The JavaScript Package Registry .SS "Description" diff --git a/deps/npm/man/man7/removal.7 b/deps/npm/man/man7/removal.7 index 2a5ca6b22f0538..cb24155064a63e 100644 --- a/deps/npm/man/man7/removal.7 +++ b/deps/npm/man/man7/removal.7 @@ -1,4 +1,4 @@ -.TH "REMOVAL" "7" "July 2025" "NPM@11.5.1" "" +.TH "REMOVAL" "7" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBremoval\fR - Cleaning the Slate .SS "Synopsis" diff --git a/deps/npm/man/man7/scope.7 b/deps/npm/man/man7/scope.7 index 949a1cc5cea850..8cd35fc929b448 100644 --- a/deps/npm/man/man7/scope.7 +++ b/deps/npm/man/man7/scope.7 @@ -1,4 +1,4 @@ -.TH "SCOPE" "7" "July 2025" "NPM@11.5.1" "" +.TH "SCOPE" "7" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBscope\fR - Scoped packages .SS "Description" diff --git a/deps/npm/man/man7/scripts.7 b/deps/npm/man/man7/scripts.7 index 7f530e4a5d37b5..78678944424b1e 100644 --- a/deps/npm/man/man7/scripts.7 +++ b/deps/npm/man/man7/scripts.7 @@ -1,4 +1,4 @@ -.TH "SCRIPTS" "7" "July 2025" "NPM@11.5.1" "" +.TH "SCRIPTS" "7" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBscripts\fR - How npm handles the "scripts" field .SS "Description" diff --git a/deps/npm/man/man7/workspaces.7 b/deps/npm/man/man7/workspaces.7 index 2c1c874b6d4062..35cf265c1abcde 100644 --- a/deps/npm/man/man7/workspaces.7 +++ b/deps/npm/man/man7/workspaces.7 @@ -1,4 +1,4 @@ -.TH "WORKSPACES" "7" "July 2025" "NPM@11.5.1" "" +.TH "WORKSPACES" "7" "September 2025" "NPM@11.6.0" "" .SH "NAME" \fBworkspaces\fR - Working with workspaces .SS "Description" diff --git a/deps/npm/node_modules/@npmcli/arborist/lib/arborist/build-ideal-tree.js b/deps/npm/node_modules/@npmcli/arborist/lib/arborist/build-ideal-tree.js index 1edd0b643b60d5..281f62b116bd3d 100644 --- a/deps/npm/node_modules/@npmcli/arborist/lib/arborist/build-ideal-tree.js +++ b/deps/npm/node_modules/@npmcli/arborist/lib/arborist/build-ideal-tree.js @@ -1238,15 +1238,19 @@ This is a one-time fix-up, please be patient... // Check if the target is within the project root isProjectInternalFileSpec = targetPath.startsWith(resolvedProjectRoot + sep) || targetPath === resolvedProjectRoot } + + // When using --install-links, we need to handle transitive file dependencies specially + // If the parent was installed (not linked) due to --install-links, and this is a file: dep, we should also install it rather than link it + const parentWasInstalled = parent && !parent.isLink && parent.resolved?.startsWith('file:') + const isTransitiveFileDep = spec.type === 'directory' && parentWasInstalled && installLinks + // Decide whether to link or copy the dependency - const shouldLink = isWorkspace || isProjectInternalFileSpec || !installLinks + const shouldLink = (isWorkspace || isProjectInternalFileSpec || !installLinks) && !isTransitiveFileDep if (spec.type === 'directory' && shouldLink) { return this.#linkFromSpec(name, spec, parent, edge) } - // if the spec matches a workspace name, then see if the workspace node will - // satisfy the edge. if it does, we return the workspace node to make sure it - // takes priority. + // if the spec matches a workspace name, then see if the workspace node will satisfy the edge. if it does, we return the workspace node to make sure it takes priority. if (isWorkspace) { const existingNode = this.idealTree.edgesOut.get(spec.name).to if (existingNode && existingNode.isWorkspace && existingNode.satisfies(edge)) { @@ -1254,6 +1258,15 @@ This is a one-time fix-up, please be patient... } } + // For file: dependencies that we're installing (not linking), ensure proper resolution + if (isTransitiveFileDep && edge) { + // For transitive file deps, resolve relative to the parent's original source location + const parentOriginalPath = parent.resolved.slice(5) // Remove 'file:' prefix + const relativePath = edge.rawSpec.slice(5) // Remove 'file:' prefix + const absolutePath = resolve(parentOriginalPath, relativePath) + spec = npa.resolve(name, `file:${absolutePath}`) + } + // spec isn't a directory, and either isn't a workspace or the workspace we have // doesn't satisfy the edge. try to fetch a manifest and build a node from that. return this.#fetchManifest(spec) @@ -1306,6 +1319,12 @@ This is a one-time fix-up, please be patient... .sort(({ name: a }, { name: b }) => localeCompare(a, b)) for (const edge of peerEdges) { + // node.parent gets mutated during loop execution due to recursive #nodeFromEdge calls. + // When a compatible peer is found (e.g. a@1.1.0 replaces a@1.2.0), the original node loses its parent. + // if node is detached/removed from the tree, or has no parent, so no need to check remaining edgesOut for that node. + if (!node.parent) { + break + } // already placed this one, and we're happy with it. if (edge.valid && edge.to) { continue diff --git a/deps/npm/node_modules/@npmcli/arborist/lib/arborist/reify.js b/deps/npm/node_modules/@npmcli/arborist/lib/arborist/reify.js index 7f3fa461b0667a..5da8e72bfa5672 100644 --- a/deps/npm/node_modules/@npmcli/arborist/lib/arborist/reify.js +++ b/deps/npm/node_modules/@npmcli/arborist/lib/arborist/reify.js @@ -885,6 +885,7 @@ module.exports = cls => class Reifier extends cls { // Replace the host with the registry host while keeping the path intact resolvedURL.hostname = registryURL.hostname resolvedURL.port = registryURL.port + resolvedURL.protocol = registryURL.protocol // Make sure we don't double-include the path if it's already there const registryPath = registryURL.pathname.replace(/\/$/, '') diff --git a/deps/npm/node_modules/@npmcli/arborist/package.json b/deps/npm/node_modules/@npmcli/arborist/package.json index 3f9282e99a55c6..7e98d0e7d75713 100644 --- a/deps/npm/node_modules/@npmcli/arborist/package.json +++ b/deps/npm/node_modules/@npmcli/arborist/package.json @@ -1,6 +1,6 @@ { "name": "@npmcli/arborist", - "version": "9.1.3", + "version": "9.1.4", "description": "Manage node_modules trees", "dependencies": { "@isaacs/string-locale-compare": "^1.1.0", diff --git a/deps/npm/node_modules/@npmcli/config/lib/env-replace.js b/deps/npm/node_modules/@npmcli/config/lib/env-replace.js index c851f6e4d15011..c347be480ed688 100644 --- a/deps/npm/node_modules/@npmcli/config/lib/env-replace.js +++ b/deps/npm/node_modules/@npmcli/config/lib/env-replace.js @@ -1,9 +1,11 @@ // replace any ${ENV} values with the appropriate environ. +// optional "?" modifier can be used like this: ${ENV?} so in case of the variable being not defined, it evaluates into empty string. -const envExpr = /(? f.replace(envExpr, (orig, esc, name) => { - const val = env[name] !== undefined ? env[name] : `$\{${name}}` +module.exports = (f, env) => f.replace(envExpr, (orig, esc, name, modifier) => { + const fallback = modifier === '?' ? '' : `$\{${name}}` + const val = env[name] !== undefined ? env[name] : fallback // consume the escape chars that are relevant. if (esc.length % 2) { diff --git a/deps/npm/node_modules/@npmcli/config/package.json b/deps/npm/node_modules/@npmcli/config/package.json index fc6c9fd10ee7f4..5cb8925d4cf4bc 100644 --- a/deps/npm/node_modules/@npmcli/config/package.json +++ b/deps/npm/node_modules/@npmcli/config/package.json @@ -1,6 +1,6 @@ { "name": "@npmcli/config", - "version": "10.3.1", + "version": "10.4.0", "files": [ "bin/", "lib/" diff --git a/deps/npm/node_modules/libnpmdiff/package.json b/deps/npm/node_modules/libnpmdiff/package.json index c89c809e456da6..87c467b5a9783e 100644 --- a/deps/npm/node_modules/libnpmdiff/package.json +++ b/deps/npm/node_modules/libnpmdiff/package.json @@ -1,6 +1,6 @@ { "name": "libnpmdiff", - "version": "8.0.6", + "version": "8.0.7", "description": "The registry diff", "repository": { "type": "git", @@ -47,7 +47,7 @@ "tap": "^16.3.8" }, "dependencies": { - "@npmcli/arborist": "^9.1.3", + "@npmcli/arborist": "^9.1.4", "@npmcli/installed-package-contents": "^3.0.0", "binary-extensions": "^3.0.0", "diff": "^7.0.0", diff --git a/deps/npm/node_modules/libnpmexec/package.json b/deps/npm/node_modules/libnpmexec/package.json index 49b188d9199129..91fb9eb8e9e3a0 100644 --- a/deps/npm/node_modules/libnpmexec/package.json +++ b/deps/npm/node_modules/libnpmexec/package.json @@ -1,6 +1,6 @@ { "name": "libnpmexec", - "version": "10.1.5", + "version": "10.1.6", "files": [ "bin/", "lib/" @@ -60,7 +60,7 @@ "tap": "^16.3.8" }, "dependencies": { - "@npmcli/arborist": "^9.1.3", + "@npmcli/arborist": "^9.1.4", "@npmcli/package-json": "^6.1.1", "@npmcli/run-script": "^9.0.1", "ci-info": "^4.0.0", diff --git a/deps/npm/node_modules/libnpmfund/package.json b/deps/npm/node_modules/libnpmfund/package.json index d888665298a9aa..10c769275c4996 100644 --- a/deps/npm/node_modules/libnpmfund/package.json +++ b/deps/npm/node_modules/libnpmfund/package.json @@ -1,6 +1,6 @@ { "name": "libnpmfund", - "version": "7.0.6", + "version": "7.0.7", "main": "lib/index.js", "files": [ "bin/", @@ -46,7 +46,7 @@ "tap": "^16.3.8" }, "dependencies": { - "@npmcli/arborist": "^9.1.3" + "@npmcli/arborist": "^9.1.4" }, "engines": { "node": "^20.17.0 || >=22.9.0" diff --git a/deps/npm/node_modules/libnpmpack/package.json b/deps/npm/node_modules/libnpmpack/package.json index 1aa091fbb5d6bb..a48d3d983707e3 100644 --- a/deps/npm/node_modules/libnpmpack/package.json +++ b/deps/npm/node_modules/libnpmpack/package.json @@ -1,6 +1,6 @@ { "name": "libnpmpack", - "version": "9.0.6", + "version": "9.0.7", "description": "Programmatic API for the bits behind npm pack", "author": "GitHub Inc.", "main": "lib/index.js", @@ -37,7 +37,7 @@ "bugs": "https://github.com/npm/libnpmpack/issues", "homepage": "https://npmjs.com/package/libnpmpack", "dependencies": { - "@npmcli/arborist": "^9.1.3", + "@npmcli/arborist": "^9.1.4", "@npmcli/run-script": "^9.0.1", "npm-package-arg": "^12.0.0", "pacote": "^21.0.0" diff --git a/deps/npm/package.json b/deps/npm/package.json index ad5800a1058865..76ebe1ab9c6c72 100644 --- a/deps/npm/package.json +++ b/deps/npm/package.json @@ -1,5 +1,5 @@ { - "version": "11.5.1", + "version": "11.6.0", "name": "npm", "description": "a package manager for JavaScript", "workspaces": [ @@ -52,8 +52,8 @@ }, "dependencies": { "@isaacs/string-locale-compare": "^1.1.0", - "@npmcli/arborist": "^9.1.3", - "@npmcli/config": "^10.3.1", + "@npmcli/arborist": "^9.1.4", + "@npmcli/config": "^10.4.0", "@npmcli/fs": "^4.0.0", "@npmcli/map-workspaces": "^4.0.2", "@npmcli/package-json": "^6.2.0", @@ -77,11 +77,11 @@ "is-cidr": "^5.1.1", "json-parse-even-better-errors": "^4.0.0", "libnpmaccess": "^10.0.1", - "libnpmdiff": "^8.0.6", - "libnpmexec": "^10.1.5", - "libnpmfund": "^7.0.6", + "libnpmdiff": "^8.0.7", + "libnpmexec": "^10.1.6", + "libnpmfund": "^7.0.7", "libnpmorg": "^8.0.0", - "libnpmpack": "^9.0.6", + "libnpmpack": "^9.0.7", "libnpmpublish": "^11.1.0", "libnpmsearch": "^9.0.0", "libnpmteam": "^8.0.1", diff --git a/deps/npm/test/fixtures/mock-oidc.js b/deps/npm/test/fixtures/mock-oidc.js index 0d1726a2f91cd8..3af720670b9470 100644 --- a/deps/npm/test/fixtures/mock-oidc.js +++ b/deps/npm/test/fixtures/mock-oidc.js @@ -39,10 +39,11 @@ const mockOidc = async (t, { config = {}, packageJson = {}, load = {}, - mockGithubOidcOptions = null, - mockOidcTokenExchangeOptions = null, + mockGithubOidcOptions = false, + mockOidcTokenExchangeOptions = false, publishOptions = {}, provenance = false, + oidcVisibilityOptions = false, }) => { const github = oidcOptions.github ?? false const gitlab = oidcOptions.gitlab ?? false @@ -113,9 +114,17 @@ const mockOidc = async (t, { }) } + if (oidcVisibilityOptions) { + registry.getVisibility({ spec: packageName, visibility: oidcVisibilityOptions }) + } + registry.publish(packageName, publishOptions) - if ((github || gitlab) && provenance) { + /** + * this will nock / mock all the successful requirements for provenance and + * assumes when a test has "provenance true" that these calls are expected + */ + if (provenance) { registry.getVisibility({ spec: packageName, visibility: { public: true } }) mockProvenance(t, { oidcURL: ACTIONS_ID_TOKEN_REQUEST_URL, diff --git a/deps/npm/test/lib/cli/exit-handler.js b/deps/npm/test/lib/cli/exit-handler.js index f8b112beab0a2c..484704c7352790 100644 --- a/deps/npm/test/lib/cli/exit-handler.js +++ b/deps/npm/test/lib/cli/exit-handler.js @@ -4,7 +4,7 @@ const EventEmitter = require('node:events') const os = require('node:os') const t = require('tap') const fsMiniPass = require('fs-minipass') -const { output, time, log } = require('proc-log') +const { output, time } = require('proc-log') const errorMessage = require('../../../lib/utils/error-message.js') const ExecCommand = require('../../../lib/commands/exec.js') const { load: loadMockNpm } = require('../../fixtures/mock-npm') @@ -707,136 +707,3 @@ t.test('do no fancy handling for shellouts', async t => { }) }) }) - -t.test('container scenarios that trigger exit handler bug', async t => { - t.test('process.exit() called before exit handler cleanup', async (t) => { - // Simulates when npm process exits directly without going through proper cleanup - - let exitHandlerNeverCalledLogged = false - let npmBugReportLogged = false - - await mockExitHandler(t, { - config: { loglevel: 'notice' }, - }) - - // Override log.error to capture the specific error messages - const originalLogError = log.error - log.error = (prefix, msg) => { - if (msg === 'Exit handler never called!') { - exitHandlerNeverCalledLogged = true - } - if (msg === 'This is an error with npm itself. Please report this error at:') { - npmBugReportLogged = true - } - return originalLogError(prefix, msg) - } - - t.teardown(() => { - log.error = originalLogError - }) - - // This happens when containers are stopped/killed before npm can clean up properly - process.emit('exit', 1) - - // Verify the bug is detected and logged correctly - t.equal(exitHandlerNeverCalledLogged, true, 'should log "Exit handler never called!" error') - t.equal(npmBugReportLogged, true, 'should log npm bug report message') - }) - - t.test('SIGTERM signal is handled properly', (t) => { - // This test verifies that our fix handles SIGTERM signals - - const ExitHandler = tmock(t, '{LIB}/cli/exit-handler.js') - const exitHandler = new ExitHandler({ process }) - - const initialSigtermCount = process.listeners('SIGTERM').length - const initialSigintCount = process.listeners('SIGINT').length - const initialSighupCount = process.listeners('SIGHUP').length - - // Register signal handlers - exitHandler.registerUncaughtHandlers() - - const finalSigtermCount = process.listeners('SIGTERM').length - const finalSigintCount = process.listeners('SIGINT').length - const finalSighupCount = process.listeners('SIGHUP').length - - // Verify the fix: signal handlers should be registered - t.ok(finalSigtermCount > initialSigtermCount, 'SIGTERM handler should be registered') - t.ok(finalSigintCount > initialSigintCount, 'SIGINT handler should be registered') - t.ok(finalSighupCount > initialSighupCount, 'SIGHUP handler should be registered') - - // Clean up listeners to avoid affecting other tests - const sigtermListeners = process.listeners('SIGTERM') - const sigintListeners = process.listeners('SIGINT') - const sighupListeners = process.listeners('SIGHUP') - - for (const listener of sigtermListeners) { - process.removeListener('SIGTERM', listener) - } - for (const listener of sigintListeners) { - process.removeListener('SIGINT', listener) - } - for (const listener of sighupListeners) { - process.removeListener('SIGHUP', listener) - } - - t.end() - }) - - t.test('signal handler execution', async (t) => { - const ExitHandler = tmock(t, '{LIB}/cli/exit-handler.js') - const exitHandler = new ExitHandler({ process }) - - // Register signal handlers - exitHandler.registerUncaughtHandlers() - - process.emit('SIGTERM') - process.emit('SIGINT') - process.emit('SIGHUP') - - // Clean up listeners - process.removeAllListeners('SIGTERM') - process.removeAllListeners('SIGINT') - process.removeAllListeners('SIGHUP') - - t.pass('signal handlers executed successfully') - t.end() - }) - - t.test('hanging async operation interrupted by signal', async (t) => { - // This test simulates the scenario where npm hangs on a long operation and receives SIGTERM/SIGKILL before it can complete - - let exitHandlerNeverCalledLogged = false - - const { exitHandler } = await mockExitHandler(t, { - config: { loglevel: 'notice' }, - }) - - // Override log.error to detect the bug message - const originalLogError = log.error - log.error = (prefix, msg) => { - if (msg === 'Exit handler never called!') { - exitHandlerNeverCalledLogged = true - } - return originalLogError(prefix, msg) - } - - t.teardown(() => { - log.error = originalLogError - }) - - // Track if exit handler was called properly - let exitHandlerCalled = false - exitHandler.exit = () => { - exitHandlerCalled = true - } - - // Simulate sending signal to the process without proper cleanup - // This mimics what happens when a container is terminated - process.emit('exit', 1) - - // Verify the bug conditions - t.equal(exitHandlerCalled, false, 'exit handler should not be called in this scenario') - t.equal(exitHandlerNeverCalledLogged, true, 'should detect and log the exit handler bug') - }) -}) diff --git a/deps/npm/test/lib/commands/publish.js b/deps/npm/test/lib/commands/publish.js index f228bfaa599140..b06655d346026e 100644 --- a/deps/npm/test/lib/commands/publish.js +++ b/deps/npm/test/lib/commands/publish.js @@ -1317,6 +1317,7 @@ t.test('oidc token exchange - no provenance', t => { }) t.test('oidc token exchange - provenance', (t) => { + const githubPrivateIdToken = githubIdToken({ visibility: 'private' }) const githubPublicIdToken = githubIdToken({ visibility: 'public' }) const gitlabPublicIdToken = gitlabIdToken({ visibility: 'public' }) const SIGSTORE_ID_TOKEN = sigstoreIdToken() @@ -1340,6 +1341,7 @@ t.test('oidc token exchange - provenance', (t) => { token: 'exchange-token', }, provenance: true, + oidcVisibilityOptions: { public: true }, })) t.test('default registry success gitlab', oidcPublishTest({ @@ -1357,6 +1359,7 @@ t.test('oidc token exchange - provenance', (t) => { token: 'exchange-token', }, provenance: true, + oidcVisibilityOptions: { public: true }, })) t.test('default registry success gitlab without SIGSTORE_ID_TOKEN', oidcPublishTest({ @@ -1376,6 +1379,10 @@ t.test('oidc token exchange - provenance', (t) => { provenance: false, })) + /** + * when the user sets provenance to true or false + * the OIDC flow should not concern itself with provenance at all + */ t.test('setting provenance true in config should enable provenance', oidcPublishTest({ oidcOptions: { github: true }, config: { @@ -1475,5 +1482,95 @@ t.test('oidc token exchange - provenance', (t) => { provenance: false, })) + t.test('attempt to publish a private package with OIDC provenance should be false', oidcPublishTest({ + oidcOptions: { github: true }, + config: { + '//registry.npmjs.org/:_authToken': 'existing-fallback-token', + }, + mockGithubOidcOptions: { + audience: 'npm:registry.npmjs.org', + idToken: githubPublicIdToken, + }, + mockOidcTokenExchangeOptions: { + idToken: githubPublicIdToken, + body: { + token: 'exchange-token', + }, + }, + publishOptions: { + token: 'exchange-token', + }, + provenance: false, + oidcVisibilityOptions: { public: false }, + })) + + /** this call shows that if the repo is private, the visibility check will not be called */ + t.test('attempt to publish a private repository with OIDC provenance should be false', oidcPublishTest({ + oidcOptions: { github: true }, + config: { + '//registry.npmjs.org/:_authToken': 'existing-fallback-token', + }, + mockGithubOidcOptions: { + audience: 'npm:registry.npmjs.org', + idToken: githubPrivateIdToken, + }, + mockOidcTokenExchangeOptions: { + idToken: githubPrivateIdToken, + body: { + token: 'exchange-token', + }, + }, + publishOptions: { + token: 'exchange-token', + }, + provenance: false, + })) + + const provenanceFailures = [[ + new Error('Valid error'), + 'verbose oidc Failed to set provenance with message: Valid error', + ], [ + 'Valid error', + 'verbose oidc Failed to set provenance with message: Unknown error', + ]] + + provenanceFailures.forEach(([error, logMessage], index) => { + t.test(`provenance visibility check failure, coverage for try-catch ${index}`, async t => { + const { npm, logs, joinedOutput } = await mockOidc(t, { + load: { + mocks: { + libnpmaccess: { + getVisibility: () => { + throw error + }, + }, + }, + }, + oidcOptions: { github: true }, + config: { + '//registry.npmjs.org/:_authToken': 'existing-fallback-token', + }, + mockGithubOidcOptions: { + audience: 'npm:registry.npmjs.org', + idToken: githubPublicIdToken, + }, + mockOidcTokenExchangeOptions: { + idToken: githubPublicIdToken, + body: { + token: 'exchange-token', + }, + }, + publishOptions: { + token: 'exchange-token', + }, + provenance: false, + }) + + await npm.exec('publish', []) + t.match(joinedOutput(), '+ @npmcli/test-package@1.0.0') + t.ok(logs.includes(logMessage)) + }) + }) + t.end() }) diff --git a/deps/npm/test/lib/utils/display.js b/deps/npm/test/lib/utils/display.js index 78bffa0221d03a..26f52b17a85283 100644 --- a/deps/npm/test/lib/utils/display.js +++ b/deps/npm/test/lib/utils/display.js @@ -37,7 +37,9 @@ t.test('can log cleanly', async (t) => { const { log, logs } = await mockDisplay(t) log.error('', 'test\x00message') + log.info('', 'fetch DELETE 200 https://registry.npmjs.org/-/user/token/npm_000000000000000000000000000000000000 477ms') t.match(logs.error, ['test^@message']) + t.match(logs.info, ['fetch DELETE 200 https://registry.npmjs.org/-/user/token/npm_*** 477ms']) }) t.test('can handle special eresolves', async (t) => { From 6313706c695a6fd2061187f85c55858e8c2a9beb Mon Sep 17 00:00:00 2001 From: "Node.js GitHub Bot" Date: Sat, 6 Sep 2025 11:43:17 +0100 Subject: [PATCH 064/103] test: update WPT for urlpattern to cff1ac1123 PR-URL: https://github.com/nodejs/node/pull/59602 Reviewed-By: Moshe Atlow Reviewed-By: Luigi Pinca --- test/fixtures/wpt/README.md | 2 +- .../resources/urlpatterntestdata.json | 42 +++++++++++++++++++ test/fixtures/wpt/versions.json | 2 +- 3 files changed, 44 insertions(+), 2 deletions(-) diff --git a/test/fixtures/wpt/README.md b/test/fixtures/wpt/README.md index 4044b555c7a0d5..5a6ab4e5522197 100644 --- a/test/fixtures/wpt/README.md +++ b/test/fixtures/wpt/README.md @@ -29,7 +29,7 @@ Last update: - resources: https://github.com/web-platform-tests/wpt/tree/1d2c5fb36a/resources - streams: https://github.com/web-platform-tests/wpt/tree/bc9dcbbf1a/streams - url: https://github.com/web-platform-tests/wpt/tree/9504a83e01/url -- urlpattern: https://github.com/web-platform-tests/wpt/tree/84b75f0880/urlpattern +- urlpattern: https://github.com/web-platform-tests/wpt/tree/cff1ac1123/urlpattern - user-timing: https://github.com/web-platform-tests/wpt/tree/5ae85bf826/user-timing - wasm/jsapi: https://github.com/web-platform-tests/wpt/tree/cde25e7e3c/wasm/jsapi - wasm/webapi: https://github.com/web-platform-tests/wpt/tree/fd1b23eeaa/wasm/webapi diff --git a/test/fixtures/wpt/urlpattern/resources/urlpatterntestdata.json b/test/fixtures/wpt/urlpattern/resources/urlpatterntestdata.json index a613b6a74b5a90..363ec15f2b2787 100644 --- a/test/fixtures/wpt/urlpattern/resources/urlpatterntestdata.json +++ b/test/fixtures/wpt/urlpattern/resources/urlpatterntestdata.json @@ -2991,5 +2991,47 @@ "pattern": [{ "pathname": "/([\\d&&[0-1]])" }], "inputs": [{ "pathname": "/3" }], "expected_match": null + }, + { + "pattern": [{ "protocol": "http", "hostname": "example.com/ignoredpath" }], + "inputs": ["http://example.com/"], + "expected_obj": { + "protocol": "http", + "hostname": "example.com", + "pathname": "*" + }, + "expected_match": { + "protocol": { "input": "http", "groups": {} }, + "hostname": { "input": "example.com", "groups": {} }, + "pathname": { "input": "/", "groups": { "0": "/" } } + } + }, + { + "pattern": [{ "protocol": "http", "hostname": "example.com\\?ignoredsearch" }], + "inputs": ["http://example.com/"], + "expected_obj": { + "protocol": "http", + "hostname": "example.com", + "search": "*" + }, + "expected_match": { + "protocol": { "input": "http", "groups": {} }, + "hostname": { "input": "example.com", "groups": {} }, + "pathname": { "input": "/", "groups": { "0": "/" } } + } + }, + { + "pattern": [{ "protocol": "http", "hostname": "example.com#ignoredhash" }], + "inputs": ["http://example.com/"], + "expected_obj": { + "protocol": "http", + "hostname": "example.com", + "hash": "*" + }, + "expected_match": { + "protocol": { "input": "http", "groups": {} }, + "hostname": { "input": "example.com", "groups": {} }, + "pathname": { "input": "/", "groups": { "0": "/" } } + } } ] diff --git a/test/fixtures/wpt/versions.json b/test/fixtures/wpt/versions.json index 521ec897f9a7be..7fa0e4572b7703 100644 --- a/test/fixtures/wpt/versions.json +++ b/test/fixtures/wpt/versions.json @@ -76,7 +76,7 @@ "path": "url" }, "urlpattern": { - "commit": "84b75f08801c8cfe407c25614c49132075b8afab", + "commit": "cff1ac112375b8089fe9f6bf434f602cf7ef98be", "path": "urlpattern" }, "user-timing": { From f12c1ad9614ef9d375a6c779f360dc25c4ed7e52 Mon Sep 17 00:00:00 2001 From: "Node.js GitHub Bot" Date: Sat, 6 Sep 2025 13:30:23 +0100 Subject: [PATCH 065/103] deps: update googletest to eb2d85e MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/59335 Reviewed-By: Luigi Pinca Reviewed-By: Rafael Gonzaga Reviewed-By: Michaël Zasso --- deps/googletest/include/gtest/gtest-typed-test.h | 8 ++++---- deps/googletest/include/gtest/gtest.h | 5 ++++- 2 files changed, 8 insertions(+), 5 deletions(-) diff --git a/deps/googletest/include/gtest/gtest-typed-test.h b/deps/googletest/include/gtest/gtest-typed-test.h index 442e00bd3478a5..ae24f94915d004 100644 --- a/deps/googletest/include/gtest/gtest-typed-test.h +++ b/deps/googletest/include/gtest/gtest-typed-test.h @@ -48,15 +48,15 @@ template class FooTest : public testing::Test { public: ... - typedef std::list List; + using List = ::std::list; static T shared_; T value_; }; // Next, associate a list of types with the test suite, which will be -// repeated for each type in the list. The typedef is necessary for +// repeated for each type in the list. The using-declaration is necessary for // the macro to parse correctly. -typedef testing::Types MyTypes; +using MyTypes = ::testing::Types; TYPED_TEST_SUITE(FooTest, MyTypes); // If the type list contains only one type, you can write that type @@ -157,7 +157,7 @@ REGISTER_TYPED_TEST_SUITE_P(FooTest, // argument to the INSTANTIATE_* macro is a prefix that will be added // to the actual test suite name. Remember to pick unique prefixes for // different instances. -typedef testing::Types MyTypes; +using MyTypes = ::testing::Types; INSTANTIATE_TYPED_TEST_SUITE_P(My, FooTest, MyTypes); // If the type list contains only one type, you can write that type diff --git a/deps/googletest/include/gtest/gtest.h b/deps/googletest/include/gtest/gtest.h index cbe680c1adb266..69994ee9dc3a36 100644 --- a/deps/googletest/include/gtest/gtest.h +++ b/deps/googletest/include/gtest/gtest.h @@ -1610,6 +1610,8 @@ GTEST_API_ AssertionResult DoubleNearPredFormat(const char* expr1, double val1, double val2, double abs_error); +using GoogleTest_NotSupported_OnFunctionReturningNonVoid = void; + // INTERNAL IMPLEMENTATION - DO NOT USE IN USER CODE. // A class that enables one to stream messages to assertion macros class GTEST_API_ AssertHelper { @@ -1621,7 +1623,8 @@ class GTEST_API_ AssertHelper { // Message assignment is a semantic trick to enable assertion // streaming; see the GTEST_MESSAGE_ macro below. - void operator=(const Message& message) const; + GoogleTest_NotSupported_OnFunctionReturningNonVoid operator=( + const Message& message) const; private: // We put our data in a struct so that the size of the AssertHelper class can From 3e4b1e732c69c9395674ba46aac7e03623b1b68b Mon Sep 17 00:00:00 2001 From: Filip Skokan Date: Sun, 7 Sep 2025 00:43:15 +0200 Subject: [PATCH 066/103] crypto: add KMAC Web Cryptography algorithms PR-URL: https://github.com/nodejs/node/pull/59647 Reviewed-By: Anna Henningsen Reviewed-By: James M Snell --- deps/ncrypto/ncrypto.cc | 90 +++++++ deps/ncrypto/ncrypto.h | 52 +++++ doc/api/webcrypto.md | 163 ++++++++++++- lib/internal/crypto/keys.js | 6 +- lib/internal/crypto/mac.js | 97 ++++++-- lib/internal/crypto/util.js | 24 ++ lib/internal/crypto/webcrypto.js | 40 +++- lib/internal/crypto/webidl.js | 104 +++++---- node.gyp | 2 + src/crypto/crypto_kmac.cc | 220 ++++++++++++++++++ src/crypto/crypto_kmac.h | 79 +++++++ src/node_crypto.cc | 5 +- src/node_crypto.h | 1 + test/fixtures/crypto/kmac.js | 120 ++++++++++ .../webcrypto/supports-modern-algorithms.mjs | 25 ++ .../test-crypto-key-objects-to-crypto-key.js | 37 ++- test/parallel/test-webcrypto-derivekey.js | 23 +- test/parallel/test-webcrypto-export-import.js | 112 ++++++++- test/parallel/test-webcrypto-keygen-kmac.js | 50 ++++ test/parallel/test-webcrypto-keygen.js | 10 + .../test-webcrypto-sign-verify-kmac.js | 193 +++++++++++++++ test/parallel/test-webcrypto-sign-verify.js | 24 ++ tools/doc/type-parser.mjs | 4 + 23 files changed, 1402 insertions(+), 79 deletions(-) create mode 100644 src/crypto/crypto_kmac.cc create mode 100644 src/crypto/crypto_kmac.h create mode 100644 test/fixtures/crypto/kmac.js create mode 100644 test/parallel/test-webcrypto-keygen-kmac.js create mode 100644 test/parallel/test-webcrypto-sign-verify-kmac.js diff --git a/deps/ncrypto/ncrypto.cc b/deps/ncrypto/ncrypto.cc index d7a26edd4cfe49..c94f0725bd3d05 100644 --- a/deps/ncrypto/ncrypto.cc +++ b/deps/ncrypto/ncrypto.cc @@ -4413,6 +4413,96 @@ HMACCtxPointer HMACCtxPointer::New() { return HMACCtxPointer(HMAC_CTX_new()); } +#if OPENSSL_VERSION_MAJOR >= 3 +EVPMacPointer::EVPMacPointer(EVP_MAC* mac) : mac_(mac) {} + +EVPMacPointer::EVPMacPointer(EVPMacPointer&& other) noexcept + : mac_(std::move(other.mac_)) {} + +EVPMacPointer& EVPMacPointer::operator=(EVPMacPointer&& other) noexcept { + if (this == &other) return *this; + mac_ = std::move(other.mac_); + return *this; +} + +EVPMacPointer::~EVPMacPointer() { + mac_.reset(); +} + +void EVPMacPointer::reset(EVP_MAC* mac) { + mac_.reset(mac); +} + +EVP_MAC* EVPMacPointer::release() { + return mac_.release(); +} + +EVPMacPointer EVPMacPointer::Fetch(const char* algorithm) { + return EVPMacPointer(EVP_MAC_fetch(nullptr, algorithm, nullptr)); +} + +EVPMacCtxPointer::EVPMacCtxPointer(EVP_MAC_CTX* ctx) : ctx_(ctx) {} + +EVPMacCtxPointer::EVPMacCtxPointer(EVPMacCtxPointer&& other) noexcept + : ctx_(std::move(other.ctx_)) {} + +EVPMacCtxPointer& EVPMacCtxPointer::operator=( + EVPMacCtxPointer&& other) noexcept { + if (this == &other) return *this; + ctx_ = std::move(other.ctx_); + return *this; +} + +EVPMacCtxPointer::~EVPMacCtxPointer() { + ctx_.reset(); +} + +void EVPMacCtxPointer::reset(EVP_MAC_CTX* ctx) { + ctx_.reset(ctx); +} + +EVP_MAC_CTX* EVPMacCtxPointer::release() { + return ctx_.release(); +} + +bool EVPMacCtxPointer::init(const Buffer& key, + const OSSL_PARAM* params) { + if (!ctx_) return false; + return EVP_MAC_init(ctx_.get(), + static_cast(key.data), + key.len, + params) == 1; +} + +bool EVPMacCtxPointer::update(const Buffer& data) { + if (!ctx_) return false; + return EVP_MAC_update(ctx_.get(), + static_cast(data.data), + data.len) == 1; +} + +DataPointer EVPMacCtxPointer::final(size_t length) { + if (!ctx_) return {}; + auto buf = DataPointer::Alloc(length); + if (!buf) return {}; + + size_t result_len = length; + if (EVP_MAC_final(ctx_.get(), + static_cast(buf.get()), + &result_len, + length) != 1) { + return {}; + } + + return buf; +} + +EVPMacCtxPointer EVPMacCtxPointer::New(EVP_MAC* mac) { + if (!mac) return EVPMacCtxPointer(); + return EVPMacCtxPointer(EVP_MAC_CTX_new(mac)); +} +#endif // OPENSSL_VERSION_MAJOR >= 3 + DataPointer hashDigest(const Buffer& buf, const EVP_MD* md) { if (md == nullptr) return {}; diff --git a/deps/ncrypto/ncrypto.h b/deps/ncrypto/ncrypto.h index 6c62aa28a50e5a..14f5c1170a81e5 100644 --- a/deps/ncrypto/ncrypto.h +++ b/deps/ncrypto/ncrypto.h @@ -229,6 +229,8 @@ class DataPointer; class DHPointer; class ECKeyPointer; class EVPKeyPointer; +class EVPMacCtxPointer; +class EVPMacPointer; class EVPMDCtxPointer; class SSLCtxPointer; class SSLPointer; @@ -1451,6 +1453,56 @@ class HMACCtxPointer final { DeleteFnPtr ctx_; }; +#if OPENSSL_VERSION_MAJOR >= 3 +class EVPMacPointer final { + public: + EVPMacPointer() = default; + explicit EVPMacPointer(EVP_MAC* mac); + EVPMacPointer(EVPMacPointer&& other) noexcept; + EVPMacPointer& operator=(EVPMacPointer&& other) noexcept; + NCRYPTO_DISALLOW_COPY(EVPMacPointer) + ~EVPMacPointer(); + + inline bool operator==(std::nullptr_t) noexcept { return mac_ == nullptr; } + inline operator bool() const { return mac_ != nullptr; } + inline EVP_MAC* get() const { return mac_.get(); } + inline operator EVP_MAC*() const { return mac_.get(); } + void reset(EVP_MAC* mac = nullptr); + EVP_MAC* release(); + + static EVPMacPointer Fetch(const char* algorithm); + + private: + DeleteFnPtr mac_; +}; + +class EVPMacCtxPointer final { + public: + EVPMacCtxPointer() = default; + explicit EVPMacCtxPointer(EVP_MAC_CTX* ctx); + EVPMacCtxPointer(EVPMacCtxPointer&& other) noexcept; + EVPMacCtxPointer& operator=(EVPMacCtxPointer&& other) noexcept; + NCRYPTO_DISALLOW_COPY(EVPMacCtxPointer) + ~EVPMacCtxPointer(); + + inline bool operator==(std::nullptr_t) noexcept { return ctx_ == nullptr; } + inline operator bool() const { return ctx_ != nullptr; } + inline EVP_MAC_CTX* get() const { return ctx_.get(); } + inline operator EVP_MAC_CTX*() const { return ctx_.get(); } + void reset(EVP_MAC_CTX* ctx = nullptr); + EVP_MAC_CTX* release(); + + bool init(const Buffer& key, const OSSL_PARAM* params = nullptr); + bool update(const Buffer& data); + DataPointer final(size_t length); + + static EVPMacCtxPointer New(EVP_MAC* mac); + + private: + DeleteFnPtr ctx_; +}; +#endif // OPENSSL_VERSION_MAJOR >= 3 + #ifndef OPENSSL_NO_ENGINE class EnginePointer final { public: diff --git a/doc/api/webcrypto.md b/doc/api/webcrypto.md index 5e24f9da78dcfa..e4bc6f5f30daf8 100644 --- a/doc/api/webcrypto.md +++ b/doc/api/webcrypto.md @@ -2,6 +2,9 @@ -* Type: {KeyAlgorithm|RsaHashedKeyAlgorithm|EcKeyAlgorithm|AesKeyAlgorithm|HmacKeyAlgorithm} +* Type: {KeyAlgorithm|RsaHashedKeyAlgorithm|EcKeyAlgorithm|AesKeyAlgorithm|HmacKeyAlgorithm|KmacKeyAlgorithm} @@ -735,6 +744,8 @@ Valid key usages depend on the key algorithm (identified by | `'Ed448'`[^secure-curves] | | ✔ | | | | | `'HDKF'` | | | ✔ | | | | `'HMAC'` | | ✔ | | | | +| `'KMAC128'`[^modern-algos] | | ✔ | | | | +| `'KMAC256'`[^modern-algos] | | ✔ | | | | | `'ML-DSA-44'`[^modern-algos] | | ✔ | | | | | `'ML-DSA-65'`[^modern-algos] | | ✔ | | | | | `'ML-DSA-87'`[^modern-algos] | | ✔ | | | | @@ -830,7 +841,7 @@ added: v24.7.0 * `decapsulationAlgorithm` {string|Algorithm} * `decapsulationKey` {CryptoKey} * `ciphertext` {ArrayBuffer|TypedArray|DataView|Buffer} -* `sharedKeyAlgorithm` {string|Algorithm|HmacImportParams|AesDerivedKeyParams} +* `sharedKeyAlgorithm` {string|Algorithm|HmacImportParams|AesDerivedKeyParams|KmacImportParams} * `extractable` {boolean} * `usages` {string\[]} See [Key usages][]. * Returns: {Promise} Fulfills with {CryptoKey} upon success. @@ -945,7 +956,7 @@ changes: * `algorithm` {EcdhKeyDeriveParams|HkdfParams|Pbkdf2Params|Argon2Params} * `baseKey` {CryptoKey} -* `derivedKeyAlgorithm` {string|Algorithm|HmacImportParams|AesDerivedKeyParams} +* `derivedKeyAlgorithm` {string|Algorithm|HmacImportParams|AesDerivedKeyParams|KmacImportParams} * `extractable` {boolean} * `keyUsages` {string\[]} See [Key usages][]. * Returns: {Promise} Fulfills with a {CryptoKey} upon success. @@ -1036,7 +1047,7 @@ added: v24.7.0 * `encapsulationAlgorithm` {string|Algorithm} * `encapsulationKey` {CryptoKey} -* `sharedKeyAlgorithm` {string|Algorithm|HmacImportParams|AesDerivedKeyParams} +* `sharedKeyAlgorithm` {string|Algorithm|HmacImportParams|AesDerivedKeyParams|KmacImportParams} * `extractable` {boolean} * `usages` {string\[]} See [Key usages][]. * Returns: {Promise} Fulfills with {EncapsulatedKey} upon success. @@ -1084,6 +1095,9 @@ The algorithms currently supported include: -* `algorithm` {string|Algorithm|RsaHashedKeyGenParams|EcKeyGenParams|HmacKeyGenParams|AesKeyGenParams} +* `algorithm` {string|Algorithm|RsaHashedKeyGenParams|EcKeyGenParams|HmacKeyGenParams|AesKeyGenParams|KmacKeyGenParams} @@ -1216,12 +1235,17 @@ The {CryptoKey} (secret key) generating algorithms supported include: * `'AES-OCB'`[^modern-algos] * `'ChaCha20-Poly1305'`[^modern-algos] * `'HMAC'` +* `'KMAC128'`[^modern-algos] +* `'KMAC256'`[^modern-algos] ### `subtle.importKey(format, keyData, algorithm, extractable, keyUsages)` -* `algorithm` {string|Algorithm|RsaHashedImportParams|EcKeyImportParams|HmacImportParams} +* `algorithm` {string|Algorithm|RsaHashedImportParams|EcKeyImportParams|HmacImportParams|KmacImportParams} @@ -1282,6 +1306,8 @@ The algorithms currently supported include: | `'Ed448'`[^secure-curves] | ✔ | ✔ | ✔ | ✔ | | ✔ | | | `'HDKF'` | | | | ✔ | ✔ | | | | `'HMAC'` | | | ✔ | ✔ | ✔ | | | +| `'KMAC128'`[^modern-algos] | | | ✔ | | ✔ | | | +| `'KMAC256'`[^modern-algos] | | | ✔ | | ✔ | | | | `'ML-DSA-44'`[^modern-algos] | ✔ | ✔ | ✔ | | | ✔ | ✔ | | `'ML-DSA-65'`[^modern-algos] | ✔ | ✔ | ✔ | | | ✔ | ✔ | | `'ML-DSA-87'`[^modern-algos] | ✔ | ✔ | ✔ | | | ✔ | ✔ | @@ -1300,6 +1326,9 @@ The algorithms currently supported include: -* `algorithm` {string|Algorithm|RsaPssParams|EcdsaParams|Ed448Params|ContextParams} +* `algorithm` {string|Algorithm|RsaPssParams|EcdsaParams|Ed448Params|ContextParams|KmacParams} * `key` {CryptoKey} * `data` {ArrayBuffer|TypedArray|DataView|Buffer} * Returns: {Promise} Fulfills with an {ArrayBuffer} upon success. @@ -1330,6 +1359,8 @@ The algorithms currently supported include: * `'Ed25519'` * `'Ed448'`[^secure-curves] * `'HMAC'` +* `'KMAC128'`[^modern-algos] +* `'KMAC256'`[^modern-algos] * `'ML-DSA-44'`[^modern-algos] * `'ML-DSA-65'`[^modern-algos] * `'ML-DSA-87'`[^modern-algos] @@ -1357,7 +1388,7 @@ changes: * `unwrapAlgo` {string|Algorithm|RsaOaepParams|AesCtrParams|AesCbcParams|AeadParams} -* `unwrappedKeyAlgo` {string|Algorithm|RsaHashedImportParams|EcKeyImportParams|HmacImportParams} +* `unwrappedKeyAlgo` {string|Algorithm|RsaHashedImportParams|EcKeyImportParams|HmacImportParams|KmacImportParams} @@ -1397,6 +1428,8 @@ The unwrapped key algorithms supported include: * `'Ed25519'` * `'Ed448'`[^secure-curves] * `'HMAC'` +* `'KMAC128'`[^secure-curves] +* `'KMAC256'`[^secure-curves] * `'ML-DSA-44'`[^modern-algos] * `'ML-DSA-65'`[^modern-algos] * `'ML-DSA-87'`[^modern-algos] @@ -1414,6 +1447,9 @@ The unwrapped key algorithms supported include: + +#### `kmacImportParams.length` + + + +* Type: {number} + +The optional number of bits in the KMAC key. This is optional and should +be omitted for most cases. + +#### `kmacImportParams.name` + + + +* Type: {string} Must be `'KMAC128'` or `'KMAC256'`. + +### Class: `KmacKeyAlgorithm` + + + +#### `kmacKeyAlgorithm.length` + + + +* Type: {number} + +The length of the KMAC key in bits. + +#### `kmacKeyAlgorithm.name` + + + +* Type: {string} + +### Class: `KmacKeyGenParams` + + + +#### `kmacKeyGenParams.length` + + + +* Type: {number} + +The number of bits to generate for the KMAC key. If omitted, +the length will be determined by the KMAC algorithm used. +This is optional and should be omitted for most cases. + +#### `kmacKeyGenParams.name` + + + +* Type: {string} Must be `'KMAC128'` or `'KMAC256'`. + +### Class: `KmacParams` + + + +#### `kmacParams.algorithm` + + + +* Type: {string} Must be `'KMAC128'` or `'KMAC256'`. + +#### `kmacParams.customization` + + + +* Type: {ArrayBuffer|TypedArray|DataView|Buffer|undefined} + +The `customization` member represents the optional customization string. + +#### `kmacParams.length` + + + +* Type: {number} + +The length of the output in bytes. This must be a positive integer. + ### Class: `Pbkdf2Params` -* `algorithm` {string|Algorithm|RsaPssParams|EcdsaParams|Ed448Params|ContextParams|KmacParams} +* `algorithm` {string|Algorithm|RsaPssParams|EcdsaParams|ContextParams|KmacParams} * `key` {CryptoKey} * `data` {ArrayBuffer|TypedArray|DataView|Buffer} * Returns: {Promise} Fulfills with an {ArrayBuffer} upon success. @@ -1462,7 +1462,7 @@ changes: -* `algorithm` {string|Algorithm|RsaPssParams|EcdsaParams|Ed448Params|ContextParams} +* `algorithm` {string|Algorithm|RsaPssParams|EcdsaParams|ContextParams|KmacParams} * `key` {CryptoKey} * `signature` {ArrayBuffer|TypedArray|DataView|Buffer} * `data` {ArrayBuffer|TypedArray|DataView|Buffer} @@ -1829,20 +1829,23 @@ added: v24.7.0 added: v24.7.0 --> -* Type: {string} Must be `'ML-DSA-44'`[^modern-algos], `'ML-DSA-65'`[^modern-algos], or `'ML-DSA-87'`[^modern-algos]. +* Type: {string} Must be `Ed448`[^secure-curves], `'ML-DSA-44'`[^modern-algos], + `'ML-DSA-65'`[^modern-algos], or `'ML-DSA-87'`[^modern-algos]. #### `contextParams.context` * Type: {ArrayBuffer|TypedArray|DataView|Buffer|undefined} The `context` member represents the optional context data to associate with the message. -The Node.js Web Crypto API implementation only supports zero-length context -which is equivalent to not providing context at all. ### Class: `CShakeParams` @@ -2023,37 +2026,6 @@ added: v15.0.0 * Type: {string} Must be one of `'P-256'`, `'P-384'`, `'P-521'`. -### Class: `Ed448Params` - - - -#### `ed448Params.name` - - - -* Type: {string} Must be `'Ed448'`[^secure-curves]. - -#### `ed448Params.context` - - - -* Type: {ArrayBuffer|TypedArray|DataView|Buffer|undefined} - -The `context` member represents the optional context data to associate with -the message. -The Node.js Web Crypto API implementation only supports zero-length context -which is equivalent to not providing context at all. - ### Class: `EncapsulatedBits` -* `operation` {string} "encrypt", "decrypt", "sign", "verify", "digest", "generateKey", "deriveKey", "deriveBits", "importKey", "exportKey", "getPublicKey", "wrapKey", or "unwrapKey" +* `operation` {string} "encrypt", "decrypt", "sign", "verify", "digest", "generateKey", "deriveKey", "deriveBits", "importKey", "exportKey", "getPublicKey", "wrapKey", "unwrapKey", "encapsulateBits", "encapsulateKey", "decapsulateBits", or "decapsulateKey" * `algorithm` {string|Algorithm} -* `lengthOrAdditionalAlgorithm` {null|number|string|Algorithm|undefined} Depending on the operation this is either ignored, the value of the length argument when operation is "deriveBits", the algorithm of key to be derived when operation is "deriveKey", the algorithm of key to be exported before wrapping when operation is "wrapKey", or the algorithm of key to be imported after unwrapping when operation is "unwrapKey". **Default:** `null` when operation is "deriveBits", `undefined` otherwise. +* `lengthOrAdditionalAlgorithm` {null|number|string|Algorithm|undefined} Depending on the operation this is either ignored, the value of the length argument when operation is "deriveBits", the algorithm of key to be derived when operation is "deriveKey", the algorithm of key to be exported before wrapping when operation is "wrapKey", the algorithm of key to be imported after unwrapping when operation is "unwrapKey", or the algorithm of key to be imported after en/decapsulating a key when operation is "encapsulateKey" or "decapsulateKey". **Default:** `null` when operation is "deriveBits", `undefined` otherwise. * Returns: {boolean} Indicating whether the implementation supports the given operation @@ -811,6 +811,8 @@ Allows feature detection in Web Crypto API, which can be used to detect whether a given algorithm identifier (including its parameters) is supported for the given operation. +See [Checking for runtime algorithm support][] for an example use of this method. + ### `subtle.decapsulateBits(decapsulationAlgorithm, decapsulationKey, ciphertext)` +A temporary symmetric secret key (represented as {ArrayBuffer}) for message encryption +and the ciphertext (that can be transmitted to the message recipient along with the +message) encrypted by this shared key. The recipient uses their private key to determine +what the shared key is which then allows them to decrypt the message. + #### `encapsulatedBits.ciphertext` +A temporary symmetric secret key (represented as {CryptoKey}) for message encryption +and the ciphertext (that can be transmitted to the message recipient along with the +message) encrypted by this shared key. The recipient uses their private key to determine +what the shared key is which then allows them to decrypt the message. + #### `encapsulatedKey.ciphertext` * `stream` {Readable|Duplex|ReadableStream} -* Returns: {boolean} +* Returns: {boolean|null} - Only returns `null` if `stream` is not a valid `Readable`, `Duplex` or `ReadableStream`. Returns whether the stream is readable. +### `stream.isWritable(stream)` + +* `stream` {Writable|Duplex|WritableStream} +* Returns: {boolean|null} - Only returns `null` if `stream` is not a valid `Writable`, `Duplex` or `WritableStream`. + +Returns whether the stream is writable. + ### `stream.Readable.from(iterable[, options])` -Type: Documentation-only - -These methods were deprecated because they can be used in a way which does not -hold the channel reference alive long enough to receive the events. +Type: Deprecation revoked -Use [`diagnostics_channel.subscribe(name, onMessage)`][] or -[`diagnostics_channel.unsubscribe(name, onMessage)`][] which does the same -thing instead. +These methods were deprecated because their use could leave the channel object +vulnerable to being garbage-collected if not strongly referenced by the user. +The deprecation was revoked because channel objects are now resistant to +garbage collection when the channel has active subscribers. ### DEP0164: `process.exit(code)`, `process.exitCode` coercion to integer @@ -4075,8 +4076,6 @@ an internal nodejs implementation rather than a public facing API, use `node:htt [`crypto.setEngine()`]: crypto.md#cryptosetengineengine-flags [`decipher.final()`]: crypto.md#decipherfinaloutputencoding [`decipher.setAuthTag()`]: crypto.md#deciphersetauthtagbuffer-encoding -[`diagnostics_channel.subscribe(name, onMessage)`]: diagnostics_channel.md#diagnostics_channelsubscribename-onmessage -[`diagnostics_channel.unsubscribe(name, onMessage)`]: diagnostics_channel.md#diagnostics_channelunsubscribename-onmessage [`dirent.parentPath`]: fs.md#direntparentpath [`dns.lookup()`]: dns.md#dnslookuphostname-options-callback [`dnsPromises.lookup()`]: dns.md#dnspromiseslookuphostname-options diff --git a/doc/api/diagnostics_channel.md b/doc/api/diagnostics_channel.md index 1c742f7f972ce0..b5be9c02f96312 100644 --- a/doc/api/diagnostics_channel.md +++ b/doc/api/diagnostics_channel.md @@ -373,13 +373,17 @@ channel.publish({ added: - v15.1.0 - v14.17.0 -deprecated: - - v18.7.0 - - v16.17.0 +changes: + - version: REPLACEME + pr-url: https://github.com/nodejs/node/pull/59758 + description: Deprecation revoked. + - version: + - v18.7.0 + - v16.17.0 + pr-url: https://github.com/nodejs/node/pull/44943 + description: Documentation-only deprecation. --> -> Stability: 0 - Deprecated: Use [`diagnostics_channel.subscribe(name, onMessage)`][] - * `onMessage` {Function} The handler to receive channel messages * `message` {any} The message data * `name` {string|symbol} The name of the channel @@ -414,10 +418,15 @@ channel.subscribe((message, name) => { added: - v15.1.0 - v14.17.0 -deprecated: - - v18.7.0 - - v16.17.0 changes: + - version: REPLACEME + pr-url: https://github.com/nodejs/node/pull/59758 + description: Deprecation revoked. + - version: + - v18.7.0 + - v16.17.0 + pr-url: https://github.com/nodejs/node/pull/44943 + description: Documentation-only deprecation. - version: - v17.1.0 - v16.14.0 @@ -426,8 +435,6 @@ changes: description: Added return value. Added to channels without subscribers. --> -> Stability: 0 - Deprecated: Use [`diagnostics_channel.unsubscribe(name, onMessage)`][] - * `onMessage` {Function} The previous subscribed handler to remove * Returns: {boolean} `true` if the handler was found, `false` otherwise. @@ -1423,7 +1430,6 @@ Emitted when a new thread is created. [`diagnostics_channel.channel(name)`]: #diagnostics_channelchannelname [`diagnostics_channel.subscribe(name, onMessage)`]: #diagnostics_channelsubscribename-onmessage [`diagnostics_channel.tracingChannel()`]: #diagnostics_channeltracingchannelnameorchannels -[`diagnostics_channel.unsubscribe(name, onMessage)`]: #diagnostics_channelunsubscribename-onmessage [`end` event]: #endevent [`error` event]: #errorevent [`net.Server.listen()`]: net.md#serverlisten From ad5cfcc9018b136c2f1ad82729baeb9d0787c5f7 Mon Sep 17 00:00:00 2001 From: Lee Jiho Date: Tue, 9 Sep 2025 01:12:56 +0900 Subject: [PATCH 090/103] typings: add missing properties in ConfigBinding PR-URL: https://github.com/nodejs/node/pull/59585 Reviewed-By: Daeyeon Jeong --- typings/internalBinding/config.d.ts | 3 +++ typings/internalBinding/os.d.ts | 1 + 2 files changed, 4 insertions(+) diff --git a/typings/internalBinding/config.d.ts b/typings/internalBinding/config.d.ts index 7c93e34238915e..68c1002f413652 100644 --- a/typings/internalBinding/config.d.ts +++ b/typings/internalBinding/config.d.ts @@ -1,12 +1,15 @@ export interface ConfigBinding { isDebugBuild: boolean; + openSSLIsBoringSSL: boolean; hasOpenSSL: boolean; fipsMode: boolean; hasIntl: boolean; + hasSmallICU: boolean; hasTracing: boolean; hasNodeOptions: boolean; hasInspector: boolean; noBrowserGlobals: boolean; bits: number; hasDtrace: boolean; + getDefaultLocale(): string; } diff --git a/typings/internalBinding/os.d.ts b/typings/internalBinding/os.d.ts index 91e907ef3c1205..afaf2e7b3b6b1b 100644 --- a/typings/internalBinding/os.d.ts +++ b/typings/internalBinding/os.d.ts @@ -22,4 +22,5 @@ export interface OSBinding { getPriority(pid: number, ctx: InternalOSBinding.OSContext): number | undefined; getOSInformation(ctx: InternalOSBinding.OSContext): [sysname: string, version: string, release: string]; isBigEndian: boolean; + getAvailableParallelism(): number; } From 15e547b3a45193295e02c0574bdc27ae062cb8a8 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=EB=B0=A9=EC=A7=84=ED=98=81?= Date: Tue, 9 Sep 2025 01:13:05 +0900 Subject: [PATCH 091/103] typings: add typing for 'uv' PR-URL: https://github.com/nodejs/node/pull/59606 Reviewed-By: Daeyeon Jeong --- typings/globals.d.ts | 2 + typings/internalBinding/uv.d.ts | 90 +++++++++++++++++++++++++++++++++ 2 files changed, 92 insertions(+) create mode 100644 typings/internalBinding/uv.d.ts diff --git a/typings/globals.d.ts b/typings/globals.d.ts index e9875cdcc3bfa1..16625b08c8cfd0 100644 --- a/typings/globals.d.ts +++ b/typings/globals.d.ts @@ -23,6 +23,7 @@ import { TypesBinding } from './internalBinding/types'; import { URLBinding } from './internalBinding/url'; import { URLPatternBinding } from "./internalBinding/url_pattern"; import { UtilBinding } from './internalBinding/util'; +import { UVBinding } from './internalBinding/uv'; import { WASIBinding } from './internalBinding/wasi'; import { WorkerBinding } from './internalBinding/worker'; import { ModulesBinding } from './internalBinding/modules'; @@ -55,6 +56,7 @@ interface InternalBindingMap { url: URLBinding; url_pattern: URLPatternBinding; util: UtilBinding; + uv: UVBinding; wasi: WASIBinding; worker: WorkerBinding; zlib: ZlibBinding; diff --git a/typings/internalBinding/uv.d.ts b/typings/internalBinding/uv.d.ts new file mode 100644 index 00000000000000..93812fce414676 --- /dev/null +++ b/typings/internalBinding/uv.d.ts @@ -0,0 +1,90 @@ +export interface UVBinding { + errname(err: number): string; + getErrorMap(): Record; + getErrorMessage(err: number): string; + + UV_E2BIG: -7; + UV_EACCES: -13; + UV_EADDRINUSE: -48; + UV_EADDRNOTAVAIL: -49; + UV_EAFNOSUPPORT: -47; + UV_EAGAIN: -35; + UV_EAI_ADDRFAMILY: -3000; + UV_EAI_AGAIN: -3001; + UV_EAI_BADFLAGS: -3002; + UV_EAI_BADHINTS: -3013; + UV_EAI_CANCELED: -3003; + UV_EAI_FAIL: -3004; + UV_EAI_FAMILY: -3005; + UV_EAI_MEMORY: -3006; + UV_EAI_NODATA: -3007; + UV_EAI_NONAME: -3008; + UV_EAI_OVERFLOW: -3009; + UV_EAI_PROTOCOL: -3014; + UV_EAI_SERVICE: -3010; + UV_EAI_SOCKTYPE: -3011; + UV_EALREADY: -37; + UV_EBADF: -9; + UV_EBUSY: -16; + UV_ECANCELED: -89; + UV_ECHARSET: -4080; + UV_ECONNABORTED: -53; + UV_ECONNREFUSED: -61; + UV_ECONNRESET: -54; + UV_EDESTADDRREQ: -39; + UV_EEXIST: -17; + UV_EFAULT: -14; + UV_EFBIG: -27; + UV_EFTYPE: -79; + UV_EHOSTDOWN: -64; + UV_EHOSTUNREACH: -65; + UV_EILSEQ: -92; + UV_EINTR: -4; + UV_EINVAL: -22; + UV_EIO: -5; + UV_EISCONN: -56; + UV_EISDIR: -21; + UV_ELOOP: -62; + UV_EMFILE: -24; + UV_EMLINK: -31; + UV_EMSGSIZE: -40; + UV_ENAMETOOLONG: -63; + UV_ENETDOWN: -50; + UV_ENETUNREACH: -51; + UV_ENFILE: -23; + UV_ENOBUFS: -55; + UV_ENODATA: -96; + UV_ENODEV: -19; + UV_ENOENT: -2; + UV_ENOMEM: -12; + UV_ENONET: -4056; + UV_ENOPROTOOPT: -42; + UV_ENOSPC: -28; + UV_ENOSYS: -78; + UV_ENOTCONN: -57; + UV_ENOTDIR: -20; + UV_ENOTEMPTY: -66; + UV_ENOTSOCK: -38; + UV_ENOTSUP: -45; + UV_ENOTTY: -25; + UV_ENXIO: -6; + UV_EOF: -4095; + UV_EOVERFLOW: -84; + UV_EPERM: -1; + UV_EPIPE: -32; + UV_EPROTO: -100; + UV_EPROTONOSUPPORT: -43; + UV_EPROTOTYPE: -41; + UV_ERANGE: -34; + UV_EREMOTEIO: -4030; + UV_EROFS: -30; + UV_ESHUTDOWN: -58; + UV_ESOCKTNOSUPPORT: -44; + UV_ESPIPE: -29; + UV_ESRCH: -3; + UV_ETIMEDOUT: -60; + UV_ETXTBSY: -26; + UV_EUNATCH: -4023; + UV_EXDEV: -18; + UV_UNKNOWN: -4094; +} From f667215583a857d37ddab83197b7504906e252a0 Mon Sep 17 00:00:00 2001 From: Lee Jiho Date: Tue, 9 Sep 2025 03:44:18 +0900 Subject: [PATCH 092/103] path: refactor path joining logic for clarity and performance PR-URL: https://github.com/nodejs/node/pull/59781 Reviewed-By: James M Snell Reviewed-By: Minwoo Jung Reviewed-By: Luigi Pinca --- lib/path.js | 14 +++++++------- 1 file changed, 7 insertions(+), 7 deletions(-) diff --git a/lib/path.js b/lib/path.js index be6fd8b24ac8bf..63b037cddfb986 100644 --- a/lib/path.js +++ b/lib/path.js @@ -24,6 +24,7 @@ const { ArrayPrototypeIncludes, ArrayPrototypeJoin, + ArrayPrototypePush, ArrayPrototypeSlice, FunctionPrototypeBind, StringPrototypeCharCodeAt, @@ -506,22 +507,21 @@ const win32 = { if (args.length === 0) return '.'; - let joined; - let firstPart; + const path = []; for (let i = 0; i < args.length; ++i) { const arg = args[i]; validateString(arg, 'path'); if (arg.length > 0) { - if (joined === undefined) - joined = firstPart = arg; - else - joined += `\\${arg}`; + ArrayPrototypePush(path, arg); } } - if (joined === undefined) + if (path.length === 0) return '.'; + const firstPart = path[0]; + let joined = ArrayPrototypeJoin(path, '\\'); + // Make sure that the joined path doesn't start with two slashes, because // normalize() will mistake it for a UNC path then. // From e476e43c178ae13f92274b53ceb2cbf6ba6dfab9 Mon Sep 17 00:00:00 2001 From: sangwook <73056306+Han5991@users.noreply.github.com> Date: Tue, 9 Sep 2025 05:10:50 +0900 Subject: [PATCH 093/103] util: fix numericSeparator with negative fractional numbers Fix util.inspect() formatting bug where negative fractional numbers between -1 and 0 lost their minus sign when numericSeparator was true. Fixed formatNumber function to preserve sign by using original string representation. Also corrected test expectations for scientific notation to not apply numeric separators. Fixes: https://github.com/nodejs/node/issues/59376 PR-URL: https://github.com/nodejs/node/pull/59379 Reviewed-By: Ruben Bridgewater Reviewed-By: James M Snell --- lib/internal/util/inspect.js | 23 ++++++++++++++--------- test/parallel/test-util-inspect.js | 21 +++++++++++++++++++++ 2 files changed, 35 insertions(+), 9 deletions(-) diff --git a/lib/internal/util/inspect.js b/lib/internal/util/inspect.js index adbf5b72fa1435..d7c8539fbfc1b8 100644 --- a/lib/internal/util/inspect.js +++ b/lib/internal/util/inspect.js @@ -1851,23 +1851,28 @@ function formatNumber(fn, number, numericSeparator) { } return fn(`${number}`, 'number'); } + + const numberString = String(number); const integer = MathTrunc(number); - const string = String(integer); + if (integer === number) { - if (!NumberIsFinite(number) || StringPrototypeIncludes(string, 'e')) { - return fn(string, 'number'); + if (!NumberIsFinite(number) || StringPrototypeIncludes(numberString, 'e')) { + return fn(numberString, 'number'); } - return fn(`${addNumericSeparator(string)}`, 'number'); + return fn(addNumericSeparator(numberString), 'number'); } if (NumberIsNaN(number)) { - return fn(string, 'number'); + return fn(numberString, 'number'); } + + const decimalIndex = StringPrototypeIndexOf(numberString, '.'); + const integerPart = StringPrototypeSlice(numberString, 0, decimalIndex); + const fractionalPart = StringPrototypeSlice(numberString, decimalIndex + 1); + return fn(`${ - addNumericSeparator(string) + addNumericSeparator(integerPart) }.${ - addNumericSeparatorEnd( - StringPrototypeSlice(String(number), string.length + 1), - ) + addNumericSeparatorEnd(fractionalPart) }`, 'number'); } diff --git a/test/parallel/test-util-inspect.js b/test/parallel/test-util-inspect.js index 054526268a0ae9..22aa57612a3694 100644 --- a/test/parallel/test-util-inspect.js +++ b/test/parallel/test-util-inspect.js @@ -3493,6 +3493,27 @@ assert.strictEqual( util.inspect(-123456789.12345678, { numericSeparator: true }), '-123_456_789.123_456_78' ); + + // Regression test for https://github.com/nodejs/node/issues/59376 + // numericSeparator should work correctly for negative fractional numbers + { + // Test the exact values from the GitHub issue + const values = [0.1234, -0.12, -0.123, -0.1234, -1.234]; + assert.strictEqual( + util.inspect(values, { numericSeparator: true }), + '[ 0.123_4, -0.12, -0.123, -0.123_4, -1.234 ]' + ); + + // Test individual negative fractional numbers between -1 and 0 + assert.strictEqual( + util.inspect(-0.1234, { numericSeparator: true }), + '-0.123_4' + ); + assert.strictEqual( + util.inspect(-0.12345, { numericSeparator: true }), + '-0.123_45' + ); + } } // Regression test for https://github.com/nodejs/node/issues/41244 From d9138723695cd6b897e52e34f397ba57e4aae89d Mon Sep 17 00:00:00 2001 From: Haram Jeong <91401364+haramj@users.noreply.github.com> Date: Tue, 9 Sep 2025 09:36:14 +0900 Subject: [PATCH 094/103] assert: cap input size in myersDiff to avoid Int32Array overflow PR-URL: https://github.com/nodejs/node/pull/59578 Reviewed-By: James M Snell Reviewed-By: Ruben Bridgewater --- lib/internal/assert/myers_diff.js | 16 +++++++++++++++- test/parallel/test-assert-myers-diff.js | 25 +++++++++++++++++++++++++ 2 files changed, 40 insertions(+), 1 deletion(-) create mode 100644 test/parallel/test-assert-myers-diff.js diff --git a/lib/internal/assert/myers_diff.js b/lib/internal/assert/myers_diff.js index 6fcfc4e84456fe..ee6359042e31b8 100644 --- a/lib/internal/assert/myers_diff.js +++ b/lib/internal/assert/myers_diff.js @@ -6,6 +6,12 @@ const { StringPrototypeEndsWith, } = primordials; +const { + codes: { + ERR_OUT_OF_RANGE, + }, +} = require('internal/errors'); + const colors = require('internal/util/colors'); const kNopLinesToCollapse = 5; @@ -29,7 +35,15 @@ function myersDiff(actual, expected, checkCommaDisparity = false) { const actualLength = actual.length; const expectedLength = expected.length; const max = actualLength + expectedLength; - // TODO(BridgeAR): Cap the input in case the values go beyond the limit of 2^31 - 1. + + if (max > 2 ** 31 - 1) { + throw new ERR_OUT_OF_RANGE( + 'myersDiff input size', + '< 2^31', + max, + ); + } + const v = new Int32Array(2 * max + 1); const trace = []; diff --git a/test/parallel/test-assert-myers-diff.js b/test/parallel/test-assert-myers-diff.js new file mode 100644 index 00000000000000..31db3cd704ae06 --- /dev/null +++ b/test/parallel/test-assert-myers-diff.js @@ -0,0 +1,25 @@ +// Flags: --expose-internals +'use strict'; + +const common = require('../common'); +const assert = require('assert'); + +const { myersDiff } = require('internal/assert/myers_diff'); + +{ + const arr1 = { length: 2 ** 31 - 1 }; + const arr2 = { length: 2 }; + const max = arr1.length + arr2.length; + assert.throws( + () => { + myersDiff(arr1, arr2); + }, + common.expectsError({ + code: 'ERR_OUT_OF_RANGE', + name: 'RangeError', + message: 'The value of "myersDiff input size" ' + + 'is out of range. It must be < 2^31. ' + + `Received ${max}` + }) + ); +} From 0d1e53d935693e1b7e2f3c11064de66ad40dfec5 Mon Sep 17 00:00:00 2001 From: "Node.js GitHub Bot" Date: Tue, 9 Sep 2025 02:03:59 +0100 Subject: [PATCH 095/103] deps: update uvwasi to 0.0.23 PR-URL: https://github.com/nodejs/node/pull/59791 Reviewed-By: Antoine du Hamel Reviewed-By: Luigi Pinca Reviewed-By: Colin Ihrig Reviewed-By: Marco Ippolito Reviewed-By: Rafael Gonzaga --- deps/uvwasi/include/uvwasi.h | 53 ++++++++++++++++- deps/uvwasi/include/wasi_serdes.h | 9 +++ deps/uvwasi/include/wasi_types.h | 6 ++ deps/uvwasi/src/path_resolver.c | 36 ++++++++++- deps/uvwasi/src/uvwasi.c | 99 ++++++++++++++++++++++--------- 5 files changed, 171 insertions(+), 32 deletions(-) diff --git a/deps/uvwasi/include/uvwasi.h b/deps/uvwasi/include/uvwasi.h index 7dbcc667793a07..2a9d45207b2109 100644 --- a/deps/uvwasi/include/uvwasi.h +++ b/deps/uvwasi/include/uvwasi.h @@ -11,7 +11,7 @@ extern "C" { #define UVWASI_VERSION_MAJOR 0 #define UVWASI_VERSION_MINOR 0 -#define UVWASI_VERSION_PATCH 21 +#define UVWASI_VERSION_PATCH 23 #define UVWASI_VERSION_HEX ((UVWASI_VERSION_MAJOR << 16) | \ (UVWASI_VERSION_MINOR << 8) | \ (UVWASI_VERSION_PATCH)) @@ -77,124 +77,159 @@ typedef struct uvwasi_options_s { } uvwasi_options_t; /* Embedder API. */ +UVWASI_EXPORT uvwasi_errno_t uvwasi_init(uvwasi_t* uvwasi, const uvwasi_options_t* options); +UVWASI_EXPORT void uvwasi_destroy(uvwasi_t* uvwasi); +UVWASI_EXPORT void uvwasi_options_init(uvwasi_options_t* options); /* Use int instead of uv_file to avoid needing uv.h */ +UVWASI_EXPORT uvwasi_errno_t uvwasi_embedder_remap_fd(uvwasi_t* uvwasi, const uvwasi_fd_t fd, int new_host_fd); +UVWASI_EXPORT const char* uvwasi_embedder_err_code_to_string(uvwasi_errno_t code); /* WASI system call API. */ +UVWASI_EXPORT uvwasi_errno_t uvwasi_args_get(uvwasi_t* uvwasi, char** argv, char* argv_buf); +UVWASI_EXPORT uvwasi_errno_t uvwasi_args_sizes_get(uvwasi_t* uvwasi, uvwasi_size_t* argc, uvwasi_size_t* argv_buf_size); +UVWASI_EXPORT uvwasi_errno_t uvwasi_clock_res_get(uvwasi_t* uvwasi, uvwasi_clockid_t clock_id, uvwasi_timestamp_t* resolution); +UVWASI_EXPORT uvwasi_errno_t uvwasi_clock_time_get(uvwasi_t* uvwasi, uvwasi_clockid_t clock_id, uvwasi_timestamp_t precision, uvwasi_timestamp_t* time); +UVWASI_EXPORT uvwasi_errno_t uvwasi_environ_get(uvwasi_t* uvwasi, char** environment, char* environ_buf); +UVWASI_EXPORT uvwasi_errno_t uvwasi_environ_sizes_get(uvwasi_t* uvwasi, uvwasi_size_t* environ_count, uvwasi_size_t* environ_buf_size); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_advise(uvwasi_t* uvwasi, uvwasi_fd_t fd, uvwasi_filesize_t offset, uvwasi_filesize_t len, uvwasi_advice_t advice); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_allocate(uvwasi_t* uvwasi, uvwasi_fd_t fd, uvwasi_filesize_t offset, uvwasi_filesize_t len); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_close(uvwasi_t* uvwasi, uvwasi_fd_t fd); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_datasync(uvwasi_t* uvwasi, uvwasi_fd_t fd); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_fdstat_get(uvwasi_t* uvwasi, uvwasi_fd_t fd, uvwasi_fdstat_t* buf); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_fdstat_set_flags(uvwasi_t* uvwasi, uvwasi_fd_t fd, uvwasi_fdflags_t flags); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_fdstat_set_rights(uvwasi_t* uvwasi, uvwasi_fd_t fd, uvwasi_rights_t fs_rights_base, uvwasi_rights_t fs_rights_inheriting ); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_filestat_get(uvwasi_t* uvwasi, uvwasi_fd_t fd, uvwasi_filestat_t* buf); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_filestat_set_size(uvwasi_t* uvwasi, uvwasi_fd_t fd, uvwasi_filesize_t st_size); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_filestat_set_times(uvwasi_t* uvwasi, uvwasi_fd_t fd, uvwasi_timestamp_t st_atim, uvwasi_timestamp_t st_mtim, uvwasi_fstflags_t fst_flags); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_pread(uvwasi_t* uvwasi, uvwasi_fd_t fd, const uvwasi_iovec_t* iovs, uvwasi_size_t iovs_len, uvwasi_filesize_t offset, uvwasi_size_t* nread); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_prestat_get(uvwasi_t* uvwasi, uvwasi_fd_t fd, uvwasi_prestat_t* buf); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_prestat_dir_name(uvwasi_t* uvwasi, uvwasi_fd_t fd, char* path, uvwasi_size_t path_len); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_pwrite(uvwasi_t* uvwasi, uvwasi_fd_t fd, const uvwasi_ciovec_t* iovs, uvwasi_size_t iovs_len, uvwasi_filesize_t offset, uvwasi_size_t* nwritten); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_read(uvwasi_t* uvwasi, uvwasi_fd_t fd, const uvwasi_iovec_t* iovs, uvwasi_size_t iovs_len, uvwasi_size_t* nread); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_readdir(uvwasi_t* uvwasi, uvwasi_fd_t fd, void* buf, uvwasi_size_t buf_len, uvwasi_dircookie_t cookie, uvwasi_size_t* bufused); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_renumber(uvwasi_t* uvwasi, uvwasi_fd_t from, uvwasi_fd_t to); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_seek(uvwasi_t* uvwasi, uvwasi_fd_t fd, uvwasi_filedelta_t offset, uvwasi_whence_t whence, uvwasi_filesize_t* newoffset); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_sync(uvwasi_t* uvwasi, uvwasi_fd_t fd); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_tell(uvwasi_t* uvwasi, uvwasi_fd_t fd, uvwasi_filesize_t* offset); +UVWASI_EXPORT uvwasi_errno_t uvwasi_fd_write(uvwasi_t* uvwasi, uvwasi_fd_t fd, const uvwasi_ciovec_t* iovs, uvwasi_size_t iovs_len, uvwasi_size_t* nwritten); +UVWASI_EXPORT uvwasi_errno_t uvwasi_path_create_directory(uvwasi_t* uvwasi, uvwasi_fd_t fd, const char* path, uvwasi_size_t path_len); +UVWASI_EXPORT uvwasi_errno_t uvwasi_path_filestat_get(uvwasi_t* uvwasi, uvwasi_fd_t fd, uvwasi_lookupflags_t flags, const char* path, uvwasi_size_t path_len, uvwasi_filestat_t* buf); +UVWASI_EXPORT uvwasi_errno_t uvwasi_path_filestat_set_times(uvwasi_t* uvwasi, uvwasi_fd_t fd, uvwasi_lookupflags_t flags, @@ -203,6 +238,7 @@ uvwasi_errno_t uvwasi_path_filestat_set_times(uvwasi_t* uvwasi, uvwasi_timestamp_t st_atim, uvwasi_timestamp_t st_mtim, uvwasi_fstflags_t fst_flags); +UVWASI_EXPORT uvwasi_errno_t uvwasi_path_link(uvwasi_t* uvwasi, uvwasi_fd_t old_fd, uvwasi_lookupflags_t old_flags, @@ -211,6 +247,7 @@ uvwasi_errno_t uvwasi_path_link(uvwasi_t* uvwasi, uvwasi_fd_t new_fd, const char* new_path, uvwasi_size_t new_path_len); +UVWASI_EXPORT uvwasi_errno_t uvwasi_path_open(uvwasi_t* uvwasi, uvwasi_fd_t dirfd, uvwasi_lookupflags_t dirflags, @@ -221,6 +258,7 @@ uvwasi_errno_t uvwasi_path_open(uvwasi_t* uvwasi, uvwasi_rights_t fs_rights_inheriting, uvwasi_fdflags_t fs_flags, uvwasi_fd_t* fd); +UVWASI_EXPORT uvwasi_errno_t uvwasi_path_readlink(uvwasi_t* uvwasi, uvwasi_fd_t fd, const char* path, @@ -228,10 +266,12 @@ uvwasi_errno_t uvwasi_path_readlink(uvwasi_t* uvwasi, char* buf, uvwasi_size_t buf_len, uvwasi_size_t* bufused); +UVWASI_EXPORT uvwasi_errno_t uvwasi_path_remove_directory(uvwasi_t* uvwasi, uvwasi_fd_t fd, const char* path, uvwasi_size_t path_len); +UVWASI_EXPORT uvwasi_errno_t uvwasi_path_rename(uvwasi_t* uvwasi, uvwasi_fd_t old_fd, const char* old_path, @@ -239,31 +279,40 @@ uvwasi_errno_t uvwasi_path_rename(uvwasi_t* uvwasi, uvwasi_fd_t new_fd, const char* new_path, uvwasi_size_t new_path_len); +UVWASI_EXPORT uvwasi_errno_t uvwasi_path_symlink(uvwasi_t* uvwasi, const char* old_path, uvwasi_size_t old_path_len, uvwasi_fd_t fd, const char* new_path, uvwasi_size_t new_path_len); +UVWASI_EXPORT uvwasi_errno_t uvwasi_path_unlink_file(uvwasi_t* uvwasi, uvwasi_fd_t fd, const char* path, uvwasi_size_t path_len); +UVWASI_EXPORT uvwasi_errno_t uvwasi_poll_oneoff(uvwasi_t* uvwasi, const uvwasi_subscription_t* in, uvwasi_event_t* out, uvwasi_size_t nsubscriptions, uvwasi_size_t* nevents); +UVWASI_EXPORT uvwasi_errno_t uvwasi_proc_exit(uvwasi_t* uvwasi, uvwasi_exitcode_t rval); +UVWASI_EXPORT uvwasi_errno_t uvwasi_proc_raise(uvwasi_t* uvwasi, uvwasi_signal_t sig); +UVWASI_EXPORT uvwasi_errno_t uvwasi_random_get(uvwasi_t* uvwasi, void* buf, uvwasi_size_t buf_len); +UVWASI_EXPORT uvwasi_errno_t uvwasi_sched_yield(uvwasi_t* uvwasi); +UVWASI_EXPORT uvwasi_errno_t uvwasi_sock_accept(uvwasi_t* uvwasi, uvwasi_fd_t sock, uvwasi_fdflags_t flags, uvwasi_fd_t* fd); +UVWASI_EXPORT uvwasi_errno_t uvwasi_sock_recv(uvwasi_t* uvwasi, uvwasi_fd_t sock, const uvwasi_iovec_t* ri_data, @@ -271,12 +320,14 @@ uvwasi_errno_t uvwasi_sock_recv(uvwasi_t* uvwasi, uvwasi_riflags_t ri_flags, uvwasi_size_t* ro_datalen, uvwasi_roflags_t* ro_flags); +UVWASI_EXPORT uvwasi_errno_t uvwasi_sock_send(uvwasi_t* uvwasi, uvwasi_fd_t sock, const uvwasi_ciovec_t* si_data, uvwasi_size_t si_data_len, uvwasi_siflags_t si_flags, uvwasi_size_t* so_datalen); +UVWASI_EXPORT uvwasi_errno_t uvwasi_sock_shutdown(uvwasi_t* uvwasi, uvwasi_fd_t sock, uvwasi_sdflags_t how); diff --git a/deps/uvwasi/include/wasi_serdes.h b/deps/uvwasi/include/wasi_serdes.h index 038ae74786ad0b..b482918855379e 100644 --- a/deps/uvwasi/include/wasi_serdes.h +++ b/deps/uvwasi/include/wasi_serdes.h @@ -6,7 +6,9 @@ /* Basic uint{8,16,32,64}_t read/write functions. */ #define BASIC_TYPE(name, type) \ + UVWASI_EXPORT \ void uvwasi_serdes_write_##name(void* ptr, size_t offset, type value); \ + UVWASI_EXPORT \ type uvwasi_serdes_read_##name(const void* ptr, size_t offset); \ #define BASIC_TYPE_UVWASI(type) BASIC_TYPE(type, uvwasi_##type) @@ -83,15 +85,18 @@ BASIC_TYPE_UVWASI(whence_t) /* WASI structure read/write functions. */ #define STRUCT(name) \ + UVWASI_EXPORT \ void uvwasi_serdes_write_##name(void* ptr, \ size_t offset, \ const uvwasi_##name* value); \ + UVWASI_EXPORT \ void uvwasi_serdes_read_##name(const void* ptr, \ size_t offset, \ uvwasi_##name* value); /* iovs currently only need to be read from WASM memory. */ #define IOVS_STRUCT(name) \ + UVWASI_EXPORT \ uvwasi_errno_t uvwasi_serdes_read_##name(const void* ptr, \ size_t end, \ size_t offset, \ @@ -124,12 +129,14 @@ STRUCT(subscription_t) #undef STRUCT #undef IOVS_STRUCT +UVWASI_EXPORT uvwasi_errno_t uvwasi_serdes_readv_ciovec_t(const void* ptr, size_t end, size_t offset, uvwasi_ciovec_t* iovs, uvwasi_size_t iovs_len); +UVWASI_EXPORT uvwasi_errno_t uvwasi_serdes_readv_iovec_t(const void* ptr, size_t end, size_t offset, @@ -137,7 +144,9 @@ uvwasi_errno_t uvwasi_serdes_readv_iovec_t(const void* ptr, uvwasi_size_t iovs_len); /* Helper functions for memory bounds checking. */ +UVWASI_EXPORT int uvwasi_serdes_check_bounds(size_t offset, size_t end, size_t size); +UVWASI_EXPORT int uvwasi_serdes_check_array_bounds(size_t offset, size_t end, size_t size, diff --git a/deps/uvwasi/include/wasi_types.h b/deps/uvwasi/include/wasi_types.h index 045c55288056c2..85eadd73b9493f 100644 --- a/deps/uvwasi/include/wasi_types.h +++ b/deps/uvwasi/include/wasi_types.h @@ -322,4 +322,10 @@ typedef uint8_t uvwasi_whence_t; #define UVWASI_WHENCE_CUR 1 #define UVWASI_WHENCE_END 2 +#ifdef NEED_UVWASI_EXPORT + #define UVWASI_EXPORT __attribute__((visibility("default"))) +#else + #define UVWASI_EXPORT +#endif /* NEED_UVWASI_EXPORT */ + #endif /* __UVWASI_WASI_TYPES_H__ */ diff --git a/deps/uvwasi/src/path_resolver.c b/deps/uvwasi/src/path_resolver.c index deb3f603821f92..8a73f03d5406c9 100644 --- a/deps/uvwasi/src/path_resolver.c +++ b/deps/uvwasi/src/path_resolver.c @@ -72,16 +72,23 @@ uvwasi_errno_t uvwasi__normalize_path(const char* path, uvwasi_size_t path_len, char* normalized_path, uvwasi_size_t normalized_len) { + /* Normalizes path and stores the resulting buffer in normalized_path. + the sizes of the buffers must correspond to strlen() of the relevant + buffers, i.e. there must be room in the relevant buffers for a + NULL-byte. */ const char* cur; char* ptr; char* next; char* last; size_t cur_len; int is_absolute; + int has_trailing_slash; if (path_len > normalized_len) return UVWASI_ENOBUFS; + has_trailing_slash = path_len > 0 && IS_SLASH(path[path_len - 1]); + is_absolute = uvwasi__is_absolute_path(path, path_len); normalized_path[0] = '\0'; ptr = normalized_path; @@ -156,6 +163,12 @@ uvwasi_errno_t uvwasi__normalize_path(const char* path, *ptr = '\0'; } + if (has_trailing_slash && !IS_SLASH(*(ptr - 1))) { + *ptr = '/'; + ptr++; + *ptr = '\0'; + } + return UVWASI_ESUCCESS; } @@ -171,7 +184,9 @@ static int uvwasi__is_path_sandboxed(const char* path, return path == strstr(path, fd_path) ? 1 : 0; /* Handle relative fds that normalized to '.' */ - if (fd_path_len == 1 && fd_path[0] == '.') { + if ((fd_path_len == 1 && fd_path[0] == '.') + || (fd_path_len == 2 && fd_path[0] == '.' && fd_path[1] == '/') + ) { /* If the fd's path is '.', then any path does not begin with '..' is OK. */ if ((path_len == 2 && path[0] == '.' && path[1] == '.') || (path_len > 2 && path[0] == '.' && path[1] == '.' && path[2] == '/')) { @@ -334,7 +349,8 @@ static uvwasi_errno_t uvwasi__resolve_path_to_host( char** resolved_path, uvwasi_size_t* resolved_len ) { - /* Return the normalized path, but resolved to the host's real path. */ + /* Return the normalized path, but resolved to the host's real path. + `path` must be a NULL-terminated string. */ char* res_path; char* stripped_path; int real_path_len; @@ -348,7 +364,11 @@ static uvwasi_errno_t uvwasi__resolve_path_to_host( fake_path_len = strlen(fd->normalized_path); /* If the fake path is '.' just ignore it. */ - if (fake_path_len == 1 && fd->normalized_path[0] == '.') { + if ((fake_path_len == 1 && fd->normalized_path[0] == '.') + || (fake_path_len == 2 + && fd->normalized_path[0] == '.' + && fd->normalized_path[1] == '/') + ) { fake_path_len = 0; } @@ -425,10 +445,20 @@ uvwasi_errno_t uvwasi__resolve_path(const uvwasi_t* uvwasi, normalized_parent = NULL; resolved_link_target = NULL; + if (uvwasi__is_absolute_path(input, input_len)) { + *resolved_path = NULL; + return UVWASI_ENOTCAPABLE; + } + start: normalized_path = NULL; err = UVWASI_ESUCCESS; + if (input_len != strnlen(input, input_len - 1) + 1) { + err = UVWASI_EINVAL; + goto exit; + } + if (1 == uvwasi__is_absolute_path(input, input_len)) { err = uvwasi__normalize_absolute_path(uvwasi, fd, diff --git a/deps/uvwasi/src/uvwasi.c b/deps/uvwasi/src/uvwasi.c index 948c1355c9ccf7..c33f5c4dee954b 100644 --- a/deps/uvwasi/src/uvwasi.c +++ b/deps/uvwasi/src/uvwasi.c @@ -395,6 +395,10 @@ uvwasi_errno_t uvwasi_init(uvwasi_t* uvwasi, const uvwasi_options_t* options) { if (options->preopen_socketc > 0) { uvwasi->loop = uvwasi__malloc(uvwasi, sizeof(uv_loop_t)); + + if (uvwasi->loop == NULL) + return UVWASI_ENOMEM; + r = uv_loop_init(uvwasi->loop); if (r != 0) { err = uvwasi__translate_uv_error(r); @@ -803,7 +807,7 @@ uvwasi_errno_t uvwasi_fd_close(uvwasi_t* uvwasi, uvwasi_fd_t fd) { uv_mutex_unlock(&wrap->mutex); if (err != UVWASI_ESUCCESS) { goto exit; - } + } } if (r != 0) { @@ -1392,6 +1396,7 @@ uvwasi_errno_t uvwasi_fd_readdir(uvwasi_t* uvwasi, #if defined(UVWASI_FD_READDIR_SUPPORTED) /* TODO(cjihrig): Avoid opening and closing the directory on each call. */ struct uvwasi_fd_wrap_t* wrap; + uvwasi_dircookie_t cur_cookie; uvwasi_dirent_t dirent; uv_dirent_t dirents[UVWASI__READDIR_NUM_ENTRIES]; uv_dir_t* dir; @@ -1400,7 +1405,6 @@ uvwasi_errno_t uvwasi_fd_readdir(uvwasi_t* uvwasi, size_t name_len; size_t available; size_t size_to_cp; - long tell; int i; int r; #endif /* defined(UVWASI_FD_READDIR_SUPPORTED) */ @@ -1440,8 +1444,22 @@ uvwasi_errno_t uvwasi_fd_readdir(uvwasi_t* uvwasi, uv_fs_req_cleanup(&req); /* Seek to the proper location in the directory. */ - if (cookie != UVWASI_DIRCOOKIE_START) - seekdir(dir->dir, cookie); + cur_cookie = 0; + while (cur_cookie < cookie) { + r = uv_fs_readdir(NULL, &req, dir, NULL); + if (r < 0) { + err = uvwasi__translate_uv_error(r); + uv_fs_req_cleanup(&req); + goto exit; + } + + cur_cookie += (uvwasi_dircookie_t)r; + uv_fs_req_cleanup(&req); + + if (r == 0) { + break; + } + } /* Read the directory entries into the provided buffer. */ err = UVWASI_ESUCCESS; @@ -1456,15 +1474,9 @@ uvwasi_errno_t uvwasi_fd_readdir(uvwasi_t* uvwasi, available = 0; for (i = 0; i < r; i++) { - tell = telldir(dir->dir); - if (tell < 0) { - err = uvwasi__translate_uv_error(uv_translate_sys_error(errno)); - uv_fs_req_cleanup(&req); - goto exit; - } - + cur_cookie++; name_len = strlen(dirents[i].name); - dirent.d_next = (uvwasi_dircookie_t) tell; + dirent.d_next = (uvwasi_dircookie_t) cur_cookie; /* TODO(cjihrig): libuv doesn't provide d_ino, and d_type is not supported on all platforms. Use stat()? */ dirent.d_ino = 0; @@ -2105,8 +2117,13 @@ uvwasi_errno_t uvwasi_path_open(uvwasi_t* uvwasi, if (err != UVWASI_ESUCCESS) goto close_file_and_error_exit; - if ((o_flags & UVWASI_O_DIRECTORY) != 0 && - filetype != UVWASI_FILETYPE_DIRECTORY) { + if ( + (filetype != UVWASI_FILETYPE_DIRECTORY) + && ( + (o_flags & UVWASI_O_DIRECTORY) != 0 + || (resolved_path[strlen(resolved_path) - 1] == '/') + ) + ) { err = UVWASI_ENOTDIR; goto close_file_and_error_exit; } @@ -2361,6 +2378,7 @@ uvwasi_errno_t uvwasi_path_symlink(uvwasi_t* uvwasi, const char* new_path, uvwasi_size_t new_path_len) { char* truncated_old_path; + char* resolved_old_path; char* resolved_new_path; struct uvwasi_fd_wrap_t* wrap; uvwasi_errno_t err; @@ -2387,40 +2405,61 @@ uvwasi_errno_t uvwasi_path_symlink(uvwasi_t* uvwasi, if (err != UVWASI_ESUCCESS) return err; + resolved_old_path = NULL; + resolved_new_path = NULL; + truncated_old_path = NULL; + truncated_old_path = uvwasi__malloc(uvwasi, old_path_len + 1); if (truncated_old_path == NULL) { - uv_mutex_unlock(&wrap->mutex); - return UVWASI_ENOMEM; + err = UVWASI_ENOMEM; + goto exit; } memcpy(truncated_old_path, old_path, old_path_len); truncated_old_path[old_path_len] = '\0'; + if (old_path_len > 0 && old_path[0] == '/') { + err = UVWASI_EPERM; + goto exit; + } + + err = uvwasi__resolve_path(uvwasi, + wrap, + old_path, + old_path_len, + &resolved_old_path, + 0); + if (err != UVWASI_ESUCCESS) + goto exit; + err = uvwasi__resolve_path(uvwasi, wrap, new_path, new_path_len, &resolved_new_path, 0); - if (err != UVWASI_ESUCCESS) { - uv_mutex_unlock(&wrap->mutex); - uvwasi__free(uvwasi, truncated_old_path); - return err; - } + if (err != UVWASI_ESUCCESS) + goto exit; + /* Windows support may require setting the flags option. */ r = uv_fs_symlink(NULL, &req, truncated_old_path, resolved_new_path, 0, NULL); + uv_fs_req_cleanup(&req); + if (r != 0) { + err = uvwasi__translate_uv_error(r); + goto exit; + } + + err = UVWASI_ESUCCESS; +exit: uv_mutex_unlock(&wrap->mutex); - uvwasi__free(uvwasi, truncated_old_path); + uvwasi__free(uvwasi, resolved_old_path); uvwasi__free(uvwasi, resolved_new_path); - uv_fs_req_cleanup(&req); - if (r != 0) - return uvwasi__translate_uv_error(r); + uvwasi__free(uvwasi, truncated_old_path); - return UVWASI_ESUCCESS; + return err; } - uvwasi_errno_t uvwasi_path_unlink_file(uvwasi_t* uvwasi, uvwasi_fd_t fd, const char* path, @@ -2793,7 +2832,7 @@ uvwasi_errno_t uvwasi_sock_shutdown(uvwasi_t* uvwasi, uv_mutex_unlock(&wrap->mutex); - if (shutdown_data.status != 0) + if (shutdown_data.status != 0) return uvwasi__translate_uv_error(shutdown_data.status); return UVWASI_ESUCCESS; @@ -2832,6 +2871,10 @@ uvwasi_errno_t uvwasi_sock_accept(uvwasi_t* uvwasi, sock_loop = uv_handle_get_loop((uv_handle_t*) wrap->sock); uv_tcp_t* uv_connect_sock = (uv_tcp_t*) uvwasi__malloc(uvwasi, sizeof(uv_tcp_t)); + + if (uv_connect_sock == NULL) + return UVWASI_ENOMEM; + uv_tcp_init(sock_loop, uv_connect_sock); r = uv_accept((uv_stream_t*) wrap->sock, (uv_stream_t*) uv_connect_sock); From 44c24657d32631aae959b216442e0b865240ed62 Mon Sep 17 00:00:00 2001 From: James M Snell Date: Mon, 8 Sep 2025 20:30:43 -0700 Subject: [PATCH 096/103] src: fixup node_messaging error handling MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Replace ToLocalChecked uses PR-URL: https://github.com/nodejs/node/pull/59792 Reviewed-By: Daeyeon Jeong Reviewed-By: Juan José Arboleda --- src/node_messaging.cc | 41 ++++++++++++++++++++++++++++------------- 1 file changed, 28 insertions(+), 13 deletions(-) diff --git a/src/node_messaging.cc b/src/node_messaging.cc index 084ff03dbe8124..ad4a7b3974df90 100644 --- a/src/node_messaging.cc +++ b/src/node_messaging.cc @@ -328,6 +328,7 @@ class SerializerDelegate : public ValueSerializer::Delegate { if (JSTransferable::IsJSTransferable(env_, context_, object)) { BaseObjectPtr js_transferable = JSTransferable::Wrap(env_, object); + if (!js_transferable) return Nothing(); return WriteHostObject(js_transferable); } @@ -550,6 +551,7 @@ Maybe Message::Serialize(Environment* env, return Nothing(); } host_object = JSTransferable::Wrap(env, entry); + if (!host_object) return Nothing(); } if (env->message_port_constructor_template()->HasInstance(entry) && @@ -1245,27 +1247,38 @@ Local GetMessagePortConstructorTemplate( BaseObjectPtr JSTransferable::Wrap(Environment* env, Local target) { Local context = env->context(); - Local wrapper_val = - target->GetPrivate(context, env->js_transferable_wrapper_private_symbol()) - .ToLocalChecked(); + Local wrapper_val; + if (!target + ->GetPrivate(context, env->js_transferable_wrapper_private_symbol()) + .ToLocal(&wrapper_val)) { + return {}; + } DCHECK(wrapper_val->IsObject() || wrapper_val->IsUndefined()); BaseObjectPtr wrapper; if (wrapper_val->IsObject()) { wrapper = BaseObjectPtr{Unwrap(wrapper_val)}; } else { - Local wrapper_obj = env->js_transferable_constructor_template() - ->GetFunction(context) - .ToLocalChecked() - ->NewInstance(context) - .ToLocalChecked(); + Local ctor; + if (!env->js_transferable_constructor_template() + ->GetFunction(context) + .ToLocal(&ctor)) { + return {}; + } + Local wrapper_obj; + if (!ctor->NewInstance(context).ToLocal(&wrapper_obj)) { + return {}; + } // Make sure the JSTransferable wrapper object is not garbage collected // until the strong BaseObjectPtr's reference count is decreased to 0. wrapper = MakeDetachedBaseObject(env, wrapper_obj, target); - target - ->SetPrivate( - context, env->js_transferable_wrapper_private_symbol(), wrapper_obj) - .ToChecked(); + if (target + ->SetPrivate(context, + env->js_transferable_wrapper_private_symbol(), + wrapper_obj) + .IsNothing()) { + return {}; + } } return wrapper; } @@ -1396,7 +1409,9 @@ Maybe JSTransferable::NestedTransferables() const { if (!JSTransferable::IsJSTransferable(env(), context, obj)) { continue; } - ret.emplace_back(JSTransferable::Wrap(env(), obj)); + auto wrapped = JSTransferable::Wrap(env(), obj); + if (!wrapped) return Nothing(); + ret.emplace_back(wrapped); } return Just(ret); } From 83d11f8a7ae5658e5f7625701e1adeeab5b1553f Mon Sep 17 00:00:00 2001 From: hotpineapple <77835382+hotpineapple@users.noreply.github.com> Date: Tue, 9 Sep 2025 15:30:41 +0900 Subject: [PATCH 097/103] tools: print appropriate output when test aborted MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit distinguish test abort from all tests passed. PR-URL: https://github.com/nodejs/node/pull/59794 Reviewed-By: Daeyeon Jeong Reviewed-By: Antoine du Hamel Reviewed-By: Juan José Arboleda Reviewed-By: Luigi Pinca --- tools/test.py | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/tools/test.py b/tools/test.py index 59801e1b07498b..d1684e86d14ffe 100755 --- a/tools/test.py +++ b/tools/test.py @@ -174,7 +174,7 @@ def Run(self, tasks) -> Dict: raise self.Done() return { - 'allPassed': not self.failed, + 'allPassed': not self.failed and not self.shutdown_event.is_set(), 'failed': self.failed, } @@ -1843,11 +1843,12 @@ def should_keep(case): if result['allPassed']: print("\nAll tests passed.") - else: + elif result['failed']: print("\nFailed tests:") for failure in result['failed']: print(EscapeCommand(failure.command)) - + else: + print("\nTest aborted.") return exitcode From bcb29fb84f911e421f645e331389b6fc88ec4092 Mon Sep 17 00:00:00 2001 From: Yaksh Bariya Date: Tue, 9 Sep 2025 17:47:47 +0530 Subject: [PATCH 098/103] src: correctly report memory changes to V8 Call `V8::ExternalMemoryAccounter::Update` instead of `V8::ExternalMemoryAccounter::Increase` to report memory difference to V8 Calling `V8::ExternalMemoryAccounter::Increase` with a signed integer on 32-bit platforms causes instances where GC inside GC takes place leading to a crash in certain cases. During GC, native objects are destructed. In destructor for `CompressionStream` class used by zlib, memory release information is passed onto `V8::ExternalMemoryAccounter::Increase()` instead of `V8::ExternalMemoryAccounter::Decrease()` which triggers V8's memory limits, thus triggering GC inside GC which leads to crash. Bug initially introduced in commit 1d5d7b6eedb2274c9ad48b5f378598a10479e4a7 For full report see https://hackerone.com/reports/3302484 PR-URL: https://github.com/nodejs/node/pull/59623 Reviewed-By: Chengzhong Wu --- src/node_mem-inl.h | 2 +- src/node_zlib.cc | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/src/node_mem-inl.h b/src/node_mem-inl.h index 06871d031d36f6..70d28dd524be84 100644 --- a/src/node_mem-inl.h +++ b/src/node_mem-inl.h @@ -59,7 +59,7 @@ void* NgLibMemoryManager::ReallocImpl(void* ptr, // Environment*/Isolate* parameter and call the V8 method transparently. const int64_t new_size = size - previous_size; manager->IncreaseAllocatedSize(new_size); - manager->env()->external_memory_accounter()->Increase( + manager->env()->external_memory_accounter()->Update( manager->env()->isolate(), new_size); *reinterpret_cast(mem) = size; mem += sizeof(size_t); diff --git a/src/node_zlib.cc b/src/node_zlib.cc index c088c54753989f..b8617093bdf5a6 100644 --- a/src/node_zlib.cc +++ b/src/node_zlib.cc @@ -644,7 +644,7 @@ class CompressionStream : public AsyncWrap, public ThreadPoolWork { if (report == 0) return; CHECK_IMPLIES(report < 0, zlib_memory_ >= static_cast(-report)); zlib_memory_ += report; - AsyncWrap::env()->external_memory_accounter()->Increase( + AsyncWrap::env()->external_memory_accounter()->Update( AsyncWrap::env()->isolate(), report); } From 8671a6cdb3144bd7e27f6f6aaa32f67850c4434c Mon Sep 17 00:00:00 2001 From: Rafael Gonzaga Date: Tue, 9 Sep 2025 14:06:08 -0300 Subject: [PATCH 099/103] doc: stabilize --disable-sigusr1 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Signed-off-by: RafaelGSS PR-URL: https://github.com/nodejs/node/pull/59707 Reviewed-By: Luigi Pinca Reviewed-By: Benjamin Gruenbaum Reviewed-By: Ulises Gascón Reviewed-By: Juan José Arboleda Reviewed-By: Antoine du Hamel --- doc/api/cli.md | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/doc/api/cli.md b/doc/api/cli.md index 01735cafddf762..55485005e5eae6 100644 --- a/doc/api/cli.md +++ b/doc/api/cli.md @@ -592,10 +592,12 @@ property throw an exception with the code `ERR_PROTO_ACCESS`. added: - v23.7.0 - v22.14.0 +changes: + - version: REPLACEME + pr-url: https://github.com/nodejs/node/pull/59707 + description: The option is no longer experimental. --> -> Stability: 1.2 - Release candidate - Disable the ability of starting a debugging session by sending a `SIGUSR1` signal to the process. From 2865d2ac2049e58e3d80ee7ca81dfe84118acd57 Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Tue, 9 Sep 2025 21:23:09 +0200 Subject: [PATCH 100/103] http: unbreak keepAliveTimeoutBuffer MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Adds a guard. PR-URL: https://github.com/nodejs/node/pull/59784 Reviewed-By: Gerhard Stöbich Reviewed-By: Rafael Gonzaga --- lib/_http_server.js | 9 +++++++-- 1 file changed, 7 insertions(+), 2 deletions(-) diff --git a/lib/_http_server.js b/lib/_http_server.js index de71fdf7228b05..f3b8e9c35950d3 100644 --- a/lib/_http_server.js +++ b/lib/_http_server.js @@ -1025,11 +1025,16 @@ function resOnFinish(req, res, socket, state, server) { socket.end(); } } else if (state.outgoing.length === 0) { - if (server.keepAliveTimeout && typeof socket.setTimeout === 'function') { + const keepAliveTimeout = NumberIsFinite(server.keepAliveTimeout) && server.keepAliveTimeout >= 0 ? + server.keepAliveTimeout : 0; + const keepAliveTimeoutBuffer = NumberIsFinite(server.keepAliveTimeoutBuffer) && server.keepAliveTimeoutBuffer >= 0 ? + server.keepAliveTimeoutBuffer : 1e3; + + if (keepAliveTimeout && typeof socket.setTimeout === 'function') { // Extend the internal timeout by the configured buffer to reduce // the likelihood of ECONNRESET errors. // This allows fine-tuning beyond the advertised keepAliveTimeout. - socket.setTimeout(server.keepAliveTimeout + server.keepAliveTimeoutBuffer); + socket.setTimeout(keepAliveTimeout + keepAliveTimeoutBuffer); state.keepAliveTimeoutSet = true; } } else { From 92128a8fe23fd2f6bce0145acdaf750a594b275f Mon Sep 17 00:00:00 2001 From: James M Snell Date: Tue, 9 Sep 2025 22:34:54 -0700 Subject: [PATCH 101/103] src: use DictionaryTemplate for node_url_pattern MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Improved API and better performance: ``` url/urlpattern-parse.js n=100000 ... *** 11.59 % ±0.96% ±1.28% ±1.66% url/urlpattern-parse.js n=100000 ... *** 9.28 % ±0.94% ±1.25% ±1.63% url/urlpattern-parse.js n=100000 ... *** 6.97 % ±0.97% ±1.29% ±1.70% url/urlpattern-parse.js n=100000 ... *** 7.56 % ±0.92% ±1.22% ±1.59% url/urlpattern-test.js n=100000 ... *** 2.84 % ±1.50% ±2.00% ±2.61% url/urlpattern-test.js n=100000 ... *** 4.13 % ±2.14% ±2.84% ±3.70% url/urlpattern-test.js n=100000 ... *** 4.76 % ±1.43% ±1.91% ±2.49% url/urlpattern-test.js n=100000 ... *** 4.44 % ±1.26% ±1.68% ±2.19% ``` PR-URL: https://github.com/nodejs/node/pull/59802 Reviewed-By: Anna Henningsen Reviewed-By: Yagiz Nizipli --- src/env_properties.h | 60 ++++------------ src/node_sqlite.cc | 140 ++++++++++++++++++------------------- src/node_url_pattern.cc | 93 ++++++++++++------------- src/node_util.cc | 37 ++++++---- src/node_v8.cc | 151 ++++++++++++++++++++++------------------ src/node_worker.cc | 83 ++++++++++++---------- src/util-inl.h | 25 +++++++ src/util.h | 14 ++++ 8 files changed, 319 insertions(+), 284 deletions(-) diff --git a/src/env_properties.h b/src/env_properties.h index ea2eff221fd3ea..7ef519960070c0 100644 --- a/src/env_properties.h +++ b/src/env_properties.h @@ -84,7 +84,6 @@ V(base_string, "base") \ V(base_url_string, "baseURL") \ V(bits_string, "bits") \ - V(block_list_string, "blockList") \ V(buffer_string, "buffer") \ V(bytes_parsed_string, "bytesParsed") \ V(bytes_read_string, "bytesRead") \ @@ -93,11 +92,9 @@ V(cached_data_produced_string, "cachedDataProduced") \ V(cached_data_rejected_string, "cachedDataRejected") \ V(cached_data_string, "cachedData") \ - V(cache_key_string, "cacheKey") \ V(cert_usage_string, "certUsage") \ V(change_string, "change") \ V(changes_string, "changes") \ - V(channel_string, "channel") \ V(chunks_sent_since_last_write_string, "chunksSentSinceLastWrite") \ V(client_id_string, "clientId") \ V(clone_unsupported_type_str, "Cannot clone object of unsupported type.") \ @@ -106,9 +103,6 @@ "transferList") \ V(clone_untransferable_str, "Found invalid value in transferList.") \ V(code_string, "code") \ - V(column_number_string, "columnNumber") \ - V(column_string, "column") \ - V(commonjs_string, "commonjs") \ V(config_string, "config") \ V(constants_string, "constants") \ V(crypto_dh_string, "dh") \ @@ -140,7 +134,6 @@ V(crypto_rsa_pss_string, "rsa-pss") \ V(cwd_string, "cwd") \ V(data_string, "data") \ - V(database_string, "database") \ V(default_is_true_string, "defaultIsTrue") \ V(deserialize_info_string, "deserializeInfo") \ V(dest_string, "dest") \ @@ -167,10 +160,8 @@ V(ecdh_string, "ECDH") \ V(emit_string, "emit") \ V(emit_warning_string, "emitWarning") \ - V(empty_object_string, "{}") \ V(encoding_string, "encoding") \ V(entries_string, "entries") \ - V(entry_type_string, "entryType") \ V(env_pairs_string, "envPairs") \ V(env_var_settings_string, "envVarSettings") \ V(err_sqlite_error_string, "ERR_SQLITE_ERROR") \ @@ -198,10 +189,8 @@ V(fingerprint_string, "fingerprint") \ V(flags_string, "flags") \ V(flowlabel_string, "flowlabel") \ - V(fragment_string, "fragment") \ V(frames_received_string, "framesReceived") \ V(frames_sent_string, "framesSent") \ - V(function_name_string, "functionName") \ V(function_string, "function") \ V(get_string, "get") \ V(get_data_clone_error_string, "_getDataCloneError") \ @@ -229,14 +218,10 @@ V(infoaccess_string, "infoAccess") \ V(inherit_string, "inherit") \ V(input_string, "input") \ - V(inputs_string, "inputs") \ - V(internal_binding_string, "internalBinding") \ - V(internal_string, "internal") \ V(inverse_string, "inverse") \ V(ipv4_string, "IPv4") \ V(ipv6_string, "IPv6") \ V(isclosing_string, "isClosing") \ - V(isfinished_string, "isFinished") \ V(issuer_string, "issuer") \ V(issuercert_string, "issuerCertificate") \ V(iterator_string, "Iterator") \ @@ -268,7 +253,6 @@ V(last_insert_rowid_string, "lastInsertRowid") \ V(length_string, "length") \ V(library_string, "library") \ - V(line_number_string, "lineNumber") \ V(loop_count, "loopCount") \ V(mac_string, "mac") \ V(match_string, "match") \ @@ -286,12 +270,10 @@ V(modulus_length_string, "modulusLength") \ V(name_string, "name") \ V(named_curve_string, "namedCurve") \ - V(netmask_string, "netmask") \ V(next_string, "next") \ V(nistcurve_string, "nistCurve") \ V(node_string, "node") \ V(nsname_string, "nsname") \ - V(num_cols_string, "num_cols") \ V(object_string, "Object") \ V(ocsp_request_string, "OCSPRequest") \ V(oncertcb_string, "oncertcb") \ @@ -349,7 +331,6 @@ V(psk_string, "psk") \ V(pubkey_string, "pubkey") \ V(public_exponent_string, "publicExponent") \ - V(query_string, "query") \ V(rate_string, "rate") \ V(raw_string, "raw") \ V(read_host_object_string, "_readHostObject") \ @@ -367,17 +348,11 @@ "export * from 'original'; export { default } from 'original'; export " \ "const __esModule = true;") \ V(require_string, "require") \ - V(resolve_string, "resolve") \ V(resource_string, "resource") \ V(result_string, "result") \ V(retry_string, "retry") \ V(return_arrays_string, "returnArrays") \ - V(return_string, "return") \ V(salt_length_string, "saltLength") \ - V(scheme_string, "scheme") \ - V(scopeid_string, "scopeid") \ - V(script_id_string, "scriptId") \ - V(script_name_string, "scriptName") \ V(search_string, "search") \ V(selector_string, "selector") \ V(serial_number_string, "serialNumber") \ @@ -399,9 +374,7 @@ V(stack_string, "stack") \ V(standard_name_string, "standardName") \ V(start_string, "start") \ - V(start_time_string, "startTime") \ V(state_string, "state") \ - V(statement_string, "statement") \ V(stats_string, "stats") \ V(status_string, "status") \ V(stdio_string, "stdio") \ @@ -430,12 +403,6 @@ V(type_string, "type") \ V(uid_string, "uid") \ V(unknown_string, "") \ - V(url_special_ftp_string, "ftp:") \ - V(url_special_file_string, "file:") \ - V(url_special_http_string, "http:") \ - V(url_special_https_string, "https:") \ - V(url_special_ws_string, "ws:") \ - V(url_special_wss_string, "wss:") \ V(url_string, "url") \ V(username_string, "username") \ V(valid_from_string, "valid_from") \ @@ -449,57 +416,58 @@ V(wrap_string, "wrap") \ V(writable_string, "writable") \ V(write_host_object_string, "_writeHostObject") \ - V(write_queue_size_string, "writeQueueSize") \ - V(x_forwarded_string, "x-forwarded-for") + V(write_queue_size_string, "writeQueueSize") #define PER_ISOLATE_TEMPLATE_PROPERTIES(V) \ V(async_wrap_ctor_template, v8::FunctionTemplate) \ - V(async_wrap_object_ctor_template, v8::FunctionTemplate) \ V(binding_data_default_template, v8::ObjectTemplate) \ V(blob_constructor_template, v8::FunctionTemplate) \ V(blob_reader_constructor_template, v8::FunctionTemplate) \ V(blocklist_constructor_template, v8::FunctionTemplate) \ + V(callsite_template, v8::DictionaryTemplate) \ V(contextify_global_template, v8::ObjectTemplate) \ V(contextify_wrapper_template, v8::ObjectTemplate) \ + V(cpu_usage_template, v8::DictionaryTemplate) \ V(crypto_key_object_handle_constructor, v8::FunctionTemplate) \ V(env_proxy_template, v8::ObjectTemplate) \ V(env_proxy_ctor_template, v8::FunctionTemplate) \ V(dir_instance_template, v8::ObjectTemplate) \ V(fd_constructor_template, v8::ObjectTemplate) \ V(fdclose_constructor_template, v8::ObjectTemplate) \ - V(fdentry_constructor_template, v8::FunctionTemplate) \ V(filehandlereadwrap_template, v8::ObjectTemplate) \ + V(free_list_statistics_template, v8::DictionaryTemplate) \ V(fsreqpromise_constructor_template, v8::ObjectTemplate) \ V(handle_wrap_ctor_template, v8::FunctionTemplate) \ + V(heap_statistics_template, v8::DictionaryTemplate) \ + V(v8_heap_statistics_template, v8::DictionaryTemplate) \ V(histogram_ctor_template, v8::FunctionTemplate) \ V(http2settings_constructor_template, v8::ObjectTemplate) \ V(http2stream_constructor_template, v8::ObjectTemplate) \ V(http2ping_constructor_template, v8::ObjectTemplate) \ V(i18n_converter_template, v8::ObjectTemplate) \ V(intervalhistogram_constructor_template, v8::FunctionTemplate) \ + V(iter_template, v8::DictionaryTemplate) \ V(js_transferable_constructor_template, v8::FunctionTemplate) \ V(libuv_stream_wrap_ctor_template, v8::FunctionTemplate) \ V(lock_holder_constructor_template, v8::FunctionTemplate) \ V(message_port_constructor_template, v8::FunctionTemplate) \ V(module_wrap_constructor_template, v8::FunctionTemplate) \ - V(microtask_queue_ctor_template, v8::FunctionTemplate) \ + V(object_stats_template, v8::DictionaryTemplate) \ + V(page_stats_template, v8::DictionaryTemplate) \ V(pipe_constructor_template, v8::FunctionTemplate) \ - V(promise_wrap_template, v8::ObjectTemplate) \ - V(sab_lifetimepartner_constructor_template, v8::FunctionTemplate) \ V(script_context_constructor_template, v8::FunctionTemplate) \ V(secure_context_constructor_template, v8::FunctionTemplate) \ V(shutdown_wrap_template, v8::ObjectTemplate) \ V(socketaddress_constructor_template, v8::FunctionTemplate) \ + V(space_stats_template, v8::DictionaryTemplate) \ + V(sqlite_column_template, v8::DictionaryTemplate) \ V(sqlite_statement_sync_constructor_template, v8::FunctionTemplate) \ V(sqlite_statement_sync_iterator_constructor_template, v8::FunctionTemplate) \ V(sqlite_session_constructor_template, v8::FunctionTemplate) \ - V(streambaseentry_ctor_template, v8::FunctionTemplate) \ V(streambaseoutputstream_constructor_template, v8::ObjectTemplate) \ - V(streamentry_ctor_template, v8::FunctionTemplate) \ - V(streamentry_opaque_ctor_template, v8::FunctionTemplate) \ - V(qlogoutputstream_constructor_template, v8::ObjectTemplate) \ V(tcp_constructor_template, v8::FunctionTemplate) \ V(tty_constructor_template, v8::FunctionTemplate) \ + V(urlpatternresult_template, v8::DictionaryTemplate) \ V(write_wrap_template, v8::ObjectTemplate) \ V(worker_cpu_profile_taker_template, v8::ObjectTemplate) \ V(worker_cpu_usage_taker_template, v8::ObjectTemplate) \ @@ -516,11 +484,9 @@ V(async_hooks_init_function, v8::Function) \ V(async_hooks_promise_resolve_function, v8::Function) \ V(buffer_prototype_object, v8::Object) \ - V(crypto_key_object_constructor, v8::Function) \ V(crypto_key_object_private_constructor, v8::Function) \ V(crypto_key_object_public_constructor, v8::Function) \ V(crypto_key_object_secret_constructor, v8::Function) \ - V(domexception_function, v8::Function) \ V(enhance_fatal_stack_after_inspector, v8::Function) \ V(enhance_fatal_stack_before_inspector, v8::Function) \ V(get_source_map_error_source, v8::Function) \ @@ -557,7 +523,6 @@ V(primordials_safe_set_prototype_object, v8::Object) \ V(primordials_safe_weak_map_prototype_object, v8::Object) \ V(primordials_safe_weak_set_prototype_object, v8::Object) \ - V(promise_hook_handler, v8::Function) \ V(promise_reject_callback, v8::Function) \ V(snapshot_serialize_callback, v8::Function) \ V(snapshot_deserialize_callback, v8::Function) \ @@ -568,7 +533,6 @@ V(tls_wrap_constructor_function, v8::Function) \ V(trace_category_state_function, v8::Function) \ V(udp_constructor_function, v8::Function) \ - V(url_constructor_function, v8::Function) \ V(wasm_streaming_compilation_impl, v8::Function) \ V(wasm_streaming_object_constructor, v8::Function) diff --git a/src/node_sqlite.cc b/src/node_sqlite.cc index 49b7f9a36c85da..c902e0d76c064b 100644 --- a/src/node_sqlite.cc +++ b/src/node_sqlite.cc @@ -24,6 +24,7 @@ using v8::BigInt; using v8::Boolean; using v8::ConstructorBehavior; using v8::Context; +using v8::DictionaryTemplate; using v8::DontDelete; using v8::Exception; using v8::Function; @@ -119,6 +120,18 @@ using v8::Value; } \ } while (0) +namespace { +Local getLazyIterTemplate(Environment* env) { + auto iter_template = env->iter_template(); + if (iter_template.IsEmpty()) { + static constexpr std::string_view iter_keys[] = {"done", "value"}; + iter_template = DictionaryTemplate::New(env->isolate(), iter_keys); + env->set_iter_template(iter_template); + } + return iter_template; +} +} // namespace + inline MaybeLocal CreateSQLiteError(Isolate* isolate, const char* message) { Local js_msg; @@ -2231,58 +2244,35 @@ void StatementSync::Columns(const FunctionCallbackInfo& args) { int num_cols = sqlite3_column_count(stmt->statement_); Isolate* isolate = env->isolate(); LocalVector cols(isolate); - LocalVector col_keys(isolate, - {env->column_string(), - env->database_string(), - env->name_string(), - env->table_string(), - env->type_string()}); - Local value; + auto sqlite_column_template = env->sqlite_column_template(); + if (sqlite_column_template.IsEmpty()) { + static constexpr std::string_view col_keys[] = { + "column", "database", "name", "table", "type"}; + sqlite_column_template = DictionaryTemplate::New(isolate, col_keys); + env->set_sqlite_column_template(sqlite_column_template); + } cols.reserve(num_cols); for (int i = 0; i < num_cols; ++i) { - LocalVector col_values(isolate); - col_values.reserve(col_keys.size()); - - if (!NullableSQLiteStringToValue( - isolate, sqlite3_column_origin_name(stmt->statement_, i)) - .ToLocal(&value)) { - return; - } - col_values.emplace_back(value); - - if (!NullableSQLiteStringToValue( - isolate, sqlite3_column_database_name(stmt->statement_, i)) - .ToLocal(&value)) { - return; - } - col_values.emplace_back(value); - - if (!stmt->ColumnNameToName(i).ToLocal(&value)) { - return; - } - col_values.emplace_back(value); - - if (!NullableSQLiteStringToValue( - isolate, sqlite3_column_table_name(stmt->statement_, i)) - .ToLocal(&value)) { - return; - } - col_values.emplace_back(value); - - if (!NullableSQLiteStringToValue( - isolate, sqlite3_column_decltype(stmt->statement_, i)) - .ToLocal(&value)) { + MaybeLocal values[] = { + NullableSQLiteStringToValue( + isolate, sqlite3_column_origin_name(stmt->statement_, i)), + NullableSQLiteStringToValue( + isolate, sqlite3_column_database_name(stmt->statement_, i)), + stmt->ColumnNameToName(i), + NullableSQLiteStringToValue( + isolate, sqlite3_column_table_name(stmt->statement_, i)), + NullableSQLiteStringToValue( + isolate, sqlite3_column_decltype(stmt->statement_, i)), + }; + + Local col; + if (!NewDictionaryInstanceNullProto( + env->context(), sqlite_column_template, values) + .ToLocal(&col)) { return; } - col_values.emplace_back(value); - - Local column = Object::New(isolate, - Null(isolate), - col_keys.data(), - col_values.data(), - col_keys.size()); - cols.emplace_back(column); + cols.emplace_back(col); } args.GetReturnValue().Set(Array::New(isolate, cols.data(), cols.size())); @@ -2514,15 +2504,19 @@ void StatementSyncIterator::Next(const FunctionCallbackInfo& args) { THROW_AND_RETURN_ON_BAD_STATE( env, iter->stmt_->IsFinalized(), "statement has been finalized"); Isolate* isolate = env->isolate(); - LocalVector keys(isolate, {env->done_string(), env->value_string()}); + + auto iter_template = getLazyIterTemplate(env); if (iter->done_) { - LocalVector values(isolate, - {Boolean::New(isolate, true), Null(isolate)}); - DCHECK_EQ(values.size(), keys.size()); - Local result = Object::New( - isolate, Null(isolate), keys.data(), values.data(), keys.size()); - args.GetReturnValue().Set(result); + MaybeLocal values[]{ + Boolean::New(isolate, true), + Null(isolate), + }; + Local result; + if (NewDictionaryInstanceNullProto(env->context(), iter_template, values) + .ToLocal(&result)) { + args.GetReturnValue().Set(result); + } return; } @@ -2531,12 +2525,12 @@ void StatementSyncIterator::Next(const FunctionCallbackInfo& args) { CHECK_ERROR_OR_THROW( env->isolate(), iter->stmt_->db_.get(), r, SQLITE_DONE, void()); sqlite3_reset(iter->stmt_->statement_); - LocalVector values(isolate, - {Boolean::New(isolate, true), Null(isolate)}); - DCHECK_EQ(values.size(), keys.size()); - Local result = Object::New( - isolate, Null(isolate), keys.data(), values.data(), keys.size()); - args.GetReturnValue().Set(result); + MaybeLocal values[] = {Boolean::New(isolate, true), Null(isolate)}; + Local result; + if (NewDictionaryInstanceNullProto(env->context(), iter_template, values) + .ToLocal(&result)) { + args.GetReturnValue().Set(result); + } return; } @@ -2564,11 +2558,12 @@ void StatementSyncIterator::Next(const FunctionCallbackInfo& args) { isolate, Null(isolate), row_keys.data(), row_values.data(), num_cols); } - LocalVector values(isolate, {Boolean::New(isolate, false), row_value}); - DCHECK_EQ(keys.size(), values.size()); - Local result = Object::New( - isolate, Null(isolate), keys.data(), values.data(), keys.size()); - args.GetReturnValue().Set(result); + MaybeLocal values[] = {Boolean::New(isolate, false), row_value}; + Local result; + if (NewDictionaryInstanceNullProto(env->context(), iter_template, values) + .ToLocal(&result)) { + args.GetReturnValue().Set(result); + } } void StatementSyncIterator::Return(const FunctionCallbackInfo& args) { @@ -2581,14 +2576,15 @@ void StatementSyncIterator::Return(const FunctionCallbackInfo& args) { sqlite3_reset(iter->stmt_->statement_); iter->done_ = true; - LocalVector keys(isolate, {env->done_string(), env->value_string()}); - LocalVector values(isolate, - {Boolean::New(isolate, true), Null(isolate)}); - DCHECK_EQ(keys.size(), values.size()); - Local result = Object::New( - isolate, Null(isolate), keys.data(), values.data(), keys.size()); - args.GetReturnValue().Set(result); + auto iter_template = getLazyIterTemplate(env); + MaybeLocal values[] = {Boolean::New(isolate, true), Null(isolate)}; + + Local result; + if (NewDictionaryInstanceNullProto(env->context(), iter_template, values) + .ToLocal(&result)) { + args.GetReturnValue().Set(result); + } } Session::Session(Environment* env, diff --git a/src/node_url_pattern.cc b/src/node_url_pattern.cc index e84e9b0de9ab20..f1bddaeab0260e 100644 --- a/src/node_url_pattern.cc +++ b/src/node_url_pattern.cc @@ -54,13 +54,13 @@ namespace node::url_pattern { using v8::Array; using v8::Context; +using v8::DictionaryTemplate; using v8::DontDelete; using v8::FunctionCallbackInfo; using v8::FunctionTemplate; using v8::Global; using v8::Isolate; using v8::Local; -using v8::LocalVector; using v8::MaybeLocal; using v8::Name; using v8::NewStringType; @@ -396,56 +396,49 @@ MaybeLocal URLPattern::URLPatternComponentResult::ToJSObject( MaybeLocal URLPattern::URLPatternResult::ToJSValue( Environment* env, const ada::url_pattern_result& result) { auto isolate = env->isolate(); - Local names[] = { - env->inputs_string(), - env->protocol_string(), - env->username_string(), - env->password_string(), - env->hostname_string(), - env->port_string(), - env->pathname_string(), - env->search_string(), - env->hash_string(), - }; - LocalVector inputs(isolate, result.inputs.size()); - size_t index = 0; - for (auto& input : result.inputs) { - if (std::holds_alternative(input)) { - auto input_str = std::get(input); - if (!ToV8Value(env->context(), input_str).ToLocal(&inputs[index])) { - return {}; - } - } else { - DCHECK(std::holds_alternative(input)); - auto init = std::get(input); - if (!URLPatternInit::ToJsObject(env, init).ToLocal(&inputs[index])) { - return {}; - } - } - index++; - } - LocalVector values(isolate, arraysize(names)); - values[0] = Array::New(isolate, inputs.data(), inputs.size()); - if (!URLPatternComponentResult::ToJSObject(env, result.protocol) - .ToLocal(&values[1]) || - !URLPatternComponentResult::ToJSObject(env, result.username) - .ToLocal(&values[2]) || - !URLPatternComponentResult::ToJSObject(env, result.password) - .ToLocal(&values[3]) || - !URLPatternComponentResult::ToJSObject(env, result.hostname) - .ToLocal(&values[4]) || - !URLPatternComponentResult::ToJSObject(env, result.port) - .ToLocal(&values[5]) || - !URLPatternComponentResult::ToJSObject(env, result.pathname) - .ToLocal(&values[6]) || - !URLPatternComponentResult::ToJSObject(env, result.search) - .ToLocal(&values[7]) || - !URLPatternComponentResult::ToJSObject(env, result.hash) - .ToLocal(&values[8])) { - return {}; + + auto tmpl = env->urlpatternresult_template(); + if (tmpl.IsEmpty()) { + static constexpr std::string_view namesVec[] = { + "inputs", + "protocol", + "username", + "password", + "hostname", + "port", + "pathname", + "search", + "hash", + }; + tmpl = DictionaryTemplate::New(isolate, namesVec); + env->set_urlpatternresult_template(tmpl); } - return Object::New( - isolate, Object::New(isolate), names, values.data(), values.size()); + + size_t index = 0; + MaybeLocal vals[] = { + Array::New(env->context(), + result.inputs.size(), + [&index, &inputs = result.inputs, env]() { + auto& input = inputs[index++]; + if (std::holds_alternative(input)) { + auto input_str = std::get(input); + return ToV8Value(env->context(), input_str); + } else { + DCHECK( + std::holds_alternative(input)); + auto init = std::get(input); + return URLPatternInit::ToJsObject(env, init); + } + }), + URLPatternComponentResult::ToJSObject(env, result.protocol), + URLPatternComponentResult::ToJSObject(env, result.username), + URLPatternComponentResult::ToJSObject(env, result.password), + URLPatternComponentResult::ToJSObject(env, result.hostname), + URLPatternComponentResult::ToJSObject(env, result.port), + URLPatternComponentResult::ToJSObject(env, result.pathname), + URLPatternComponentResult::ToJSObject(env, result.search), + URLPatternComponentResult::ToJSObject(env, result.hash)}; + return NewDictionaryInstanceNullProto(env->context(), tmpl, vals); } std::optional diff --git a/src/node_util.cc b/src/node_util.cc index 1972d30b9b3899..36bd7c0028153a 100644 --- a/src/node_util.cc +++ b/src/node_util.cc @@ -15,6 +15,7 @@ using v8::BigInt; using v8::Boolean; using v8::CFunction; using v8::Context; +using v8::DictionaryTemplate; using v8::External; using v8::FunctionCallbackInfo; using v8::IndexFilter; @@ -23,6 +24,7 @@ using v8::Isolate; using v8::KeyCollectionMode; using v8::Local; using v8::LocalVector; +using v8::MaybeLocal; using v8::Name; using v8::Object; using v8::ObjectTemplate; @@ -263,6 +265,20 @@ static void GetCallSites(const FunctionCallbackInfo& args) { const int frame_count = stack->GetFrameCount(); LocalVector callsite_objects(isolate); + auto callsite_template = env->callsite_template(); + if (callsite_template.IsEmpty()) { + static constexpr std::string_view names[] = { + "functionName", + "scriptId", + "scriptName", + "lineNumber", + "columnNumber", + // TODO(legendecas): deprecate CallSite.column. + "column"}; + callsite_template = DictionaryTemplate::New(isolate, names); + env->set_callsite_template(callsite_template); + } + // Frame 0 is node:util. It should be skipped. for (int i = 1; i < frame_count; ++i) { Local stack_frame = stack->GetFrame(isolate, i); @@ -279,16 +295,7 @@ static void GetCallSites(const FunctionCallbackInfo& args) { std::string script_id = std::to_string(stack_frame->GetScriptId()); - Local names[] = { - env->function_name_string(), - env->script_id_string(), - env->script_name_string(), - env->line_number_string(), - env->column_number_string(), - // TODO(legendecas): deprecate CallSite.column. - env->column_string(), - }; - Local values[] = { + MaybeLocal values[] = { function_name, OneByteString(isolate, script_id), script_name, @@ -297,10 +304,14 @@ static void GetCallSites(const FunctionCallbackInfo& args) { // TODO(legendecas): deprecate CallSite.column. Integer::NewFromUnsigned(isolate, stack_frame->GetColumn()), }; - Local obj = Object::New( - isolate, v8::Null(isolate), names, values, arraysize(names)); - callsite_objects.push_back(obj); + Local callsite; + if (!NewDictionaryInstanceNullProto( + env->context(), callsite_template, values) + .ToLocal(&callsite)) { + return; + } + callsite_objects.push_back(callsite); } Local callsites = diff --git a/src/node_v8.cc b/src/node_v8.cc index 9f3c721680fabf..4d2c86e2da7429 100644 --- a/src/node_v8.cc +++ b/src/node_v8.cc @@ -35,6 +35,7 @@ using v8::Array; using v8::BigInt; using v8::CFunction; using v8::Context; +using v8::DictionaryTemplate; using v8::FunctionCallbackInfo; using v8::FunctionTemplate; using v8::HandleScope; @@ -46,7 +47,6 @@ using v8::Isolate; using v8::Local; using v8::LocalVector; using v8::MaybeLocal; -using v8::Name; using v8::Object; using v8::ScriptCompiler; using v8::String; @@ -326,9 +326,61 @@ static void SetHeapStatistics(JSONWriter* writer, Isolate* isolate) { static MaybeLocal ConvertHeapStatsToJSObject( Isolate* isolate, const cppgc::HeapStatistics& stats) { Local context = isolate->GetCurrentContext(); + Environment* env = Environment::GetCurrent(isolate); // Space Statistics LocalVector space_statistics_array(isolate); space_statistics_array.reserve(stats.space_stats.size()); + + auto object_stats_template = env->object_stats_template(); + auto page_stats_tmpl = env->page_stats_template(); + auto free_list_statistics_template = env->free_list_statistics_template(); + auto space_stats_tmpl = env->space_stats_template(); + auto heap_stats_tmpl = env->v8_heap_statistics_template(); + if (object_stats_template.IsEmpty()) { + static constexpr std::string_view object_stats_names[] = {"allocated_bytes", + "object_count"}; + object_stats_template = + DictionaryTemplate::New(isolate, object_stats_names); + env->set_object_stats_template(object_stats_template); + } + if (page_stats_tmpl.IsEmpty()) { + static constexpr std::string_view page_stats_names[] = { + "committed_size_bytes", + "resident_size_bytes", + "used_size_bytes", + "object_statistics"}; + page_stats_tmpl = DictionaryTemplate::New(isolate, page_stats_names); + env->set_page_stats_template(page_stats_tmpl); + } + if (free_list_statistics_template.IsEmpty()) { + std::string_view free_list_statistics_names[] = { + "bucket_size", "free_count", "free_size"}; + free_list_statistics_template = + DictionaryTemplate::New(isolate, free_list_statistics_names); + env->set_free_list_statistics_template(free_list_statistics_template); + } + if (space_stats_tmpl.IsEmpty()) { + static constexpr std::string_view space_stats_names[] = { + "name", + "committed_size_bytes", + "resident_size_bytes", + "used_size_bytes", + "page_stats", + "free_list_stats"}; + space_stats_tmpl = DictionaryTemplate::New(isolate, space_stats_names); + env->set_space_stats_template(space_stats_tmpl); + } + if (heap_stats_tmpl.IsEmpty()) { + static constexpr std::string_view heap_statistics_names[] = { + "committed_size_bytes", + "resident_size_bytes", + "used_size_bytes", + "space_statistics", + "type_names"}; + heap_stats_tmpl = DictionaryTemplate::New(isolate, heap_statistics_names); + env->set_v8_heap_statistics_template(heap_stats_tmpl); + } + for (size_t i = 0; i < stats.space_stats.size(); i++) { const cppgc::HeapStatistics::SpaceStatistics& space_stats = stats.space_stats[i]; @@ -344,30 +396,22 @@ static MaybeLocal ConvertHeapStatsToJSObject( for (size_t k = 0; k < page_stats.object_statistics.size(); k++) { const cppgc::HeapStatistics::ObjectStatsEntry& object_stats = page_stats.object_statistics[k]; - Local object_stats_names[] = { - FIXED_ONE_BYTE_STRING(isolate, "allocated_bytes"), - FIXED_ONE_BYTE_STRING(isolate, "object_count")}; - Local object_stats_values[] = { + MaybeLocal object_stats_values[] = { Uint32::NewFromUnsigned( isolate, static_cast(object_stats.allocated_bytes)), Uint32::NewFromUnsigned( isolate, static_cast(object_stats.object_count))}; - Local object_stats_object = - Object::New(isolate, - Null(isolate), - object_stats_names, - object_stats_values, - arraysize(object_stats_names)); + Local object_stats_object; + if (!NewDictionaryInstanceNullProto( + context, object_stats_template, object_stats_values) + .ToLocal(&object_stats_object)) { + return MaybeLocal(); + } object_statistics_array.emplace_back(object_stats_object); } // Set page statistics - Local page_stats_names[] = { - FIXED_ONE_BYTE_STRING(isolate, "committed_size_bytes"), - FIXED_ONE_BYTE_STRING(isolate, "resident_size_bytes"), - FIXED_ONE_BYTE_STRING(isolate, "used_size_bytes"), - FIXED_ONE_BYTE_STRING(isolate, "object_statistics")}; - Local page_stats_values[] = { + MaybeLocal page_stats_values[] = { Uint32::NewFromUnsigned( isolate, static_cast(page_stats.committed_size_bytes)), Uint32::NewFromUnsigned( @@ -377,21 +421,17 @@ static MaybeLocal ConvertHeapStatsToJSObject( Array::New(isolate, object_statistics_array.data(), object_statistics_array.size())}; - Local page_stats_object = - Object::New(isolate, - Null(isolate), - page_stats_names, - page_stats_values, - arraysize(page_stats_names)); + Local page_stats_object; + if (!NewDictionaryInstanceNullProto( + context, page_stats_tmpl, page_stats_values) + .ToLocal(&page_stats_object)) { + return MaybeLocal(); + } page_statistics_array.emplace_back(page_stats_object); } // Free List Statistics - Local free_list_statistics_names[] = { - FIXED_ONE_BYTE_STRING(isolate, "bucket_size"), - FIXED_ONE_BYTE_STRING(isolate, "free_count"), - FIXED_ONE_BYTE_STRING(isolate, "free_size")}; - Local free_list_statistics_values[] = { + MaybeLocal free_list_statistics_values[] = { ToV8ValuePrimitiveArray( context, space_stats.free_list_stats.bucket_size, isolate), ToV8ValuePrimitiveArray( @@ -399,28 +439,21 @@ static MaybeLocal ConvertHeapStatsToJSObject( ToV8ValuePrimitiveArray( context, space_stats.free_list_stats.free_size, isolate)}; - Local free_list_statistics_obj = - Object::New(isolate, - Null(isolate), - free_list_statistics_names, - free_list_statistics_values, - arraysize(free_list_statistics_names)); + Local free_list_statistics_obj; + if (!NewDictionaryInstanceNullProto(context, + free_list_statistics_template, + free_list_statistics_values) + .ToLocal(&free_list_statistics_obj)) { + return MaybeLocal(); + } // Set Space Statistics - Local space_stats_names[] = { - FIXED_ONE_BYTE_STRING(isolate, "name"), - FIXED_ONE_BYTE_STRING(isolate, "committed_size_bytes"), - FIXED_ONE_BYTE_STRING(isolate, "resident_size_bytes"), - FIXED_ONE_BYTE_STRING(isolate, "used_size_bytes"), - FIXED_ONE_BYTE_STRING(isolate, "page_stats"), - FIXED_ONE_BYTE_STRING(isolate, "free_list_stats")}; - Local name_value; if (!ToV8Value(context, stats.space_stats[i].name, isolate) .ToLocal(&name_value)) { return MaybeLocal(); } - Local space_stats_values[] = { + MaybeLocal space_stats_values[] = { name_value, Uint32::NewFromUnsigned( isolate, @@ -436,29 +469,21 @@ static MaybeLocal ConvertHeapStatsToJSObject( page_statistics_array.size()), free_list_statistics_obj, }; - Local space_stats_object = - Object::New(isolate, - Null(isolate), - space_stats_names, - space_stats_values, - arraysize(space_stats_names)); + Local space_stats_object; + if (!NewDictionaryInstanceNullProto( + context, space_stats_tmpl, space_stats_values) + .ToLocal(&space_stats_object)) { + return MaybeLocal(); + } space_statistics_array.emplace_back(space_stats_object); } - // Set heap statistics - Local heap_statistics_names[] = { - FIXED_ONE_BYTE_STRING(isolate, "committed_size_bytes"), - FIXED_ONE_BYTE_STRING(isolate, "resident_size_bytes"), - FIXED_ONE_BYTE_STRING(isolate, "used_size_bytes"), - FIXED_ONE_BYTE_STRING(isolate, "space_statistics"), - FIXED_ONE_BYTE_STRING(isolate, "type_names")}; - Local type_names_value; if (!ToV8Value(context, stats.type_names, isolate) .ToLocal(&type_names_value)) { return MaybeLocal(); } - Local heap_statistics_values[] = { + MaybeLocal heap_statistics_values[] = { Uint32::NewFromUnsigned( isolate, static_cast(stats.committed_size_bytes)), Uint32::NewFromUnsigned(isolate, @@ -470,14 +495,8 @@ static MaybeLocal ConvertHeapStatsToJSObject( space_statistics_array.size()), type_names_value}; - Local heap_statistics_object = - Object::New(isolate, - Null(isolate), - heap_statistics_names, - heap_statistics_values, - arraysize(heap_statistics_names)); - - return heap_statistics_object; + return NewDictionaryInstanceNullProto( + context, heap_stats_tmpl, heap_statistics_values); } static void GetCppHeapStatistics(const FunctionCallbackInfo& args) { diff --git a/src/node_worker.cc b/src/node_worker.cc index 0b606092a466e2..9518ab9d812f21 100644 --- a/src/node_worker.cc +++ b/src/node_worker.cc @@ -27,16 +27,18 @@ using v8::Context; using v8::CpuProfile; using v8::CpuProfilingResult; using v8::CpuProfilingStatus; +using v8::DictionaryTemplate; using v8::Float64Array; using v8::FunctionCallbackInfo; using v8::FunctionTemplate; using v8::HandleScope; +using v8::HeapStatistics; using v8::Integer; using v8::Isolate; using v8::Local; using v8::Locker; using v8::Maybe; -using v8::Name; +using v8::MaybeLocal; using v8::NewStringType; using v8::Null; using v8::Number; @@ -874,11 +876,17 @@ void Worker::CpuUsage(const FunctionCallbackInfo& args) { argv[0] = UVException( isolate, err, "uv_getrusage_thread", nullptr, nullptr, nullptr); } else { - Local names[] = { - FIXED_ONE_BYTE_STRING(isolate, "user"), - FIXED_ONE_BYTE_STRING(isolate, "system"), - }; - Local values[] = { + auto tmpl = env->cpu_usage_template(); + if (tmpl.IsEmpty()) { + static constexpr std::string_view names[] = { + "user", + "system", + }; + tmpl = DictionaryTemplate::New(isolate, names); + env->set_cpu_usage_template(tmpl); + } + + MaybeLocal values[] = { Number::New(isolate, 1e6 * cpu_usage_stats->ru_utime.tv_sec + cpu_usage_stats->ru_utime.tv_usec), @@ -886,8 +894,10 @@ void Worker::CpuUsage(const FunctionCallbackInfo& args) { 1e6 * cpu_usage_stats->ru_stime.tv_sec + cpu_usage_stats->ru_stime.tv_usec), }; - argv[1] = Object::New( - isolate, Null(isolate), names, values, arraysize(names)); + if (!NewDictionaryInstanceNullProto(env->context(), tmpl, values) + .ToLocal(&argv[1])) { + return; + } } taker->MakeCallback(env->ondone_string(), arraysize(argv), argv); @@ -1056,7 +1066,7 @@ void Worker::GetHeapStatistics(const FunctionCallbackInfo& args) { env](Environment* worker_env) mutable { // We create a unique pointer to HeapStatistics so that the actual object // it's not copied in the lambda, but only the pointer is. - auto heap_stats = std::make_unique(); + auto heap_stats = std::make_unique(); worker_env->isolate()->GetHeapStatistics(heap_stats.get()); // Here, the worker thread temporarily owns the WorkerHeapStatisticsTaker @@ -1071,24 +1081,30 @@ void Worker::GetHeapStatistics(const FunctionCallbackInfo& args) { AsyncHooks::DefaultTriggerAsyncIdScope trigger_id_scope(taker->get()); - Local heap_stats_names[] = { - FIXED_ONE_BYTE_STRING(isolate, "total_heap_size"), - FIXED_ONE_BYTE_STRING(isolate, "total_heap_size_executable"), - FIXED_ONE_BYTE_STRING(isolate, "total_physical_size"), - FIXED_ONE_BYTE_STRING(isolate, "total_available_size"), - FIXED_ONE_BYTE_STRING(isolate, "used_heap_size"), - FIXED_ONE_BYTE_STRING(isolate, "heap_size_limit"), - FIXED_ONE_BYTE_STRING(isolate, "malloced_memory"), - FIXED_ONE_BYTE_STRING(isolate, "peak_malloced_memory"), - FIXED_ONE_BYTE_STRING(isolate, "does_zap_garbage"), - FIXED_ONE_BYTE_STRING(isolate, "number_of_native_contexts"), - FIXED_ONE_BYTE_STRING(isolate, "number_of_detached_contexts"), - FIXED_ONE_BYTE_STRING(isolate, "total_global_handles_size"), - FIXED_ONE_BYTE_STRING(isolate, "used_global_handles_size"), - FIXED_ONE_BYTE_STRING(isolate, "external_memory")}; + auto tmpl = env->heap_statistics_template(); + if (tmpl.IsEmpty()) { + std::string_view heap_stats_names[] = { + "total_heap_size", + "total_heap_size_executable", + "total_physical_size", + "total_available_size", + "used_heap_size", + "heap_size_limit", + "malloced_memory", + "peak_malloced_memory", + "does_zap_garbage", + "number_of_native_contexts", + "number_of_detached_contexts", + "total_global_handles_size", + "used_global_handles_size", + "external_memory", + }; + tmpl = DictionaryTemplate::New(isolate, heap_stats_names); + env->set_heap_statistics_template(tmpl); + } // Define an array of property values - Local heap_stats_values[] = { + MaybeLocal heap_stats_values[] = { Number::New(isolate, heap_stats->total_heap_size()), Number::New(isolate, heap_stats->total_heap_size_executable()), Number::New(isolate, heap_stats->total_physical_size()), @@ -1104,16 +1120,13 @@ void Worker::GetHeapStatistics(const FunctionCallbackInfo& args) { Number::New(isolate, heap_stats->used_global_handles_size()), Number::New(isolate, heap_stats->external_memory())}; - DCHECK_EQ(arraysize(heap_stats_names), arraysize(heap_stats_values)); - - // Create the object with the property names and values - Local stats = Object::New(isolate, - Null(isolate), - heap_stats_names, - heap_stats_values, - arraysize(heap_stats_names)); - - Local args[] = {stats}; + Local obj; + if (!NewDictionaryInstanceNullProto( + env->context(), tmpl, heap_stats_values) + .ToLocal(&obj)) { + return; + } + Local args[] = {obj}; taker->get()->MakeCallback( env->ondone_string(), arraysize(args), args); // implicitly delete `taker` diff --git a/src/util-inl.h b/src/util-inl.h index 778cc57537a966..fbce06d7cef9c2 100644 --- a/src/util-inl.h +++ b/src/util-inl.h @@ -704,6 +704,31 @@ inline std::wstring ConvertToWideString(const std::string& str, } #endif // _WIN32 +inline v8::MaybeLocal NewDictionaryInstance( + v8::Local context, + v8::Local tmpl, + v8::MemorySpan> property_values) { + for (auto& value : property_values) { + if (value.IsEmpty()) return v8::MaybeLocal(); + } + return tmpl->NewInstance(context, property_values); +} + +inline v8::MaybeLocal NewDictionaryInstanceNullProto( + v8::Local context, + v8::Local tmpl, + v8::MemorySpan> property_values) { + for (auto& value : property_values) { + if (value.IsEmpty()) return v8::MaybeLocal(); + } + v8::Local obj = tmpl->NewInstance(context, property_values); + if (obj->SetPrototypeV2(context, v8::Null(context->GetIsolate())) + .IsNothing()) { + return v8::MaybeLocal(); + } + return obj; +} + } // namespace node #endif // defined(NODE_WANT_INTERNALS) && NODE_WANT_INTERNALS diff --git a/src/util.h b/src/util.h index 6c9482893a5129..9eb7034e378f0d 100644 --- a/src/util.h +++ b/src/util.h @@ -1030,6 +1030,20 @@ inline bool IsWindowsBatchFile(const char* filename); inline std::wstring ConvertToWideString(const std::string& str, UINT code_page); #endif // _WIN32 +// A helper to create a new instance of the dictionary template. +// Unlike v8::DictionaryTemplate::NewInstance, this method will +// check that all properties have been set (are not empty MaybeLocals) +// or will return early with an empty MaybeLocal under the assumption +// that an error has been thrown. +inline v8::MaybeLocal NewDictionaryInstance( + v8::Local context, + v8::Local tmpl, + v8::MemorySpan> property_values); +inline v8::MaybeLocal NewDictionaryInstanceNullProto( + v8::Local context, + v8::Local tmpl, + v8::MemorySpan> property_values); + } // namespace node #endif // defined(NODE_WANT_INTERNALS) && NODE_WANT_INTERNALS From 74a09482de5ac1f0e3f124608b6317898493a6c6 Mon Sep 17 00:00:00 2001 From: Aras Abbasi Date: Wed, 10 Sep 2025 15:12:28 +0200 Subject: [PATCH 102/103] inspector: undici as shared-library should pass tests PR-URL: https://github.com/nodejs/node/pull/59837 Reviewed-By: Richard Lau Reviewed-By: Matteo Collina Reviewed-By: Chengzhong Wu --- test/parallel/test-inspector-network-websocket.js | 8 +++++--- 1 file changed, 5 insertions(+), 3 deletions(-) diff --git a/test/parallel/test-inspector-network-websocket.js b/test/parallel/test-inspector-network-websocket.js index 6f2613a4c27de2..8f0d4cb75dc674 100644 --- a/test/parallel/test-inspector-network-websocket.js +++ b/test/parallel/test-inspector-network-websocket.js @@ -11,6 +11,8 @@ const WebSocketServer = require('../common/websocket-server'); const inspector = require('node:inspector/promises'); const dc = require('diagnostics_channel'); +const nameRE = 'undici' in process.versions ? /^node:internal\/deps\/undici\/undici$/u : /undici/u; + const session = new inspector.Session(); session.connect(); @@ -18,9 +20,9 @@ dc.channel('undici:websocket:socket_error').subscribe((message) => { console.error('WebSocket error:', message); }); -function findFrameInInitiator(scriptName, initiator) { +function findFrameInInitiator(regex, initiator) { const frame = initiator.stack.callFrames.find((it) => { - return it.url === scriptName; + return regex.test(it.url); }); return frame; } @@ -39,7 +41,7 @@ async function test() { assert.ok(message.params.requestId); assert.strictEqual(typeof message.params.initiator, 'object'); assert.strictEqual(message.params.initiator.type, 'script'); - assert.ok(findFrameInInitiator('node:internal/deps/undici/undici', message.params.initiator)); + assert.ok(findFrameInInitiator(nameRE, message.params.initiator)); requestId = message.params.requestId; })); From f9ec1b63e17f24a38b7835bb703c7e2bebe0674d Mon Sep 17 00:00:00 2001 From: "Node.js GitHub Bot" Date: Tue, 9 Sep 2025 07:52:21 +0100 Subject: [PATCH 103/103] 2025-09-10, Version 24.8.0 (Current) Notable changes: crypto: * (SEMVER-MINOR) support Ed448 and ML-DSA context parameter in node:crypto (Filip Skokan) https://github.com/nodejs/node/pull/59570 * (SEMVER-MINOR) support Ed448 and ML-DSA context parameter in Web Cryptography (Filip Skokan) https://github.com/nodejs/node/pull/59570 * (SEMVER-MINOR) add KMAC Web Cryptography algorithms (Filip Skokan) https://github.com/nodejs/node/pull/59647 * (SEMVER-MINOR) add Argon2 Web Cryptography algorithms (Filip Skokan) https://github.com/nodejs/node/pull/59544 * (SEMVER-MINOR) support SLH-DSA KeyObject, sign, and verify (Filip Skokan) https://github.com/nodejs/node/pull/59537 inspector: * add http2 tracking support (Darshan Sen) https://github.com/nodejs/node/pull/59611 worker: * (SEMVER-MINOR) add cpu profile APIs for worker (theanarkh) https://github.com/nodejs/node/pull/59428 PR-URL: https://github.com/nodejs/node/pull/59816 --- CHANGELOG.md | 3 +- doc/api/cli.md | 2 +- doc/api/crypto.md | 14 +- doc/api/deprecations.md | 2 +- doc/api/diagnostics_channel.md | 4 +- doc/api/errors.md | 6 +- doc/api/path.md | 2 +- doc/api/single-executable-applications.md | 2 +- doc/api/v8.md | 6 +- doc/api/vm.md | 6 +- doc/api/webcrypto.md | 64 ++++----- doc/api/worker_threads.md | 2 +- doc/changelogs/CHANGELOG_V24.md | 151 ++++++++++++++++++++++ src/node_version.h | 6 +- 14 files changed, 211 insertions(+), 59 deletions(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index 843b312c289d76..a4c9d1f4e4bd50 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -40,7 +40,8 @@ release. -24.7.0
    +24.8.0
    +24.7.0
    24.6.0
    24.5.0
    24.4.1
    diff --git a/doc/api/cli.md b/doc/api/cli.md index 55485005e5eae6..5e61ba21283697 100644 --- a/doc/api/cli.md +++ b/doc/api/cli.md @@ -593,7 +593,7 @@ added: - v23.7.0 - v22.14.0 changes: - - version: REPLACEME + - version: v24.8.0 pr-url: https://github.com/nodejs/node/pull/59707 description: The option is no longer experimental. --> diff --git a/doc/api/crypto.md b/doc/api/crypto.md index 77927d7332518a..32e28e652cb82e 100644 --- a/doc/api/crypto.md +++ b/doc/api/crypto.md @@ -2058,7 +2058,7 @@ Other key details might be exposed via this API using additional attributes. The CPU profile with the given name is already started. @@ -841,7 +841,7 @@ The CPU profile with the given name is already started. ### `ERR_CPU_PROFILE_NOT_STARTED` The CPU profile with the given name is not started. @@ -851,7 +851,7 @@ The CPU profile with the given name is not started. ### `ERR_CPU_PROFILE_TOO_MANY` There are too many CPU profiles being collected. diff --git a/doc/api/path.md b/doc/api/path.md index ffc90ad25c8815..c072c85ee59d5f 100644 --- a/doc/api/path.md +++ b/doc/api/path.md @@ -290,7 +290,7 @@ added: - v22.5.0 - v20.17.0 changes: - - version: REPLACEME + - version: v24.8.0 pr-url: https://github.com/nodejs/node/pull/59572 description: Marking the API stable. --> diff --git a/doc/api/single-executable-applications.md b/doc/api/single-executable-applications.md index d1000bcbef269a..133a6752d2a6ab 100644 --- a/doc/api/single-executable-applications.md +++ b/doc/api/single-executable-applications.md @@ -435,7 +435,7 @@ writes to the returned array buffer is likely to result in a crash. ### `sea.getAssetKeys()` * Returns {string\[]} An array containing all the keys of the assets diff --git a/doc/api/v8.md b/doc/api/v8.md index ff903dbda992c3..3ab0e0f6cfed5c 100644 --- a/doc/api/v8.md +++ b/doc/api/v8.md @@ -1397,13 +1397,13 @@ setTimeout(() => { ## Class: `CPUProfileHandle` ### `cpuProfileHandle.stop()` * Returns: {Promise} @@ -1414,7 +1414,7 @@ profile data. ### `cpuProfileHandle[Symbol.asyncDispose]()` * Returns: {Promise} diff --git a/doc/api/vm.md b/doc/api/vm.md index 95f9f1ff997746..dd36293859fe1e 100644 --- a/doc/api/vm.md +++ b/doc/api/vm.md @@ -923,7 +923,7 @@ the ECMAScript specification. ### `sourceTextModule.instantiate()` * Returns: {undefined} @@ -941,7 +941,7 @@ modules in the cycle before calling this method. ### `sourceTextModule.linkRequests(modules)` * `modules` {vm.Module\[]} Array of `vm.Module` objects that this module depends on. @@ -1103,7 +1103,7 @@ added: - v13.0.0 - v12.16.0 changes: - - version: REPLACEME + - version: v24.8.0 pr-url: https://github.com/nodejs/node/pull/59000 description: No longer need to call `syntheticModule.link()` before calling this method. diff --git a/doc/api/webcrypto.md b/doc/api/webcrypto.md index 7265183226c2f5..18bffdaa2af76b 100644 --- a/doc/api/webcrypto.md +++ b/doc/api/webcrypto.md @@ -2,10 +2,10 @@ #### `argon2Params.associatedData` * Type: {ArrayBuffer|TypedArray|DataView|Buffer} @@ -1768,7 +1768,7 @@ Represents the optional associated data. #### `argon2Params.memory` * Type: {number} @@ -1778,7 +1778,7 @@ Represents the memory size in kibibytes. It must be at least 8 times the degree #### `argon2Params.name` * Type: {string} Must be one of `'Argon2d'`, `'Argon2i'`, or `'Argon2id'`. @@ -1786,7 +1786,7 @@ added: REPLACEME #### `argon2Params.nonce` * Type: {ArrayBuffer|TypedArray|DataView|Buffer} @@ -1796,7 +1796,7 @@ Represents the nonce, which is a salt for password hashing applications. #### `argon2Params.parallelism` * Type: {number} @@ -1806,7 +1806,7 @@ Represents the degree of parallelism. #### `argon2Params.passes` * Type: {number} @@ -1816,7 +1816,7 @@ Represents the number of passes. #### `argon2Params.secretValue` * Type: {ArrayBuffer|TypedArray|DataView|Buffer} @@ -1826,7 +1826,7 @@ Represents the optional secret value. #### `argon2Params.version` * Type: {number} @@ -1853,7 +1853,7 @@ added: v24.7.0 @@ -2309,13 +2309,13 @@ added: v15.0.0 ### Class: `KmacImportParams` #### `kmacImportParams.length` * Type: {number} @@ -2326,7 +2326,7 @@ be omitted for most cases. #### `kmacImportParams.name` * Type: {string} Must be `'KMAC128'` or `'KMAC256'`. @@ -2334,13 +2334,13 @@ added: REPLACEME ### Class: `KmacKeyAlgorithm` #### `kmacKeyAlgorithm.length` * Type: {number} @@ -2350,7 +2350,7 @@ The length of the KMAC key in bits. #### `kmacKeyAlgorithm.name` * Type: {string} @@ -2358,13 +2358,13 @@ added: REPLACEME ### Class: `KmacKeyGenParams` #### `kmacKeyGenParams.length` * Type: {number} @@ -2376,7 +2376,7 @@ This is optional and should be omitted for most cases. #### `kmacKeyGenParams.name` * Type: {string} Must be `'KMAC128'` or `'KMAC256'`. @@ -2384,13 +2384,13 @@ added: REPLACEME ### Class: `KmacParams` #### `kmacParams.algorithm` * Type: {string} Must be `'KMAC128'` or `'KMAC256'`. @@ -2398,7 +2398,7 @@ added: REPLACEME #### `kmacParams.customization` * Type: {ArrayBuffer|TypedArray|DataView|Buffer|undefined} @@ -2408,7 +2408,7 @@ The `customization` member represents the optional customization string. #### `kmacParams.length` * Type: {number} diff --git a/doc/api/worker_threads.md b/doc/api/worker_threads.md index 47bc06d3221f5a..48465835210210 100644 --- a/doc/api/worker_threads.md +++ b/doc/api/worker_threads.md @@ -1956,7 +1956,7 @@ If the worker has stopped, the return value is an empty object. ### `worker.startCpuProfile()` * Returns: {Promise} diff --git a/doc/changelogs/CHANGELOG_V24.md b/doc/changelogs/CHANGELOG_V24.md index 550d1492a30f93..212b6efe3de741 100644 --- a/doc/changelogs/CHANGELOG_V24.md +++ b/doc/changelogs/CHANGELOG_V24.md @@ -8,6 +8,7 @@ +24.8.0
    24.7.0
    24.6.0
    24.5.0
    @@ -49,6 +50,156 @@ * [io.js](CHANGELOG_IOJS.md) * [Archive](CHANGELOG_ARCHIVE.md) + + +## 2025-09-10, Version 24.8.0 (Current), @targos + +### Notable Changes + +#### HTTP/2 Network Inspection Support in Node.js + +Node.js now supports inspection of HTTP/2 network calls in Chrome DevTools for Node.js. + +##### Usage + +Write a `test.js` script that makes HTTP/2 requests. + +```js +const http2 = require('node:http2'); + +const client = http2.connect('https://nghttp2.org'); + +const req = client.request([ + ':path', '/', + ':method', 'GET', +]); +``` + +Run it with these options: + +```bash +node --inspect-wait --experimental-network-inspection test.js +``` + +Open `about:inspect` on Google Chrome and click on `Open dedicated DevTools for Node`. +The `Network` tab will let you track your HTTP/2 calls. + +Contributed by Darshan Sen in [#59611](https://github.com/nodejs/node/pull/59611). + +#### Other Notable Changes + +* \[[`7a8e2c251d`](https://github.com/nodejs/node/commit/7a8e2c251d)] - **(SEMVER-MINOR)** **crypto**: support Ed448 and ML-DSA context parameter in node:crypto (Filip Skokan) [#59570](https://github.com/nodejs/node/pull/59570) +* \[[`4b631be0b0`](https://github.com/nodejs/node/commit/4b631be0b0)] - **(SEMVER-MINOR)** **crypto**: support Ed448 and ML-DSA context parameter in Web Cryptography (Filip Skokan) [#59570](https://github.com/nodejs/node/pull/59570) +* \[[`3e4b1e732c`](https://github.com/nodejs/node/commit/3e4b1e732c)] - **(SEMVER-MINOR)** **crypto**: add KMAC Web Cryptography algorithms (Filip Skokan) [#59647](https://github.com/nodejs/node/pull/59647) +* \[[`b1d28785b2`](https://github.com/nodejs/node/commit/b1d28785b2)] - **(SEMVER-MINOR)** **crypto**: add Argon2 Web Cryptography algorithms (Filip Skokan) [#59544](https://github.com/nodejs/node/pull/59544) +* \[[`430691d1af`](https://github.com/nodejs/node/commit/430691d1af)] - **(SEMVER-MINOR)** **crypto**: support SLH-DSA KeyObject, sign, and verify (Filip Skokan) [#59537](https://github.com/nodejs/node/pull/59537) +* \[[`d6d05ba397`](https://github.com/nodejs/node/commit/d6d05ba397)] - **(SEMVER-MINOR)** **worker**: add cpu profile APIs for worker (theanarkh) [#59428](https://github.com/nodejs/node/pull/59428) + +### Commits + +* \[[`d913872369`](https://github.com/nodejs/node/commit/d913872369)] - **assert**: cap input size in myersDiff to avoid Int32Array overflow (Haram Jeong) [#59578](https://github.com/nodejs/node/pull/59578) +* \[[`7bbbcf6666`](https://github.com/nodejs/node/commit/7bbbcf6666)] - **benchmark**: sqlite prevent create both tables on prepare selects (Bruno Rodrigues) [#59709](https://github.com/nodejs/node/pull/59709) +* \[[`44d7b92271`](https://github.com/nodejs/node/commit/44d7b92271)] - **benchmark**: calibrate config array-vs-concat (Rafael Gonzaga) [#59587](https://github.com/nodejs/node/pull/59587) +* \[[`7f347fc551`](https://github.com/nodejs/node/commit/7f347fc551)] - **build**: fix getting OpenSSL version on Windows (Michaël Zasso) [#59609](https://github.com/nodejs/node/pull/59609) +* \[[`4a317150d5`](https://github.com/nodejs/node/commit/4a317150d5)] - **build**: fix 'implicit-function-declaration' on OpenHarmony platform (hqzing) [#59547](https://github.com/nodejs/node/pull/59547) +* \[[`bda32af587`](https://github.com/nodejs/node/commit/bda32af587)] - **build**: use `windows-2025` runner (Michaël Zasso) [#59673](https://github.com/nodejs/node/pull/59673) +* \[[`a4a8ed8f6e`](https://github.com/nodejs/node/commit/a4a8ed8f6e)] - **build**: compile bundled uvwasi conditionally (Carlo Cabrera) [#59622](https://github.com/nodejs/node/pull/59622) +* \[[`d944a87761`](https://github.com/nodejs/node/commit/d944a87761)] - **crypto**: refactor subtle methods to use synchronous import (Filip Skokan) [#59771](https://github.com/nodejs/node/pull/59771) +* \[[`7a8e2c251d`](https://github.com/nodejs/node/commit/7a8e2c251d)] - **(SEMVER-MINOR)** **crypto**: support Ed448 and ML-DSA context parameter in node:crypto (Filip Skokan) [#59570](https://github.com/nodejs/node/pull/59570) +* \[[`4b631be0b0`](https://github.com/nodejs/node/commit/4b631be0b0)] - **(SEMVER-MINOR)** **crypto**: support Ed448 and ML-DSA context parameter in Web Cryptography (Filip Skokan) [#59570](https://github.com/nodejs/node/pull/59570) +* \[[`3e4b1e732c`](https://github.com/nodejs/node/commit/3e4b1e732c)] - **(SEMVER-MINOR)** **crypto**: add KMAC Web Cryptography algorithms (Filip Skokan) [#59647](https://github.com/nodejs/node/pull/59647) +* \[[`b1d28785b2`](https://github.com/nodejs/node/commit/b1d28785b2)] - **(SEMVER-MINOR)** **crypto**: add Argon2 Web Cryptography algorithms (Filip Skokan) [#59544](https://github.com/nodejs/node/pull/59544) +* \[[`430691d1af`](https://github.com/nodejs/node/commit/430691d1af)] - **(SEMVER-MINOR)** **crypto**: support SLH-DSA KeyObject, sign, and verify (Filip Skokan) [#59537](https://github.com/nodejs/node/pull/59537) +* \[[`0d1e53d935`](https://github.com/nodejs/node/commit/0d1e53d935)] - **deps**: update uvwasi to 0.0.23 (Node.js GitHub Bot) [#59791](https://github.com/nodejs/node/pull/59791) +* \[[`68732cf426`](https://github.com/nodejs/node/commit/68732cf426)] - **deps**: update histogram to 0.11.9 (Node.js GitHub Bot) [#59689](https://github.com/nodejs/node/pull/59689) +* \[[`f12c1ad961`](https://github.com/nodejs/node/commit/f12c1ad961)] - **deps**: update googletest to eb2d85e (Node.js GitHub Bot) [#59335](https://github.com/nodejs/node/pull/59335) +* \[[`45af6966ae`](https://github.com/nodejs/node/commit/45af6966ae)] - **deps**: upgrade npm to 11.6.0 (npm team) [#59750](https://github.com/nodejs/node/pull/59750) +* \[[`57617244a4`](https://github.com/nodejs/node/commit/57617244a4)] - **deps**: V8: cherry-pick 6b1b9bca2a8 (Xiao-Tao) [#59283](https://github.com/nodejs/node/pull/59283) +* \[[`2e6225a747`](https://github.com/nodejs/node/commit/2e6225a747)] - **deps**: update amaro to 1.1.2 (Node.js GitHub Bot) [#59616](https://github.com/nodejs/node/pull/59616) +* \[[`1f7f6dfae6`](https://github.com/nodejs/node/commit/1f7f6dfae6)] - **diagnostics\_channel**: revoke DEP0163 (René) [#59758](https://github.com/nodejs/node/pull/59758) +* \[[`8671a6cdb3`](https://github.com/nodejs/node/commit/8671a6cdb3)] - **doc**: stabilize --disable-sigusr1 (Rafael Gonzaga) [#59707](https://github.com/nodejs/node/pull/59707) +* \[[`583b1b255d`](https://github.com/nodejs/node/commit/583b1b255d)] - **doc**: update OpenSSL default security level to 2 (Jeetu Suthar) [#59723](https://github.com/nodejs/node/pull/59723) +* \[[`9b5eb6eb50`](https://github.com/nodejs/node/commit/9b5eb6eb50)] - **doc**: fix missing links in the `errors` page (Nam Yooseong) [#59427](https://github.com/nodejs/node/pull/59427) +* \[[`e7bf712c57`](https://github.com/nodejs/node/commit/e7bf712c57)] - **doc**: update "Type stripping in dependencies" section (Josh Kelley) [#59652](https://github.com/nodejs/node/pull/59652) +* \[[`96db47f91e`](https://github.com/nodejs/node/commit/96db47f91e)] - **doc**: add Miles Guicent as triager (Miles Guicent) [#59562](https://github.com/nodejs/node/pull/59562) +* \[[`87f829bd0c`](https://github.com/nodejs/node/commit/87f829bd0c)] - **doc**: mark `path.matchesGlob` as stable (Aviv Keller) [#59572](https://github.com/nodejs/node/pull/59572) +* \[[`062b2f705e`](https://github.com/nodejs/node/commit/062b2f705e)] - **doc**: improve documentation for raw headers in HTTP/2 APIs (Tim Perry) [#59633](https://github.com/nodejs/node/pull/59633) +* \[[`6ab9306370`](https://github.com/nodejs/node/commit/6ab9306370)] - **doc**: update install\_tools.bat free disk space (Stefan Stojanovic) [#59579](https://github.com/nodejs/node/pull/59579) +* \[[`c8d6b60da6`](https://github.com/nodejs/node/commit/c8d6b60da6)] - **doc**: fix quic session instance typo (jakecastelli) [#59642](https://github.com/nodejs/node/pull/59642) +* \[[`61d0a2d1ba`](https://github.com/nodejs/node/commit/61d0a2d1ba)] - **doc**: fix filehandle.read typo (Ruy Adorno) [#59635](https://github.com/nodejs/node/pull/59635) +* \[[`3276bfa0d0`](https://github.com/nodejs/node/commit/3276bfa0d0)] - **doc**: update migration recomendations for `util.is**()` deprecations (Augustin Mauroy) [#59269](https://github.com/nodejs/node/pull/59269) +* \[[`11de6c7ebb`](https://github.com/nodejs/node/commit/11de6c7ebb)] - **doc**: fix missing link to the Error documentation in the `http` page (Alexander Makarenko) [#59080](https://github.com/nodejs/node/pull/59080) +* \[[`f5b6829bba`](https://github.com/nodejs/node/commit/f5b6829bba)] - **doc,crypto**: add description to the KEM and supports() methods (Filip Skokan) [#59644](https://github.com/nodejs/node/pull/59644) +* \[[`5bfdc7ee74`](https://github.com/nodejs/node/commit/5bfdc7ee74)] - **doc,crypto**: cleanup unlinked and self method references webcrypto.md (Filip Skokan) [#59608](https://github.com/nodejs/node/pull/59608) +* \[[`010458d061`](https://github.com/nodejs/node/commit/010458d061)] - **esm**: populate separate cache for require(esm) in imported CJS (Joyee Cheung) [#59679](https://github.com/nodejs/node/pull/59679) +* \[[`dbe6e63baf`](https://github.com/nodejs/node/commit/dbe6e63baf)] - **esm**: fix missed renaming in ModuleJob.runSync (Joyee Cheung) [#59724](https://github.com/nodejs/node/pull/59724) +* \[[`8eb0d9d834`](https://github.com/nodejs/node/commit/8eb0d9d834)] - **fs**: fix wrong order of file names in cpSync error message (Nicholas Paun) [#59775](https://github.com/nodejs/node/pull/59775) +* \[[`e69be5611f`](https://github.com/nodejs/node/commit/e69be5611f)] - **fs**: fix dereference: false on cpSync (Nicholas Paun) [#59681](https://github.com/nodejs/node/pull/59681) +* \[[`2865d2ac20`](https://github.com/nodejs/node/commit/2865d2ac20)] - **http**: unbreak keepAliveTimeoutBuffer (Robert Nagy) [#59784](https://github.com/nodejs/node/pull/59784) +* \[[`ade1175475`](https://github.com/nodejs/node/commit/ade1175475)] - **http**: use cached '1.1' http version string (Robert Nagy) [#59717](https://github.com/nodejs/node/pull/59717) +* \[[`74a09482de`](https://github.com/nodejs/node/commit/74a09482de)] - **inspector**: undici as shared-library should pass tests (Aras Abbasi) [#59837](https://github.com/nodejs/node/pull/59837) +* \[[`772f8f415a`](https://github.com/nodejs/node/commit/772f8f415a)] - **inspector**: add http2 tracking support (Darshan Sen) [#59611](https://github.com/nodejs/node/pull/59611) +* \[[`3d225572d7`](https://github.com/nodejs/node/commit/3d225572d7)] - _**Revert**_ "**lib**: optimize writable stream buffer clearing" (Yoo) [#59743](https://github.com/nodejs/node/pull/59743) +* \[[`4fd213ce73`](https://github.com/nodejs/node/commit/4fd213ce73)] - **lib**: fix isReadable and isWritable return type value (Gabriel Quaresma) [#59089](https://github.com/nodejs/node/pull/59089) +* \[[`39befddb87`](https://github.com/nodejs/node/commit/39befddb87)] - **lib**: prefer TypedArrayPrototype primordials (Filip Skokan) [#59766](https://github.com/nodejs/node/pull/59766) +* \[[`0748160d2e`](https://github.com/nodejs/node/commit/0748160d2e)] - **lib**: fix DOMException subclass support (Chengzhong Wu) [#59680](https://github.com/nodejs/node/pull/59680) +* \[[`1a93df808c`](https://github.com/nodejs/node/commit/1a93df808c)] - **lib**: revert to using default derived class constructors (René) [#59650](https://github.com/nodejs/node/pull/59650) +* \[[`bb0755df37`](https://github.com/nodejs/node/commit/bb0755df37)] - **meta**: bump `codecov/codecov-action` (dependabot\[bot]) [#59726](https://github.com/nodejs/node/pull/59726) +* \[[`45d148d9be`](https://github.com/nodejs/node/commit/45d148d9be)] - **meta**: bump actions/download-artifact from 4.3.0 to 5.0.0 (dependabot\[bot]) [#59729](https://github.com/nodejs/node/pull/59729) +* \[[`01b66b122e`](https://github.com/nodejs/node/commit/01b66b122e)] - **meta**: bump github/codeql-action from 3.29.2 to 3.30.0 (dependabot\[bot]) [#59728](https://github.com/nodejs/node/pull/59728) +* \[[`34f7ab5502`](https://github.com/nodejs/node/commit/34f7ab5502)] - **meta**: bump actions/cache from 4.2.3 to 4.2.4 (dependabot\[bot]) [#59727](https://github.com/nodejs/node/pull/59727) +* \[[`5806ea02af`](https://github.com/nodejs/node/commit/5806ea02af)] - **meta**: bump actions/checkout from 4.2.2 to 5.0.0 (dependabot\[bot]) [#59725](https://github.com/nodejs/node/pull/59725) +* \[[`f667215583`](https://github.com/nodejs/node/commit/f667215583)] - **path**: refactor path joining logic for clarity and performance (Lee Jiho) [#59781](https://github.com/nodejs/node/pull/59781) +* \[[`0340fe92a6`](https://github.com/nodejs/node/commit/0340fe92a6)] - **repl**: do not cause side effects in tab completion (Anna Henningsen) [#59774](https://github.com/nodejs/node/pull/59774) +* \[[`a414c1eb51`](https://github.com/nodejs/node/commit/a414c1eb51)] - **repl**: fix REPL completion under unary expressions (Kingsword) [#59744](https://github.com/nodejs/node/pull/59744) +* \[[`c206f8dd87`](https://github.com/nodejs/node/commit/c206f8dd87)] - **repl**: add isValidParentheses check before wrap input (Xuguang Mei) [#59607](https://github.com/nodejs/node/pull/59607) +* \[[`0bf9775ee2`](https://github.com/nodejs/node/commit/0bf9775ee2)] - **sea**: implement sea.getAssetKeys() (Joyee Cheung) [#59661](https://github.com/nodejs/node/pull/59661) +* \[[`bf26b478d8`](https://github.com/nodejs/node/commit/bf26b478d8)] - **sea**: allow using inspector command line flags with SEA (Joyee Cheung) [#59568](https://github.com/nodejs/node/pull/59568) +* \[[`92128a8fe2`](https://github.com/nodejs/node/commit/92128a8fe2)] - **src**: use DictionaryTemplate for node\_url\_pattern (James M Snell) [#59802](https://github.com/nodejs/node/pull/59802) +* \[[`bcb29fb84f`](https://github.com/nodejs/node/commit/bcb29fb84f)] - **src**: correctly report memory changes to V8 (Yaksh Bariya) [#59623](https://github.com/nodejs/node/pull/59623) +* \[[`44c24657d3`](https://github.com/nodejs/node/commit/44c24657d3)] - **src**: fixup node\_messaging error handling (James M Snell) [#59792](https://github.com/nodejs/node/pull/59792) +* \[[`2cd6a3b7ec`](https://github.com/nodejs/node/commit/2cd6a3b7ec)] - **src**: track async resources via pointers to stack-allocated handles (Anna Henningsen) [#59704](https://github.com/nodejs/node/pull/59704) +* \[[`34d752586f`](https://github.com/nodejs/node/commit/34d752586f)] - **src**: fix build on NetBSD (Thomas Klausner) [#59718](https://github.com/nodejs/node/pull/59718) +* \[[`15fa779ac5`](https://github.com/nodejs/node/commit/15fa779ac5)] - **src**: fix race on process exit and off thread CA loading (Chengzhong Wu) [#59632](https://github.com/nodejs/node/pull/59632) +* \[[`15cbd3966a`](https://github.com/nodejs/node/commit/15cbd3966a)] - **src**: separate module.hasAsyncGraph and module.hasTopLevelAwait (Joyee Cheung) [#59675](https://github.com/nodejs/node/pull/59675) +* \[[`88d1ca8990`](https://github.com/nodejs/node/commit/88d1ca8990)] - **src**: use non-deprecated Get/SetPrototype methods (Michaël Zasso) [#59671](https://github.com/nodejs/node/pull/59671) +* \[[`56ac9a2d46`](https://github.com/nodejs/node/commit/56ac9a2d46)] - **src**: migrate WriteOneByte to WriteOneByteV2 (Chengzhong Wu) [#59634](https://github.com/nodejs/node/pull/59634) +* \[[`3d88aa9f2f`](https://github.com/nodejs/node/commit/3d88aa9f2f)] - **src**: remove duplicate code (theanarkh) [#59649](https://github.com/nodejs/node/pull/59649) +* \[[`0718a70b2a`](https://github.com/nodejs/node/commit/0718a70b2a)] - **src**: add name for more threads (theanarkh) [#59601](https://github.com/nodejs/node/pull/59601) +* \[[`0379a8b254`](https://github.com/nodejs/node/commit/0379a8b254)] - **src**: remove JSONParser (Joyee Cheung) [#59619](https://github.com/nodejs/node/pull/59619) +* \[[`90d0a1b2e9`](https://github.com/nodejs/node/commit/90d0a1b2e9)] - **src,sqlite**: refactor value conversion (Edy Silva) [#59659](https://github.com/nodejs/node/pull/59659) +* \[[`5e025c7ca7`](https://github.com/nodejs/node/commit/5e025c7ca7)] - **stream**: replace manual function validation with validateFunction (방진혁) [#59529](https://github.com/nodejs/node/pull/59529) +* \[[`155a999bed`](https://github.com/nodejs/node/commit/155a999bed)] - **test**: skip tests failing when run under root (Livia Medeiros) [#59779](https://github.com/nodejs/node/pull/59779) +* \[[`6313706c69`](https://github.com/nodejs/node/commit/6313706c69)] - **test**: update WPT for urlpattern to cff1ac1123 (Node.js GitHub Bot) [#59602](https://github.com/nodejs/node/pull/59602) +* \[[`41245ad4c7`](https://github.com/nodejs/node/commit/41245ad4c7)] - **test**: skip more sea tests on Linux ppc64le (Richard Lau) [#59755](https://github.com/nodejs/node/pull/59755) +* \[[`df63d37ec4`](https://github.com/nodejs/node/commit/df63d37ec4)] - **test**: fix internet/test-dns (Michaël Zasso) [#59660](https://github.com/nodejs/node/pull/59660) +* \[[`1f6c335e82`](https://github.com/nodejs/node/commit/1f6c335e82)] - **test**: mark test-inspector-network-fetch as flaky again (Joyee Cheung) [#59640](https://github.com/nodejs/node/pull/59640) +* \[[`1798683df1`](https://github.com/nodejs/node/commit/1798683df1)] - **test**: skip test-fs-cp\* tests that are constantly failing on Windows (Joyee Cheung) [#59637](https://github.com/nodejs/node/pull/59637) +* \[[`4c48ec09e5`](https://github.com/nodejs/node/commit/4c48ec09e5)] - **test**: deflake test-http-keep-alive-empty-line (Luigi Pinca) [#59595](https://github.com/nodejs/node/pull/59595) +* \[[`dcdb259e85`](https://github.com/nodejs/node/commit/dcdb259e85)] - **test\_runner**: fix todo inheritance (Moshe Atlow) [#59721](https://github.com/nodejs/node/pull/59721) +* \[[`24177973a2`](https://github.com/nodejs/node/commit/24177973a2)] - **test\_runner**: set mock timer's interval undefined (hotpineapple) [#59479](https://github.com/nodejs/node/pull/59479) +* \[[`83d11f8a7a`](https://github.com/nodejs/node/commit/83d11f8a7a)] - **tools**: print appropriate output when test aborted (hotpineapple) [#59794](https://github.com/nodejs/node/pull/59794) +* \[[`1eca2cc548`](https://github.com/nodejs/node/commit/1eca2cc548)] - **tools**: use sparse checkout in `build-tarball.yml` (Antoine du Hamel) [#59788](https://github.com/nodejs/node/pull/59788) +* \[[`89fa1a929d`](https://github.com/nodejs/node/commit/89fa1a929d)] - **tools**: remove unused actions from `build-tarball.yml` (Antoine du Hamel) [#59787](https://github.com/nodejs/node/pull/59787) +* \[[`794ca3511d`](https://github.com/nodejs/node/commit/794ca3511d)] - **tools**: do not attempt to compress tgz archive (Antoine du Hamel) [#59785](https://github.com/nodejs/node/pull/59785) +* \[[`377bdb9b7e`](https://github.com/nodejs/node/commit/377bdb9b7e)] - **tools**: add v8windbg target (Chengzhong Wu) [#59767](https://github.com/nodejs/node/pull/59767) +* \[[`6696d1d6c9`](https://github.com/nodejs/node/commit/6696d1d6c9)] - **tools**: improve error handling in node\_mksnapshot (James M Snell) [#59437](https://github.com/nodejs/node/pull/59437) +* \[[`8dbd0f13e8`](https://github.com/nodejs/node/commit/8dbd0f13e8)] - **tools**: add sccache to `test-internet` workflow (Antoine du Hamel) [#59720](https://github.com/nodejs/node/pull/59720) +* \[[`6523c2d7d9`](https://github.com/nodejs/node/commit/6523c2d7d9)] - **tools**: update gyp-next to 0.20.4 (Node.js GitHub Bot) [#59690](https://github.com/nodejs/node/pull/59690) +* \[[`19d633f40c`](https://github.com/nodejs/node/commit/19d633f40c)] - **tools**: add script to make reviewing backport PRs easier (Antoine du Hamel) [#59161](https://github.com/nodejs/node/pull/59161) +* \[[`15e547b3a4`](https://github.com/nodejs/node/commit/15e547b3a4)] - **typings**: add typing for 'uv' (방진혁) [#59606](https://github.com/nodejs/node/pull/59606) +* \[[`ad5cfcc901`](https://github.com/nodejs/node/commit/ad5cfcc901)] - **typings**: add missing properties in ConfigBinding (Lee Jiho) [#59585](https://github.com/nodejs/node/pull/59585) +* \[[`70d2d6d479`](https://github.com/nodejs/node/commit/70d2d6d479)] - **url**: add err.input to ERR\_INVALID\_FILE\_URL\_PATH (Joyee Cheung) [#59730](https://github.com/nodejs/node/pull/59730) +* \[[`e476e43c17`](https://github.com/nodejs/node/commit/e476e43c17)] - **util**: fix numericSeparator with negative fractional numbers (sangwook) [#59379](https://github.com/nodejs/node/pull/59379) +* \[[`b2e8f40d15`](https://github.com/nodejs/node/commit/b2e8f40d15)] - **util**: remove unnecessary template strings (btea) [#59201](https://github.com/nodejs/node/pull/59201) +* \[[`6f79450ea2`](https://github.com/nodejs/node/commit/6f79450ea2)] - **util**: remove outdated TODO comment (haramjeong) [#59760](https://github.com/nodejs/node/pull/59760) +* \[[`32731432ef`](https://github.com/nodejs/node/commit/32731432ef)] - **util**: use getOptionValue('--no-deprecation') in deprecated() (haramjeong) [#59760](https://github.com/nodejs/node/pull/59760) +* \[[`65e4e68c90`](https://github.com/nodejs/node/commit/65e4e68c90)] - **util**: hide duplicated stack frames when using util.inspect (Ruben Bridgewater) [#59447](https://github.com/nodejs/node/pull/59447) +* \[[`2086f3365f`](https://github.com/nodejs/node/commit/2086f3365f)] - **vm**: sync-ify SourceTextModule linkage (Chengzhong Wu) [#59000](https://github.com/nodejs/node/pull/59000) +* \[[`c16163511d`](https://github.com/nodejs/node/commit/c16163511d)] - **wasi**: fix `clean` target in `test/wasi/Makefile` (Antoine du Hamel) [#59576](https://github.com/nodejs/node/pull/59576) +* \[[`2e54411cb6`](https://github.com/nodejs/node/commit/2e54411cb6)] - **worker**: optimize cpu profile implement (theanarkh) [#59683](https://github.com/nodejs/node/pull/59683) +* \[[`d6d05ba397`](https://github.com/nodejs/node/commit/d6d05ba397)] - **(SEMVER-MINOR)** **worker**: add cpu profile APIs for worker (theanarkh) [#59428](https://github.com/nodejs/node/pull/59428) + ## 2025-08-27, Version 24.7.0 (Current), @targos diff --git a/src/node_version.h b/src/node_version.h index df1790bead9c47..f5639d5f5656af 100644 --- a/src/node_version.h +++ b/src/node_version.h @@ -23,13 +23,13 @@ #define SRC_NODE_VERSION_H_ #define NODE_MAJOR_VERSION 24 -#define NODE_MINOR_VERSION 7 -#define NODE_PATCH_VERSION 1 +#define NODE_MINOR_VERSION 8 +#define NODE_PATCH_VERSION 0 #define NODE_VERSION_IS_LTS 0 #define NODE_VERSION_LTS_CODENAME "" -#define NODE_VERSION_IS_RELEASE 0 +#define NODE_VERSION_IS_RELEASE 1 #ifndef NODE_STRINGIFY #define NODE_STRINGIFY(n) NODE_STRINGIFY_HELPER(n)