Skip to content

Conversation

renovate[bot]
Copy link
Contributor

@renovate renovate bot commented Mar 4, 2025

This PR contains the following updates:

Package Change Age Adoption Passing Confidence
transformers ~=4.40.2 -> ~=4.53.0 age adoption passing confidence
transformers ==4.45.2 -> ==4.53.0 age adoption passing confidence
transformers ==4.41.2 -> ==4.53.0 age adoption passing confidence

GitHub Vulnerability Alerts

CVE-2024-11392

Hugging Face Transformers MobileViTV2 Deserialization of Untrusted Data Remote Code Execution Vulnerability. This vulnerability allows remote attackers to execute arbitrary code on affected installations of Hugging Face Transformers. User interaction is required to exploit this vulnerability in that the target must visit a malicious page or open a malicious file.

The specific flaw exists within the handling of configuration files. The issue results from the lack of proper validation of user-supplied data, which can result in deserialization of untrusted data. An attacker can leverage this vulnerability to execute code in the context of the current user. Was ZDI-CAN-24322.

CVE-2024-11394

Hugging Face Transformers Trax Model Deserialization of Untrusted Data Remote Code Execution Vulnerability. This vulnerability allows remote attackers to execute arbitrary code on affected installations of Hugging Face Transformers. User interaction is required to exploit this vulnerability in that the target must visit a malicious page or open a malicious file.

The specific flaw exists within the handling of model files. The issue results from the lack of proper validation of user-supplied data, which can result in deserialization of untrusted data. An attacker can leverage this vulnerability to execute code in the context of the current user. Was ZDI-CAN-25012.

CVE-2024-11393

Hugging Face Transformers MaskFormer Model Deserialization of Untrusted Data Remote Code Execution Vulnerability. This vulnerability allows remote attackers to execute arbitrary code on affected installations of Hugging Face Transformers. User interaction is required to exploit this vulnerability in that the target must visit a malicious page or open a malicious file.

The specific flaw exists within the parsing of model files. The issue results from the lack of proper validation of user-supplied data, which can result in deserialization of untrusted data. An attacker can leverage this vulnerability to execute code in the context of the current user. Was ZDI-CAN-25191.

CVE-2024-12720

A Regular Expression Denial of Service (ReDoS) vulnerability was identified in the huggingface/transformers library, specifically in the file tokenization_nougat_fast.py. The vulnerability occurs in the post_process_single() function, where a regular expression processes specially crafted input. The issue stems from the regex exhibiting exponential time complexity under certain conditions, leading to excessive backtracking. This can result in significantly high CPU usage and potential application downtime, effectively creating a Denial of Service (DoS) scenario. The affected version is v4.46.3.

CVE-2025-1194

A Regular Expression Denial of Service (ReDoS) vulnerability was identified in the huggingface/transformers library, specifically in the file tokenization_gpt_neox_japanese.py of the GPT-NeoX-Japanese model. The vulnerability occurs in the SubWordJapaneseTokenizer class, where regular expressions process specially crafted inputs. The issue stems from a regex exhibiting exponential complexity under certain conditions, leading to excessive backtracking. This can result in high CPU usage and potential application downtime, effectively creating a Denial of Service (DoS) scenario. The affected version is v4.48.1 (latest).

CVE-2025-2099

A Regular Expression Denial of Service (ReDoS) exists in the preprocess_string() function of the transformers.testing_utils module. In versions before 4.50.0, the regex used to process code blocks in docstrings contains nested quantifiers that can trigger catastrophic backtracking when given inputs with many newline characters. An attacker who can supply such input to preprocess_string() (or code paths that call it) can force excessive CPU usage and degrade availability.

Fix: released in 4.50.0, which rewrites the regex to avoid the inefficient pattern. ([GitHub][1])

  • Affected: < 4.50.0
  • Patched: 4.50.0

CVE-2025-3263

A Regular Expression Denial of Service (ReDoS) vulnerability was discovered in the Hugging Face Transformers library, specifically in the get_configuration_file() function within the transformers.configuration_utils module. The affected version is 4.49.0, and the issue is resolved in version 4.51.0. The vulnerability arises from the use of a regular expression pattern config\.(.*)\.json that can be exploited to cause excessive CPU consumption through crafted input strings, leading to catastrophic backtracking. This can result in model serving disruption, resource exhaustion, and increased latency in applications using the library.

CVE-2025-3264

A Regular Expression Denial of Service (ReDoS) vulnerability was discovered in the Hugging Face Transformers library, specifically in the get_imports() function within dynamic_module_utils.py. This vulnerability affects versions 4.49.0 and is fixed in version 4.51.0. The issue arises from a regular expression pattern \s*try\s*:.*?except.*?: used to filter out try/except blocks from Python code, which can be exploited to cause excessive CPU consumption through crafted input strings due to catastrophic backtracking. This vulnerability can lead to remote code loading disruption, resource exhaustion in model serving, supply chain attack vectors, and development pipeline disruption.

CVE-2025-3777

Hugging Face Transformers versions up to 4.49.0 are affected by an improper input validation vulnerability in the image_utils.py file. The vulnerability arises from insecure URL validation using the startswith() method, which can be bypassed through URL username injection. This allows attackers to craft URLs that appear to be from YouTube but resolve to malicious domains, potentially leading to phishing attacks, malware distribution, or data exfiltration. The issue is fixed in version 4.52.1.

CVE-2025-3933

A Regular Expression Denial of Service (ReDoS) vulnerability was discovered in the Hugging Face Transformers library, specifically within the DonutProcessor class's token2json() method. This vulnerability affects versions 4.51.3 and earlier, and is fixed in version 4.52.1. The issue arises from the regex pattern <s_(.*?)> which can be exploited to cause excessive CPU consumption through crafted input strings due to catastrophic backtracking. This vulnerability can lead to service disruption, resource exhaustion, and potential API service vulnerabilities, impacting document processing tasks using the Donut model.

CVE-2025-5197

A Regular Expression Denial of Service (ReDoS) vulnerability exists in the Hugging Face Transformers library, specifically in the convert_tf_weight_name_to_pt_weight_name() function. This function, responsible for converting TensorFlow weight names to PyTorch format, uses a regex pattern /[^/]*___([^/]*)/ that can be exploited to cause excessive CPU consumption through crafted input strings due to catastrophic backtracking. The vulnerability affects versions up to 4.51.3 and is fixed in version 4.53.0. This issue can lead to service disruption, resource exhaustion, and potential API service vulnerabilities, impacting model conversion processes between TensorFlow and PyTorch formats.

CVE-2025-6638

A Regular Expression Denial of Service (ReDoS) vulnerability was discovered in the Hugging Face Transformers library, specifically affecting the MarianTokenizer's remove_language_code() method. This vulnerability is present in version 4.52.4 and has been fixed in version 4.53.0. The issue arises from inefficient regex processing, which can be exploited by crafted input strings containing malformed language code patterns, leading to excessive CPU consumption and potential denial of service.

CVE-2025-6051

A Regular Expression Denial of Service (ReDoS) vulnerability was discovered in the Hugging Face Transformers library, specifically within the normalize_numbers() method of the EnglishNormalizer class. This vulnerability affects versions up to 4.52.4 and is fixed in version 4.53.0. The issue arises from the method's handling of numeric strings, which can be exploited using crafted input strings containing long sequences of digits, leading to excessive CPU consumption. This vulnerability impacts text-to-speech and number normalization tasks, potentially causing service disruption, resource exhaustion, and API vulnerabilities.

CVE-2025-6921

The huggingface/transformers library, versions prior to 4.53.0, is vulnerable to Regular Expression Denial of Service (ReDoS) in the AdamWeightDecay optimizer. The vulnerability arises from the _do_use_weight_decay method, which processes user-controlled regular expressions in the include_in_weight_decay and exclude_from_weight_decay lists. Malicious regular expressions can cause catastrophic backtracking during the re.search call, leading to 100% CPU utilization and a denial of service. This issue can be exploited by attackers who can control the patterns in these lists, potentially causing the machine learning task to hang and rendering services unresponsive.


Transformers Regular Expression Denial of Service (ReDoS) vulnerability

CVE-2024-12720 / GHSA-6rvg-6v2m-4j46

More information

Details

A Regular Expression Denial of Service (ReDoS) vulnerability was identified in the huggingface/transformers library, specifically in the file tokenization_nougat_fast.py. The vulnerability occurs in the post_process_single() function, where a regular expression processes specially crafted input. The issue stems from the regex exhibiting exponential time complexity under certain conditions, leading to excessive backtracking. This can result in significantly high CPU usage and potential application downtime, effectively creating a Denial of Service (DoS) scenario. The affected version is v4.46.3.

Severity

  • CVSS Score: 5.3 / 10 (Medium)
  • Vector String: CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:L

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


CVE-2024-11393 / GHSA-wrfc-pvp9-mr9g / PYSEC-2024-228

More information

Details

Hugging Face Transformers MaskFormer Model Deserialization of Untrusted Data Remote Code Execution Vulnerability. This vulnerability allows remote attackers to execute arbitrary code on affected installations of Hugging Face Transformers. User interaction is required to exploit this vulnerability in that the target must visit a malicious page or open a malicious file.

The specific flaw exists within the parsing of model files. The issue results from the lack of proper validation of user-supplied data, which can result in deserialization of untrusted data. An attacker can leverage this vulnerability to execute code in the context of the current user. Was ZDI-CAN-25191.

Severity

  • CVSS Score: 8.8 / 10 (High)
  • Vector String: CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H

References

This data is provided by OSV and the PyPI Advisory Database (CC-BY 4.0).


Deserialization of Untrusted Data in Hugging Face Transformers

CVE-2024-11394 / GHSA-hxxf-235m-72v3 / PYSEC-2024-229

More information

Details

Hugging Face Transformers Trax Model Deserialization of Untrusted Data Remote Code Execution Vulnerability. This vulnerability allows remote attackers to execute arbitrary code on affected installations of Hugging Face Transformers. User interaction is required to exploit this vulnerability in that the target must visit a malicious page or open a malicious file.

The specific flaw exists within the handling of model files. The issue results from the lack of proper validation of user-supplied data, which can result in deserialization of untrusted data. An attacker can leverage this vulnerability to execute code in the context of the current user. Was ZDI-CAN-25012.

Severity

  • CVSS Score: 8.8 / 10 (High)
  • Vector String: CVSS:3.0/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


CVE-2024-11394 / GHSA-hxxf-235m-72v3 / PYSEC-2024-229

More information

Details

Hugging Face Transformers Trax Model Deserialization of Untrusted Data Remote Code Execution Vulnerability. This vulnerability allows remote attackers to execute arbitrary code on affected installations of Hugging Face Transformers. User interaction is required to exploit this vulnerability in that the target must visit a malicious page or open a malicious file.

The specific flaw exists within the handling of model files. The issue results from the lack of proper validation of user-supplied data, which can result in deserialization of untrusted data. An attacker can leverage this vulnerability to execute code in the context of the current user. Was ZDI-CAN-25012.

Severity

  • CVSS Score: 8.8 / 10 (High)
  • Vector String: CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H

References

This data is provided by OSV and the PyPI Advisory Database (CC-BY 4.0).


Deserialization of Untrusted Data in Hugging Face Transformers

CVE-2024-11392 / GHSA-qxrp-vhvm-j765 / PYSEC-2024-227

More information

Details

Hugging Face Transformers MobileViTV2 Deserialization of Untrusted Data Remote Code Execution Vulnerability. This vulnerability allows remote attackers to execute arbitrary code on affected installations of Hugging Face Transformers. User interaction is required to exploit this vulnerability in that the target must visit a malicious page or open a malicious file.

The specific flaw exists within the handling of configuration files. The issue results from the lack of proper validation of user-supplied data, which can result in deserialization of untrusted data. An attacker can leverage this vulnerability to execute code in the context of the current user. Was ZDI-CAN-24322.

Severity

  • CVSS Score: 7.5 / 10 (High)
  • Vector String: CVSS:3.0/AV:N/AC:H/PR:N/UI:R/S:U/C:H/I:H/A:H

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


CVE-2024-11392 / GHSA-qxrp-vhvm-j765 / PYSEC-2024-227

More information

Details

Hugging Face Transformers MobileViTV2 Deserialization of Untrusted Data Remote Code Execution Vulnerability. This vulnerability allows remote attackers to execute arbitrary code on affected installations of Hugging Face Transformers. User interaction is required to exploit this vulnerability in that the target must visit a malicious page or open a malicious file.

The specific flaw exists within the handling of configuration files. The issue results from the lack of proper validation of user-supplied data, which can result in deserialization of untrusted data. An attacker can leverage this vulnerability to execute code in the context of the current user. Was ZDI-CAN-24322.

Severity

  • CVSS Score: 8.8 / 10 (High)
  • Vector String: CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H

References

This data is provided by OSV and the PyPI Advisory Database (CC-BY 4.0).


Deserialization of Untrusted Data in Hugging Face Transformers

CVE-2024-11393 / GHSA-wrfc-pvp9-mr9g / PYSEC-2024-228

More information

Details

Hugging Face Transformers MaskFormer Model Deserialization of Untrusted Data Remote Code Execution Vulnerability. This vulnerability allows remote attackers to execute arbitrary code on affected installations of Hugging Face Transformers. User interaction is required to exploit this vulnerability in that the target must visit a malicious page or open a malicious file.

The specific flaw exists within the parsing of model files. The issue results from the lack of proper validation of user-supplied data, which can result in deserialization of untrusted data. An attacker can leverage this vulnerability to execute code in the context of the current user. Was ZDI-CAN-25191.

Severity

  • CVSS Score: 8.8 / 10 (High)
  • Vector String: CVSS:3.0/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


CVE-2025-2099 / GHSA-qq3j-4f4f-9583 / PYSEC-2025-40

More information

Details

A vulnerability in the preprocess_string() function of the transformers.testing_utils module in huggingface/transformers version v4.48.3 allows for a Regular Expression Denial of Service (ReDoS) attack. The regular expression used to process code blocks in docstrings contains nested quantifiers, leading to exponential backtracking when processing input with a large number of newline characters. An attacker can exploit this by providing a specially crafted payload, causing high CPU usage and potential application downtime, effectively resulting in a Denial of Service (DoS) scenario.

Severity

  • CVSS Score: 7.5 / 10 (High)
  • Vector String: CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H

References

This data is provided by OSV and the PyPI Advisory Database (CC-BY 4.0).


Transformers Regular Expression Denial of Service (ReDoS) vulnerability

CVE-2025-1194 / GHSA-fpwr-67px-3qhx

More information

Details

A Regular Expression Denial of Service (ReDoS) vulnerability was identified in the huggingface/transformers library, specifically in the file tokenization_gpt_neox_japanese.py of the GPT-NeoX-Japanese model. The vulnerability occurs in the SubWordJapaneseTokenizer class, where regular expressions process specially crafted inputs. The issue stems from a regex exhibiting exponential complexity under certain conditions, leading to excessive backtracking. This can result in high CPU usage and potential application downtime, effectively creating a Denial of Service (DoS) scenario. The affected version is v4.48.1 (latest).

Severity

  • CVSS Score: 4.3 / 10 (Medium)
  • Vector String: CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:N/I:N/A:L

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


Hugging Face Transformers Regular Expression Denial of Service

CVE-2025-2099 / GHSA-qq3j-4f4f-9583 / PYSEC-2025-40

More information

Details

A Regular Expression Denial of Service (ReDoS) exists in the preprocess_string() function of the transformers.testing_utils module. In versions before 4.50.0, the regex used to process code blocks in docstrings contains nested quantifiers that can trigger catastrophic backtracking when given inputs with many newline characters. An attacker who can supply such input to preprocess_string() (or code paths that call it) can force excessive CPU usage and degrade availability.

Fix: released in 4.50.0, which rewrites the regex to avoid the inefficient pattern. ([GitHub][1])

  • Affected: < 4.50.0
  • Patched: 4.50.0

Severity

  • CVSS Score: 5.3 / 10 (Medium)
  • Vector String: CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:L

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


Transformers vulnerable to ReDoS attack through its get_imports() function

CVE-2025-3264 / GHSA-jjph-296x-mrcr

More information

Details

A Regular Expression Denial of Service (ReDoS) vulnerability was discovered in the Hugging Face Transformers library, specifically in the get_imports() function within dynamic_module_utils.py. This vulnerability affects versions 4.49.0 and is fixed in version 4.51.0. The issue arises from a regular expression pattern \s*try\s*:.*?except.*?: used to filter out try/except blocks from Python code, which can be exploited to cause excessive CPU consumption through crafted input strings due to catastrophic backtracking. This vulnerability can lead to remote code loading disruption, resource exhaustion in model serving, supply chain attack vectors, and development pipeline disruption.

Severity

  • CVSS Score: 5.3 / 10 (Medium)
  • Vector String: CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:L

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


Transformers's ReDoS vulnerability in get_configuration_file can lead to catastrophic backtracking

CVE-2025-3263 / GHSA-q2wp-rjmx-x6x9

More information

Details

A Regular Expression Denial of Service (ReDoS) vulnerability was discovered in the Hugging Face Transformers library, specifically in the get_configuration_file() function within the transformers.configuration_utils module. The affected version is 4.49.0, and the issue is resolved in version 4.51.0. The vulnerability arises from the use of a regular expression pattern config\.(.*)\.json that can be exploited to cause excessive CPU consumption through crafted input strings, leading to catastrophic backtracking. This can result in model serving disruption, resource exhaustion, and increased latency in applications using the library.

Severity

  • CVSS Score: 5.3 / 10 (Medium)
  • Vector String: CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:L

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


Transformers's Improper Input Validation vulnerability can be exploited through username injection

CVE-2025-3777 / GHSA-phhr-52qp-3mj4

More information

Details

Hugging Face Transformers versions up to 4.49.0 are affected by an improper input validation vulnerability in the image_utils.py file. The vulnerability arises from insecure URL validation using the startswith() method, which can be bypassed through URL username injection. This allows attackers to craft URLs that appear to be from YouTube but resolve to malicious domains, potentially leading to phishing attacks, malware distribution, or data exfiltration. The issue is fixed in version 4.52.1.

Severity

  • CVSS Score: 3.5 / 10 (Low)
  • Vector String: CVSS:3.0/AV:N/AC:L/PR:L/UI:R/S:U/C:L/I:N/A:N

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


Transformers is vulnerable to ReDoS attack through its DonutProcessor class

CVE-2025-3933 / GHSA-37mw-44qp-f5jm

More information

Details

A Regular Expression Denial of Service (ReDoS) vulnerability was discovered in the Hugging Face Transformers library, specifically within the DonutProcessor class's token2json() method. This vulnerability affects versions 4.51.3 and earlier, and is fixed in version 4.52.1. The issue arises from the regex pattern <s_(.*?)> which can be exploited to cause excessive CPU consumption through crafted input strings due to catastrophic backtracking. This vulnerability can lead to service disruption, resource exhaustion, and potential API service vulnerabilities, impacting document processing tasks using the Donut model.

Severity

  • CVSS Score: 5.3 / 10 (Medium)
  • Vector String: CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:L

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


Hugging Face Transformers vulnerable to Regular Expression Denial of Service (ReDoS) in the AdamWeightDecay optimizer

CVE-2025-6921 / GHSA-4w7r-h757-3r74

More information

Details

The huggingface/transformers library, versions prior to 4.53.0, is vulnerable to Regular Expression Denial of Service (ReDoS) in the AdamWeightDecay optimizer. The vulnerability arises from the _do_use_weight_decay method, which processes user-controlled regular expressions in the include_in_weight_decay and exclude_from_weight_decay lists. Malicious regular expressions can cause catastrophic backtracking during the re.search call, leading to 100% CPU utilization and a denial of service. This issue can be exploited by attackers who can control the patterns in these lists, potentially causing the machine learning task to hang and rendering services unresponsive.

Severity

  • CVSS Score: 5.3 / 10 (Medium)
  • Vector String: CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:L

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


Hugging Face Transformers Regular Expression Denial of Service (ReDoS) vulnerability

CVE-2025-5197 / GHSA-9356-575x-2w9m

More information

Details

A Regular Expression Denial of Service (ReDoS) vulnerability exists in the Hugging Face Transformers library, specifically in the convert_tf_weight_name_to_pt_weight_name() function. This function, responsible for converting TensorFlow weight names to PyTorch format, uses a regex pattern /[^/]*___([^/]*)/ that can be exploited to cause excessive CPU consumption through crafted input strings due to catastrophic backtracking. The vulnerability affects versions up to 4.51.3 and is fixed in version 4.53.0. This issue can lead to service disruption, resource exhaustion, and potential API service vulnerabilities, impacting model conversion processes between TensorFlow and PyTorch formats.

Severity

  • CVSS Score: 5.3 / 10 (Medium)
  • Vector String: CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:L

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


Hugging Face Transformers is vulnerable to ReDoS through its MarianTokenizer

CVE-2025-6638 / GHSA-59p9-h35m-wg4g

More information

Details

A Regular Expression Denial of Service (ReDoS) vulnerability was discovered in the Hugging Face Transformers library, specifically affecting the MarianTokenizer's remove_language_code() method. This vulnerability is present in version 4.52.4 and has been fixed in version 4.53.0. The issue arises from inefficient regex processing, which can be exploited by crafted input strings containing malformed language code patterns, leading to excessive CPU consumption and potential denial of service.

Severity

  • CVSS Score: 5.3 / 10 (Medium)
  • Vector String: CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:L

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


Hugging Face Transformers library has Regular Expression Denial of Service

CVE-2025-6051 / GHSA-rcv9-qm8p-9p6j

More information

Details

A Regular Expression Denial of Service (ReDoS) vulnerability was discovered in the Hugging Face Transformers library, specifically within the normalize_numbers() method of the EnglishNormalizer class. This vulnerability affects versions up to 4.52.4 and is fixed in version 4.53.0. The issue arises from the method's handling of numeric strings, which can be exploited using crafted input strings containing long sequences of digits, leading to excessive CPU consumption. This vulnerability impacts text-to-speech and number normalization tasks, potentially causing service disruption, resource exhaustion, and API vulnerabilities.

Severity

  • CVSS Score: 5.3 / 10 (Medium)
  • Vector String: CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:L

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


Release Notes

huggingface/transformers (transformers)

v4.53.0

Compare Source

Release v4.53.0

Gemma3n

Gemma 3n models are designed for efficient execution on low-resource devices. They are capable of multimodal input, handling text, image, video, and audio input, and generating text outputs, with open weights for pre-trained and instruction-tuned variants. These models were trained with data in over 140 spoken languages.

Gemma 3n models use selective parameter activation technology to reduce resource requirements. This technique allows the models to operate at an effective size of 2B and 4B parameters, which is lower than the total number of parameters they contain. For more information on Gemma 3n's efficient parameter management technology, see the Gemma 3n page.

image

from transformers import pipeline
import torch

pipe = pipeline(
    "image-text-to-text",
    torch_dtype=torch.bfloat16,
    model="google/gemma-3n-e4b",
    device="cuda",
)
output = pipe(
    "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/bee.jpg",
    text="<image_soft_token> in this image, there is"
)

print(output)
Dia

image

Dia is an opensource text-to-speech (TTS) model (1.6B parameters) developed by Nari Labs.
It can generate highly realistic dialogue from transcript including nonverbal communications such as laughter and coughing.
Furthermore, emotion and tone control is also possible via audio conditioning (voice cloning).

Model Architecture:
Dia is an encoder-decoder transformer based on the original transformer architecture. However, some more modern features such as
rotational positional embeddings (RoPE) are also included. For its text portion (encoder), a byte tokenizer is utilized while
for the audio portion (decoder), a pretrained codec model DAC is used - DAC encodes speech into discrete codebook
tokens and decodes them back into audio.

Kyutai Speech-to-Text

Kyutai STT is a speech-to-text model architecture based on the Mimi codec, which encodes audio into discrete tokens in a streaming fashion, and a Moshi-like autoregressive decoder. Kyutai’s lab has released two model checkpoints:

  • kyutai/stt-1b-en_fr: a 1B-parameter model capable of transcribing both English and French
  • kyutai/stt-2.6b-en: a 2.6B-parameter model focused solely on English, optimized for maximum transcription accuracy

Read more about the model in the documentation

V-JEPA 2
drawing

V-JEPA 2 is a self-supervised approach to training video encoders developed by FAIR, Meta. Using internet-scale video data, V-JEPA 2 attains state-of-the-art performance on motion understanding and human action anticipation tasks. V-JEPA 2-AC is a latent action-conditioned world model post-trained from V-JEPA 2 (using a small amount of robot trajectory interaction data) that solves robot manipulation tasks without environment-specific data collection or task-specific training or calibration.

Read more about the model in the documentation.

Arcee

image

Arcee is a decoder-only transformer model based on the Llama architecture with a key modification: it uses ReLU² (ReLU-squared) activation in the MLP blocks instead of SiLU, following recent research showing improved training efficiency with squared activations. This architecture is designed for efficient training and inference while maintaining the proven stability of the Llama design.

The Arcee model is architecturally similar to Llama but uses x * relu(x) in MLP layers for improved gradient flow and is optimized for efficiency in both training and inference scenarios.

Read more about the model in the documentation.

ColQwen2

ColQwen2 is a variant of the ColPali model designed to retrieve documents by analyzing their visual features. Unlike traditional systems that rely heavily on text extraction and OCR, ColQwen2 treats each page as an image. It uses the Qwen2-VL backbone to capture not only text, but also the layout, tables, charts, and other visual elements to create detailed multi-vector embeddings that can be used for retrieval by computing pairwise late interaction similarity scores. This offers a more comprehensive understanding of documents and enables more efficient and accurate retrieval.

image

Read more about the model in the documentation.

MiniMax

image

MiniMax is a powerful language model with 456 billion total parameters, of which 45.9 billion are activated per token. To better unlock the long context capabilities of the model, MiniMax adopts a hybrid architecture that combines Lightning Attention, Softmax Attention and Mixture-of-Experts (MoE). Leveraging advanced parallel strategies and innovative compute-communication overlap methods—such as Linear Attention Sequence Parallelism Plus (LASP+), varlen ring attention, Expert Tensor Parallel (ETP), etc., MiniMax's training context length is extended to 1 million tokens, and it can handle a context of up to 4 million tokens during the inference. On various academic benchmarks, MiniMax also demonstrates the performance of a top-tier model.

The architecture of MiniMax is briefly described as follows:

  • Total Parameters: 456B
  • Activated Parameters per Token: 45.9B
  • Number Layers: 80
  • Hybrid Attention: a softmax attention is positioned after every 7 lightning attention.
    • Number of attention heads: 64
    • Attention head dimension: 128
  • Mixture of Experts:
    • Number of experts: 32
    • Expert hidden dimension: 9216
    • Top-2 routing strategy
  • Positional Encoding: Rotary Position Embedding (RoPE) applied to half of the attention head dimension with a base frequency of 10,000,000
  • Hidden Size: 6144
  • Vocab Size: 200,064

For more details refer to the release blog post.

Read more about the model in the documentation.

Encoder-Decoder Gemma

image

T5Gemma (aka encoder-decoder Gemma) was proposed in a research paper by Google. It is a family of encoder-decoder large langauge models, developed by adapting pretrained decoder-only models into encoder-decoder. T5Gemma includes pretrained and instruction-tuned variants. The architecture is based on transformer encoder-decoder design following T5, with improvements from Gemma 2: GQA, RoPE, GeGLU activation, RMSNorm, and interleaved local/global attention.

T5Gemma has two groups of model sizes: 1) Gemma 2 sizes (2B-2B, 9B-2B, and 9B-9B), which are based on the offical Gemma 2 models (2B and 9B); and 2) T5 sizes (Small, Base, Large, and XL), where are pretrained under the Gemma 2 framework following T5 configuration. In addition, we also provide a model at ML size (medium large, ~2B in total), which is in-between T5 Large and T5 XL.

The pretrained varaints are trained with two objectives: prefix language modeling with knowledge distillation (PrefixLM) and UL2, separately. We release both variants for each model size. The instruction-turned varaints was post-trained with supervised fine-tuning and reinforcement learning.

Read more about the model in the documentation.

GLM-4.1V

Configuration

📅 Schedule: Branch creation - "" in timezone America/Toronto, Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about these updates again.


  • If you want to rebase/retry this PR, check this box

This PR was generated by Mend Renovate. View the repository job log.

Copy link
Contributor Author

renovate bot commented Mar 4, 2025

⚠️ Artifact update problem

Renovate failed to update artifacts related to this branch. You probably do not want to merge this PR as-is.

♻ Renovate will retry this branch, including artifacts, only when one of the following happens:

  • any of the package files in this branch needs updating, or
  • the branch becomes conflicted, or
  • you click the rebase/retry checkbox if found above, or
  • you rename this PR's title to start with "rebase!" to trigger it manually

The artifact failure details are included below:

File name: model-servers/vllm/0.6.4/Pipfile.lock
Command failed: pipenv lock
Creating a virtualenv for this project
Pipfile: 
/tmp/renovate/repos/github/redhat-ai-dev/developer-images/model-servers/vllm/0.6
.4/Pipfile
Using /usr/local/bin/python3.11.13 to create virtualenv...
created virtual environment CPython3.11.13.final.0-64 in 713ms
  creator CPython3Posix(dest=/runner/cache/others/virtualenvs/0.6.4-d_r4MF2m, 
clear=False, no_vcs_ignore=False, global=False)
  seeder FromAppData(download=False, pip=bundle, setuptools=bundle, via=copy, 
app_data_dir=/tmp/containerbase/cache/.local/share/virtualenv)
    added seed packages: pip==25.2, setuptools==80.9.0
  activators 
BashActivator,CShellActivator,FishActivator,NushellActivator,PowerShellActivator
,PythonActivator

✔ Successfully created virtual environment!
Virtualenv location: /runner/cache/others/virtualenvs/0.6.4-d_r4MF2m
Locking  dependencies...
CRITICAL:pipenv.patched.pip._internal.resolution.resolvelib.factory:Cannot 
install -r /tmp/pipenv-a8w8wc3f-requirements/pipenv-mqskb0n7-constraints.txt 
(line 13), -r /tmp/pipenv-a8w8wc3f-requirements/pipenv-mqskb0n7-constraints.txt 
(line 17), -r /tmp/pipenv-a8w8wc3f-requirements/pipenv-mqskb0n7-constraints.txt 
(line 23) and torch==2.3.0+cu121 because these package versions have conflicting
dependencies.
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/resolver.py", line 451, in main
[ResolutionFailure]:       _main(
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/resolver.py", line 436, in _main
[ResolutionFailure]:       resolve_packages(
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/resolver.py", line 400, in resolve_packages
[ResolutionFailure]:       results, resolver = resolve_deps(
[ResolutionFailure]:       ^^^^^^^^^^^^^
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/utils/resolver.py", line 1083, in resolve_deps
[ResolutionFailure]:       results, hashes, internal_resolver = 
actually_resolve_deps(
[ResolutionFailure]:       ^^^^^^^^^^^^^^^^^^^^^^
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/utils/resolver.py", line 811, in actually_resolve_deps
[ResolutionFailure]:       resolver.resolve()
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/utils/resolver.py", line 471, in resolve
[ResolutionFailure]:       raise ResolutionFailure(message=e)
Your dependencies could not be resolved. You likely have a mismatch in your 
sub-dependencies.
You can use $ pipenv run pip install <requirement_name> to bypass this 
mechanism, then run $ pipenv graph to inspect the versions actually installed in
the virtualenv.
Hint: try $ pipenv lock --pre if it is a pre-release dependency.
ERROR: ResolutionImpossible: for help visit 
https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-depende
ncy-conflicts

Your dependencies could not be resolved. You likely have a mismatch in your 
sub-dependencies.
You can use $ pipenv run pip install <requirement_name> to bypass this 
mechanism, then run $ pipenv graph to inspect the versions actually installed in
the virtualenv.
Hint: try $ pipenv lock --pre if it is a pre-release dependency.
ERROR: Failed to lock Pipfile.lock!

File name: model-servers/vllm/0.6.6/Pipfile.lock
Command failed: pipenv lock
Creating a virtualenv for this project
Pipfile: 
/tmp/renovate/repos/github/redhat-ai-dev/developer-images/model-servers/vllm/0.6
.6/Pipfile
Using /usr/local/bin/python3.11.13 to create virtualenv...
created virtual environment CPython3.11.13.final.0-64 in 242ms
  creator CPython3Posix(dest=/runner/cache/others/virtualenvs/0.6.6-39QtryhM, 
clear=False, no_vcs_ignore=False, global=False)
  seeder FromAppData(download=False, pip=bundle, setuptools=bundle, via=copy, 
app_data_dir=/tmp/containerbase/cache/.local/share/virtualenv)
    added seed packages: pip==25.2, setuptools==80.9.0
  activators 
BashActivator,CShellActivator,FishActivator,NushellActivator,PowerShellActivator
,PythonActivator

✔ Successfully created virtual environment!
Virtualenv location: /runner/cache/others/virtualenvs/0.6.6-39QtryhM
Locking  dependencies...
CRITICAL:pipenv.patched.pip._internal.resolution.resolvelib.factory:Cannot 
install -r /tmp/pipenv-4_g2uo17-requirements/pipenv-qpl5_o5j-constraints.txt 
(line 8) and torch==2.3.0+cu121 because these package versions have conflicting 
dependencies.
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/resolver.py", line 451, in main
[ResolutionFailure]:       _main(
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/resolver.py", line 436, in _main
[ResolutionFailure]:       resolve_packages(
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/resolver.py", line 400, in resolve_packages
[ResolutionFailure]:       results, resolver = resolve_deps(
[ResolutionFailure]:       ^^^^^^^^^^^^^
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/utils/resolver.py", line 1083, in resolve_deps
[ResolutionFailure]:       results, hashes, internal_resolver = 
actually_resolve_deps(
[ResolutionFailure]:       ^^^^^^^^^^^^^^^^^^^^^^
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/utils/resolver.py", line 811, in actually_resolve_deps
[ResolutionFailure]:       resolver.resolve()
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/utils/resolver.py", line 471, in resolve
[ResolutionFailure]:       raise ResolutionFailure(message=e)
Your dependencies could not be resolved. You likely have a mismatch in your 
sub-dependencies.
You can use $ pipenv run pip install <requirement_name> to bypass this 
mechanism, then run $ pipenv graph to inspect the versions actually installed in
the virtualenv.
Hint: try $ pipenv lock --pre if it is a pre-release dependency.
ERROR: ResolutionImpossible: for help visit 
https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-depende
ncy-conflicts

Your dependencies could not be resolved. You likely have a mismatch in your 
sub-dependencies.
You can use $ pipenv run pip install <requirement_name> to bypass this 
mechanism, then run $ pipenv graph to inspect the versions actually installed in
the virtualenv.
Hint: try $ pipenv lock --pre if it is a pre-release dependency.
ERROR: Failed to lock Pipfile.lock!

File name: model-servers/vllm/0.8.4/Pipfile.lock
Command failed: pipenv lock
Creating a virtualenv for this project
Pipfile: 
/tmp/renovate/repos/github/redhat-ai-dev/developer-images/model-servers/vllm/0.8
.4/Pipfile
Using /usr/local/bin/python3.11.13 to create virtualenv...
created virtual environment CPython3.11.13.final.0-64 in 237ms
  creator CPython3Posix(dest=/runner/cache/others/virtualenvs/0.8.4-lR4GvG4y, 
clear=False, no_vcs_ignore=False, global=False)
  seeder FromAppData(download=False, pip=bundle, setuptools=bundle, via=copy, 
app_data_dir=/tmp/containerbase/cache/.local/share/virtualenv)
    added seed packages: pip==25.2, setuptools==80.9.0
  activators 
BashActivator,CShellActivator,FishActivator,NushellActivator,PowerShellActivator
,PythonActivator

✔ Successfully created virtual environment!
Virtualenv location: /runner/cache/others/virtualenvs/0.8.4-lR4GvG4y
Locking  dependencies...
CRITICAL:pipenv.patched.pip._internal.resolution.resolvelib.factory:Cannot 
install -r /tmp/pipenv-s0yp4bd7-requirements/pipenv-9qvmj54w-constraints.txt 
(line 8) and fastapi~=0.111.0 because these package versions have conflicting 
dependencies.
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/resolver.py", line 451, in main
[ResolutionFailure]:       _main(
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/resolver.py", line 436, in _main
[ResolutionFailure]:       resolve_packages(
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/resolver.py", line 400, in resolve_packages
[ResolutionFailure]:       results, resolver = resolve_deps(
[ResolutionFailure]:       ^^^^^^^^^^^^^
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/utils/resolver.py", line 1083, in resolve_deps
[ResolutionFailure]:       results, hashes, internal_resolver = 
actually_resolve_deps(
[ResolutionFailure]:       ^^^^^^^^^^^^^^^^^^^^^^
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/utils/resolver.py", line 811, in actually_resolve_deps
[ResolutionFailure]:       resolver.resolve()
[ResolutionFailure]:   File 
"/opt/containerbase/tools/pipenv/2025.0.4/3.11.13/lib/python3.11/site-packages/p
ipenv/utils/resolver.py", line 471, in resolve
[ResolutionFailure]:       raise ResolutionFailure(message=e)
Your dependencies could not be resolved. You likely have a mismatch in your 
sub-dependencies.
You can use $ pipenv run pip install <requirement_name> to bypass this 
mechanism, then run $ pipenv graph to inspect the versions actually installed in
the virtualenv.
Hint: try $ pipenv lock --pre if it is a pre-release dependency.
ERROR: ResolutionImpossible: for help visit 
https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-depende
ncy-conflicts

Your dependencies could not be resolved. You likely have a mismatch in your 
sub-dependencies.
You can use $ pipenv run pip install <requirement_name> to bypass this 
mechanism, then run $ pipenv graph to inspect the versions actually installed in
the virtualenv.
Hint: try $ pipenv lock --pre if it is a pre-release dependency.
ERROR: Failed to lock Pipfile.lock!

Copy link
Contributor

@thepetk thepetk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 54c16f5 to f88c921 Compare March 7, 2025 03:45
@renovate renovate bot changed the title Update dependency transformers [SECURITY] Update dependency transformers to v4.48.0 [SECURITY] Mar 7, 2025
@thepetk thepetk changed the title Update dependency transformers to v4.48.0 [SECURITY] rebase! Update dependency transformers to v4.48.0 [SECURITY] Mar 7, 2025
@renovate renovate bot changed the title rebase! Update dependency transformers to v4.48.0 [SECURITY] Update dependency transformers to v4.48.0 [SECURITY] Mar 7, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from f88c921 to 48cb960 Compare March 13, 2025 07:21
@renovate renovate bot changed the title Update dependency transformers to v4.48.0 [SECURITY] Update dependency transformers [SECURITY] Mar 13, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 48cb960 to aef2157 Compare March 15, 2025 03:20
@renovate renovate bot changed the title Update dependency transformers [SECURITY] Update dependency transformers to v4.48.0 [SECURITY] Mar 15, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from aef2157 to 4c8e625 Compare March 18, 2025 20:15
@renovate renovate bot changed the title Update dependency transformers to v4.48.0 [SECURITY] Update dependency transformers [SECURITY] Mar 18, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 4c8e625 to cc82e9a Compare March 21, 2025 23:40
@renovate renovate bot changed the title Update dependency transformers [SECURITY] Update dependency transformers to v4.48.0 [SECURITY] Mar 21, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from cc82e9a to 020c12d Compare March 25, 2025 16:48
@renovate renovate bot changed the title Update dependency transformers to v4.48.0 [SECURITY] Update dependency transformers [SECURITY] Mar 25, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 020c12d to 1ad7b15 Compare March 29, 2025 04:06
@renovate renovate bot changed the title Update dependency transformers [SECURITY] Update dependency transformers to v4.48.0 [SECURITY] Mar 29, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 1ad7b15 to 465bc72 Compare April 2, 2025 00:04
@renovate renovate bot changed the title Update dependency transformers to v4.48.0 [SECURITY] Update dependency transformers [SECURITY] Apr 2, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 465bc72 to 6e5b47f Compare April 3, 2025 23:56
@renovate renovate bot changed the title Update dependency transformers [SECURITY] Update dependency transformers to v4.48.0 [SECURITY] Apr 3, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 6e5b47f to b1be93e Compare April 8, 2025 20:13
@renovate renovate bot changed the title Update dependency transformers to v4.48.0 [SECURITY] Update dependency transformers [SECURITY] Apr 8, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from b1be93e to 3ee72ed Compare April 12, 2025 07:49
@renovate renovate bot changed the title Update dependency transformers [SECURITY] Update dependency transformers to v4.48.0 [SECURITY] Apr 12, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 3ee72ed to d0b788a Compare April 13, 2025 20:46
@renovate renovate bot changed the title Update dependency transformers to v4.48.0 [SECURITY] Update dependency transformers [SECURITY] Apr 13, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from d0b788a to ebdfd8f Compare April 18, 2025 04:13
@renovate renovate bot changed the title Update dependency transformers [SECURITY] Update dependency transformers to v4.48.0 [SECURITY] Apr 18, 2025
@renovate renovate bot changed the title Update dependency transformers [SECURITY] Update dependency transformers to v4.50.0 [SECURITY] Jun 8, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 1d21438 to af70393 Compare June 22, 2025 08:09
@renovate renovate bot changed the title Update dependency transformers to v4.50.0 [SECURITY] Update dependency transformers [SECURITY] Jun 22, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from af70393 to 59cb5e8 Compare June 30, 2025 14:57
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 59cb5e8 to 4fc3c92 Compare July 11, 2025 16:23
@renovate renovate bot requested a review from a team as a code owner July 11, 2025 16:23
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch 3 times, most recently from e6e8f20 to 51d73f3 Compare August 8, 2025 03:17
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch 2 times, most recently from 5d9d06a to c079476 Compare August 15, 2025 14:42
@renovate renovate bot changed the title Update dependency transformers [SECURITY] Update dependency transformers to v4.53.0 [SECURITY] Aug 15, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from c079476 to 6e66bb7 Compare August 19, 2025 16:42
@renovate renovate bot changed the title Update dependency transformers to v4.53.0 [SECURITY] Update dependency transformers [SECURITY] Aug 19, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 6e66bb7 to c02d11b Compare August 19, 2025 23:48
@renovate renovate bot changed the title Update dependency transformers [SECURITY] Update dependency transformers to v4.53.0 [SECURITY] Aug 19, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from c02d11b to 4d1fca8 Compare September 15, 2025 13:14
@renovate renovate bot added the renovatebot label Sep 15, 2025
@renovate renovate bot changed the title Update dependency transformers to v4.53.0 [SECURITY] Update dependency transformers [SECURITY] Sep 15, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 4d1fca8 to 737b982 Compare September 15, 2025 14:22
@renovate renovate bot changed the title Update dependency transformers [SECURITY] Update dependency transformers to v4.53.0 [SECURITY] Sep 15, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 737b982 to 536512c Compare September 25, 2025 21:26
@renovate renovate bot changed the title Update dependency transformers to v4.53.0 [SECURITY] Update dependency transformers [SECURITY] Sep 25, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 536512c to 4e2d0da Compare September 26, 2025 00:38
@renovate renovate bot changed the title Update dependency transformers [SECURITY] Update dependency transformers to v4.53.0 [SECURITY] Sep 26, 2025
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from 4e2d0da to ec62d2a Compare October 9, 2025 10:11
@renovate renovate bot changed the title Update dependency transformers to v4.53.0 [SECURITY] Update dependency transformers [SECURITY] Oct 9, 2025
Signed-off-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
@renovate renovate bot force-pushed the renovate/pypi-transformers-vulnerability branch from ec62d2a to 8f7bf47 Compare October 9, 2025 13:41
@renovate renovate bot changed the title Update dependency transformers [SECURITY] Update dependency transformers to v4.53.0 [SECURITY] Oct 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant