Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add messageRateLimit option to consumer configuration #6394

Open
elettrico opened this issue Jan 21, 2025 · 0 comments
Open

Add messageRateLimit option to consumer configuration #6394

elettrico opened this issue Jan 21, 2025 · 0 comments
Labels
proposal Enhancement idea or proposal

Comments

@elettrico
Copy link

elettrico commented Jan 21, 2025

Proposed change

Currently, JetStream supports limiting the delivery rate of messages using the RateLimit configuration, which is defined in bits per second (bps). While this is effective for managing bandwidth, it does not allow direct control over the number of messages delivered per second. It may be good to have the ability to configure a consumer to limit the delivery rate in terms of the number of messages per second via a new consumer configuration option, such as MessageRateLimit.
This feature should also apply to pull consumers, ensuring that they do not receive more than the specified number of messages per second. The server could enforce this rate limit by controlling the delivery of messages to match the defined threshold, regardless of how frequently the client issues pull requests. This would help maintain consistent behavior across both push and pull consumers while simplifying client-side rate control logic.

Example (java):

ConsumerConfiguration cc = ConsumerConfiguration.builder()
    .durable("my-consumer")
    .filterSubject("my-subject")
    .messageRateLimit(5)  // Deliver up to 5 messages per second
    .build();

Use case

This would provide finer control for applications where the message frequency matters more than the data size, having such option developers can focus on defining the number of messages they need, ie: rate-limited processing pipelines, time-sensitive analytics systems or scenarios where external dependencies have strict request rate limits. Also, it can help in edge cases where "message bursts" can appear, helping to reduce the variability in message rates and protecting the client from being overloaded.
A similar feature is supported in some other messaging systems (e.g., RabbitMQ's consumer prefetch limits), however, they often require manual throttling or handling at the client level. Incorporating this feature natively in JetStream would simplify client implementations and improve usability.

Contribution

I am not familiar with the NATS codebase or Go, so my contribution would likely be limited to refining the idea and discussing potential use cases.

@elettrico elettrico added the proposal Enhancement idea or proposal label Jan 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
proposal Enhancement idea or proposal
Projects
None yet
Development

No branches or pull requests

1 participant