diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index a18f057f9..147124234 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -29,8 +29,9 @@ jobs: include: - os: macos-11.0 - os: windows-2019 - features: cmake-build,libz-static,curl-static - rdkafka-sys-features: cmake-build,libz-static,curl-static + # The Windows build should automatically use CMake + features: libz-static,curl-static + rdkafka-sys-features: libz-static,curl-static - os: ubuntu-20.04 features: tracing - os: ubuntu-20.04 diff --git a/README.md b/README.md index 6070ef6a7..aa4755480 100644 --- a/README.md +++ b/README.md @@ -1,289 +1,290 @@ -# rust-rdkafka - -[![crates.io](https://img.shields.io/crates/v/rdkafka.svg)](https://crates.io/crates/rdkafka) -[![docs.rs](https://docs.rs/rdkafka/badge.svg)](https://docs.rs/rdkafka/) -[![Build Status](https://travis-ci.org/fede1024/rust-rdkafka.svg?branch=master)](https://travis-ci.org/fede1024/rust-rdkafka) -[![coverate](https://codecov.io/gh/fede1024/rust-rdkafka/graphs/badge.svg?branch=master)](https://codecov.io/gh/fede1024/rust-rdkafka/) -[![Join the chat at https://gitter.im/rust-rdkafka/Lobby](https://badges.gitter.im/rust-rdkafka/Lobby.svg)](https://gitter.im/rust-rdkafka/Lobby?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) - -A fully asynchronous, [futures]-enabled [Apache Kafka] client -library for Rust based on [librdkafka]. - -## The library - -`rust-rdkafka` provides a safe Rust interface to librdkafka. This version -is compatible with librdkafka v1.9.2+. - -### Documentation - -- [Current master branch](https://fede1024.github.io/rust-rdkafka/) -- [Latest release](https://docs.rs/rdkafka/) -- [Changelog](https://github.com/fede1024/rust-rdkafka/blob/master/changelog.md) - -### Features - -The main features provided at the moment are: - -- Support for all Kafka versions since 0.8.x. For more information about - broker compatibility options, check the [librdkafka - documentation][broker-compat]. -- Consume from single or multiple topics. -- Automatic consumer rebalancing. -- Customizable rebalance, with pre and post rebalance callbacks. -- Synchronous or asynchronous message production. -- Customizable offset commit. -- Create and delete topics and add and edit partitions. -- Alter broker and topic configurations. -- Access to cluster metadata (list of topic-partitions, replicas, active - brokers etc). -- Access to group metadata (list groups, list members of groups, hostnames, - etc.). -- Access to producer and consumer metrics, errors and callbacks. -- Exactly-once semantics (EOS) via idempotent and transactional producers - and read-committed consumers. - -### One million messages per second - -`rust-rdkafka` is designed to be easy and safe to use thanks to the -abstraction layer written in Rust, while at the same time being extremely -fast thanks to the librdkafka C library. - -Here are some benchmark results using the [`BaseProducer`], -sending data to a single Kafka 0.11 process running in localhost (default -configuration, 3 partitions). Hardware: Dell laptop, with Intel Core -i7-4712HQ @ 2.30GHz. - -- Scenario: produce 5 million messages, 10 bytes each, wait for all of them to be acked - - 1045413 messages/s, 9.970 MB/s (average over 5 runs) - -- Scenario: produce 100000 messages, 10 KB each, wait for all of them to be acked - - 24623 messages/s, 234.826 MB/s (average over 5 runs) - -For more numbers, check out the [kafka-benchmark] project. - -### Client types - -`rust-rdkafka` provides low level and high level consumers and producers. - -Low level: - -* [`BaseConsumer`]: a simple wrapper around the librdkafka consumer. It - must be periodically `poll()`ed in order to execute callbacks, rebalances - and to receive messages. -* [`BaseProducer`]: a simple wrapper around the librdkafka producer. As in - the consumer case, the user must call `poll()` periodically to execute - delivery callbacks. -* [`ThreadedProducer`]: a `BaseProducer` with a separate thread dedicated to - polling the producer. - -High level: - - * [`StreamConsumer`]: a [`Stream`] of messages that takes care of - polling the consumer automatically. - * [`FutureProducer`]: a [`Future`] that will be completed once - the message is delivered to Kafka (or failed). - -For more information about consumers and producers, refer to their -module-level documentation. - -*Warning*: the library is under active development and the APIs are likely -to change. - -### Asynchronous data processing with Tokio - -[Tokio] is a platform for fast processing of asynchronous events in Rust. -The interfaces exposed by the [`StreamConsumer`] and the [`FutureProducer`] -allow rust-rdkafka users to easily integrate Kafka consumers and producers -within the Tokio platform, and write asynchronous message processing code. -Note that rust-rdkafka can be used without Tokio. - -To see rust-rdkafka in action with Tokio, check out the -[asynchronous processing example] in the examples folder. - -### At-least-once delivery - -At-least-once delivery semantics are common in many streaming applications: -every message is guaranteed to be processed at least once; in case of -temporary failure, the message can be re-processed and/or re-delivered, -but no message will be lost. - -In order to implement at-least-once delivery the stream processing -application has to carefully commit the offset only once the message has -been processed. Committing the offset too early, instead, might cause -message loss, since upon recovery the consumer will start from the next -message, skipping the one where the failure occurred. - -To see how to implement at-least-once delivery with `rdkafka`, check out the -[at-least-once delivery example] in the examples folder. To know more about -delivery semantics, check the [message delivery semantics] chapter in the -Kafka documentation. - -### Exactly-once semantics - -Exactly-once semantics (EOS) can be achieved using transactional producers, -which allow produced records and consumer offsets to be committed or aborted -atomically. Consumers that set their `isolation.level` to `read_committed` -will only observe committed messages. - -EOS is useful in read-process-write scenarios that require messages to be -processed exactly once. - -To learn more about using transactions in rust-rdkafka, see the -[Transactions](producer-transactions) section of the producer documentation. - -### Users - -Here are some of the projects using rust-rdkafka: - -- [timely-dataflow]: a distributed data-parallel compute engine. See also - the [blog post][timely-blog] announcing its Kafka integration. -- [kafka-view]: a web interface for Kafka clusters. -- [kafka-benchmark]: a high performance benchmarking tool for Kafka. -- [callysto]: Stream processing framework in Rust. -- [bytewax]: Python stream processing framework using Timely Dataflow. - -*If you are using rust-rdkafka, please let us know!* - -## Installation - -Add this to your `Cargo.toml`: - -```toml -[dependencies] -rdkafka = { version = "0.25", features = ["cmake-build"] } -``` - -This crate will compile librdkafka from sources and link it statically to -your executable. To compile librdkafka you'll need: - -* the GNU toolchain -* GNU `make` -* `pthreads` -* `zlib`: optional, but included by default (feature: `libz`) -* `cmake`: optional, *not* included by default (feature: `cmake-build`) -* `libssl-dev`: optional, *not* included by default (feature: `ssl`) -* `libsasl2-dev`: optional, *not* included by default (feature: `gssapi`) -* `libzstd-dev`: optional, *not* included by default (feature: `zstd-pkg-config`) - -Note that using the CMake build system, via the `cmake-build` feature, is -encouraged if you can take the dependency on CMake. - -By default a submodule with the librdkafka sources pinned to a specific -commit will be used to compile and statically link the library. The -`dynamic-linking` feature can be used to instead dynamically link rdkafka to -the system's version of librdkafka. Example: - -```toml -[dependencies] -rdkafka = { version = "0.25", features = ["dynamic-linking"] } -``` - -For a full listing of features, consult the [rdkafka-sys crate's -documentation][rdkafka-sys-features]. All of rdkafka-sys features are -re-exported as rdkafka features. - -### Minimum supported Rust version (MSRV) - -The current minimum supported Rust version (MSRV) is 1.61.0. Note that -bumping the MSRV is not considered a breaking change. Any release of -rust-rdkafka may bump the MSRV. - -### Asynchronous runtimes - -Some features of the [`StreamConsumer`] and [`FutureProducer`] depend on -Tokio, which can be a heavyweight dependency for users who only intend to -use the low-level consumers and producers. The Tokio integration is -enabled by default, but can be disabled by turning off default features: - -```toml -[dependencies] -rdkafka = { version = "0.25", default-features = false } -``` - -If you would like to use an asynchronous runtime besides Tokio, you can -integrate it with rust-rdkafka by providing a shim that implements the -[`AsyncRuntime`] trait. See the following examples for details: - - * [smol][runtime-smol] - * [async-std][runtime-async-std] - -## Examples - -You can find examples in the [`examples`] folder. To run them: - -```bash -cargo run --example -- -``` - -## Debugging - -rust-rdkafka uses the [`log`] crate to handle logging. -Optionally, enable the `tracing` feature to emit [`tracing`] -events as opposed to [`log`] records. - -In test and examples, rust-rdkafka uses the [`env_logger`] crate -to format logs. In those contexts, logging can be enabled -using the `RUST_LOG` environment variable, for example: - -```bash -RUST_LOG="librdkafka=trace,rdkafka::client=debug" cargo test -``` - -This will configure the logging level of librdkafka to trace, and the level -of the client module of the Rust client to debug. To actually receive logs -from librdkafka, you also have to set the `debug` option in the producer or -consumer configuration (see librdkafka -[configuration][librdkafka-config]). - -To enable debugging in your project, make sure you initialize the logger -with `env_logger::init()`, or the equivalent for any `log`-compatible -logging framework. - -[`AsyncRuntime`]: https://docs.rs/rdkafka/*/rdkafka/util/trait.AsyncRuntime.html -[`BaseConsumer`]: https://docs.rs/rdkafka/*/rdkafka/consumer/base_consumer/struct.BaseConsumer.html -[`BaseProducer`]: https://docs.rs/rdkafka/*/rdkafka/producer/base_producer/struct.BaseProducer.html -[`Future`]: https://doc.rust-lang.org/stable/std/future/trait.Future.html -[`FutureProducer`]: https://docs.rs/rdkafka/*/rdkafka/producer/future_producer/struct.FutureProducer.html -[`Stream`]: https://docs.rs/futures/*/futures/stream/trait.Stream.html -[`StreamConsumer`]: https://docs.rs/rdkafka/*/rdkafka/consumer/stream_consumer/struct.StreamConsumer.html -[`ThreadedProducer`]: https://docs.rs/rdkafka/*/rdkafka/producer/base_producer/struct.ThreadedProducer.html -[`log`]: https://docs.rs/log -[`tracing`]: https://docs.rs/tracing -[`env_logger`]: https://docs.rs/env_logger -[Apache Kafka]: https://kafka.apache.org -[asynchronous processing example]: https://github.com/fede1024/rust-rdkafka/blob/master/examples/asynchronous_processing.rs -[at-least-once delivery example]: https://github.com/fede1024/rust-rdkafka/blob/master/examples/at_least_once.rs -[runtime-smol]: https://github.com/fede1024/rust-rdkafka/blob/master/examples/runtime_smol.rs -[runtime-async-std]: https://github.com/fede1024/rust-rdkafka/blob/master/examples/runtime_async_std.rs -[broker-compat]: https://github.com/edenhill/librdkafka/blob/master/INTRODUCTION.md#broker-version-compatibility -[bytewax]: https://github.com/bytewax/bytewax -[callysto]: https://github.com/vertexclique/callysto -[`examples`]: https://github.com/fede1024/rust-rdkafka/blob/master/examples/ -[futures]: https://github.com/rust-lang/futures-rs -[kafka-benchmark]: https://github.com/fede1024/kafka-benchmark -[kafka-view]: https://github.com/fede1024/kafka-view -[librdkafka]: https://github.com/edenhill/librdkafka -[librdkafka-config]: https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md -[message delivery semantics]: https://kafka.apache.org/0101/documentation.html#semantics -[producer-transactions]: https://docs.rs/rdkafka/*/rdkafka/producer/#transactions -[rdkafka-sys-features]: https://github.com/fede1024/rust-rdkafka/tree/master/rdkafka-sys/README.md#features -[rdkafka-sys-known-issues]: https://github.com/fede1024/rust-rdkafka/tree/master/rdkafka-sys/README.md#known-issues -[smol]: https://docs.rs/smol -[timely-blog]: https://github.com/frankmcsherry/blog/blob/master/posts/2017-11-08.md -[timely-dataflow]: https://github.com/frankmcsherry/timely-dataflow -[Tokio]: https://tokio.rs/ - -## rdkafka-sys - -See [rdkafka-sys](https://github.com/fede1024/rust-rdkafka/tree/master/rdkafka-sys). - -## Contributors - -Thanks to: -* Thijs Cadier - [thijsc](https://github.com/thijsc) - -## Alternatives - -* [kafka-rust]: a pure Rust implementation of the Kafka client. - -[kafka-rust]: https://github.com/spicavigo/kafka-rust +# rust-rdkafka + +[![crates.io](https://img.shields.io/crates/v/rdkafka.svg)](https://crates.io/crates/rdkafka) +[![docs.rs](https://docs.rs/rdkafka/badge.svg)](https://docs.rs/rdkafka/) +[![Build Status](https://travis-ci.org/fede1024/rust-rdkafka.svg?branch=master)](https://travis-ci.org/fede1024/rust-rdkafka) +[![coverate](https://codecov.io/gh/fede1024/rust-rdkafka/graphs/badge.svg?branch=master)](https://codecov.io/gh/fede1024/rust-rdkafka/) +[![Join the chat at https://gitter.im/rust-rdkafka/Lobby](https://badges.gitter.im/rust-rdkafka/Lobby.svg)](https://gitter.im/rust-rdkafka/Lobby?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) + +A fully asynchronous, [futures]-enabled [Apache Kafka] client +library for Rust based on [librdkafka]. + +## The library + +`rust-rdkafka` provides a safe Rust interface to librdkafka. This version +is compatible with librdkafka v1.9.2+. + +### Documentation + +- [Current master branch](https://fede1024.github.io/rust-rdkafka/) +- [Latest release](https://docs.rs/rdkafka/) +- [Changelog](https://github.com/fede1024/rust-rdkafka/blob/master/changelog.md) + +### Features + +The main features provided at the moment are: + +- Support for all Kafka versions since 0.8.x. For more information about + broker compatibility options, check the [librdkafka + documentation][broker-compat]. +- Consume from single or multiple topics. +- Automatic consumer rebalancing. +- Customizable rebalance, with pre and post rebalance callbacks. +- Synchronous or asynchronous message production. +- Customizable offset commit. +- Create and delete topics and add and edit partitions. +- Alter broker and topic configurations. +- Access to cluster metadata (list of topic-partitions, replicas, active + brokers etc). +- Access to group metadata (list groups, list members of groups, hostnames, + etc.). +- Access to producer and consumer metrics, errors and callbacks. +- Exactly-once semantics (EOS) via idempotent and transactional producers + and read-committed consumers. + +### One million messages per second + +`rust-rdkafka` is designed to be easy and safe to use thanks to the +abstraction layer written in Rust, while at the same time being extremely +fast thanks to the librdkafka C library. + +Here are some benchmark results using the [`BaseProducer`], +sending data to a single Kafka 0.11 process running in localhost (default +configuration, 3 partitions). Hardware: Dell laptop, with Intel Core +i7-4712HQ @ 2.30GHz. + +- Scenario: produce 5 million messages, 10 bytes each, wait for all of them to be acked + - 1045413 messages/s, 9.970 MB/s (average over 5 runs) + +- Scenario: produce 100000 messages, 10 KB each, wait for all of them to be acked + - 24623 messages/s, 234.826 MB/s (average over 5 runs) + +For more numbers, check out the [kafka-benchmark] project. + +### Client types + +`rust-rdkafka` provides low level and high level consumers and producers. + +Low level: + +* [`BaseConsumer`]: a simple wrapper around the librdkafka consumer. It + must be periodically `poll()`ed in order to execute callbacks, rebalances + and to receive messages. +* [`BaseProducer`]: a simple wrapper around the librdkafka producer. As in + the consumer case, the user must call `poll()` periodically to execute + delivery callbacks. +* [`ThreadedProducer`]: a `BaseProducer` with a separate thread dedicated to + polling the producer. + +High level: + + * [`StreamConsumer`]: a [`Stream`] of messages that takes care of + polling the consumer automatically. + * [`FutureProducer`]: a [`Future`] that will be completed once + the message is delivered to Kafka (or failed). + +For more information about consumers and producers, refer to their +module-level documentation. + +*Warning*: the library is under active development and the APIs are likely +to change. + +### Asynchronous data processing with Tokio + +[Tokio] is a platform for fast processing of asynchronous events in Rust. +The interfaces exposed by the [`StreamConsumer`] and the [`FutureProducer`] +allow rust-rdkafka users to easily integrate Kafka consumers and producers +within the Tokio platform, and write asynchronous message processing code. +Note that rust-rdkafka can be used without Tokio. + +To see rust-rdkafka in action with Tokio, check out the +[asynchronous processing example] in the examples folder. + +### At-least-once delivery + +At-least-once delivery semantics are common in many streaming applications: +every message is guaranteed to be processed at least once; in case of +temporary failure, the message can be re-processed and/or re-delivered, +but no message will be lost. + +In order to implement at-least-once delivery the stream processing +application has to carefully commit the offset only once the message has +been processed. Committing the offset too early, instead, might cause +message loss, since upon recovery the consumer will start from the next +message, skipping the one where the failure occurred. + +To see how to implement at-least-once delivery with `rdkafka`, check out the +[at-least-once delivery example] in the examples folder. To know more about +delivery semantics, check the [message delivery semantics] chapter in the +Kafka documentation. + +### Exactly-once semantics + +Exactly-once semantics (EOS) can be achieved using transactional producers, +which allow produced records and consumer offsets to be committed or aborted +atomically. Consumers that set their `isolation.level` to `read_committed` +will only observe committed messages. + +EOS is useful in read-process-write scenarios that require messages to be +processed exactly once. + +To learn more about using transactions in rust-rdkafka, see the +[Transactions](producer-transactions) section of the producer documentation. + +### Users + +Here are some of the projects using rust-rdkafka: + +- [timely-dataflow]: a distributed data-parallel compute engine. See also + the [blog post][timely-blog] announcing its Kafka integration. +- [kafka-view]: a web interface for Kafka clusters. +- [kafka-benchmark]: a high performance benchmarking tool for Kafka. +- [callysto]: Stream processing framework in Rust. +- [bytewax]: Python stream processing framework using Timely Dataflow. + +*If you are using rust-rdkafka, please let us know!* + +## Installation + +Add this to your `Cargo.toml`: + +```toml +[dependencies] +rdkafka = { version = "0.25", features = ["cmake-build"] } +``` + +This crate will compile librdkafka from sources and link it statically to +your executable. To compile librdkafka you'll need: + +* the GNU toolchain +* GNU `make` +* `pthreads` +* `zlib`: optional, but included by default (feature: `libz`) +* `cmake`: optional¹, *not* included by default (feature: `cmake-build`) +* `libssl-dev`: optional, *not* included by default (feature: `ssl`) +* `libsasl2-dev`: optional, *not* included by default (feature: `gssapi`) +* `libzstd-dev`: optional, *not* included by default (feature: `zstd-pkg-config`) + +Note that using the CMake build system, via the `cmake-build` feature, is +encouraged if you can take the dependency on CMake. ¹Windows is only supported +via CMake, where it is the default build system. + +By default a submodule with the librdkafka sources pinned to a specific +commit will be used to compile and statically link the library. The +`dynamic-linking` feature can be used to instead dynamically link rdkafka to +the system's version of librdkafka. Example: + +```toml +[dependencies] +rdkafka = { version = "0.25", features = ["dynamic-linking"] } +``` + +For a full listing of features, consult the [rdkafka-sys crate's +documentation][rdkafka-sys-features]. All of rdkafka-sys features are +re-exported as rdkafka features. + +### Minimum supported Rust version (MSRV) + +The current minimum supported Rust version (MSRV) is 1.61.0. Note that +bumping the MSRV is not considered a breaking change. Any release of +rust-rdkafka may bump the MSRV. + +### Asynchronous runtimes + +Some features of the [`StreamConsumer`] and [`FutureProducer`] depend on +Tokio, which can be a heavyweight dependency for users who only intend to +use the low-level consumers and producers. The Tokio integration is +enabled by default, but can be disabled by turning off default features: + +```toml +[dependencies] +rdkafka = { version = "0.25", default-features = false } +``` + +If you would like to use an asynchronous runtime besides Tokio, you can +integrate it with rust-rdkafka by providing a shim that implements the +[`AsyncRuntime`] trait. See the following examples for details: + + * [smol][runtime-smol] + * [async-std][runtime-async-std] + +## Examples + +You can find examples in the [`examples`] folder. To run them: + +```bash +cargo run --example -- +``` + +## Debugging + +rust-rdkafka uses the [`log`] crate to handle logging. +Optionally, enable the `tracing` feature to emit [`tracing`] +events as opposed to [`log`] records. + +In test and examples, rust-rdkafka uses the [`env_logger`] crate +to format logs. In those contexts, logging can be enabled +using the `RUST_LOG` environment variable, for example: + +```bash +RUST_LOG="librdkafka=trace,rdkafka::client=debug" cargo test +``` + +This will configure the logging level of librdkafka to trace, and the level +of the client module of the Rust client to debug. To actually receive logs +from librdkafka, you also have to set the `debug` option in the producer or +consumer configuration (see librdkafka +[configuration][librdkafka-config]). + +To enable debugging in your project, make sure you initialize the logger +with `env_logger::init()`, or the equivalent for any `log`-compatible +logging framework. + +[`AsyncRuntime`]: https://docs.rs/rdkafka/*/rdkafka/util/trait.AsyncRuntime.html +[`BaseConsumer`]: https://docs.rs/rdkafka/*/rdkafka/consumer/base_consumer/struct.BaseConsumer.html +[`BaseProducer`]: https://docs.rs/rdkafka/*/rdkafka/producer/base_producer/struct.BaseProducer.html +[`Future`]: https://doc.rust-lang.org/stable/std/future/trait.Future.html +[`FutureProducer`]: https://docs.rs/rdkafka/*/rdkafka/producer/future_producer/struct.FutureProducer.html +[`Stream`]: https://docs.rs/futures/*/futures/stream/trait.Stream.html +[`StreamConsumer`]: https://docs.rs/rdkafka/*/rdkafka/consumer/stream_consumer/struct.StreamConsumer.html +[`ThreadedProducer`]: https://docs.rs/rdkafka/*/rdkafka/producer/base_producer/struct.ThreadedProducer.html +[`log`]: https://docs.rs/log +[`tracing`]: https://docs.rs/tracing +[`env_logger`]: https://docs.rs/env_logger +[Apache Kafka]: https://kafka.apache.org +[asynchronous processing example]: https://github.com/fede1024/rust-rdkafka/blob/master/examples/asynchronous_processing.rs +[at-least-once delivery example]: https://github.com/fede1024/rust-rdkafka/blob/master/examples/at_least_once.rs +[runtime-smol]: https://github.com/fede1024/rust-rdkafka/blob/master/examples/runtime_smol.rs +[runtime-async-std]: https://github.com/fede1024/rust-rdkafka/blob/master/examples/runtime_async_std.rs +[broker-compat]: https://github.com/edenhill/librdkafka/blob/master/INTRODUCTION.md#broker-version-compatibility +[bytewax]: https://github.com/bytewax/bytewax +[callysto]: https://github.com/vertexclique/callysto +[`examples`]: https://github.com/fede1024/rust-rdkafka/blob/master/examples/ +[futures]: https://github.com/rust-lang/futures-rs +[kafka-benchmark]: https://github.com/fede1024/kafka-benchmark +[kafka-view]: https://github.com/fede1024/kafka-view +[librdkafka]: https://github.com/edenhill/librdkafka +[librdkafka-config]: https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md +[message delivery semantics]: https://kafka.apache.org/0101/documentation.html#semantics +[producer-transactions]: https://docs.rs/rdkafka/*/rdkafka/producer/#transactions +[rdkafka-sys-features]: https://github.com/fede1024/rust-rdkafka/tree/master/rdkafka-sys/README.md#features +[rdkafka-sys-known-issues]: https://github.com/fede1024/rust-rdkafka/tree/master/rdkafka-sys/README.md#known-issues +[smol]: https://docs.rs/smol +[timely-blog]: https://github.com/frankmcsherry/blog/blob/master/posts/2017-11-08.md +[timely-dataflow]: https://github.com/frankmcsherry/timely-dataflow +[Tokio]: https://tokio.rs/ + +## rdkafka-sys + +See [rdkafka-sys](https://github.com/fede1024/rust-rdkafka/tree/master/rdkafka-sys). + +## Contributors + +Thanks to: +* Thijs Cadier - [thijsc](https://github.com/thijsc) + +## Alternatives + +* [kafka-rust]: a pure Rust implementation of the Kafka client. + +[kafka-rust]: https://github.com/spicavigo/kafka-rust diff --git a/changelog.md b/changelog.md index c87f5afc4..f68c80cb4 100644 --- a/changelog.md +++ b/changelog.md @@ -7,6 +7,7 @@ See also the [rdkafka-sys changelog](rdkafka-sys/changelog.md). * Update MSRV to 1.70 * Remove testign for old Kafka versions (before 3.0). Add tests for 3.7. * Fix test dependency on docker compose. +* Automatically use CMake for Windows targets ## 0.36.2 (2024-01-16) diff --git a/rdkafka-sys/Cargo.toml b/rdkafka-sys/Cargo.toml index 7ea379ef8..f16161b0a 100644 --- a/rdkafka-sys/Cargo.toml +++ b/rdkafka-sys/Cargo.toml @@ -26,6 +26,12 @@ sasl2-sys = { version = "0.1.6", optional = true } pkg-config = "0.3.9" cmake = { version = "0.1.0", optional = true } +# Windows only supports the CMake build. As we can't conditionally +# make cmake-build a default feature only on some platforms, we instead +# have to make the cmake dependency required. +[target.'cfg(target_os = "windows")'.build-dependencies] +cmake = { version = "0.1.0", optional = false } + [lib] name = "rdkafka_sys" path = "src/lib.rs" diff --git a/rdkafka-sys/README.md b/rdkafka-sys/README.md index af475d049..4aac4ea4a 100644 --- a/rdkafka-sys/README.md +++ b/rdkafka-sys/README.md @@ -1,94 +1,95 @@ -# rdkafka-sys - -Low level bindings to [librdkafka](https://github.com/edenhill/librdkafka), -a C library for the [Apache Kafka] protocol with producer, consumer, and -admin clients. - -For a safe wrapper, see the [rdkafka] crate. - -## Version - -The rdkafka-sys version number is in the format `X.Y.Z+RX.RY.RZ`, where -`X.Y.Z` is the version of this crate and follows SemVer conventions, while -`RX.RY.RZ` is the version of the bundled librdkafka. - -Note that versions before v2.0.0+1.4.2 did not follow this convention, and -instead directly correspond to the bundled librdkafka version. - -## Build - -### Known issues - -* When any of librdkafka's optional dependencies are enabled, like libz or - OpenSSL, if you have multiple versions of that library installed upon your - system, librdkafka's build system may disagree with Cargo about which - version of the library to use! **This can result in subtly broken - builds,** if librdkafka compiles against the headers for one version but - Cargo links against a different version. For complete confidence when - building release binaries, use an environment like a Docker container or a - chroot jail where you can guarantee that only one version of each - dependency is present. The current design of Cargo unfortunately makes - this nearly impossible to fix. - -* Windows is only supported when using the CMake build system via the - `cmake-build` Cargo feature. - -### Features - -By default a submodule with the librdkafka sources will be used to compile -and statically link the library. - -The **`dynamic-linking`** feature can be used to link rdkafka to a locally -installed version of librdkafka: if the feature is enabled, the build script -will use `pkg-config` to check the version of the library installed in the -system, and it will configure the compiler to dynamically link against it. -The system version of librdkafka must exactly match the version of -librdkafka bundled with this crate. - -The **`cmake-build`** feature builds librdkafka with its [CMake] build -system, rather than its default [mklove]-based build system. This feature -requires that CMake is installed on the build machine. - -The following features directly correspond to librdkafka features (i.e., -flags you would pass to `configure` if you were compiling manually). - - * The **`ssl`** feature enables SSL support. By default, the system's - OpenSSL library is dynamically linked, but static linking of the version - bundled with the [openssl-sys] crate can be requested with the - `ssl-vendored` feature. - * The **`gssapi`** feature enables SASL GSSAPI support with Cyrus - libsasl2. By default the system's libsasl2 is dynamically linked, but - static linking of the version bundled with the [sasl2-sys] crate can be - requested with the `gssapi-vendored` feature. - * The **`libz`** feature enables support for zlib compression. This - feature is enabled by default. By default, the system's libz is - dynamically linked, but static linking of the version bundled with the - [libz-sys] crate can be requested with the `libz-static` feature. - * The **`curl`** feature enables the HTTP client via curl. By default, the - system's curl is dynamically linked, but static linking of the version - bundled with the [curl-sys] create can be requested with the - `curl-static` feature. - * The **`zstd`** feature enables support for ZSTD compression. By default, - this builds and statically links the version bundled with the [zstd-sys] - crate, but dynamic linking of the system's version can be requested with - the `zstd-pkg-config` feature. - * The **`external-lz4`** feature statically links against the copy of - liblz4 bundled with the [lz4-sys] crate. By default, librdkafka - statically links against its own bundled version of liblz4. Due to - limitations with lz4-sys, it is not yet possible to dynamically link - against the system's version of liblz4. - -All features are disabled by default unless noted otherwise above. The build -process is defined in [`build.rs`]. - -[`build.rs`]: https://github.com/fede1024/rust-rdkafka/tree/master/rdkafka-sys/build.rs -[Apache Kafka]: https://kafka.apache.org -[CMake]: https://cmake.org -[libz-sys]: https://crates.io/crates/libz-sys -[curl-sys]: https://crates.io/crates/curl-sys -[lz4-sys]: https://crates.io/crates/lz4-sys -[mklove]: https://github.com/edenhill/mklove -[openssl-sys]: https://crates.io/crates/openssl-sys -[rdkafka]: https://docs.rs/rdkafka -[sasl2-sys]: https://docs.rs/sasl2-sys -[zstd-sys]: https://crates.io/crates/zstd-sys +# rdkafka-sys + +Low level bindings to [librdkafka](https://github.com/edenhill/librdkafka), +a C library for the [Apache Kafka] protocol with producer, consumer, and +admin clients. + +For a safe wrapper, see the [rdkafka] crate. + +## Version + +The rdkafka-sys version number is in the format `X.Y.Z+RX.RY.RZ`, where +`X.Y.Z` is the version of this crate and follows SemVer conventions, while +`RX.RY.RZ` is the version of the bundled librdkafka. + +Note that versions before v2.0.0+1.4.2 did not follow this convention, and +instead directly correspond to the bundled librdkafka version. + +## Build + +### Known issues + +* When any of librdkafka's optional dependencies are enabled, like libz or + OpenSSL, if you have multiple versions of that library installed upon your + system, librdkafka's build system may disagree with Cargo about which + version of the library to use! **This can result in subtly broken + builds,** if librdkafka compiles against the headers for one version but + Cargo links against a different version. For complete confidence when + building release binaries, use an environment like a Docker container or a + chroot jail where you can guarantee that only one version of each + dependency is present. The current design of Cargo unfortunately makes + this nearly impossible to fix. + +* Windows builds always use the CMake build system, regardless of whether + the `cmake-build` Cargo feature is enabled or not. This is because Windows + is only supported with CMake as the build system. + +### Features + +By default a submodule with the librdkafka sources will be used to compile +and statically link the library. + +The **`dynamic-linking`** feature can be used to link rdkafka to a locally +installed version of librdkafka: if the feature is enabled, the build script +will use `pkg-config` to check the version of the library installed in the +system, and it will configure the compiler to dynamically link against it. +The system version of librdkafka must exactly match the version of +librdkafka bundled with this crate. + +The **`cmake-build`** feature builds librdkafka with its [CMake] build +system, rather than its default [mklove]-based build system. This feature +requires that CMake is installed on the build machine. + +The following features directly correspond to librdkafka features (i.e., +flags you would pass to `configure` if you were compiling manually). + + * The **`ssl`** feature enables SSL support. By default, the system's + OpenSSL library is dynamically linked, but static linking of the version + bundled with the [openssl-sys] crate can be requested with the + `ssl-vendored` feature. + * The **`gssapi`** feature enables SASL GSSAPI support with Cyrus + libsasl2. By default the system's libsasl2 is dynamically linked, but + static linking of the version bundled with the [sasl2-sys] crate can be + requested with the `gssapi-vendored` feature. + * The **`libz`** feature enables support for zlib compression. This + feature is enabled by default. By default, the system's libz is + dynamically linked, but static linking of the version bundled with the + [libz-sys] crate can be requested with the `libz-static` feature. + * The **`curl`** feature enables the HTTP client via curl. By default, the + system's curl is dynamically linked, but static linking of the version + bundled with the [curl-sys] create can be requested with the + `curl-static` feature. + * The **`zstd`** feature enables support for ZSTD compression. By default, + this builds and statically links the version bundled with the [zstd-sys] + crate, but dynamic linking of the system's version can be requested with + the `zstd-pkg-config` feature. + * The **`external-lz4`** feature statically links against the copy of + liblz4 bundled with the [lz4-sys] crate. By default, librdkafka + statically links against its own bundled version of liblz4. Due to + limitations with lz4-sys, it is not yet possible to dynamically link + against the system's version of liblz4. + +All features are disabled by default unless noted otherwise above. The build +process is defined in [`build.rs`]. + +[`build.rs`]: https://github.com/fede1024/rust-rdkafka/tree/master/rdkafka-sys/build.rs +[Apache Kafka]: https://kafka.apache.org +[CMake]: https://cmake.org +[libz-sys]: https://crates.io/crates/libz-sys +[curl-sys]: https://crates.io/crates/curl-sys +[lz4-sys]: https://crates.io/crates/lz4-sys +[mklove]: https://github.com/edenhill/mklove +[openssl-sys]: https://crates.io/crates/openssl-sys +[rdkafka]: https://docs.rs/rdkafka +[sasl2-sys]: https://docs.rs/sasl2-sys +[zstd-sys]: https://crates.io/crates/zstd-sys diff --git a/rdkafka-sys/build.rs b/rdkafka-sys/build.rs index b7c3e42ca..a5afc5002 100644 --- a/rdkafka-sys/build.rs +++ b/rdkafka-sys/build.rs @@ -1,8 +1,6 @@ use std::borrow::Borrow; use std::env; use std::ffi::OsStr; -#[cfg(feature = "cmake-build")] -use std::fs; use std::path::{Path, PathBuf}; use std::process::{self, Command}; @@ -88,7 +86,7 @@ fn main() { } } -#[cfg(not(feature = "cmake-build"))] +#[cfg(all(not(feature = "cmake-build"), not(target_os = "windows")))] fn build_librdkafka() { let mut configure_flags: Vec = Vec::new(); @@ -200,8 +198,9 @@ fn build_librdkafka() { println!("cargo:root={}", out_dir); } -#[cfg(feature = "cmake-build")] +#[cfg(any(feature = "cmake-build", target_os = "windows"))] fn build_librdkafka() { + use std::fs; let mut config = cmake::Config::new("librdkafka"); let mut cmake_library_paths = vec![]; diff --git a/rdkafka-sys/src/lib.rs b/rdkafka-sys/src/lib.rs index 565f00bce..270a6ac95 100644 --- a/rdkafka-sys/src/lib.rs +++ b/rdkafka-sys/src/lib.rs @@ -28,8 +28,9 @@ //! dependency is present. The current design of Cargo unfortunately makes //! this nearly impossible to fix. //! -//! * Windows is only supported when using the CMake build system via the -//! `cmake-build` Cargo feature. +//! * Windows builds always use the CMake build system, regardless of whether +//! the `cmake-build` Cargo feature is enabled or not. This is because Windows +//! is only supported with CMake as the build system. //! //! ### Features //! diff --git a/src/lib.rs b/src/lib.rs index 46709c5a7..d33f74ff8 100644 --- a/src/lib.rs +++ b/src/lib.rs @@ -152,13 +152,14 @@ //! * GNU `make` //! * `pthreads` //! * `zlib`: optional, but included by default (feature: `libz`) -//! * `cmake`: optional, *not* included by default (feature: `cmake-build`) +//! * `cmake`: optional¹, *not* included by default (feature: `cmake-build`) //! * `libssl-dev`: optional, *not* included by default (feature: `ssl`) //! * `libsasl2-dev`: optional, *not* included by default (feature: `gssapi`) //! * `libzstd-dev`: optional, *not* included by default (feature: `zstd-pkg-config`) //! //! Note that using the CMake build system, via the `cmake-build` feature, is -//! encouraged if you can take the dependency on CMake. +//! encouraged if you can take the dependency on CMake. ¹Windows is only supported +//! via CMake, where it is the default build system. //! //! By default a submodule with the librdkafka sources pinned to a specific //! commit will be used to compile and statically link the library. The