Rust macro utility for batching expensive async operations.
cargo add batched
Or add this to your Cargo.toml
:
[dependencies]
batched = "0.1.3"
- window: Minimum amount of time (in milliseconds) the background thread waits before processing a batch.
- limit: Maximum amount of items that can be grouped and processed in a single batch.
- concurrent: Maximum amount of concurrent batched tasks running (default:
Infinity
) - boxed: Automatically wraps the return type in an
Arc
The target function must have a single argument, a vector of items (Vec<T>
). The return value is propagated to all async calls made for the batch items.
The target function return type must implement Clone
to propagate the result. If it cannot implement Clone
, use the boxed
option to automatically wrap your return type in an Arc
.
- Built for async environments (tokio), will not work without a tokio async runtime
- Target function must have async, andthe function name should end with
_batched
- Not supported inside structs
struct A;
impl A {
// NOT SUPPORTED
#[batched(window = 1000, limit = 100)]
fn operation() {
...
}
}
#[batched(window = 100, limit = 1000)]
async fn add(numbers: Vec<u32>) -> u32 {
numbers.iter().sum()
}
async fn main() {
for _ in 0..99 {
tokio::task::spawn(async move {
add(1).await
});
}
let result = add(1).await;
assert_eq!(result, 100);
}
use batched::batched;
// Creates functions [`insert_message`] and [`insert_message_multiple`]
#[batched(window = 100, limit = 100_000, boxed)]
async fn insert_message_batched(messages: Vec<String>) -> Result<(), anyhow::Error> {
let pool = PgPool::connect("postgres://user:password@localhost/dbname").await?;
let mut query = String::from("INSERT INTO messages (content) VALUES ");
...
}
#[post("/message")]
async fn service(message: String) -> Result<(), anyhow::Error> {
insert_message(message).await?;
Ok(())
}
#[post("/bulk_messages")]
async fn service(messages: Vec<String>) -> Result<(), anyhow::Error> {
insert_message_multiple(messages).await?;
Ok(())
}