Class: LeakyBucketRateLimiter
Package:
@litert/concurrentImport path:@litert/concurrentSource: packages/partials/concurrent/src/Classes/LeakyBucketRateLimiter.tsImplements:IAsyncRateLimiter
An asynchronous rate limiter using the leaky bucket algorithm. Rather than failing immediately, calls wait (sleep) until their scheduled slot arrives. If the virtual queue is deeper than capacity slots, the call is rejected.
Constructor
new LeakyBucketRateLimiter(opts: ILeakyBucketRateLimiterOptions)See ILeakyBucketRateLimiterOptions.
Throws:
TypeError— ifcapacityorleakIntervalMsis not a positive safe integer.
Methods
challenge()
challenge(): Promise<void>Waits until the next slot is available, then resolves. Throws if the queue is already at capacity.
Throws: E_RATE_LIMITED (or custom error) if over capacity.
isBlocking()
isBlocking(): booleanReturns true if any future challenge would need to wait (i.e. the next slot is in the future).
isIdle()
isIdle(): booleanReturns true if the next slot is now or already past (no backlog).
call(fn)
call<TFn extends IFunction>(fn: TFn): Promise<Awaited<ReturnType<TFn>>>Awaits challenge(), then calls fn.
reset()
reset(): voidResets the internal schedule so the next call can proceed immediately.
wrap(fn)
wrap<T extends IFunction>(fn: T): IToPromise<T>Returns an async wrapper that awaits challenge() before each invocation.
Scoped Types
Interface ILeakyBucketRateLimiterOptions
Source: LeakyBucketRateLimiter.ts
import type { ILeakyBucketRateLimiterOptions } from '@litert/concurrent';interface ILeakyBucketRateLimiterOptions {
capacity: number;
leakIntervalMs: number;
errorCtorOnLimited?: IConstructor<Error>;
}| Property | Type | Description |
|---|---|---|
capacity | number | Maximum pending tasks (queue depth) |
leakIntervalMs | number | Milliseconds between each task being "leaked" through |
errorCtorOnLimited | IConstructor<Error>? | Custom error class; defaults to E_RATE_LIMITED |
Example
import { LeakyBucketRateLimiter } from '@litert/concurrent';
const limiter = new LeakyBucketRateLimiter({
capacity: 5, // at most 5 requests queued
leakIntervalMs: 200, // one request every 200ms
});
// Each call waits its turn
await limiter.challenge();
await sendRequest();