Class: LeakyBucketRateLimiterManager
Package:
@litert/concurrentImport path:@litert/concurrentSource: packages/partials/concurrent/src/Classes/LeakyBucketRateLimiterManager.tsImplements:IAsyncRateLimiterManager
Manages a collection of per-key leaky-bucket rate limiters. Each key gets its own scheduling queue so concurrent calls to different keys do not interfere.
Constructor
new LeakyBucketRateLimiterManager(opts: ILeakyBucketRateLimiterManagerOptions)See ILeakyBucketRateLimiterManagerOptions.
Throws:
TypeError— ifcapacityorleakIntervalMsis not a positive safe integer.TypeError— ifcleanDelayMsis negative.
Methods
challenge(key)
challenge(key: string): Promise<void>Waits until the next available slot for key, then resolves. Throws if the queue for that key is already at capacity.
Throws: E_RATE_LIMITED (or custom error).
isBlocking(key)
isBlocking(key: string): booleanReturns true if a challenge for key would need to wait.
call(key, fn)
call<T extends IFunction>(key: string, fn: T): Promise<Awaited<ReturnType<T>>>Awaits challenge(key), then calls fn.
reset(key)
reset(key: string): voidResets the schedule for key so the next call proceeds immediately.
size()
size(): numberReturns the number of active key contexts.
clean()
clean(): voidRemoves key contexts that are currently idle and have been so for at least cleanDelayMs milliseconds.
Example
import { LeakyBucketRateLimiterManager } from '@litert/concurrent';
const manager = new LeakyBucketRateLimiterManager({
capacity: 5,
leakIntervalMs: 200,
cleanDelayMs: 10_000,
});
async function handleApiCall(userId: string) {
await manager.challenge(userId);
return callApi(userId);
}