This Convex component pools actions and mutations to restrict parallel requests.
Suppose you have some important async work, like sending verification emails,
and some less important async work, like scraping data from an API. If all of
these are scheduled with ctx.scheduler.runAfter
, they'll compete with each
other for resources. The emails might be delayed if there are too many scraping
requests queued ahead of them.
To resolve this problem, you can separate work into different pools.
const emailPool = new Workpool(components.emailWorkpool, {
maxParallelism: 5,
});
const scrapePool = new Workpool(components.scrapeWorkpool, {
maxParallelism: 1,
});
export const signUp = mutation({
handler: async (ctx, args) => {
const userId = await ctx.db.insert("users", args);
await emailPool.enqueueAction(internal.auth.sendEmailVerification, {
userId,
});
},
});
export const downloadLatestWeather = mutation({
handler: async (ctx, args) => {
for (const city of allCities) {
await scrapePool.enqueueAction(internal.weather.scrape, { city });
}
},
});
With limited parallelism, you can reduce OCC errors from mutations that read and write the same data.
Consider this action that calls a mutation to increment a singleton counter.
By calling the mutation on a workpool with maxParallelism: 1
, it will never
throw an error due to conflicts with parallel mutations.
const counterPool = new Workpool(components.counterWorkpool, {
maxParallelism: 1,
});
export const doSomethingAndCount = action({
handler: async (ctx) => {
const doSomething = await fetch("https://example.com");
await counterPool.enqueueMutation(internal.counter.increment, {});
},
});
// This mutation is prone to conflicting with itself, because it always reads
// and writes the same data. By running it in a workpool with low parallelism,
// it will run serially.
export const increment = internalMutation({
handler: async (ctx) => {
const countDoc = await ctx.db.query("counter").unique();
await ctx.db.patch(countDoc!._id, { count: countDoc!.count + 1 });
},
});
Effectively, Workpool runs async functions similar to
ctx.scheduler.runAfter(0, ...)
, but it limits the number of functions that
can run in parallel.
You'll need an existing Convex project to use the component. Convex is a hosted backend platform, including a database, serverless functions, and a ton more you can learn about here.
Run npm create convex
or follow any of the quickstarts to set one up.
See example/
for a working demo.
- Install the Workpool component:
npm install @convex-dev/workpool
- Create a
convex.config.ts
file in your app'sconvex/
folder and install the component by callinguse
:
// convex/convex.config.ts
import { defineApp } from "convex/server";
import workpool from "@convex-dev/workpool/convex.config";
const app = defineApp();
app.use(workpool, { name: "emailWorkpool" });
app.use(workpool, { name: "scrapeWorkpool" });
export default app;
import { components } from "./_generated/api";
import { Workpool } from "@convex-dev/workpool";
const pool = new Workpool(components.emailWorkpool, {
maxParallelism: 10,
// More options available, such as:
ttl: 7 * 24 * 60 * 60 * 1000,
});
Then you have the following interface on pool
:
// Schedule functions to run in the background.
const id = await pool.enqueueMutation(internal.foo.bar, args);
const id = await pool.enqueueAction(internal.foo.bar, args);
// Is it done yet? Did it succeed or fail?
const status = await pool.status(id);
// You can cancel the work, if it hasn't finished yet.
await pool.cancel(id);
See more example usage in example.ts.
The benefit of Workpool is that it won't fall over if there are many jobs scheduled at once, and it allows you to throttle low-priority jobs.
However, Workpool has some overhead and can slow down your workload compared
to using ctx.scheduler
directly.
Since each Workpool has some overhead -- each runs several functions to coordinate its work -- don't create too many of them.
If you're running into issues with too many concurrent functions, there are alternatives to Workpool:
- Try combining multiple mutations into a single mutation, with batching or debouncing.
- Call plain TypeScript functions if possible.
- In particular, an action calling
ctx.runAction
has more overhead than just calling the action's handler directly.
- In particular, an action calling
See best practices for more.