Skip to Content
MigrateMigrate from fal

Migrate from fal

If you’re already using the @fal-ai/client library, you can route requests through OpenPixels with a one-line change. Your existing code, model names, and request/response formats all stay the same.

1. Get an API key

Generate your API key here .

2. Add request middleware

import { createFalClient } from "@fal-ai/client" const client = createFalClient({ credentials: "sk-op-your-openpixels-key", requestMiddleware: (req) => ({ ...req, url: req.url .replace("fal.run", "fal.openpixels.ai") .replace("queue.fal.run", "queue-fal.openpixels.ai"), }), })

That’s it. Everything else works as before:

const result = await client.run("fal-ai/flux/dev", { input: { prompt: "a cat", image_size: { width: 1024, height: 1024 }, }, }) console.log(result.data.images[0].url)

What happens under the hood

OpenPixels accepts the same endpoints, request format, and response format as fal. When you call fal-ai/flux/dev, OpenPixels routes the request to the cheapest available provider (which may be fal, or may be a different provider that serves the same model faster or cheaper). If one provider is down, the request automatically falls through to the next one.

What’s supported

  • client.run() (sync)
  • client.subscribe() (queue-based polling)
  • client.queue.submit() / client.queue.status() / client.queue.result() (manual queue)
  • All models available on fal that OpenPixels supports

Bring your own fal key

If you have free credits or volume discounts on fal, you can add your fal key in provider settings . OpenPixels will use your key when routing to fal, and our key for other providers.