Over 5 years ago I wrote about overhauling a JS parsing library by rewriting it in Rust and compiling to Wasm.1 I smugly implied that this would go unnoticed by users:

By inlining as base64 we make the Wasm an implementation detail

This was the wrong mentality 5 years ago, and it’s the wrong mentality today. It took me a couple years to fix the recommendation2, but I still see new tooling built that makes the same mistakes as I did3.

Why isn’t Wasm an implementation and what can we do about it?

Compatibility

99%+ of browser users have at least some support for Wasm based on caniuse4.

That percentage drops to 97% if that Wasm uses fixed-width SIMD instructions.

And drops precipitously to 81% when using relaxed SIMD.

These are all features that are standardized as part of Wasm 3.0.

Ideally, one could ship a single Wasm bundle and the fastest can be selected at runtime, but Wasm compilation will fail if it contains any unsupported op codes. The best we can do is cater to the lowest common denominator or ship multiple Wasm bundles with our own JS-powered Wasm feature detection5

Not Just Browsers

It’s 2025 and people have been running JS outside of the browser for more than a decade (AWS lambda was launched in 2014 with Node.js support). Full featured runtimes like Node.js, Deno, and Bun have great support for Wasm, but they don’t capture the entire market.

There are runtimes like AWS’s LLRT (built on QuickJS) that do not support Wasm and likely never will.6

Then there are platforms that allow the execution of Wasm, but not the compilation. Cloudflare Workers is the prototypical example, which disables compilation for security reasons.7 Any attempt will be met with a “Wasm code generation disallowed by embedder” error.

To execute Wasm on Cloudflare workers, one simply imports a .wasm file directly. The runtime will hand you back an already compiled WebAssembly.Module. So the library needs to expose a way for users to inject an already compiled module.

import { MyLibrary } from "my-library";
import myLibraryModule from "my-library/lib.wasm";

// Synchronous initialization!
const app = MyLibrary.initSync({ wasm: myLibraryModule });

This fact of forbidding Wasm compilation does not seem to be well known as evident by my chat with Gemini where I even push back on their answers:

[Nick]: I have an arraybuffer in cloudflare workers. Will I be able to compile this as a WebAssembly module?

[Gemini]: Yes, you will be able to compile […]

[Nick]: Are you sure? I distinctly remember cloudflare workers forbidding wasm compilation in the worker itself

[Gemini]: You’re touching on a point that was historically true, but the situation has since changed! 🕰️

[Gemini]: Yes, I’m sure that you can now compile WebAssembly modules within a Cloudflare Worker using WebAssembly.compile() or WebAssembly.instantiate()

ChatGPT fared no better:

from the platform side, you’re good: Workers can compile and run WebAssembly modules from an ArrayBuffer or Uint8Array

These are false and misleading statements! Unequivocally, one can not compile Wasm modules from an ArrayBuffer on Cloudflare Workers.

Performance

Inlining the Wasm in the JS bundle as a base64 string does not come free:

  • Base64 encoded Wasm can’t take advantage of code caching8
  • Additional bandwidth for transmitting the base64 data. This will vary depending on the workload, but I’ve observed 50% increases in bundle size after compression.
  • Additional compute to parse the base64 string and decode it

Solution

Ship the Wasm as part of the NPM package.json:

{
  "exports": {
    "./lib.wasm": "./dist/lib.wasm"
  }
}

This means for those coming from a Rust and wasm-pack background, where a package.json is autogenerated for you: do not publish that and prioritize migrating off wasm-pack.9

We’ll have users write their code that is conducive to their bundler of choice. For vite targeting a browser, I’d expose asynchronous initialization on my library called init where users can pass in a bundler-digested asset path to the Wasm:

import { MyLibrary } from "my-library";
import wasmUrl from "my-library/lib.wasm?url";

const myLibTask = MyLibrary.init({ module_or_path: wasmUrl });

async function someFn() {
    const myLib = await myLibTask;
}

Also expose synchronous initialization for users that are working with Wasm module instances, like on Cloudflare Workers:

import { MyLibrary } from "my-library";
import wasmModule from "my-library/lib.wasm";

const myLib = MyLibrary.initSync(wasmModule);

// function initSync(module: WebAssembly.Module): MyLibrary

Remember how Wasm doesn’t have feature detection? So as Wasm is compiled with different flags, each bundle needs to be exposed separately. Let’s say we have a relaxed SIMD enabled build and one without. We should export both in the package:

{
  "exports": {
    "./lib-simd.wasm": "./dist/lib-simd.wasm",
    "./lib.wasm": "./dist/lib.wasm"
  }
}

And give the user the option of deciding what to pass in. They could pass in both build URLs and have our library do feature detection to know which one to load.

import { MyLibrary } from "my-library";
import wasmUrl from "my-library/lib.wasm?url";
import wasmSimdUrl from "my-library/lib-simd.wasm?url";

const myLibTask = MyLibrary.init({
  wasm: {
    sisd: wasmUrl,
    simd: wasmSimdUrl
  }
});

Or when the user can assume SIMD support (or know they can’t support it), passing in a single Wasm url should work too:

import { MyLibrary } from "my-library";
import wasmSimdUrl from "my-library/lib-simd.wasm?url";

// I'm only targeting Chrome 114+, so we can assume relaxed simd 
const myLibTask = MyLibrary.init({
  wasm:  wasmSimdUrl,
});

This solution covers the widest number of platforms with the best performance. It should be your goto solution.

Only if it is extremely beneficial, should one offer a build that contains the Wasm inlined. Ideally this build should be described with a word that has a negative connotation so that it’s clear it should not be the default

{
  "exports": {
    "./fat-inlined": "./dist/fat-inlined.js",
  }
}

What about ergonomics?

The solution receives inevitable pushback about how DX suffers:

  • Shifting the responsibility of wiring the Wasm to users
  • Taking what may have been an synchronous API for initialization and turning it async

Both complaints are true and valid. Wasm is inherently more difficult to consume than JS and can taint a perfectly synchronous API with async.

For the wiring complaint: in the solution you saw how easy it was for a user to wire the Wasm. No plugins needed. Vite handles it right out of the box. Best case scenario: users write a single line of code to handle the Wasm import. Worse case scenario: users need to scour how to achieve the same behavior for their specific bundler.

The second complaint is far more insidious as there are “workarounds”, where one gets a little too clever for their own good.

One workaround is to combine async initialization with a synchronous function, so everything becomes async, but the result is that users will now be surprised to find out they are running potentially heavy compute on the main thread instead of it getting offloaded to a background thread.

The other solution is to put the Wasm initialization behind a top level await. On the surface this appears quite nice:

import { MyLibrary } from "my-libray";

const lib = new MyLibrary();

And the implementation could use something like the vite-plugin-wasm or structure it manually like:

import { MyLibraryImpl } from "./lib"
import someWasm from "./lib.wasm";

await MyLibraryImpl.init({ module_or_path: someWasm });

// ...
export class MyLibrary {

}

I’ve documented before about how top level async can wreak havoc on developer intuitions10, as the top-level await has the side effect of halting module graph evaluation, which can lead to race conditions.

Still, there is resistance:11

JS devs assume APIs are synchronous unless they deal with I/O in some way, and it’s truly rare to have async initialization of synchronous APIs. There’s a lot of muscle memory there to overcome, and I’m trying to avoid the cognitive dissonance.

This argument doesn’t hold water. Papering over the inherently async nature of Wasm in the name of DX is a noble endeavor but it’s too paper thin

Perhaps it is presumptuous, but I’d wager most developers would be surprised by an import statement making a network call and blocking subsequent statements.

import { abc } from "./my-lib.ts";

console.log("I'm blocked on the network call!");

And given the option to remove the side effect in favor of async initialization, I don’t think it’d even be a contest. I expect imports to be synchronous and so do you. The top-level await is masking an I/O operation and I think we’re in agreement that I/O should be behind an asynchronous interface.

It’s a bummer that Wasm will always have a worse developer experience than something written in pure JS, but that is a price worth paying.

Any improvements on the ergonomics of initialization and bundling of Wasm in JS libraries will inevitably lead to performance, compatibility, and intuition degradation.