Update: Full validation results + Bun runtime benchmarks
Since the initial post, this spike has gone through rigorous validation and expanded into a Bun-native runtime path. Hereβs everything we found.
What changed since v1
The ESM loader has been hardened:
- Real per-package Assets β
getTextAsync, getBinaryAsync, absoluteFilePath read actual files from the bundle (not stubs). Per-package asset context, matching legacy boot.js behavior.
- Native source maps β via
process.setSourceMapsEnabled(true) (Node 20+), replacing the legacy source-map-support monkey-patch.
- Input validation β
--format=invalid rejected with clear error message.
- Documented limitations β
METEOR_INSPECT_BRK and METEOR_PARENT_PID not supported (use standard Node flags).
Compatibility validation
Tested on a meteor create --full app with accounts-password + email added. 74 packages in load order, 14 distinct Npm.require patterns (including subpaths like nodemailer/lib/mail-composer, mongodb/package.json, native addons).
Test Legacy ESM Match
ββββββββββββββββββββββββ ββββββββββββββββββ ββββββββββββββββββ βββββ
HTTP 200 pass pass yes
HTML boilerplate size 1713 bytes 1713 bytes yes
Client JS bundle hash identical identical yes
Static JS/CSS served 200 200 yes
DDP connect pass pass yes
DDP method call pass pass yes
DDP subscription pass pass yes
Assets.getTextAsync returns content returns content yes
Mongo insertAsync pass pass yes
Mongo find + fetchAsync pass pass yes
Publication 2 docs received 2 docs received yes
Soak 1min 20 clients 22.8% timeout 23.1% timeout yes
The soak test error rate is identical β the timeouts are from the test harness (5s timeout on Mongo ops under load), not from Meteor. Both formats behave the same under sustained load.
Performance: Node legacy vs Node ESM (30 runs, trimmed mean)
App: meteor create --full + accounts-password + email.
Machine: ThinkPad P52, Linux 6.8.0, Node 22.22.0.
Metric Legacy ESM Delta
ββββββββββββββββββ ββββββββββββββ ββββββββββββββ ββββββββββββββββββ
Cold start 1,017 ms 1,041 ms neutral
HTTP boilerplate 860 req/sec 725 req/sec neutral (high var)
DDP mean latency 0.38 ms 0.44 ms neutral (high var)
DDP throughput 2,640/sec 2,332/sec neutral (high var)
RSS memory 252 MB 229 MB -9% (stable)
Honest assessment: On Node alone, the ESM format is performance-neutral β no significant gains, no significant regressions. HTTP and DDP numbers oscillate between runs (machine load, GC timing). The only stable result across 10/20/30 run campaigns is memory: ESM uses ~7-9% less RSS consistently.
The value of the ESM format on Node is architectural, not performance:
- Boot chain: 1 file (9 KB) replacing 8 files (52 KB)
- No
vm.runInThisContext, no Reify runtime, no Module.prototype patching
- Standard
import() instead of a hand-rolled module loader
Where it gets interesting: Bun runtime
The ESM format is the foundation for a Bun-native host (branch 3). This is where the real gains appear.
Architecture:
Bun.serve(:PORT)
βββ Static files β Bun.file() (zero-copy sendfile)
βββ Boilerplate β WebAppInternals.getBoilerplate() direct
βββ WebSocket β BunSocket adapter β StreamServer
βββ Other routes β Express via Unix socket (transitional)
All 4 templates tested (--bare, --minimal, --blaze, --full): HTTP, DDP, MongoDB β everything works on Bun. Zero Node process.
Bun benchmarks (vs Node legacy)
Metric Node legacy Bun ESM Delta
ββββββββββββββββββββββ ββββββββββββββ ββββββββββββββ ββββββββββ
HTTP boilerplate 884 req/sec 2,146 req/sec +143%
HTTP static JS 800KB 732 req/sec 3,419 req/sec +367%
HTTP static CSS 1KB 1,556 req/sec 17,304 req/sec +1012%
DDP roundtrip mean 0.49 ms 0.13 ms 3.8x faster
DDP roundtrip P95 0.82 ms 0.19 ms 4.3x faster
DDP sequential 2,062/sec 7,735/sec 3.75x
RSS memory 305 MB 191 MB -37%
Cold start 1,005 ms 691 ms -31%
Realistic workload (multi-client, mixed ops)
Scenario Clients x Ops Node legacy Bun ESM Delta
ββββββββββββββββββ βββββββββββββ ββββββββββββββ ββββββββββββββ ββββββ
Small team 10 x 20 3,180/sec 6,285/sec +98%
Typical SaaS 50 x 10 3,843/sec 14,832/sec +286%
Busy dashboard 100 x 5 4,754/sec 12,893/sec +171%
Traffic spike 200 x 2 9,420/sec 23,714/sec +152%
Stability
5-minute soak test, 20 clients, mixed operations (methods + subscriptions + pings):
- 108,922 ops, 0 errors, 0 reconnects
- Throughput constant: 362-364 ops/sec throughout
- RSS stable at 179 MB
- 20/20 clients active at end
What this means
The ESM format itself is a small, low-risk, opt-in change (6 files, ~500 lines, flag-gated). On Node itβs performance-neutral with slightly less memory usage.
But itβs also the prerequisite for a Bun runtime path that delivers 2-4x improvements on the metrics that matter most for Meteor apps: DDP latency, WebSocket throughput, memory footprint, and static asset serving.
The Bun host is 201 lines of code on top of the ESM format. The gains come from Bunβs native WebSocket (vs SockJS), Bun.file() zero-copy serving (vs Express send module), and direct boilerplate generation (vs Express middleware stack).
Code
Three branches, each building on the previous one:
feature/esm-bundle-format β meteor build --format=esm, ESM loader, tests, Assets support. Node only.
feature/esm-bun-compat β bootPackages/runMain split so the same ESM bundle boots on both Node and Bun.
feature/bun-only-host β Bun.serve() host, Bun.file() static serving, BunSocket DDP, benchmarks, soak test.
Detailed benchmark results: bun-only/bench/RESULTS.md
Happy to answer questions or share more details about any of the findings.