Using Web Workers for In-Browser Compilation
A technical guide to offloading TypeScript compilation and other heavy processing to Web Workers, keeping the main thread responsive while maintaining a seamless developer experience.
Compiling TypeScript in the browser is computationally expensive. Even a small file with a few imports can take 50-200 milliseconds to parse, type-check, and emit. During that time, the main thread is blocked: the editor stops responding to keystrokes, animations freeze, and the entire interface feels sluggish. Web Workers solve this problem by moving compilation to a background thread, but the solution introduces its own complexity around communication, data serialization, and error handling. This article details how Kodepad uses Web Workers for TypeScript compilation and other heavy processing tasks, covering worker architecture, message protocols, transfer optimizations, and practical patterns for keeping the main thread free.
Worker Architecture Overview
Kodepad uses a dedicated worker for TypeScript compilation. Unlike shared workers or service workers, dedicated workers have a one-to-one relationship with the main thread, which simplifies the communication model. The worker is created once when the application starts and reused for all compilation requests:
// src/workers/compiler-worker.ts
// This file runs in the Web Worker context
import ts from "typescript";
interface CompileRequest {
type: "compile";
id: string;
source: string;
filename: string;
options: ts.CompilerOptions;
}
interface CompileResponse {
type: "compile-result";
id: string;
code: string;
diagnostics: SerializedDiagnostic[];
duration: number;
}
interface SerializedDiagnostic {
messageText: string;
line: number;
column: number;
category: number;
}
// Handle messages from the main thread
self.addEventListener("message", (event: MessageEvent<CompileRequest>) => {
const { type, id, source, filename, options } = event.data;
if (type === "compile") {
const start = performance.now();
const result = compile(source, filename, options);
const response: CompileResponse = {
type: "compile-result",
id,
code: result.code,
diagnostics: result.diagnostics,
duration: performance.now() - start,
};
self.postMessage(response);
}
});
function compile(
source: string,
filename: string,
options: ts.CompilerOptions
): { code: string; diagnostics: SerializedDiagnostic[] } {
let outputCode = "";
const diagnostics: SerializedDiagnostic[] = [];
const host = createInMemoryHost(source, filename);
host.writeFile = (name, text) => {
if (name.endsWith(".js")) outputCode = text;
};
const program = ts.createProgram([filename], options, host);
program.emit();
for (const d of ts.getPreEmitDiagnostics(program)) {
if (d.file && d.start !== undefined) {
const pos = d.file.getLineAndCharacterOfPosition(d.start);
diagnostics.push({
messageText: ts.flattenDiagnosticMessageText(d.messageText, "\n"),
line: pos.line + 1,
column: pos.character + 1,
category: d.category,
});
}
}
return { code: outputCode, diagnostics };
}
function createInMemoryHost(
source: string,
filename: string
): ts.CompilerHost {
const sourceFile = ts.createSourceFile(
filename,
source,
ts.ScriptTarget.ES2020,
true
);
return {
getSourceFile: (name) => (name === filename ? sourceFile : undefined),
getDefaultLibFileName: () => "lib.d.ts",
writeFile: () => {},
getCurrentDirectory: () => "/",
getCanonicalFileName: (f) => f,
useCaseSensitiveFileNames: () => true,
getNewLine: () => "\n",
fileExists: (name) => name === filename,
readFile: (name) => (name === filename ? source : undefined),
};
}The worker imports the TypeScript compiler at load time. This means the first message takes slightly longer as the worker initializes, but subsequent compilations start immediately. The TypeScript compiler is approximately 10MB uncompressed, so loading it in a worker avoids blocking the main thread during initialization.
The Main Thread Client
The main thread communicates with the worker through a typed client that wraps the raw postMessage API in a promise-based interface:
// src/workers/compiler-client.ts
class CompilerClient {
private worker: Worker;
private pending: Map<string, {
resolve: (result: CompileResult) => void;
reject: (error: Error) => void;
timeout: ReturnType<typeof setTimeout>;
}> = new Map();
constructor() {
this.worker = new Worker(
new URL("./compiler-worker.ts", import.meta.url),
{ type: "module" }
);
this.worker.addEventListener("message", this.handleMessage.bind(this));
this.worker.addEventListener("error", this.handleError.bind(this));
}
compile(
source: string,
filename: string = "main.ts",
options: ts.CompilerOptions = {}
): Promise<CompileResult> {
return new Promise((resolve, reject) => {
const id = crypto.randomUUID();
const timeout = setTimeout(() => {
this.pending.delete(id);
reject(new Error("Compilation timed out after 10 seconds"));
}, 10_000);
this.pending.set(id, { resolve, reject, timeout });
this.worker.postMessage({
type: "compile",
id,
source,
filename,
options,
});
});
}
private handleMessage(event: MessageEvent): void {
const { id, code, diagnostics, duration } = event.data;
const pending = this.pending.get(id);
if (pending) {
clearTimeout(pending.timeout);
this.pending.delete(id);
pending.resolve({ code, diagnostics, duration });
}
}
private handleError(event: ErrorEvent): void {
// Worker-level error: reject all pending compilations
for (const [id, pending] of this.pending) {
clearTimeout(pending.timeout);
pending.reject(new Error(`Worker error: ${event.message}`));
}
this.pending.clear();
}
dispose(): void {
for (const [, pending] of this.pending) {
clearTimeout(pending.timeout);
pending.reject(new Error("Compiler disposed"));
}
this.pending.clear();
this.worker.terminate();
}
}
interface CompileResult {
code: string;
diagnostics: SerializedDiagnostic[];
duration: number;
}The promise-based wrapper provides several benefits over raw message passing. It enables async/await usage in the calling code, it handles timeouts for compilation that hang, and it properly propagates worker-level errors to all pending callers.
Request Cancellation and Deduplication
When the user types quickly, compilation requests pile up. Compiling every intermediate state wastes CPU time in the worker. Kodepad implements request cancellation so that only the most recent compilation request is processed:
class CancelableCompilerClient {
private client: CompilerClient;
private currentRequestId: string | null = null;
constructor(client: CompilerClient) {
this.client = client;
}
async compileLatest(
source: string,
filename?: string
): Promise<CompileResult | null> {
const requestId = crypto.randomUUID();
this.currentRequestId = requestId;
const result = await this.client.compile(source, filename);
// If a newer request has been made, discard this result
if (this.currentRequestId !== requestId) {
return null;
}
return result;
}
}
// Usage in the preview pipeline
class PreviewController {
private compiler: CancelableCompilerClient;
constructor(compiler: CancelableCompilerClient) {
this.compiler = compiler;
}
async onEditorChange(source: string): Promise<void> {
const result = await this.compiler.compileLatest(source);
if (result === null) {
// A newer compilation superseded this one
return;
}
if (result.diagnostics.length === 0) {
this.updatePreview(result.code);
}
this.updateDiagnostics(result.diagnostics);
}
private updatePreview(code: string): void {
// Send compiled code to the sandbox iframe
}
private updateDiagnostics(diagnostics: SerializedDiagnostic[]): void {
// Update editor markers
}
}This pattern ensures that even if the worker processes requests sequentially, the main thread only acts on the latest result. Stale results are silently discarded.
Transferable Objects for Large Payloads
postMessage serializes data using the structured clone algorithm, which copies the data between threads. For large payloads, this copying overhead is significant. Transferable objects, such as ArrayBuffer, can be transferred without copying, moving ownership from one thread to another in constant time:
// Using transferable objects for large compilation outputs
// Worker side
function sendLargeResult(code: string, id: string): void {
const encoder = new TextEncoder();
const encoded = encoder.encode(code);
const buffer = encoded.buffer;
self.postMessage(
{
type: "compile-result",
id,
codeBuffer: buffer,
codeLength: encoded.length,
},
{ transfer: [buffer] }
);
// After transfer, buffer is no longer accessible in the worker
}
// Main thread side
function receiveLargeResult(data: {
codeBuffer: ArrayBuffer;
codeLength: number;
}): string {
const decoder = new TextDecoder();
const view = new Uint8Array(data.codeBuffer, 0, data.codeLength);
return decoder.decode(view);
}In practice, TypeScript compilation output for playground-sized files is small enough that structured cloning is not a bottleneck. But if you extend the worker to handle tasks like code formatting, bundling, or source map processing, transferable objects can make a measurable difference.
Worker Pool for Parallel Tasks
A single compilation worker is sufficient for Kodepad's primary use case, but the platform also runs linting, formatting, and type declaration loading. A worker pool distributes these tasks across multiple background threads:
class WorkerPool<TRequest, TResponse> {
private workers: Worker[] = [];
private queue: Array<{
request: TRequest;
resolve: (response: TResponse) => void;
reject: (error: Error) => void;
}> = [];
private available: Set<number> = new Set();
constructor(
workerUrl: URL,
poolSize: number = navigator.hardwareConcurrency || 4
) {
// Cap pool size to avoid overwhelming low-end devices
const size = Math.min(poolSize, 4);
for (let i = 0; i < size; i++) {
const worker = new Worker(workerUrl, { type: "module" });
worker.addEventListener("message", (event) => {
this.onWorkerDone(i, event.data);
});
this.workers.push(worker);
this.available.add(i);
}
}
submit(request: TRequest): Promise<TResponse> {
return new Promise((resolve, reject) => {
const task = { request, resolve, reject };
const workerId = this.getAvailableWorker();
if (workerId !== null) {
this.dispatch(workerId, task);
} else {
this.queue.push(task);
}
});
}
private getAvailableWorker(): number | null {
const first = this.available.values().next();
return first.done ? null : first.value;
}
private dispatch(
workerId: number,
task: { request: TRequest; resolve: (r: TResponse) => void; reject: (e: Error) => void }
): void {
this.available.delete(workerId);
(this.workers[workerId] as any).__currentTask = task;
this.workers[workerId].postMessage(task.request);
}
private onWorkerDone(workerId: number, response: TResponse): void {
const task = (this.workers[workerId] as any).__currentTask;
if (task) {
task.resolve(response);
delete (this.workers[workerId] as any).__currentTask;
}
// Process next queued task or mark worker as available
if (this.queue.length > 0) {
const next = this.queue.shift()!;
this.dispatch(workerId, next);
} else {
this.available.add(workerId);
}
}
dispose(): void {
this.workers.forEach((w) => w.terminate());
this.queue.forEach((t) => t.reject(new Error("Pool disposed")));
this.queue = [];
}
}The pool size defaults to navigator.hardwareConcurrency but is capped at 4 to avoid creating too many threads on machines with many cores, where the overhead of thread management outweighs the parallelism benefit for our workload.
Conclusion
Web Workers are essential for maintaining a responsive code editor while performing heavy computation like TypeScript compilation. The key patterns are straightforward: dedicate a worker to compilation, wrap postMessage in a promise-based client, cancel stale requests so only the latest result matters, and use transferable objects when payload sizes warrant it.
The most impactful improvement is simply moving compilation off the main thread. The editor remains responsive during compilation, and users experience smooth typing regardless of how complex their code is. Everything else, cancellation, pools, transfers, is optimization on top of that fundamental architectural decision. If you are building any tool that runs a compiler, linter, or formatter in the browser, a Web Worker is not an optimization; it is a requirement.
Related Articles
Rapid Prototyping: From Idea to Demo in Minutes
How browser-based code playgrounds enable rapid prototyping workflows that compress the journey from concept to working demo, and why this capability is a competitive advantage for engineering teams.
Code Playgrounds in Developer Education
How code playgrounds are transforming developer education by providing immediate feedback, reducing setup barriers, and enabling interactive learning experiences that scale.
Developer Tools That Boost Productivity
An exploration of how browser-based developer tools like code playgrounds reduce friction in the development workflow, accelerating everything from prototyping to debugging to knowledge sharing.