The Emitter Pipeline: Transformations, Code Generation, and Declaration Files
Prerequisites
- ›Articles 1-4: Full compiler pipeline through type checking
- ›Understanding of JavaScript module systems (CommonJS, ESM, AMD)
- ›Familiarity with source maps and code generation concepts
The Emitter Pipeline: Transformations, Code Generation, and Declaration Files
After scanning, parsing, binding, and type checking, the compiler's final job is to produce output: JavaScript files, declaration files (.d.ts), and source maps. The emitter pipeline doesn't work by walking the AST and printing strings — it works through transformations: a chain of AST-to-AST passes that progressively lower TypeScript and modern JavaScript syntax into the target output level, followed by a printer that serializes the final AST to text.
This architecture is one of the most elegant aspects of the TypeScript compiler. Each transformer handles one concern (strip types, lower class fields, convert JSX, emit CommonJS modules), and they compose cleanly because they all operate on the same AST representation.
emitFiles and the Transformer Chain
The emit pipeline starts with emitFiles() in src/compiler/emitter.ts. It receives an EmitResolver (the checker's emit-facing API), an EmitHost, an EmitTransformers object containing both script and declaration transformers, and various flags.
flowchart LR
Program["program.emit()"] --> EF["emitFiles()"]
EF --> GT["getTransformers(options)"]
GT --> Chain["Transformer Chain"]
Chain --> T1["transformTypeScript"]
T1 --> T2["transformJsx"]
T2 --> T3["transformESNext"]
T3 --> T4["transformClassFields"]
T4 --> T5["transformES20xx"]
T5 --> T6["transformModule"]
T6 --> Printer["Printer → text output"]
Printer --> JS[".js file"]
Printer --> Map[".js.map file"]
EF --> DT["Declaration Transforms"]
DT --> TD["transformDeclarations"]
TD --> DPrinter["Printer → .d.ts output"]
The getTransformers() function in src/compiler/transformer.ts assembles the ordered chain based on compiler options. The ordering is significant — each transformer expects certain syntax to still be present from the original source, and later transformers depend on earlier ones having already lowered their target syntax.
function getScriptTransformers(compilerOptions, customTransformers, emitOnly) {
const transformers = [];
addRange(transformers, customTransformers?.before);
transformers.push(transformTypeScript); // Always first: strip types
if (compilerOptions.experimentalDecorators) {
transformers.push(transformLegacyDecorators);
}
if (getJSXTransformEnabled(compilerOptions)) {
transformers.push(transformJsx);
}
// ESNext, ESDecorators, ClassFields...
// Then progressive downleveling: ES2021 → ES2020 → ... → ES2015
if (languageVersion < ScriptTarget.ES2015) {
transformers.push(transformES2015);
transformers.push(transformGenerators);
}
transformers.push(getModuleTransformer(moduleKind)); // Always last
addRange(transformers, customTransformers?.after);
return transformers;
}
Tip: Custom transformer plugins (from build tools like ts-patch or ttypescript) hook in via the
customTransformers.beforeandcustomTransformers.afterarrays. Understanding this chain order is critical for writing plugins that interact correctly with built-in transforms.
Key Transformers: TS Stripping, Modules, JSX, and Downleveling
TypeScript → JavaScript Transform
transformTypeScript is always the first transformer in the chain. It:
- Strips type annotations: Type references, type assertions,
asexpressions, type-only imports/exports — all removed - Converts enums:
enum Color { Red, Green, Blue }becomes an IIFE that builds an object with forward and reverse mappings - Converts namespaces:
namespace Foo { ... }becomes an IIFE that populates aFooobject - Handles parameter properties:
constructor(public x: number)generatesthis.x = xassignment - Removes
declareblocks: Ambient declarations produce no output
Module Transforms
The module transformer selection in getModuleTransformer() dispatches based on moduleKind:
flowchart TD
MK["moduleKind"] --> Preserve["Preserve → transformECMAScriptModule"]
MK --> Node["ESNext/ES2022/ES2020/ES2015/Node16/Node18/Node20/NodeNext/CommonJS → transformImpliedNodeFormatDependentModule"]
MK --> System["System → transformSystemModule"]
MK --> Default["AMD/UMD/Default → transformModule"]
The transformImpliedNodeFormatDependentModule is particularly interesting — it wraps both transformModule (CJS output) and transformECMAScriptModule (ESM output), selecting between them based on each file's impliedNodeFormat. This is how --module nodenext can produce CJS for .cts files and ESM for .mts files in the same compilation.
The main transformModule handles CommonJS, AMD, and UMD output — converting import/export statements into require() calls, Object.defineProperty(exports, ...) assignments, and the AMD/UMD factory wrappers.
JSX Transform
transformJsx converts JSX syntax into function calls. For --jsx react, <div className="x"> becomes React.createElement("div", { className: "x" }). For --jsx react-jsx, it becomes _jsx("div", { className: "x" }) with an auto-import of the JSX runtime.
Class Fields and Decorators
transformClassFields handles the nuanced lowering of class field declarations, including the interaction between useDefineForClassFields and [[Define]] vs [[Set]] semantics. transformESDecorators implements the TC39 stage 3 decorator proposal, while transformLegacyDecorators handles the older --experimentalDecorators behavior.
ES Downleveling
The transformES20xx family progressively lowers modern syntax to older targets:
| Transformer | Lowers |
|---|---|
transformESNext |
Latest proposals not yet at a stable target |
transformES2021 |
Logical assignment (??=, ` |
transformES2020 |
Optional chaining (?.), nullish coalescing (??) |
transformES2019 |
Array.flat, optional catch binding |
transformES2018 |
Async iteration, object rest/spread |
transformES2017 |
async/await → promise chains |
transformES2016 |
Exponentiation operator (**) |
transformES2015 |
Arrow functions, classes, destructuring, for-of, template literals |
transformGenerators |
Generator functions → state machines |
Each transform is only included if the target is below the corresponding ES level. Targeting ES2020 skips all transforms from ES2020 and above.
Declaration File Generation
Declaration files (.d.ts) are generated by a separate transform chain. transformDeclarations walks the AST and:
- Strips implementation bodies: Function bodies become
;, class method bodies are removed - Preserves type signatures: Parameter types, return types, property types, generic constraints
- Resolves inferred types: When a function lacks a return type annotation, the transformer calls back into the checker to compute the inferred type, then uses
expressionToTypeNodeto materialize it as AST syntax - Filters by visibility: Only exported declarations appear in the output
- Handles re-exports:
export { Foo } from './bar'must trace to the original declaration
flowchart TD
SF["SourceFile AST"] --> DT["transformDeclarations"]
DT --> Strip["Strip function bodies"]
DT --> Visibility["Filter: only exported"]
DT --> Infer["Resolve inferred types"]
Infer --> E2TN["expressionToTypeNode()"]
E2TN --> Checker["Checker: getTypeOfSymbol()"]
Checker --> TypeNode["TypeNode AST"]
DT --> DTS[".d.ts SourceFile"]
DTS --> Printer["Printer → .d.ts text"]
The expressionToTypeNode bridge (introduced in Part 4) is essential here. Consider:
export function getConfig() {
return { debug: true, level: 3 };
}
The source has no return type annotation, but the .d.ts must include one: export function getConfig(): { debug: boolean; level: number; }. The declaration transformer asks the checker for the inferred return type, then converts that internal Type object back into an AST type node.
Tip: If you're investigating declaration emit issues (especially with
isolatedDeclarations), the path is:transformDeclarations→ checker callback →expressionToTypeNode. TheisolatedDeclarationsfeature introduced in TypeScript 5.5 restricts this pattern, requiring explicit annotations so that declaration emit doesn't need the checker at all.
Emit Helpers and Source Maps
Runtime Helpers
When a transformer lowers syntax, it sometimes needs runtime helper functions. For example, lowering async/await requires __awaiter, lowering class inheritance requires __extends, and lowering rest parameters requires __rest.
The src/compiler/factory/emitHelpers.ts module defines these helpers. Each helper is an EmitHelper object with a name, text (the helper function body), and flags indicating whether it's scoped or unscoped.
Two strategies for including helpers:
- Inline: The helper function is emitted into each output file that needs it. This is the default behavior.
- External (
--importHelpers): Output files import helpers from thetslibnpm package instead. This avoids duplicating helper code across many output files.
The transformer requests helpers via context.requestEmitHelper(), and the emitter collects them during serialization.
Source Map Generation
The src/compiler/sourcemap.ts module implements source map generation — tracking the position mapping between original TypeScript source positions and emitted JavaScript positions. As the printer walks the transformed AST and writes output text, it records source map entries for each node that has an original position.
The source map generator produces standard V3 source maps with Base64 VLQ-encoded mappings. Source maps can be emitted as separate .js.map files (--sourceMap) or inlined into the JavaScript output as data URIs (--inlineSourceMap). For declaration files, --declarationMap produces .d.ts.map files that map back to the original .ts source — essential for "Go to Definition" navigating from a .d.ts into the real source.
flowchart LR
Original["Original AST<br/>(with positions)"] --> Transforms["Transform Chain"]
Transforms --> Transformed["Transformed AST<br/>(with original pointers)"]
Transformed --> Printer["Printer"]
Printer --> JS["JavaScript Text"]
Printer --> SMG["SourceMapGenerator"]
SMG --> SM[".js.map<br/>(VLQ mappings)"]
The key insight: each transformed node carries an original pointer back to the source node it was derived from (set by the NodeFactory.update* methods). The printer uses these pointers to record the correct source positions in the map, even though the AST structure may have changed dramatically through transformations.
What's Next
We've now traced the complete compilation pipeline — from source text through scanning, parsing, binding, type checking, transformation, and emission. But the TypeScript compiler isn't just a batch tool; it also powers every IDE feature you use daily. In Part 6, we'll explore the Language Service and tsserver — how completions, go-to-definition, find-references, refactorings, and code fixes are built on top of the compiler core, and how the server process manages multiple projects in a workspace.