Read OSS

The Node System and Three Shading Language: How Shaders Are Built from Graphs

Advanced

Prerequisites

  • Articles 1-3
  • Understanding of DAG/graph data structures
  • Basic shader programming concepts (vertex/fragment stages, uniforms, varyings)

The Node System and Three Shading Language: How Shaders Are Built from Graphs

The node system is the most ambitious piece of engineering in modern Three.js. Instead of maintaining separate GLSL shader templates for every material/light/effect combination (the WebGLRenderer approach), the new renderer constructs shaders as directed acyclic graphs (DAGs) of Node objects. These graphs are traversed by NodeBuilder through a three-phase pipeline to produce either WGSL (for WebGPU) or GLSL (for the WebGL fallback). On top of this, TSL (Three Shading Language) provides a fluent JavaScript API that makes writing custom shader logic feel like writing math, not managing graph data structures.

The Node Base Class and Type System

Node extends EventDispatcher and serves as the base class for all shader graph nodes. Its constructor accepts a nodeType string that represents the output type of the node — values like 'float', 'vec2', 'vec3', 'vec4', 'mat4', and 'bool' drawn from the constants in src/nodes/core/constants.js:

export const NodeType = {
    BOOLEAN: 'bool',
    INTEGER: 'int',
    FLOAT: 'float',
    VECTOR2: 'vec2',
    VECTOR3: 'vec3',
    VECTOR4: 'vec4',
    MATRIX2: 'mat2',
    MATRIX3: 'mat3',
    MATRIX4: 'mat4'
};

Three lifecycle properties control when a node's update() method fires, defined in NodeUpdateType: NONE (never), FRAME (once per frame), RENDER (once per render call — a frame may have multiple renders for shadows/reflections), and OBJECT (once per object that uses this node).

The build process is organized into three phases, declared at line 66:

export const defaultBuildStages = [ 'setup', 'analyze', 'generate' ];
classDiagram
    class Node {
        +nodeType: string
        +updateType: string
        +version: number
        +global: boolean
        +setup(builder): Node
        +analyze(builder): void
        +generate(builder): string
        +build(builder): string
    }

    class ConstNode {
        +value: any
        +generate(): string
    }

    class OperatorNode {
        +op: string
        +aNode: Node
        +bNode: Node
    }

    class MathNode {
        +method: string
        +generate(): string
    }

    class MaterialNode {
        +scope: string
    }

    Node <|-- ConstNode
    Node <|-- OperatorNode
    Node <|-- MathNode
    Node <|-- MaterialNode

A critical detail: the _parentBuildStage map at line 10 defines stage ordering: analyze's parent is setup, and generate's parent is analyze. This ensures child nodes are processed in the correct stage before their parents reference them.

Node Categories: Accessors, Math, Lighting, Display

The src/nodes/ directory organizes nodes by function:

Category Directory Purpose Examples
Core core/ Base classes and infrastructure Node, NodeBuilder, ConstNode, StackNode
Accessors accessors/ Read scene/object data MaterialNode, CameraNode, PositionNode, NormalNode
Math math/ Mathematical operations MathNode, OperatorNode, CondNode
Lighting lighting/ Light calculations LightsNode, AnalyticLightNode, ShadowNode
Display display/ Output processing ToneMappingNode, ColorSpaceNode, ViewportDepthNode
Code code/ Raw shader code injection FunctionNode, ExpressionNode
TSL tsl/ Three Shading Language runtime TSLCore, TSLBase
Functions functions/ Reusable shader functions BRDF_GGX, getAlphaHashThreshold

Accessor nodes pull data from the scene graph into the shader. MaterialNode reads material properties (color, roughness, metalness), CameraNode reads camera uniforms (position, projection matrix), and PositionNode provides vertex positions in various coordinate spaces (local, world, view).

Math nodes perform operations. OperatorNode handles binary operators (+, −, ×, ÷), MathNode handles functions (sin, cos, normalize, mix, smoothstep), and CondNode implements ternary selection.

graph TD
    subgraph "Accessor Nodes"
        MC["materialColor"] --> MulOp
        T["texture(map)"] --> MulOp
    end

    subgraph "Math Nodes"
        MulOp["OperatorNode(*)"] --> AddOp["OperatorNode(+)"]
        E["materialEmissive"] --> AddOp
    end

    subgraph "Display Nodes"
        AddOp --> TM["ToneMappingNode"]
        TM --> CS["ColorSpaceNode"]
        CS --> Output["output"]
    end

TSL: Runtime Metaprogramming for Fluent Shaders

TSL is the developer-facing API that makes node graph construction feel like writing shader math in JavaScript. The magic happens in addMethodChaining() inside TSLCore.js:

export function addMethodChaining( name, nodeElement ) {
    if ( NodeElements.has( name ) ) {
        warn( `TSL: Redefinition of method chaining '${ name }'.` );
        return;
    }

    NodeElements.set( name, nodeElement );

    if ( name !== 'assign' ) {
        Node.prototype[ name ] = function ( ...params ) {
            return this.isStackNode 
                ? this.addToStack( nodeElement( ...params ) ) 
                : nodeElement( this, ...params );
        };

        Node.prototype[ name + 'Assign' ] = function ( ...params ) {
            return this.isStackNode 
                ? this.assign( params[ 0 ], nodeElement( ...params ) ) 
                : this.assign( nodeElement( this, ...params ) );
        };
    }
}

This function does two things simultaneously: it registers the node factory function in a NodeElements Map, and it patches Node.prototype to add the method as a chainable call. When you write positionLocal.mul(2.0), what actually happens is:

  1. positionLocal is a Node (specifically a PositionNode)
  2. .mul was added to Node.prototype by addMethodChaining('mul', mulFunction)
  3. The prototype method calls mulFunction(this, 2.0), which creates a new OperatorNode('*', positionLocal, float(2.0)) and returns it

This is pure runtime metaprogramming — there's no compile step, no babel plugin. Every TSL function call builds a node and returns it, so expressions compose naturally:

// This JavaScript expression:
positionLocal.mul( 2.0 ).add( offset ).normalize()

// Produces this node DAG:
// MathNode('normalize',
//   OperatorNode('+',
//     OperatorNode('*', positionLocal, float(2.0)),
//     offset
//   )
// )

Tip: TSL expressions are lazy — they build a graph, they don't execute shader code. The graph only becomes shader code when NodeBuilder traverses it during material compilation. This means you can store and reuse TSL expressions as variables.

NodeBuilder: The DAG-to-Shader Compiler

NodeBuilder is the base class for the compiler that transforms node graphs into shader source code. It's constructed with the 3D object, renderer, and a parser, and provides the infrastructure for variable declaration, uniform binding, varying management, and code emission.

The three build phases work as follows:

  1. Setup: Nodes return their child nodes, establishing the DAG structure. A node might transform itself — for example, MaterialNode in setup might return a TextureNode if the material has a color map.

  2. Analyze: The builder walks the DAG to determine which shader stage each node belongs to (vertex or fragment), detect shared nodes that need to become varyings, and collect uniform declarations.

  3. Generate: Each node emits its shader code string. OperatorNode generates (a * b), MathNode generates normalize(x), accessor nodes generate uniform reads or attribute lookups.

flowchart LR
    subgraph "Phase 1: Setup"
        S1["Material slots → Node DAG"]
    end

    subgraph "Phase 2: Analyze"
        S2["Stage assignment<br/>Varying detection<br/>Uniform collection"]
    end

    subgraph "Phase 3: Generate"
        S3["Node → shader code string<br/>Variable declarations<br/>Code assembly"]
    end

    S1 --> S2 --> S3

    S3 --> WGSL["WGSLNodeBuilder<br/>→ WGSL code"]
    S3 --> GLSL["GLSLNodeBuilder<br/>→ GLSL code"]

NodeBuilder is subclassed into two concrete implementations:

  • WGSLNodeBuilder (~2,523 lines) generates WGSL for WebGPU
  • GLSLNodeBuilder (~1,676 lines) generates GLSL for the WebGL fallback

Both receive the same node graph from the same material — the DAG is backend-agnostic, only the code generation is backend-specific.

NodeMaterial: Where Nodes Meet Materials

NodeMaterial extends the base Material class (covered in Part 2) and adds the slot system — a set of nullable node properties that control every stage of the material pipeline:

Slot Purpose Default
colorNode Diffuse color Material color property
normalNode Surface normal perturbation Geometry normals
opacityNode Alpha value Material opacity
emissiveNode Self-illumination Material emissive color
roughnessNode PBR roughness Material roughness
metalnessNode PBR metalness Material metalness
outputNode Final fragment output Computed lighting result
positionNode Vertex position override Geometry position

Each slot accepts any node that resolves to the expected type. Setting material.colorNode = texture(myTexture).mul(color(0xff0000)) replaces the default color logic with a TSL expression that samples a texture and tints it red.

Concrete subclasses like MeshStandardNodeMaterial wire up PBR-specific slot defaults and add additional slots for roughness maps, metalness maps, normal maps, and environment map sampling.

flowchart TB
    subgraph "NodeMaterial Slots"
        PN["positionNode"] --> VS["Vertex Shader"]
        CN["colorNode"] --> FS["Fragment Shader"]
        NN["normalNode"] --> FS
        ON["opacityNode"] --> FS
        EN["emissiveNode"] --> FS
        RN["roughnessNode"] --> FS
        MN["metalnessNode"] --> FS
    end

    VS --> Rasterizer["Rasterization"]
    Rasterizer --> FS
    FS --> OutputN["outputNode"] --> Final["Final Color"]

StandardNodeLibrary: The Rosetta Stone

The StandardNodeLibrary is what enables backward compatibility between scenes written for WebGLRenderer and the new WebGPURenderer. It maps every classic material and light type to a node-based equivalent:

this.addMaterial( MeshStandardNodeMaterial, 'MeshStandardMaterial' );
this.addMaterial( MeshPhysicalNodeMaterial, 'MeshPhysicalMaterial' );
this.addMaterial( MeshBasicNodeMaterial, 'MeshBasicMaterial' );
// ... 10 more material mappings

this.addLight( PointLightNode, PointLight );
this.addLight( DirectionalLightNode, DirectionalLight );
this.addLight( SpotLightNode, SpotLight );
// ... 6 more light mappings

this.addToneMapping( acesFilmicToneMapping, ACESFilmicToneMapping );
// ... 5 more tone mapping mappings

When WebGPURenderer encounters a MeshStandardMaterial in the scene (a class that knows nothing about nodes), the NodeManager consults the StandardNodeLibrary to find the corresponding MeshStandardNodeMaterial and transparently creates a node-based equivalent. The user's code doesn't change at all — the library handles the translation.

Tip: If you're migrating from WebGLRenderer to WebGPURenderer, you don't need to rewrite your materials. The StandardNodeLibrary handles the conversion automatically. But if you want to unlock the full power of the node system (custom shader effects, compute shaders, procedural geometry), use NodeMaterial and TSL directly.

Example Walkthrough: From TSL Expression to Shader Code

Let's trace how a simple TSL expression becomes shader code. Consider:

material.colorNode = materialColor.mul( texture( map ) );

Step 1: Graph construction. materialColor is a MaterialNode that reads the material's color property. texture(map) creates a TextureNode. .mul() creates an OperatorNode('*', materialColorNode, textureNode).

Step 2: Setup phase. NodeBuilder calls setup() on the root node. The OperatorNode returns itself, and recursively sets up its children. The TextureNode might generate a UV accessor node as a dependency.

Step 3: Analyze phase. The builder determines that the texture sampling must happen in the fragment stage. If any input comes from the vertex stage, it marks those connections as varyings.

Step 4: Generate phase. Each node emits code. For WGSL, it might produce:

let nodeColor = nodeUniform.color * textureSample(nodeTexture, nodeSampler, nodeVarying_uv);

For GLSL, the same graph produces:

vec4 nodeColor = nodeUniform_color * texture(nodeTexture, nodeVarying_uv);

The power of this architecture is that the same semantic graph — "multiply material color by texture sample" — produces correct code for any target language, with all the plumbing (uniforms, samplers, varyings, coordinate transforms) handled automatically.

What's Next

With the renderer and node system covered, we'll shift focus to the supporting infrastructure: the math library that powers all transforms and shading calculations, the camera and light hierarchies, color management, and how lights integrate with the node system for shader-based lighting calculations.