Skip to content
Become a Patron Help Build a Better ComfyUI Knowledge Base
Nodes ManualConditioningApply ControlNet

Apply ControlNet

💡

This documentation is for the original Apply ControlNet(Advanced) node. The earliest Apply ControlNet node has been renamed to Apply ControlNet(Old). While you may still see the Apply ControlNet(Old) node in many workflow folders you download from comfyui.org for compatibility reasons, you can no longer find the Apply ControlNet(Old) node through search or node list. Please use the Apply ControlNet node instead.

Apply ControlNet

This node applies a ControlNet to a given image and conditioning, adjusting the image’s attributes based on the control network’s parameters and specified strength, such as Depth, OpenPose, Canny, HED, etc.

Documentation

  • Class name: ControlNetApply
  • Category: conditioning
  • Output node: False

Using controlNet requires preprocessing of input images. Since ComfyUI initial nodes do not come with preprocessors and controlNet models, please first install ContrlNet preprocessors download preprocessors here and corresponding controlNet models.

Input Types

ParameterData TypeFunction
positiveCONDITIONINGPositive conditioning data, from CLIP Text Encoder or other conditioning inputs
negativeCONDITIONINGNegative conditioning data, from CLIP Text Encoder or other conditioning inputs
control_netCONTROL_NETThe controlNet model to apply, typically input from ControlNet Loader
imageIMAGEImage for controlNet application, needs to be processed by preprocessor
vaeVAEVae model input
strengthFLOATControls the strength of network adjustments, value range 010. Recommended values between 0.51.5 are reasonable. Lower values allow more model freedom, higher values impose stricter constraints. Too high values may result in strange images. You can test and adjust this value to fine-tune the control network’s influence.
start_percentFLOATValue 0.000~1.000, determines when to start applying controlNet as a percentage, e.g., 0.2 means ControlNet guidance will start influencing image generation at 20% of the diffusion process
end_percentFLOATValue 0.000~1.000, determines when to stop applying controlNet as a percentage, e.g., 0.8 means ControlNet guidance will stop influencing image generation at 80% of the diffusion process

Output Types

ParameterData TypeFunction
positiveCONDITIONINGPositive conditioning data processed by ControlNet, can be output to next ControlNet or K Sampler nodes
negativeCONDITIONINGNegative conditioning data processed by ControlNet, can be output to next ControlNet or K Sampler nodes
💡

If you want to use T2IAdaptor style models, please use the Apply Style Model node instead

ComfyUI ControlNet Usage Examples

Visit the following pages for examples:

ControlNet Stage Control Settings

In the node settings, you can see two parameters start_percent and end_percent. These parameters can be used to control the application stage of ControlNet during the generation process. When using ControlNet:

  • You can first set start_percent and end_percent to the default values of 0.000 and 1.000, then adjust these values as needed to see the application effect

Below is a diagram explaining the stage control:

1. Parameter Configuration Reference for Different ControlNet Types

TypeRecommended WeightStage RangeKey Preprocessing ParametersBest Use CasesSpecial Techniques
Canny0.8-1.20.0-0.4Threshold:100/200, Sharpen 15%Architecture/Product DesignEnable Invert for transparent materials, process complex structures in segments
HED0.6-0.90.2-0.7Gaussian Blur σ=1.5, Smooth 20%Portrait/Fashion DesignAnime mode for cartoon style, Realism mode for authentic details
MLSD0.7-1.00.3-0.8Min Line Length 15px, Angle Tolerance 15°Engineering DrawingIncrease weight by 0.2 for tilted walls, decrease by 0.3 for glass curtain walls
Depth0.7-1.00.2-0.9MiDaS Large Model, 3D MappingVR/Medical VisualizationNear-view enhancement mode for subject details, ZoeDepth for macro scenes
Normal0.5-0.80.4-1.0Resolution 2048px, AO 0.3Product RenderingEnable Specular for metal materials, multi-light synthesis for enhanced 3D feel
Scribble0.4-0.70.5-1.0SoftEdge Blur 3px, Hue Tolerance 15%Concept DesignUse 50% opacity masks for gradients, Pantone library for brand consistency
Lineart0.6-0.90.3-1.0Anti-aliasing On, Line Width ±2pxCharacter ArtAnime mode for simplified lines, Realism mode for complex folds
OpenPose0.9-1.10.0-0.325-point Skeleton, Hand Detail EnhancementMotion CaptureMotion blur compensation to prevent ghosting, increase weight to 1.2 for martial arts
Segmentation0.8-1.00.0-0.7ADEPT 2.0, Mask Feather 10pxAd CompositionDecrease weight to 0.2 for sky regions, sharpen building edges by 20%
Tile0.3-0.60.4-0.9256x256 Blocks, 30% Repeat RateTexture GenerationRandomize variations for natural feel, enable seamless tiling for brick walls

2. Classic Scene Configuration Templates

2.1 Architectural Visualization Design

Control TypeWeightStage RangePreprocessing ParametersAdjustment Tips
Canny1.00.0-0.4Threshold 100/200Enable Invert for glass walls
Depth0.80.2-0.7MiDaS Large ModelEnhance mid-ground by 20%
MLSD0.60.5-0.9Min Line Length 20pxIncrease weight to 0.8 for tilted walls

2.2 Game Character Design

Control TypeWeightStage RangePreprocessing ParametersDynamic Adjustment
OpenPose1.00.0-0.3Full SkeletonReduce to 0.7 after step 20
Lineart0.70.4-1.0Anime Mode+0.1 weight for equipment areas
Scribble0.50.5-1.0SoftEdge Blur 2pxSet color block boundary strength to 0.3

2.3 Product Concept Design

Control TypeWeightStage RangePreprocessing ParametersMaterial Optimization
HED0.90.0-0.3Gaussian Blur σ=1.5Enable Specular for metal surfaces
Normal0.70.2-0.6Resolution 2048x2048Reduce to 0.5 for plastic materials
Depth0.60.5-0.9Near View EnhancementBackground blur strength 1.2

2.4 Medical Visualization

Control TypeWeightStage RangePreprocessing ParametersPrecision Control
Scribble0.80.0-0.5Red Annotation LinesOrgan boundary tolerance ±2px
Depth0.70.4-0.8CT Scan ModeLayer spacing 0.1mm
Lineart0.90.7-1.0Ultra DetailBlood vessel path precision 1px

2.5 Film Scene Composition

Control TypeWeightStage RangePreprocessing ParametersAtmosphere Creation
Seg0.90.0-0.6ADEPT ModelReduce sky region weight to 0.2
Shuffle0.60.3-0.8Color Temp 5500KNeon light area weight 0.8
Depth0.70.5-1.0Dynamic Range CompressionForeground sharpening 1.5

2.6 E-commerce Advertisement Design

Control TypeWeightStage RangePreprocessing ParametersCommercial Optimization
Canny1.20.0-0.4Edge Sharpen +15%Enhanced reflection mode
Scribble0.70.3-0.7Pantone LibraryBrand color tolerance ±5%
Inpaint0.50.6-1.0Feather Radius 15pxText area protection mask

3. Expert-Level Adjustment Strategies

3.1 Stage Weight Decay Model

Generation ProgressControl TypeDecay CurveFormula Example
0-30%Structure ControlConstant Strengthstrength = 1.0
30-70%Spatial ControlLinear Decaystrength = 1.0 - (step-30)/40*0.5
70-100%Detail ControlReverse Enhancementstrength = 0.5 + (step-70)/30*0.5

3.2 Multi-ControlNet Conflict Resolution

Conflict TypeVisual ManifestationResolution Strategy
Structure-SpaceObject floating/perspective errorsSet stage interval ≥0.15
Space-DetailMaterial distortion/reflection anomaliesAdd area masks to isolate control ranges
Structure-DetailKey feature lossIncrease structure control strength by 20%

4. Common Issues Quick Reference

Q1: Control effect suddenly disappears?
✅ Check if end_percent ends too early (recommended ≥0.8)
✅ Confirm no other ControlNet overlaps in the area

Q2: Generation results show ghosting?
✅ Reduce stage overlap (recommended ≤20%)
✅ Set exclusion masks for conflicting ControlNets

Q3: How to optimize for insufficient VRAM?
✅ Use stepped stage configuration (example: 0.0-0.3 → 0.4-0.6 → 0.7-1.0)
✅ Reduce non-critical ControlNet resolution to 512px

Apply ControlNet (OLD) Node Description

Apply ControlNet This is an early version of the Apply ControlNet node. The node options have been updated, but for compatibility, if you download workflows using the old version node in ComfyUI, it will display as this node. You can switch to the new Apply ControlNet node.

Apply ControlNet (OLD) Input Types

ParameterData TypeFunction
conditioningCONDITIONINGConditioning data from CLIP Text Encoder or other conditioning inputs (such as input from another conditioning node)
control_netCONTROL_NETThe controlNet model to apply, typically input from ControlNet Loader
imageIMAGEImage for controlNet application, needs to be processed by preprocessor
strengthFLOATControls the strength of network adjustments, value range 010. Recommended values between 0.51.5 are reasonable. Lower values allow more model freedom, higher values impose stricter constraints. Too high values may result in strange images.

Apply ControlNet (OLD) Output Types

ParameterData TypeFunction
conditioningCONDITIONINGConditioning data processed by ControlNet, can be output to next ControlNet or K Sampler nodes