Skip to content
Help Build a Better ComfyUI Knowledge Base Become a Patron
Nodes ManualConditioningApply ControlNet

Apply ControlNet

💡

This documentation is for the original Apply ControlNet(Advanced) node. The earliest Apply ControlNet node has been renamed to Apply ControlNet(Old). While you may still see the Apply ControlNet(Old) node in many workflow folders you download from comfyui.org for compatibility reasons, you can no longer find the Apply ControlNet(Old) node through search or node list. Please use the Apply ControlNet node instead.

Apply ControlNet

This node applies a ControlNet to a given image and conditioning, adjusting the image’s attributes based on the control network’s parameters and specified strength, such as Depth, OpenPose, Canny, HED, etc.

Documentation

  • Class name: ControlNetApply
  • Category: conditioning
  • Output node: False

Using controlNet requires preprocessing of input images. Since ComfyUI initial nodes do not come with preprocessors and controlNet models, please first install ContrlNet preprocessors download preprocessors here and corresponding controlNet models.

Input Types

ParameterData TypeFunction
positiveCONDITIONINGPositive conditioning data, from CLIP Text Encoder or other conditioning inputs
negativeCONDITIONINGNegative conditioning data, from CLIP Text Encoder or other conditioning inputs
control_netCONTROL_NETThe controlNet model to apply, typically input from ControlNet Loader
imageIMAGEImage for controlNet application, needs to be processed by preprocessor
vaeVAEVae model input
strengthFLOATControls the strength of network adjustments, value range 010. Recommended values between 0.51.5 are reasonable. Lower values allow more model freedom, higher values impose stricter constraints. Too high values may result in strange images. You can test and adjust this value to fine-tune the control network’s influence.
start_percentFLOATValue 0.000~1.000, determines when to start applying controlNet as a percentage, e.g., 0.2 means ControlNet guidance will start influencing image generation at 20% of the diffusion process
end_percentFLOATValue 0.000~1.000, determines when to stop applying controlNet as a percentage, e.g., 0.8 means ControlNet guidance will stop influencing image generation at 80% of the diffusion process

Output Types

ParameterData TypeFunction
positiveCONDITIONINGPositive conditioning data processed by ControlNet, can be output to next ControlNet or K Sampler nodes
negativeCONDITIONINGNegative conditioning data processed by ControlNet, can be output to next ControlNet or K Sampler nodes
💡

If you want to use T2IAdaptor style models, please use the Apply Style Model node instead

ComfyUI ControlNet Usage Examples

Visit the following pages for examples:

Apply ControlNet (OLD) Node Description

Apply ControlNet This is an early version of the Apply ControlNet node. The node options have been updated, but for compatibility, if you download workflows using the old version node in ComfyUI, it will display as this node. You can switch to the new Apply ControlNet node.

Apply ControlNet (OLD) Input Types

ParameterData TypeFunction
conditioningCONDITIONINGConditioning data from CLIP Text Encoder or other conditioning inputs (such as input from another conditioning node)
control_netCONTROL_NETThe controlNet model to apply, typically input from ControlNet Loader
imageIMAGEImage for controlNet application, needs to be processed by preprocessor
strengthFLOATControls the strength of network adjustments, value range 010. Recommended values between 0.51.5 are reasonable. Lower values allow more model freedom, higher values impose stricter constraints. Too high values may result in strange images.

Apply ControlNet (OLD) Output Types

ParameterData TypeFunction
conditioningCONDITIONINGConditioning data processed by ControlNet, can be output to next ControlNet or K Sampler nodes