ExpressionEditor Node Documentation
Overview
The ExpressionEditor
node is an integral part of the ComfyUI-AdvancedLivePortrait project, which is designed to edit and manipulate facial expressions within images. This node provides an interface for users to modify expressions through rotation, blinking, eyebrow movements, and other facial features. Users can apply these changes to source images or sample images, allowing creative control over the appearance of faces in photographs and videos.
Functionality
The ExpressionEditor
node enables users to:
- Adjust facial expressions including eye and mouth movements.
- Manipulate the rotation of facial features in terms of yaw, pitch, and roll.
- Blend expressions from sample images with the source image.
- Apply additive expressions from previously saved data.
- Preview output images to evaluate changes in real-time.
Inputs
The node accepts the following inputs:
- rotate_pitch, rotate_yaw, rotate_roll: Adjusts rotations in degrees around the x, y, and z axes, respectively.
- blink: Controls the degree of blinking.
- eyebrow: Adjusts eyebrow movements.
- wink: Specifies the intensity of a wink.
- pupil_x, pupil_y: Moves pupils horizontally and vertically.
- aaa, eee, woo, smile: Adjusts mouth movements and smiles.
- src_ratio: Determines the influence of the source image's current expression.
- sample_ratio: Determines the influence of the expressions extracted from the sample image.
- sample_parts: Specifies which parts of the expression are applied from the sample image. Options include "OnlyExpression," "OnlyRotation," "OnlyMouth," "OnlyEyes," and "All."
- crop_factor: A scaling factor for cropping the face region.
Optional image inputs include:
- src_image: The source image for expression editing.
- sample_image: A reference image that can be used to extract expressions.
- motion_link: Links to additional motion data to incorporate into the expression.
- add_exp: Optional data from pre-existing expression sets to be added.
Outputs
The node generates the following outputs:
- image: The edited image with the applied expression changes.
- motion_link: An updated motion link containing expression and pose changes, which can be used in further processing.
- save_exp: The expression set data reflecting the applied changes, which can be saved for future use.
Usage in ComfyUI Workflows
The ExpressionEditor
can be used in various ComfyUI workflows that involve real-time facial expression manipulation. Users may leverage this node to:
- Create dynamic facial animations from static images.
- Simulate different facial expressions for avatars or characters in creative projects.
- Enhance videos by integrating facial expression adjustments frame by frame.
- Save time by previewing changes instantly and adjusting parameters as needed.
Special Features and Considerations
- Real-Time Preview: The node provides a real-time preview of the adjusted image, helping users make informed decisions quickly.
- Expression Blending: Users can blend existing expressions with those extracted from other images, providing nuanced control over facial dynamics.
- Integration with Other Nodes: The node seamlessly integrates with nodes like
LoadExpData
and SaveExpData
for enhanced workflow management.
- Parameter Rich: A wide array of parameters allows for detailed control over expression details, enhancing creative flexibility.
By using the ExpressionEditor
node, ComfyUI users harness powerful tools for facial expression editing, enabling intuitive and detailed character creation for their projects.