comfyui_LLM_party

1625

Available Nodes

Documentation

ComfyUI LLM Party

Overview

The ComfyUI LLM Party repository is designed to provide a comprehensive set of custom nodes for building workflows involving large language models (LLMs) within the ComfyUI environment. This repository supports seamless integration of LLM functionalities into existing image processing workflows, allowing users to create their personalized AI-powered solutions with ease.

Key Features

  • Comprehensive Node Set: A diverse range of nodes is implemented to facilitate LLM-based workflows, spanning API calls, model management, and workflow integration.
  • Versatile LLM Integration: Support for various LLM-related tasks, from basic API interactions and advanced role-setting to localized knowledge base management and complex agent interactions.
  • Model Compatibility: Works with multiple API formats (e.g., OpenAI, Ollama, Azure, etc.) and supports various local models, including those in Transformer libraries.
  • Quick Start Packages: Provides a Windows-specific portable package for simplified installation and setup of ComfyUI with LLM capabilities.
  • Interactive Features: Real-time text output from API calls and streaming services for dynamic user interaction.

Installation

To install the ComfyUI LLM Party repository, follow one of these methods:

Method 1: ComfyUI Manager

  1. Search for comfyui_LLM_party in the ComfyUI Manager.
  2. Install it with one click.
  3. Restart ComfyUI.

Method 2: Git Clone

  1. Navigate to the custom_nodes subfolder under the ComfyUI root folder.

  2. Clone the repository using:

    git clone https://github.com/heshengtao/comfyui_LLM_party.git
    

Method 3: Manual Download

  1. Click CODE in the upper right corner of the GitHub page.
  2. Select download zip.
  3. Unzip the downloaded package into the custom_nodes subfolder under the ComfyUI root folder.

Environment Deployment

  1. Navigate to the comfyui_LLM_party project folder.

  2. Install required third-party libraries with:

    pip install -r requirements.txt
    
  3. For the ComfyUI launcher, use:

    path_in_launcher_configuration\python_embeded\python.exe -m pip install -r requirements.txt
    

Node Categories and Examples

The repository includes a wide range of nodes to support diverse functionality. Some noteworthy nodes include:

Special Features

  • Streaming API Output: View real-time streaming of API-based LLM responses directly in the console.
  • Multi-Platform Integration: Provides tools for connecting to platforms like QQ, Feishu, and Discord for broader application.
  • Powerful Image Support: Integrates image support capabilities such as image hosting and converting URLs to images.
  • Extensive Model Support: Integrates with various industry-related models and APIs, both local and cloud-based, for enhanced utility.

Use in ComfyUI Workflows

  • Rapid LLM Workflow Configuration: Use the extensive suite of nodes to fast-track the creation of custom LLM pipelines.
  • Integration with Image Processing: Easily combine LLM capabilities with image-based ComfyUI workflows for enriched multimedia applications.
  • Custom AI Assistants: Set up personalized AI solutions using role-setting nodes and multi-agent interactions for specialized tasks.

For more detailed guidance and tutorials, users are encouraged to refer to additional resource materials provided in the repository.