Update README.md

This commit is contained in:
Joel Andrés Navarro
2025-06-09 09:25:43 +00:00
committed by system
parent 35e2f6f651
commit 58fed0bce1

View File

@ -13,10 +13,11 @@ language:
license: apache-2.0 license: apache-2.0
--- ---
This is a merge of [Wan-AI/Wan2.1-VACE-14B](https://huggingface.co/Wan-AI/Wan2.1-VACE-14B) and [vrgamedevgirl84/Wan14BT2VFusionX](https://huggingface.co/vrgamedevgirl84/Wan14BT2VFusioniX) to provide additional VACE compatibility. This is a merge of [Wan-AI/Wan2.1-VACE-14B](https://huggingface.co/Wan-AI/Wan2.1-VACE-14B) scopes and [vrgamedevgirl84/Wan14BT2VFusionX](https://huggingface.co/vrgamedevgirl84/Wan14BT2VFusioniX).
The process involved extracting VACE scopes and injecting into the target models. The process involved extracting VACE scopes and injecting into the target models.
FP8 model weight was then converted to specific FP8 formats (E4M3FN and E5M2) using a custom ComfyUI node developed by lum3on, available at the [ComfyUI-ModelQuantizer](https://github.com/lum3on/ComfyUI-ModelQuantizer) GitHub repository.
- FP8 model weight was then converted to specific FP8 formats (E4M3FN and E5M2) using ComfyUI custom node [ComfyUI-ModelQuantizer](https://github.com/lum3on/ComfyUI-ModelQuantizer) by [lum3on](https://github.com/lum3on).
## Usage ## Usage
@ -36,5 +37,5 @@ The model files can be used in [ComfyUI](https://github.com/comfyanonymous/Comfy
## Reference ## Reference
- For more information about the GGUF-quantized versions, refer to [QuantStack/Wan-14B-T2V-FusionX-VACE-GGUF](https://huggingface.co/QuantStack/Wan-14B-T2V-FusionX-VACE-GGUF), where the quantization process is explained. - For more information about the GGUF-quantized versions, refer to [QuantStack/Wan-14B-T2V-FusionX-VACE-GGUF](https://huggingface.co/QuantStack/Wan-14B-T2V-FusionX-VACE-GGUF).
- For an overview of Safetensors format, please see the [Safetensors](https://huggingface.co/docs/safetensors/index). - For an overview of Safetensors format, please see the [Safetensors](https://huggingface.co/docs/safetensors/index).