最終更新:2024-04-03 (水) 16:06:30 (47d)  

ExLlamaV2
Top / ExLlamaV2

an inference library for running local LLMs on modern consumer GPUs.

https://github.com/turboderp/exllamav2

モデル

関連