klotz: quantization methods*

0 bookmark(s) - Sort by: Date ↓ / Title / - Bookmarks from other users for this tag

  1. The article discusses fine-tuning large language models (LLMs) using QLoRA with different quantization methods, including AutoRound, AQLM, GPTQ, AWQ, and bitsandbytes. It compares their performance and speed, recommending AutoRound for its balance of quality and speed.

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: Tags: quantization methods

About - Propulsed by SemanticScuttle