klotz: lonestriker* + llm* + everyone* + quantization* + moe* + coder* + mistral* + frankenmoe*

0 bookmark(s) - Sort by: Date ↓ / Title / - Bookmarks from other users for this tag

  1. Not Mixtral MoE but Merge-kit MoE

    - What makes a perfect MoE: The secret formula
    - Why is a proper merge considered a base model, and how do we distinguish them from a FrankenMoE?
    - Why the community working together to improve as a whole is the only way we will get Mixtral right

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: Tags: lonestriker + llm + everyone + quantization + moe + coder + mistral + frankenmoe

About - Propulsed by SemanticScuttle