DavidAU's model collection on Hugging Face includes various AI and ML models, such as GALAXY-XB, Mini-MOEs, TinyLlama, and Psyonic-Cetacean. These models are designed for text generation, single/multiple LLMs, and automation tasks.
This article explains how to use the Sentence Transformers library to finetune and train embedding models for a variety of applications, such as retrieval augmented generation, semantic search, and semantic textual similarity. It covers the training components, dataset format, loss function, training arguments, evaluators, and trainer.