• A beginner's guide to understanding Hugging Face Transformers, a library that provides access to thousands of pre-trained transformer models for natural language processing, computer vision, and more.
• The guide covers the basics of Hugging Face Transformers, including what it is, how it works, and how to use it with a simple example of running Microsoft's Phi-2 LLM in a notebook
• The guide is designed for non-technical individuals who want to understand open-source machine learning without prior knowledge of Python or machine learning.
Get models like Phi-2, Mistral, and LLaVA running locally on a Raspberry Pi with Ollama
Inside Phi 2: Microsoft's Small Language Model explores the development of tailored AI models that minimize resource usage, with a focus on small language models (SLMs). The article discusses Microsoft Research's approach to building generative AI models, highlighting their "textbooks are all you need" strategy used in training their Phi series SLMs. Key aspects include curating high-quality training data, generating synthetic content, and fine-tuning the model with domain-specific information. This innovative approach has resulted in surprisingly good outcomes, with some benchmarked SLMs performing similarly or even better than larger LLMs like GPT.