This article explains and visualizes sampling strategies used by Large Language Models (LLMs) to generate text, focusing on parameters like temperature and top-p. By understanding these parameters, users can tailor LLM output for different use cases.
Understand temperature, Top-k, Top-p, frequency, and presence penalty for LLM hyperparameters once and for all with visual examples.
Deep learning has been deployed in many tasks in NLP, such as translation, image captioning, and dialogue systems. In machine translation, it is used to read source language (input) and generate the desired language (output). Similarly in a dialogue system, it is used to generate a response given a context. This is also known as Natural Language Generation (NLG).