klotz: word2vec* + machine learning*

Bookmarks on this page are managed by an admin user.

0 bookmark(s) - Sort by: Date ↓ / Title / - Bookmarks from other users for this tag

  1. - Embeddings transform words and sentences into sequences of numbers for computers to understand language.
    - This technology powers tools like Siri, Alexa, Google Translate, and generative AI systems like ChatGPT, Bard, and DALL-E.
    - In the early days, embeddings were crafted by hand, which was time-consuming and couldn't adapt to language nuances easily.
    - The 3D hand-crafted embedding app provides an interactive experience to understand this concept.
    - The star visualization method offers an intuitive way to understand word embeddings.
    - Machine learning models like Word2Vec and GloVe revolutionized the generation of word embeddings from large text datasets.
    - Universal Sentence Encoder (USE) extends the concept of word embeddings to entire sentences.
    - TensorFlow Projector is an advanced tool to interactively explore high-dimensional data like word and sentence embeddings.
  2. t-Distributed Stochastic Neighbor Embedding (t-SNE) is a (prize-winning) technique for dimensionality reduction that is particularly well suited for the visualization of high-dimensional datasets. The technique can be implemented via Barnes-Hut approximations, allowing it to be applied on large real-world datasets. We applied it on data sets with up to 30 million examples. The technique and its variants are introduced in the following papers:
  3. from gensim.models import Phrases
    2021-08-30 Tags: , , , by klotz

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: Tags: word2vec + machine learning

About - Propulsed by SemanticScuttle