- TabPFN is a novel foundation model designed for small- to medium-sized tabular datasets, with up to 10,000 samples and 500 features.
- It uses a transformer-based architecture and in-context learning (ICL) to outperform traditional gradient-boosted decision trees on these datasets.
The article addresses the challenges in recommendation systems, specifically dealing with new users and items (cold-start problem), and the computational inefficiency and scalability issues of traditional embedding-based models.
ByteDance introduced the Hierarchical Large Language Model (HLLM), designed to improve sequential recommendations. The HLLM consists of two components: an Item LLM and a User LLM. The Item LLM extracts detailed features from item descriptions and generates embeddings that are then processed by the User LLM to predict user behavior. This hierarchical approach allows for efficient and effective handling of new items and users.
Explore the architecture of TiDE and apply it in a forecasting project using Python