RAGs to Riches

I’ve been diving into the world of Retrieval Augmented Generation (RAG), a framework that empowers large language models (LLMs) to access external data sources. This is a game-changer for overcoming the limitations of LLMs, particularly the issue of knowledge cutoffs. While retraining the model with new data is an option,…

Overflowing

Abstraction via Repetition In a fastAI class, I asked the following: Do any language models attempt to provide meaning? For instance, “I’m going to the store” is the opposite of “I’m not going to the store.” Or “I barely understand this stuff” and “That ball came so close to my…