LLM

LLM

Building an SAT Reading Prep Application

Explore the development of an SAT Reading Prep Application using gpt-4o and Cursor

Read
LLM

How does Huggingface Transformers work?

A deep dive into how Huggingface Transformers works under the hood, exploring its pipeline architecture, model loading process, and key functionalities that make it a powerful tool for working with transformer models.

Read
LLM

Using LLM as a Reranker

Explore the concept of Rerankers, their role in enhancing search results, and how they leverage large language models to improve the relevance and accuracy of information retrieval.

Read
LLM

How Is Attention Calculated?

A detailed exploration of how attentions are calculated in the Transformer model, as introduced in 'Attention Is All You Need.'

Read