Understanding Gradient Descent

Gradient Descent is one of the most important optimization methods used in machine learning and modeling. At its core, it is a systematic way for a model to improve step by step, by reducing its mistakes over time. Think of Gradient Descent like walking down a hill. This process of slowly moving down the slope—making … Read more

Neural Network Explained

, What is a Neural Network? Think of your brain as a giant web of tiny switches called neurons. These neurons are the reason we can see, think, and make decisions. They pass signals between each other and gradually learn from experience. Now, a neural network in a computer is designed to mimic this process. … Read more

Visualize the LLM

llm

Have you seen the LLM working internally? Recently I came across this site where the enter flow of LLM can be visualized. Please check the below link https://bbycroft.net/llm Transformer-based LLM Architecture (Components) Embedding Layer Normalization Self-Attention Projection Feed Forward Network (MLP) Transformer Block Softmax and Output Large Language Models (LLMs) such as GPT and BERT … Read more