How did the Attention Mechanism start an AI frenzy? | LM3 Published 2024-04-15 Download video MP4 360p Recommendations 26:55 ChatGPT: 30 Year History | How AI Learned to Talk 10:28 What does it mean for computers to understand language? | LM1 40:08 The Most Important Algorithm in Machine Learning 16:53 What are Reed-Solomon Codes? How computers recover lost data 37:56 Orignal transformer paper "Attention is all you need" introduced by a layman | Shawn's ML Notes 07:15 Is There Existence Outside of Time? Sara Walker 09:06 10 weird algorithms 13:17 Why Recurrent Neural Networks are cursed | LM2 18:08 Transformer Neural Networks Derived from Scratch 12:21 The secret π in the Mandelbrot Set 16:09 Self-Attention Using Scaled Dot-Product Approach 08:03 The million dollar equation (Navier-Stokes equations) 24:47 What does it mean to take a complex derivative? (visually explained) 20:18 Why Does Diffusion Work Better than Auto-Regression? 13:52 The Neural Network, A Visual Introduction 07:38 Which transformer architecture is best? Encoder-only vs Encoder-decoder vs Decoder-only models 05:54 Visualize the Transformers Multi-Head Attention in Action 31:51 MAMBA from Scratch: Neural Nets Better and Faster than Transformers 17:35 5 Good Python Habits 05:13 Can you solve the rogue AI riddle? - Dan Finkel