All
Search
Images
Videos
Shorts
Maps
News
More
Shopping
Flights
Travel
Notebook
Report an inappropriate content
Please select one of the options below.
Not Relevant
Offensive
Adult
Child Sexual Abuse
Length
All
Short (less than 5 minutes)
Medium (5-20 minutes)
Long (more than 20 minutes)
Date
All
Past 24 hours
Past week
Past month
Past year
Resolution
All
Lower than 360p
360p or higher
480p or higher
720p or higher
1080p or higher
Source
All
Dailymotion
Vimeo
Metacafe
Hulu
VEVO
Myspace
MTV
CBS
Fox
CNN
MSN
Price
All
Free
Paid
Clear filters
SafeSearch:
Moderate
Strict
Moderate (default)
Off
Filter
1:12
YouTube
Rajistics - data science, AI, and machine learning
Compressing AI Models (LLMs) using Distillation, Quantization, and Pruning
A couple of techniques we use to compress models. This saves GPU memory and can reduce the amount of compute needed. Model distillation compresses a large model’s knowledge into a smaller one, quantization reduces memory usage by representing parameters with fewer bits, and pruning streamlines the model by removing less important weights ...
2.8K views
Feb 2, 2025
Knowledge Distillation Tutorial
1:18:05
Understanding Knowledge Distillation in Neural Sequence Generation - Microsoft Research
Microsoft
Jan 17, 2020
Knowledge Distillation: Principles, Algorithms, Applications
neptune.ai
Jul 22, 2022
What is Knowledge distillation? | IBM
ibm.com
Apr 16, 2024
Top videos
12:09
How to Distill LLM? LLM Distilling [Explained] Step-by-Step using Python Hugging Face AutoTrain
YouTube
FreeBirds Crew - Data
6.7K views
Feb 28, 2025
6:05
What is LLM Distillation ?
YouTube
New Machina
30.4K views
Feb 2, 2025
29:14
Knowledge Distillation Simplified | Teacher to Student Model for LLMs (Step-by-Step with Demo) #ai
YouTube
Unfold Data Science
1.4K views
6 months ago
Knowledge Distillation Applications
Distillation - Definition, Detailed Process, Types, Uses
byjus.com
Jun 2, 2016
0:28
201 reactions · 41 shares | Microsoft and CMU researchers have begun to unravel three mysteries in deep learning related to ensemble, knowledge distillation, and self-distillation. Discover how their work has led to the first theoretical proof with empirical evidence for ensemble in deep learning: https://aka.ms/AAavuq3 | Microsoft Research | Facebook
Facebook
Microsoft Research
14K views
2 weeks ago
50:39
Knowledge Distillation as Semiparametric Inference [Talk]
Microsoft
Apr 28, 2021
12:09
How to Distill LLM? LLM Distilling [Explained] Step-by-Step using Py
…
6.7K views
Feb 28, 2025
YouTube
FreeBirds Crew - Data Science and GenAI
6:05
What is LLM Distillation ?
30.4K views
Feb 2, 2025
YouTube
New Machina
29:14
Knowledge Distillation Simplified | Teacher to Student Model for LLM
…
1.4K views
6 months ago
YouTube
Unfold Data Science
57:22
MedAI #88: Distilling Step-by-Step! Outperforming LLMs with Smaller
…
8.7K views
Jul 20, 2023
YouTube
Stanford MedAI
23:18
Master the Art of Model Compression with Knowledge Dist
…
1.7K views
Aug 24, 2023
YouTube
DataTrek
16:49
Better not Bigger: Distilling LLMs into Specialized Models
11.7K views
Oct 30, 2023
YouTube
Snorkel AI
24:11
Knowledge Distillation in Machine Learning: Full Tutorial with Code
3.1K views
10 months ago
YouTube
MLWorks
9:51
Knowledge Distillation in Deep Learning - Basics
24.2K views
Sep 11, 2021
YouTube
Dingu Sagar
25:21
Model Distillation: Same LLM Power but 3240x Smaller
22.4K views
Aug 5, 2024
YouTube
Adam Lucek
43:49
MiniLLM: Knowledge Distillation of Large Language Models
6.9K views
Jul 23, 2023
YouTube
Gabriel Mongaras
4:19
AI model distillation
17.8K views
Feb 19, 2025
YouTube
InterSystems Developers
4:17
Distillation Column Modeling in MATLAB and Simulink
22.5K views
Dec 5, 2012
YouTube
APMonitor.com
5:49
What is Transfer Learning? An Introduction.
1.8K views
Jan 8, 2025
YouTube
Don Woodlock
57:02
Model Distillation: From Large Models to Efficient Enterprise Sol
…
2.3K views
Oct 1, 2024
YouTube
Snorkel AI
7:56
How Distillation Makes AI Models Smaller and Cheaper
43 views
5 months ago
YouTube
Pop Culture Files
1:28
DeepSeek and OpenAI: Understanding Model Distillation
15.7K views
Jan 30, 2025
TikTok
mor10web
4:25
Distillation | Definition, Process & Types
30K views
Mar 2, 2016
Study.com
Roger Harris
16:54
Knowledge Distillation - Keras Code Examples
8.6K views
Feb 28, 2021
YouTube
Connor Shorten
3:45
What is Transfer Learning? How Is It Different from Model Distillation?
4 views
6 months ago
YouTube
Allow AI
2:18
How to setup a basic Distillation Column (DISTL model) Aspen Plu
…
1.9K views
Feb 27, 2018
YouTube
Chemical Engineering Guy
45:46
Model Mondays - Fine Tuning & Distillation
1.1K views
7 months ago
YouTube
Microsoft Reactor
13:01
Teacher-Student Neural Networks: The Secret to Supercharged AI
7.2K views
Sep 11, 2023
YouTube
Computing For All
4:53
AI Distillation: Making Models Smaller and More Affordable
3 views
5 months ago
YouTube
TrendTrove
23:48
Knowledge Distillation Demystified: Techniques and Applications
2.8K views
Oct 3, 2024
YouTube
Snorkel AI
15:33
Distillation Column Simulation with Aspen Hysys
26.9K views
Sep 16, 2019
YouTube
Aspen Hysys Pro
1:17:18
How to model Distillation Columns in Aspen Hysys
9.2K views
Aug 9, 2023
YouTube
HotSpot
3:02
Set up of a Distillation Column Model RadFrac in Aspen Plus (Le
…
21.5K views
Feb 27, 2018
YouTube
Chemical Engineering Guy
7:21
Knowledge Distillation in Deep Learning - DistilBERT Explained
19.3K views
Sep 22, 2021
YouTube
Dingu Sagar
36:45
ASTM D-86 Distillation Demonstration
19K views
Sep 9, 2017
YouTube
Andy
See more videos
More like this
Feedback