Daniel Bourke

Daniel Bourke

AU
@mrdbourke
346
Video Count
12.3M
Video View
246.0K
Subscriber
#1,362
Australia Rank
#104,200
Global Rank
Daniel Bourke YouTube channel subscribers:246,000- Seelive statisticsand growth insights below.

Daniel Bourke YouTube Statistics & Analytics

Subscribers
246.0K
Total Views
12.3M
Videos
346
Activity
Unknown

Daniel Bourke Content Analysis

Content Type Distribution

Long videosLong
75%
30 videos
ShortsShorts
25%
10 videos

📽️ This channel specializes in long-form videos. Deep dives and comprehensive content perform well here.

Content Categories

Primary CategoryScience & Technology
100%
Science & Technology
40(100%)

🎯 Primary focus: Science & Technology with 40 videos (100% of categorized content).

Latest Video

Long video
Small Language Models (SLMs) Are the Future: Fine-Tuning AI That Runs on Your iPhone
1:04:42

Small Language Models (SLMs) Are the Future: Fine-Tuning AI That Runs on Your iPhone

92.7K
Views
2.1K
Likes
1 month ago
Published

In this talk, I go over the rise of small language models (SLMs) and how they can benefit your business or day to day life. Talk date: March 12, 2026 at the Queensland AI Meetup. We'll look at case studies such as Sunny, an iOS application which uses a fine-tuned version of MedGemma to privately track skin health on-device. We'll break down why on-device inference matters (privacy, offline access, zero ongoing cost) and compare the economics of local models versus cloud API pricing at scale. Then we'll discuss the hardware and software optimizations required to run a model on a compute constrained device. Finally, we'll get hands-on and fine-tune a small language model live. We'll walk through how to build a custom dataset, set up supervised fine-tuning using Hugging Face's SFT Trainer, and fine-tune a small model, Gemma 3 270M, in about two minutes on an RTX 6000 Blackwell GPU on Google Colab. We'll see how the base model's outputs to the fine-tuned version side by side, showing how even a small model can be customized to know specific people, handle edge cases, and refuse to answer questions it shouldn't. Links: Colab Notebook we used - https://dbourke.link/qldai-colab-notebook-llm-fine-tune-march-2026 End-to-end Small Language Model Fine-tuning Tutorial - https://www.learnhuggingface.com/notebooks/hugging_face_llm_full_fine_tune_tutorial Courses I teach: Learn AI/ML (beginner-friendly course) - https://dbourke.link/ZTMMLcourse Learn Hugging Face - https://dbourke.link/ZTMHuggingFace Learn TensorFlow - https://dbourke.link/ZTMTFcourse Learn PyTorch - https://dbourke.link/ZTMPyTorch Connect elsewhere: Download Nutrify (my startup) - https://apple.co/4ahM7Wc My website - https://www.mrdbourke.com X/Twitter - https://www.twitter.com/mrdbourke LinkedIn - https://www.linkedin.com/in/mrdbourke/ Get email updates on my work - https://dbourke.link/newsletter Read my novel Charlie Walks - https://www.charliewalks.com Timestamps: 0:00 - Intro 2:19 - About me 4:09 - Case study: Sunny 7:55 - Benefits of small language models running on-device 8:44 - Cost savings of on-device models 9:29 - Case study: Sunny (hardware overview) 10:55 - Current best practice for running VLMs on iPhone 12:35 - Case study: Sunny (memory usage in Xcode) 13:56 - Case study: Sunny (workflow overview) 15:25 - Jeff Dean on precision 16:46 - Precision breakdown 17:28 - Effects of quantization on model size footprint 17:48 - Saving memory by reducing token usage 19:28 - Before and after different on-device experiments 20:29 - Case studies for other small but useful language models 24:00 - Case study for private VLM-based surveillance 25:04 - Small language models features and benefits 25:47 - How to pick a model for your use case 26:06 - Question: What hardware is required for getting started? 27:21 - Prompting vs fine-tuning vs RAG 28:08 - Live LLM fine-tuning problem overview 28:50 - How I made a dataset for fine-tuning Gemma 3 33:08 - Live fine-tuning code begins in Google Colab 36:56 - Data = a guide for what you want your model to do 40:37 - Question: How do you know if your fine-tuned model is performing well? 44:24 - Comparing the base model to the fine-tuned model 53:23 - Demo'ing our fine-tuned model on Hugging Face Spaces 56:00 - Haiku 57:33 - Contact me 59:05 - Q&A

See Top Science & Technology YouTube Channels in Australia

Compare this channel with the leading Science & Technology creators in Australia.

Ranking: AustraliaCategory: Science & TechnologyCategory Focus: 100%
Open ranking

Daniel Bourke AI Channel Analysis

Gemini ProScore: 7.2/10

AI-powered insights analyzing content strategy, audience engagement, and growth potential.

Overall Score
7.2
Consistency
95%
Cadence
2-3/wk
Library
40

Growth Potential

7.5/10

Good content foundation. Increasing upload frequency could boost growth.

Audience Engagement

7.2/10

Moderate engagement levels. Focus on community interaction could improve metrics.

Content Strategy

7/10

Developing content strategy. Consider focusing on specific niches for better targeting.

AI Recommendations

Auto-prioritized by predicted impact

  1. 1
    Increase upload frequency to 2-3 videos per week
    High ImpactCadence
  2. 2
    Focus on SEO optimization for better discoverability
    High ImpactSEO
  3. 3
    Analyze top-performing content for pattern replication
    MediumStrategy
  4. 4
    Increase community engagement through comments and polls
    MediumEngagement

Frequently Asked Questions About Daniel Bourke

Data Source & Accuracy

Source: YouTube Data API v3
Accuracy: Real-time statistics from official YouTube API
Data is updated hourly and sourced directly from official APIs to ensure accuracy and reliability.

Data from YouTube Data API v3 • Updated hourly • Last updated: 12:24 PM