phi-1 llm model for code by microsoft (1.3b/350m parameters)
Published 1 year ago • 1.3K plays • Length 41:29Download video MP4
Download video MP3
Similar videos
-
13:25
microsoft’s phi-1 crushes coding challenges!
-
6:21
microsoft's phi-1.5 - shocking power with just 1.3b parameters (crushing llama 2 7b)
-
0:58
local llm testing with phi-1.5 #ai #artificialintelligence #machinelearning
-
17:56
phi 1.5 - the small model getting big results
-
4:17
llm explained | what is llm
-
24:20
"okay, but i want llama 3 for my specific use case" - here's how
-
10:24
training your own ai model is not as hard as you (probably) think
-
10:21
llama-3.1 (405b, 70b, & 8b) continuedev free copilot! fully locally and opensource!
-
28:18
fine-tuning large language models (llms) | w/ example code
-
7:28
why wait for kosmos-1? code a vision - llm w/ vit, flan-t5 llm and blip-2: multimodal llms (mllm)
-
5:21
microsoft's new ai phi-2: just 2b parameters outperform llama 2-7b & mistral!
-
5:34
how large language models work
-
8:16
microsoft phi-3.1 mini (3.8b) : phi-3 mini llm just got an insane upgrade (beats llama-3 & qwen2)
-
0:52
how to use starcoder in vscode
-
12:32
phi-3.5 (moe, mini & vision) : the new best small model is finally here! (beats llama-3.1, mistral)
-
8:55
generate code with microsoft phi 2 | data engineering with llm
-
31:42
fine tuning phi 1_5 with peft and qlora | large language model with pytorch
-
1:01
write #code with #ai #llm huggingface starcoder #opensource - part 1
-
13:40
textbooks are all you need