Tullio Facchinetti
Tag cloud
Picture wall
Daily
RSS Feed
  • RSS Feed
  • ATOM Feed
  • Daily Feed
Filters

Links per page

  • 20 links
  • 50 links
  • 100 links

Display

Filter untagged links
Distilling knowledge from Neural Networks to build smaller and faster models https://blog.floydhub.com/knowledge-distillation/
Fri 15 Nov 2019 08:20:21 PM CET
QRCode

This article discusses GPT-2 and BERT models, as well using knowledge distillation to create highly accurate models with fewer parameters than their teachers

AI article machine_learning neural_networks
3660 links
Shaarli - The personal, minimalist, super-fast, database free, bookmarking service by the Shaarli community - Theme by kalvn