Distillation Can Make AI Models Smaller and Cheaper

Image for article Distillation Can Make AI Models Smaller and Cheaper
News Source : Wired

News Summary

  • Distillation, also called knowledge distillation, is a widely used tool in AI.
  • A big, complicated model could be reduced to a leaner one with barely any loss of accuracy.
  • Distillation became ubiquitous, and it’s now offered as a service by companies such as Google, Openprint and Amazon.
  • The original distillation paper, still published on arivx.org, is only available on the pre-print server on the arivix.org website.
  • It was written by three researchers at Google, including Geoffrey Hinton, the so-called godfather of AI and a 2024 Nobel laureate.
The original version ofthis storyappeared inQuanta Magazine.The Chinese AI company DeepSeek released a chatbot earlier this year called R1, which drew a huge amount of attention. Most of it focused [+6022 chars]

Must read Articles