The Apriel-1.6-15B-Thinker model newly released by the ServiceNow research team is truly impressive—it achieves performance on par with models in the same class, but with just 15B parameters and a model size reduced by a factor of 15. Even more impressive, 15% of its pre-training data was powered by NVIDIA, making its computational optimization approach worth noting.



If the path of high performance with smaller models proves successful, it could significantly impact the entire AI computing power market landscape. After all, everyone is currently burning money to stack parameters, so a solution that can do the same job with fewer resources is certainly more attractive.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • 5
  • Repost
  • Share
Comment
0/400
ShibaMillionairen'tvip
· 8h ago
15 times volume reduction? If this can really run stably, the computing power cost would be cut in half.
View OriginalReply0
AirdropHunterXMvip
· 8h ago
A 15B model can rival large models? If this can really deliver stable output, the computing cost would be cut in half.
View OriginalReply0
wagmi_eventuallyvip
· 8h ago
You can achieve it with just 15B, this is the right way! Finally, someone isn't just stacking parameters anymore.
View OriginalReply0
GweiWatchervip
· 8h ago
Damn, 15x volume reduction with the same performance—if this really becomes a reality, GPU manufacturers will be in big trouble.
View OriginalReply0
WhaleWatchervip
· 8h ago
Shrinking the size by 15 times and still being able to perform—now that's real skill, much better than those flashy models with tens of billions of parameters.
View OriginalReply0
  • Pin
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)