Stablecoin issuer Tether announced today (17th) a major technological breakthrough for its AI infrastructure QVAC Fabric: the world’s first cross-platform support for the BitNet LoRA fine-tuning framework, enabling large language models that previously required enterprise-grade GPUs and cloud computing power—such as those used in data centers—to now be trained and inferred on consumer-grade hardware, including smartphones.
Smartphones can now train LLMs: 1B models completed within 1 hour
According to data released by Tether, the framework has successfully achieved fine-tuning of BitNet models on various devices, including the common Samsung S25 and iPhone 16.
Samsung S25 (Adreno GPU):
iPhone 16 (Apple GPU):
Extreme tests have reached up to 13 billion parameters for model fine-tuning
Previously, AI training tasks were performed using high-end NVIDIA GPUs, but now they have been compressed to edge devices like smartphones.
Key technology: BitNet + LoRA—cutting AI costs drastically
The core of this breakthrough lies in the combination of two technologies:
BitNet (1-bit LLM)
LoRA (Low-Rank Adaptation)
Together, these enable models to operate in extremely low-resource environments.
Empirical tests show that BitNet-1B uses 77.8% less VRAM than Gemma-3-1B and 65.6% less than Qwen3-0.6B. Under the same hardware, models approximately twice as large can be run.
GPU unlocking mobile AI: performance boosted up to 11 times
Another key breakthrough of QVAC is enabling BitNet to run truly on “non-NVIDIA” ecosystems. It supports GPUs from AMD, Intel, Apple Silicon, and even mobile GPUs: Adreno, Mali, Apple Bionic.
Large language models are no longer the exclusive domain of tech giants; AI can now be decentralized
Tether CEO Paolo Ardoino stated: “Intelligence will be a key determinant of future societal development. It has the potential to enhance social stability, serve as a bridge connecting society, or further empower a select few elites. The future of AI should be accessible, usable, and available to everyone, not monopolized by a few cloud service providers with enormous resources.”
Traditional AI development heavily relies on cloud and large GPU clusters, which are costly and concentrated among a few tech giants. Tether’s QVAC platform supports meaningful training of large models on consumer hardware, including smartphones, demonstrating that advanced AI can be decentralized and inclusive. In the coming months, Tether will continue investing significant resources and funds to ensure AI can be used anytime and anywhere on local devices.
This article: AI is no longer the patent of tech giants! Tether launches QVAC—are we entering an era where everyone has their own LLM? Originally published on Chain News ABMedia.