$FIL Always lacking storage → Storage is the "black land" of AI. A 7B model + LoRA fine-tuning set + hundreds of billions of tokens dataset ≈ 50TB;
100,000 open-source models = 500PB, equivalent to 5 AWS S3
In one sentence, storage is not about "storing movies," but about storing "all of humanity's AI memory."
Chips will be oversupplied, electricity will be tight, but storage will always be scarce—because once knowledge is created, it will never disappear, only grow exponentially. So what do the big players do? They cut leeks, not to help you make money!
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
$FIL Always lacking storage → Storage is the "black land" of AI. A 7B model + LoRA fine-tuning set + hundreds of billions of tokens dataset ≈ 50TB;
100,000 open-source models = 500PB, equivalent to 5 AWS S3
In one sentence, storage is not about "storing movies," but about storing "all of humanity's AI memory."
Chips will be oversupplied, electricity will be tight, but storage will always be scarce—because once knowledge is created, it will never disappear, only grow exponentially.
So what do the big players do? They cut leeks, not to help you make money!