Ticker

6/recent/ticker-posts

Ad Code

Responsive Advertisement

What’s Driving AI Costs Down and What’s Driving Them Back Up

Developing large language models is expensive work: the costs of running LLMs in the cloud knocked out more than half of the revenue Anthropic generated last month, Maria Heeter and I reported yesterday. And that doesn’t include the cost to train its models in the first place.

Recent technological advancements should make it cheaper to develop and run LLMs, however. It’s not clear if the new savings will be enough to turn LLM developers into high-margin software businesses, but it should help them deal with many of their most pressing costs.

For instance, the falling price of some older AI chips, such as Nvidia’s A100 graphics processing units, has helped cut model training costs at copywriting startup Writer by about 60% over the past three to four months, co-founder and CTO Waseem Alshikh told me. Nvidia also has boosted the quality of its software to help developers train and run LLMs even faster on those chips, he said.

Enregistrer un commentaire

0 Commentaires