Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, it is currently cheaper because it is massively subsidized. That will change when subsidies stop. I don’t think it is a good argument.
 help



The claim was "It is cheaper", not "It will be cheaper". Until it actually _is_ cheaper, it doesn't make much sense to purchase $10k+ in hardware to run local models that are still worse than the frontier offerings.

> Until it actually _is_ cheaper, it doesn't make much sense to purchase

Once it is cheaper, there will be more demand so it will no longer be cheaper. Buying now gets current prices (though demand is still fairly high).


No it's not. AI products are quite often subsidized. AI inference very certainly is not.

There are more and more independent AI inference providers without VC backing that serve open weight models on a ~cost-plus basis that show that subsidies are not significant for AI inference.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: