Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Latency means this still makes no sense to me. Perhaps some batch background processing job such as research or something but that's stretching.


I think the most providers all give high latency batch APIs significant discounts. A lot of AI workloads feel batch-oriented to me, or could be once they move beyond the prototype and testing phases. Chat will end up being a small fraction of load in the long term.


That would imply there's still capacity here on earth for this type of traffic.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: