Don't bother trying their other models available in GCP either. Their embedding models and their un-RLHFd generic GPT analogues are miles behind the competition. It's incredible how bad google have dropped the ball on this.
We had some google people come in at $CORPO_DAYJOB the other day to sell their cloud offerings. The engineer had the gall to say "people ask me, why did google miss the boat on LLMs? I say people, we BUILT the boat". While referencing "attention is all you need". Hilarious coping strategy guys, but because you had some good academics on your payroll in 2017, that doesn't mean you're delivering right now... we'll talk when you actually have something better than 90% of the other LLM offerings out there.
LLMs are in their infancy. ChatGPT is bleeding users by an alarming amount. It's incredible how bad OpenAI have dropped the ball on user retention, product integration and revenue.
So the raw model output doesn't affect user retention, product integration and revenue? The quality of the raw output is probably why it's failing so fast and hard.
It's also compatible with what I know about the business side of things. Google Brain/Research probably spent a lot of time and money in the neural network direction and still didn't have a clear way to productize it, so they cut them off just before they got there. Other companies were able to start where they left off since the research was public and make something with less time - making the investment in that direction look more attractive to higher ups.
That’s a strange take. Those researchers were given the resources, the right incentives, the right environment for that kind of research to happen, at Google, in a time when LLMs weren’t all the rage.
“X Built a boat == the builders remain at X until they die” Is a strange definition of building something imo.
When a company provides an R&D lab, we still attribute the results of that research to the company as well as to the researchers.
After all, if those individuals were not hired that position would still be filled with someone similar, but it's hard to argue that if the lab didn't bring those researchers together, they may have individually gotten the same result.
Honestly, it's not as easy as one may think to build a research lab.
We had some google people come in at $CORPO_DAYJOB the other day to sell their cloud offerings. The engineer had the gall to say "people ask me, why did google miss the boat on LLMs? I say people, we BUILT the boat". While referencing "attention is all you need". Hilarious coping strategy guys, but because you had some good academics on your payroll in 2017, that doesn't mean you're delivering right now... we'll talk when you actually have something better than 90% of the other LLM offerings out there.