Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?
This is your opportunity to ask questions. No question too simple or too silly.
Culture war topics are accepted, and proposals for a better intro post are appreciated.
Jump in the discussion.
No email address required.
Notes -
I think you need to define what you consider "parity" in this context.
The main thing that LLMs are good at is generating social media engagement (or rather a simiclura thereof) and by extension attracting VC dollars from less tech-savvy investors aiming to get in early on the next Facebook or Google.
Meanwhile they remain largely unsuitable for the sorts of tasks people actually want a notional AGI to automate, namely anything requiring both a high cognitive-load and precision on a short time horizon.
In short, I am only worried about Chinese AI (and GPT5 for that matter) insofar as they will turn TikTok and Instagram into even more of a hellscape than they aready are.
not surprising as none of them are generally smarter than humans, therefore not AGI as typically defined
More options
Context Copy link
If LLMs have moved trillions of dollars on the stock market, they must be doing something pretty substantial.
OpenAI is apparently bringing in billions in revenue. Apparently character.ai gets 1/5 of Google's inference requests (there are a lot of lonely people out there).
https://research.character.ai/optimizing-inference/
Yes, even less tech-savvy investors have realized this is a big thing.
One, "revenue" and "investment" are not the same thing.
Two, your reasoning only holds if one assumes that stock markets are rational. History would seem to indicate that they are not. See the tulip craze.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link