This almost feels like the confirmation bias that some religious people have where they see a “miracle” in everything.
These AI researchers have bought into the belief that superhuman AGI is right around the corner. Thus, they will interpret everything in light of that.
This also brings to mind the story of the Googler who was convinced that the internal Google AI had come alive. However, Bard doesn’t give the same vibes when people all over are using it.
When you desperately are invested in something being true (like AGI being right around the corner), you will convince yourself that you are seeing this happen. The only real counter to that is exposing it to a lot of outsiders (but then again you have convinced yourself it is too dangerous for the unwashed masses).
Ugh. We have a working example of a physical system that implements intelligence (the brain) in contrast to no evidence of all-powerful dude in the sky. Why these analogies keep popping up?
How can you know that AGI is not around the corner? Compute available to the corporations is already in a ballpark of some estimates of the brain's computational capacity. What's left is unknown unknows. And the researches working with the state of the art models have better info to estimate them than you.
"The googler" you've mentioned wasn't a researcher.
If you claim to have AGI, show it, prove it. Otherwise, I will continue to assume that it's not around the corner.
If you claim that GPT4 is close to AGI (as was done a lot), then you very likely have access to a GPT4 that I don't have access to. The actual usable thing available out there clearly isn't.
Not that long ago some people predicted that software engineers would be out of a job within weeks. "Brilliant" CTOs claimed they would replace developers with ChatGPT. What happened? Nothing.
I'll boldly predict that this time what will happen is exactly nothing again.
I may be wrong, but at least I don't waste my time with the buzz about the next "big thing" which in reality isn't ready for anything useful yet.
Apparently Ilya has been leading people in "feel the AGI" chants and "I can feel the AGI" is his catchphrase within OA. So yes, some people might have gone of the rocker a little bit.
These AI researchers have bought into the belief that superhuman AGI is right around the corner. Thus, they will interpret everything in light of that.
This also brings to mind the story of the Googler who was convinced that the internal Google AI had come alive. However, Bard doesn’t give the same vibes when people all over are using it.
When you desperately are invested in something being true (like AGI being right around the corner), you will convince yourself that you are seeing this happen. The only real counter to that is exposing it to a lot of outsiders (but then again you have convinced yourself it is too dangerous for the unwashed masses).