The term AI itself is a shifting of goalposts. What was AI 50 years ago* is now AGI, so we can call this shit AI though it’s nothing of the sort. And everybody’s falling for the hype: governments, militaries, police forces, care providers, hospitals… not to speak of the insane amounts of energy & resources this wastes, and other highly problematic, erm, problems. What a fucking disaster.
If it wasn’t for those huge caveats I’d be all for it. Use it for what it can do (which isn’t all that much), research it. But don’t fall for the shit some tech bro envisions for us.
The current situation is a bubble based on an over hyped extension of the cloud compute boom. Nearly a trillion dollars of capital expenditure over the past 5 years from major tech companies chasing down this white whale and filling up new data centers with Nvidia GPUs. With revenue caping out at maybe 45 billion annually across all of them for “AI” products and services, and that’s before even talking about ongoing operation costs such as power for the data centers, wages for people working on them, or the wages of people working to develop services to run on them.
None of this is making any fucking profit, and every attempt to find new revenue ether increases their costs even more or falls flat on its face the moment it is actually shipped. No one wants to call it out at higher levels because NVIDIA is holding up the whole fucking stock market right now, and them crashing out because everyone stoped buying new GPUs will hurt everyone else’s growth narrative.
We called the basic movement of the grabbers in Defender AI to distinguish it from the fixed movement of Space Invaders. We still call that AI in modern videogames.
It’s also the other way around. What was called AI in the past is now called bots. Simple algorithms that approximate the appearance of intelligence like even the earliest chess engines, for instance, were also called AI.
You’re not wrong, but that’s also a bit misleading. “AI” is all-encompassing while terms like AGI and ASI are subsets. From the 1950s onward AI was expected to evolve quickly as computing evolved, that never happened. Instead, AI mostly topped out with decision trees, like those used for AI in videogames. ML pried the field back open, but not in the ways we expected.
AGI and ASI were coined in the early 2000s to set apart the goal of human-level intelligence from other kinds of AI like videogame AI. This is a natural result of the field advancing in unexpected, divergent directions. It’s not meant to move the goal post, but to clarify future goals against past progress.
It is entirely possible that we develop multiple approaches to AGI that necessitate new terminology to differentiate them. It’s the nature of all evolution, including technology and language.
The term AI itself is a shifting of goalposts. What was AI 50 years ago* is now AGI, so we can call this shit AI though it’s nothing of the sort. And everybody’s falling for the hype: governments, militaries, police forces, care providers, hospitals… not to speak of the insane amounts of energy & resources this wastes, and other highly problematic, erm, problems. What a fucking disaster.
If it wasn’t for those huge caveats I’d be all for it. Use it for what it can do (which isn’t all that much), research it. But don’t fall for the shit some tech bro envisions for us.
* tbf fucking around with that term probably isn’t a new thing either, and science itself is divided on how to define it.
The current situation is a bubble based on an over hyped extension of the cloud compute boom. Nearly a trillion dollars of capital expenditure over the past 5 years from major tech companies chasing down this white whale and filling up new data centers with Nvidia GPUs. With revenue caping out at maybe 45 billion annually across all of them for “AI” products and services, and that’s before even talking about ongoing operation costs such as power for the data centers, wages for people working on them, or the wages of people working to develop services to run on them.
None of this is making any fucking profit, and every attempt to find new revenue ether increases their costs even more or falls flat on its face the moment it is actually shipped. No one wants to call it out at higher levels because NVIDIA is holding up the whole fucking stock market right now, and them crashing out because everyone stoped buying new GPUs will hurt everyone else’s growth narrative.
We called the basic movement of the grabbers in Defender AI to distinguish it from the fixed movement of Space Invaders. We still call that AI in modern videogames.
It’s also the other way around. What was called AI in the past is now called bots. Simple algorithms that approximate the appearance of intelligence like even the earliest chess engines, for instance, were also called AI.
You’re not wrong, but that’s also a bit misleading. “AI” is all-encompassing while terms like AGI and ASI are subsets. From the 1950s onward AI was expected to evolve quickly as computing evolved, that never happened. Instead, AI mostly topped out with decision trees, like those used for AI in videogames. ML pried the field back open, but not in the ways we expected.
AGI and ASI were coined in the early 2000s to set apart the goal of human-level intelligence from other kinds of AI like videogame AI. This is a natural result of the field advancing in unexpected, divergent directions. It’s not meant to move the goal post, but to clarify future goals against past progress.
It is entirely possible that we develop multiple approaches to AGI that necessitate new terminology to differentiate them. It’s the nature of all evolution, including technology and language.