Not sure if this is the best community to post in; please let me know if there’s a more appropriate one. AFAIK Aii@programming.dev is meant for news and articles only.
Not sure if this is the best community to post in; please let me know if there’s a more appropriate one. AFAIK Aii@programming.dev is meant for news and articles only.
Because AI - in a very broad sense - is useful.
Machine Learning and the training and use of targeted, specialized inferential models is useful. LLMs and generative content models are not.
What! LLMs are extremely useful. They can already:
-Funnel wealth to the richest people -Create fake money to trade around -Deplete the world of natural resources -Make sure consumers cannot buy computer hardware -Poison the wells of online spaces with garbage content that takes 2s to generate and 2 minutes to read
Let’s not forget about traditional AI, which have served us well for so long that we stopped thinking of them as AI.
What?
As in, I agree with your point. I just want to give a shoutout to the non-ML-based AI.
In the strictest sense of the technical definition: all of what you are describing are algorithmic approaches that are only colloquially referred to as “AI”. Artificial Intelligence is still science fiction. “AI” as it’s being marketed and sold today is categorical snake oil. We are nowhere even close to having a Star Trek ship-wide computer with anything even approaching reliable, reproducible, and safe outputs and capabilities that are fit for purpose - much less anything even remotely akin to a Soong-type Android.
In the strictest sense there is no technical definition because it all depends on what is “intelligence”, which isn’t something we have an easy definition for. A thermostat learning when you want which temperature based on usage stats can absolutely fulfill some definitions of intelligence (perceiving information and adapting behaviour as a result), and is orders of magnitude less complex than neural networks.
That’s why this joke definition of AI is still the best: “AI is whatever hasn’t been done yet.”
I have forgotten all working definitions of AI that CS professors gave except for this one 🙃
Putting aside that “AI” doesn’t exist…
For whom is it useful for? For what?
Under capitalism “usefulness” often means the destruction of humanity and the planet.
Example: The Role of AI in Israel’s Genocidal Campaign Against Palestinians
I am still waiting for evidence of that. Tried it for a while for general questions and for coding and the results were at best meh, and most of all it was not faster than traditional search.
Even so, if it was really useful, it would still not be worth the fact that it is based on stolen data and the impact to the environment.
AI is a super broad field that encompasses so many tech. It is not limited to the whatever the tech CEOs are pushing.
In this comment section alone, we see a couple examples of AI used in practical ways.
On a more personal level, surely you’d have played video games before? If you had to face any monster / bot opponents / etc, those are all considered AI. Depending on the game, stages / maps / environments may be procedurally generated - using AI techniques!
There are many more examples - e.g. pathfinding in map apps, translation apps -, just that we are all so familiar with them that we stopped thinking of them as AI.
So there are plenty of evidence for AI’s usefulness.
Langton’s ant can procedurally generate things, if you set it up right. Would you call that AI?
As for enemies in gaming, it got called that because game makers wanted to give the appearance of intelligence in enemy encounters. Aspirationally cribbing a word from sci-fi. It could just as accurately have been called “puppet behavior”… more accurately, really.
The point is “AI” is not a useful word. A bunch of different disciplines across computing all use it to describe different things, each trying to cash in on the cultural associations of a term that comes from fiction.
deleted by creator
I think what people are struggling to articulate is that, the way AI gets thrown around now, it’s basically being used as a replacement for the word “algorithm”.
It’s obfuscating the truth that this is all (relatively) comprehensible mathematics. Even the black box stuff. Just because the programmer doesn’t know each step the end program takes, doesn’t mean they don’t know the principals behind how it was made, or didn’t make deliberate choices to shape the outcome.
There’s some very neat mathematics, yes, and an utterly staggering amount of data and hardware. But at the end of the day its still just an (large) algorithm. Calling it AI is dubious at best, and con-artistry at worst.
Fair enough. I was using the new colloquial definition of AI which actually mean LLMs specifically.
I thing the broader AI which includes ML and all your other examples are indeed very useful.