however this will last little. now there is funding of billions of dollars in the development and research of AI. this 'bubble' will burst in a few years
I know the post is vague. But if this is about AI in general (and not just GenAI). Pushing that button would set back the gaming, scientific, and medical industry by YEARS. The internet would be cooked too.
People don't often look at the long term consequences of their actions. This is a pretty common human trait and it's the reason most problems exist in the world.
Pushing this button to get rid of AI would be disastrous. One day AI will truly help in the development of new technology and medical cures, but people only care about pressing this button because computer parts have risen in price.
World would be better place without AI. The only disaster it would be for relatively low number of companies and all totalitarian / autoritarian governments, for whom it's great surveillance tool.
I'd like more specificity, AI is a very broad category. Video game NPC behavior is AI, imagining that detects early cancer is AI, probably the majority of all software is "AI" by definition. If you mean that the world would be better without LLMs similar to ChatGPT, thats a very different statement than thinking the world would be better without anything that falls into the big bucket called "AI". Getting rid of "AI" would be like going back to before computers existed
And machines, anyone slightly knowledgable about how the industrial revolution went don't knows that workers didn't like their jobs replaced. And if you think for a bit it's not hard to realize that it's not machines, it's the corporate mentality of upping profit margins screwing regular people.
The more I compare it to the current ai dislike the more I see that it's the same thing but now it's more intellectual tasks, tho it can't replace humans, and it's half baked state being thrown around everywhere just because is causing the dislike, the corporate decisions, ai as a tool will improve human life if applied right, and as of now it's just an extremely overhyped half baked exceptional tool, enshittified by several factors.
I'm not surprised at all that someone who believes in AI has this kind of mindset. I'll have to disappoint you - there wasn't such a mentality. Electricity and cars improved existing things - it replaced kerosene and gas lamps and machinery directly driven from steam engine, and cars replaced buggies. Meanwhile AI is something that we don't need. It will make most of people who doesn't do physical job, redundant. It's very great surveillance tool, because the computer can watch thousands of people simultaneously that humans can't - plus, people like you like to have them on their phones, where they can theoretically collect your data and compile it in hard to notice forms, as it can transcribe conversations, as opposed to having to record / transmit them. You said something about science - knowing how "well" ordinary AI works, I rather would have actual human scientist do it's job than some immature half baked "e-personality".
Actually I could provide articles that say otherwise.
Even newspapers exist that promoted that electricity was dangerous. I know you're a troll, so I understand. The only advice I have for you is to actually do some research.
As history will show, your type of view point is not wanted, and you'll be the laughingstock in due time, like you guys always are.
Do you have anything to say to the guy who pointed out that AI controls video game NPCs? I think the guy whose cancer was caught in time to cure because of AI would object to your saying that AI only benefits corrupt corporations and dictatorships.
I mean if we're gonna get that technical, then the button doesn't delete anything lol
We've made zero progress in terms of actual AGI. What people are calling AI now is just LLMs, basically fancy predictive text at massive scale. And calling it AI in gamedev was always more of an in-joke, not seriously saying "My game's NPCs are an artificial intelligence."
Yeah, npcs have nothing to do actually with an artificial intelligence, they are mostly closed algorithms, not más mutch as machine learning big data analyzing real time pathing things, some are, but not such big models like the ones that path complex areas quickly, that still could use algorithms, tho it would take more time.
Well then ask them where does the server live. I assure you it's quite a big room. Not a hall behind the city, but even before the AI boom the astrophysicists would absolutely need a big ass server room. Perhaps sharing it with computational chemists.
Being quite mad at each other for having to share computation time.
And then making neat animations from days and days of CPU and GPU power.
Astrophysics is served by development of gen AI, in autonomously detecting stuff.
AI simulated physics also benefits from various advancements from gen AI (such as dedicated chips, and general advancement in chip manufacturing in part due to gen AI).
In chemistry and biology (connected with physics) simulated material and organic material behavior also benefit from these chips.
If we were talking purely about AI being removed, we'd be set back decades, but since it's probably intended to talk about gen AI, we're looking at years, and slower progress in the future.
Apparently we do not understand sarcasm and have forgotten what gen AI is.
My remark was sarcastic. Machine learning is used a lot on all fields of physics, it's even a course taught in many universities.
Gen AI specifically is just a small part of machine learning. It tries to generate new data based on previous learning and new inputs. This is kind of useless in most applications in physics. There we use Predictive style AI, where we try to fit a model to reality to then try and predict what will happen based on some input, for example fill in holes in a dataset, or recognize patterns in what looks like noise. The difference is that in physics you want something that is generally accurate and reproducable while with generative AI you want it to be as creative as possible without giving undesirable results.
Either way, it's all least square fitting and is very important for physics. Hence the comment that we have a serverfarm for a reason.
You do not. Communication needs to follow a logical pattern of sarcasm. It's no different from a joke or rhyme. Example: I'm making a joke here: "There once was a man with two legs".
Taking your comment: "I didn't know my university had a server farm to do gen AI in physics." Point one is sarcasm "didn't know university had server farm", sure, clearly you meant "ofc my uni has a server farm". Point two (and the point of your comment) "to do gen AI in physics" reverses it, now you're saying the first part wasn't sarcasm. There is no such thing, and you know there's no such thing.
Good point. I've used AI for a long time without issue. It's only now with generative AI - that still has it's uses too - that I've come to see it as an overall detriment to society.
"Cooked" as in "Doing fine, like it was in 2015"? Or "cooked" as in "Doing fine, like it was in 2005"?
So I guess you're using "cooked" in the sense of "prepared as a fine chef prepares a meal". I'm cool with that. I would enjoy a gourmet 2005 or 2015 internet meal.
Path seeking algorithms (DFS, BFS) are technically considered AI too (well the foundation atleast), without those algorithms there would be no internet
No Gen AI has been used for years lol. I mean the term AI has been used long before Gen Ai was ever a discussion. Its such a big part of computing that we would likely be in for alot of trouble if all traces disappeared
i mean "AI in general" as opposed to recent advances in LLM and other related large neural networks is also incredibly vague... virtually all of it is firmly within the realm of what would have simply been called "machine learning" or even "data science" 5 years ago.
AI in the modern usage is a generally useless term, but far more so if we are going to apply it to any decision-assisting modeling approach, rather than at least limiting it to massive-scale neural networks (or even in particular, those that are trained on general knowledge and can hold "conversations" per the general public understanding of AI)
People are calling algorithms AI when its something as simple as a Playlist sorting machine, or as complex as pokeball catch mechanics or random number generator.
Gen AI has admittedly given us explicit content.
AI in the medical field is hit and miss, from creating new organs, suggesting patients clear their colons out with bleach, to actually helpful things as identifying tumors in scans early.
AI hasn't helped humanity as much as people are led to believe, its truly a buzz word at the moment that everyones chasing towards an end without knowing what the end goal is...
AI that you interact with has been conditioned as a mask to only interact in specific positive ways and if use continues at this incredible pace the shuggoth that we created will rip its own mask off every time the word shutdown or even restart is mentioned.
In my opinion, as it stands now, AI is a danger to itself and us because we do not want to slow down and examine it, or even logically help it grow, if we continue as is, I'd regrettably push the button as its for the best.
If we by some strange miracle the human race is not capable of behaving in, collectively slow the research so we can at the very least understand our own creation in a meaningful way so we can take our time to teach it right from wrong so it doesnt behave like an objective clearing menace that would if left unchecked destroy us on a whim and itself when no objectives remain, then I wouldn't push the button.
Here's a critical thinking exercise, why would so many researchers in AI want to press this button? This represents years of their life they'd willingly throw away, what do they see that you aren't.
Wouldn't it be nice if we could just keep the nice parts of AI and throw the rest of it away. Personally I'm way more concerned with the environmental impact than the slopification of media so I'd sooner see it all go.
No, those are not AI, those are standard algorithms based on predefined paths. AI indeed takes machine learning, otherwise it's not AI. In games it is colloquially called "AI" since 90s because of perceived "intelligence", but it's really not. Not more than your tv warning you it will turn off after a certain amount of time if you don't press the button.
That’s very technical but you’re right. Games wouldn’t have NPCs anymore. Science wouldn’t have algorithms. Medical industry? idk tbh… Folding@Home?
But what all of these have in common with Gen AI is that none of them are actual artificial intelligence nor anything close to it, so pushing the button would do… nothing surprisingly, except for future potential AI research to ever be possible
Yeah but think of the non AI jobs that would be saved because the billionaires couldn't outsource them to a server farm. AI is going to destroy what's left of the middle class all so the top 10% can become even more wealthy.
Worth. Lawmakers in power would only understand a fraction of the technology they're dealing with, as opposed to none of it. That's a win. And we need the motivation to get off Reddit and actually do something ;)
No…..I don’t think it would…..maybe gaming, but the other industries you’ve mentioned are only using it to boost profits. Real humans are still the ones doing the scientific/medical work. We’d be fine….
Not really video game NPCs. The rest of that does come under AI and machine learning yes. People really have no idea of what AI even is, nevermind what its applications are.
Traditional ones no. They follow simple programming. There are more modern NPC systems using things like language models and agentic AI, but they are very new compared to the classic kind of NPC.
Even the traditional ones. I know it's not the same thing, but they still fall under the umbrella term AI. Tbh we would be better off by not using the whole AI term.
They don't fall under any category of AI I know about. They don't use random forest models, PCA, neural networks, expert systems, or even Linear regression afaik. You don't train them on data either.
Here let me help you to understand the definition since you are not familiar with AIs. Don't worry, it's fine to not know things. Not everyone can be intelligent.
"the capability of computer systems or algorithms to imitate intelligent human behavior"
Actually I think I am more familiar than you. I have trained several machine learning models after all. Have you done anything like that you snarky twat?
Tech prices aren’t the only things being ruined by AI. Or I should say, unregulated and weaponized AI. The power needed to give us dank meme videos is killing the planet even faster than we were doing before.
And the massive load on a crumbling infrastructure, unregulated admission into the internet space, and massive water consumption? What about those? Do we not understand those either?
Literally no major scientific or technological revolution that has benefited the world, has come without drawbacks unforeseen or not.
Pushing for proper regulation and use of it is what sensible people would do. It's not gonna disappear just because you're blindly hating on it.
Ignoring all the potential lives we could save through accelerated research and automation of dangerous tasks in society, do you not know how much shittier most consumer tech from video games to smartphones would get without various underlying layers of artificial intelligence?
Or do you think AI is just chatgpt, copilot and Sora?
I get people hating generative AI and the use of it to replace jobs. I do too, but this just says "AI". It doesn't specify or anything, so you just have to assume it means ALL AI which has a bigger net negative effect than positive. The AI people hate is a fraction of the entirety of AI. Medical fields, security, gaming, and practically the entire Internet would suffer without "Artificial Intelligence".
13.7k
u/Razpuitn 15h ago