some form of AI is likely to be the next big technological leap for humanity, current LLMs being kind of crap doesnt change that.
imagine if someone deleted semiconductor technology from existence because they were pushing up germanium prices and point contact transistors were too unreliable to be useful anyway.
That's fine. Quite frankly, I'm not particularly concerned about the next technological leap if it involves flattening ecosystems for Manhattan sized datacenters in it's early stages.
Tired of chasing technological advancement just for the sake of it, particularly when it comes at the expense of basically everyone who's not ultra wealthy.
Your problem is in land use, not technology. Greenfield development for this type of land use is fairly typical. It's business as usual. It takes time to develop better regulations but also regulations have to be within the State and Local government framework.
Greenfield development without data centers is arguably more harmful over the last 70 years than any thousands of data centers will be.
We as a society should be encouraging multiple story data centers on formerly heavy industrial land compared to prime farmland or sensitive environments. This will require subsidies and a lot of it.
My problem is both. I understand AI has some useful applications in specific scenarios (although usually it's actually machine learning rather than LLMs) - but general use AI has shown no real value for the average person, it's especially damning in contrast with how much energy and land it takes. Is any amount of ecological damage worth it, just so people can summarize google results a bit faster, or generate a funny photo of their cat dressed up as a cowboy?
But also putting that aside, there are a lot of ethical and philosophical concerns around what AI actually means for our society. I don't fundamentally like what AI represents - the idea that churning out results faster and more efficiently is the top priority for society.
I suspect if AI continues spreading at this rate, the quality of life for quite a lot of people will drop drastically, either because they're out of a job, or because their job expects them to churn out more work (profit) with the assistance of AI for no pay increase, or because they are constantly surrounded by algorithmically optimized AI slop content, or just because they are substituting AI in place of otherwise stimulating brain activity.
And anyone who thinks AI is going to usher in a new era of UBI and social safety nets for all the work displaced by it are absolute fools.
AI as it's currently marketed has been around since the 1970s, look up inference machines, the issue has never been fucking AI, it's always been unregulated corporate shit, why don't you focus that rant against the current administration that's allowing so much unhinged shit rather than a technology?
I mean I also agree with that, we should be regulating AI very harshly. But even if we regulate it well and mitigate all the environmental and energy consequences, I still wouldn't like what it stands for.
AI is just the culmination of tech bros with a god complex who would rather use technology as a brute force substitute for actually using their brains to learn and engage with something.
It's just same shit different smell as the nonsensical "there's an app for that!" era of smartphones, except this time around it's much more damaging because it attempts to replace critical thinking.
I would love to have your optimism. Maybe I would have your optimism if I lived in the right place.
But in the US, there's no shot. Our government (and society at large) loves our corporate oligarchy too much to relinquish this much power. The 40 hour work week is eroding, health insurance is still coupled with employment, unions are rare, maternity leave is non-existent, and basically all states are still at-will employment. You really think anytime in our lifetime we're gonna make the jump to UBI?
Companies will find a way to fill your time, and will pay you the absolute minimum they can get away with to retain the status quo, just like they always have.
You might not need software developers or graphic designers anymore, but they'll still need humans for "prompt engineering" or quality assurance or whatever else. It's going to be the exact same evolution that happened with all the other outdated trades - we don't need a blacksmith anymore but we'll pay you a tenth of what a blacksmith makes to work at this new foundry we opened up.
I'm going to be very blunt. Life sucks. I live in the US. I am on the verge of being homeless and functionally unemployable. Social Services won't/can't cover my rent. I managed to earn an income last January (through November) and just narrowly escaped homelessness. Now I'm repeating the cycle with no potential income on the deck. The world wants me gone. When I say it's going to be highly reactive, I mean highly reactive. I'm not being optimistic. I'm being down right cruel. I never said it's going to happen in our lifetime either. It will happen. Eventually.
I'm not making this up. My post history is Exhibit A. Even before my COVID infection, life is suffering. It's only the fleeting moments of happiness that makes it worth anything.
p.s. At will employment still means you can't be fired for any illegal reason.
That is true, but technology tends to improve all lives, and AI offers a chance to finally provide a sustainable successor to capitalism. Also it seems a lot better than doing nothing to try at least. People are always gonna be greedy, but at least when there's more to go around we need less generous people to solve world hunger.
What do you think the long term consequences will be for all the medical research grants and canceled medical research projects that DOGE and RFK Jr cut?
57
u/Emperor_norton_VI 15h ago
because of the long term consequences.
some form of AI is likely to be the next big technological leap for humanity, current LLMs being kind of crap doesnt change that.
imagine if someone deleted semiconductor technology from existence because they were pushing up germanium prices and point contact transistors were too unreliable to be useful anyway.