this post was submitted on 21 May 2025
299 points (96.6% liked)

Technology

70916 readers
3314 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Absolutely needed: to get high efficiency for this beast ... as it gets better, we'll become too dependent.

"all of this growth is for a new technology that’s still finding its footing, and in many applications—education, medical advice, legal analysis—might be the wrong tool for the job,,,"

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 77 points 2 weeks ago (2 children)

as it gets better

Bold assumption.

[–] [email protected] 33 points 2 weeks ago (40 children)

Historically AI always got much better. Usually after the field collapsed in an AI winter and several years went by in search for a new technique to then repeat the hype cycle. Tech bros want it to get better without that winter stage though.

[–] [email protected] 29 points 2 weeks ago (2 children)

AI usually got better when people realized it wasn't going to do all it was hyped up for but was useful for a certain set of tasks.

Then it turned from world-changing hotness to super boring tech your washing machine uses to fine-tune its washing program.

[–] [email protected] 33 points 2 weeks ago (1 children)

Like the cliché goes: when it works, we don't call it AI anymore.

[–] [email protected] 5 points 2 weeks ago (1 children)

The smart move is never calling it "AI" in the first place.

[–] [email protected] 10 points 2 weeks ago* (last edited 2 weeks ago)

Unless you're in comp sci, and AI is a field, not a marketing term. And in that case everyone already knows that's not "it".

[–] [email protected] 6 points 2 weeks ago* (last edited 2 weeks ago) (4 children)

The major thing that killed 1960s/70s AI was the Vietnam War. MIT's CSAIL was funded heavily by DARPA. When public opinion turned against Vietnam and Congress started shutting off funding, DARPA wasn't putting money into CSAIL anymore. Congress didn't create an alternative funding path, so the whole thing dried up.

That lab basically created computing as we know it today. It bore fruit, and many companies owe their success to it. There were plenty of promising lines of research still going on.

load more comments (4 replies)
[–] [email protected] 9 points 2 weeks ago

The spice must flow

load more comments (38 replies)
[–] [email protected] 14 points 2 weeks ago (1 children)

Yeah, I think there was some efforts, until we found out that adding billions of parameters to a model would allow both to write the useless part in emails that nobody reads and to strip out the useless part in emails that nobody reads.

load more comments (1 replies)
[–] [email protected] 24 points 2 weeks ago (6 children)

The energy issue almost feels like a red herring for distracting all idiots from actual AI problems and lemmy is just gobbling it up every day. It's so tiring.

[–] phoenixz 47 points 2 weeks ago (13 children)

That's because it IS an issue, together with many other issues like disinformation, over reliance, wrong tools for wrong (most) jobs, etc.

load more comments (13 replies)
[–] [email protected] 6 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

lemmy is just gobbling it up every day. It's so tiring.

Are you fucking serious? All I ever see on Lemmy is prople saying "AI slop" over and over and over and over again... in like every comment section of every post. It could be a picture that was actually hand-drawn, or a photograph that was definitely not AI, or articles written by someone "sounding like AI". The AI hate on Lemmy is WAY overpowering any support.

[–] [email protected] 7 points 2 weeks ago (1 children)

I think you misunderstood me here as we're in agreement already

load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 18 points 2 weeks ago (2 children)

How does crypto mining play into all of the electrical need? I know they used to use a butt load.

[–] [email protected] 14 points 2 weeks ago (1 children)

I found this article from last year: https://www.eia.gov/todayinenergy/detail.php?id=61364

Our preliminary estimates suggest that annual electricity use from cryptocurrency mining probably represents from 0.6% to 2.3% of U.S. electricity consumption.

The wide range should not be too surprising, it's a mess to keep track of, especially with the current administration. Since then, with Trump immediately pledging to support the "industry", I can only imagine it consuming even more now.

[–] [email protected] 8 points 2 weeks ago (10 children)

That's a huge amount of electricity even at it's lowest. Are they building the AI to crypto mine is also another question. I could see these sneaky bastards combining the two somehow.

load more comments (10 replies)
[–] [email protected] 7 points 2 weeks ago

It should be clarified that it's 99.99% Bitcoin mining that's wasting all that energy, any other crypto that still uses mining is basically irrelevant when compared to it

[–] [email protected] 8 points 2 weeks ago* (last edited 2 weeks ago) (10 children)

Solar powered server farms in space. Self-powered, self-cooling, 'outside the environment'. Is this a stupid idea?

Edit: So it would seem the answer is yes. Good chat :) Thanks.

[–] [email protected] 32 points 2 weeks ago

Launch cost is astronomical.

Maintenance access is horrible.

Temperature delta is insane, upto 250C.

[–] [email protected] 27 points 2 weeks ago (1 children)

I don’t understand the self-cooling. Isn’t it harder to keep things cool in space since there is no conduction or convection cooling? I mean everything is in a vacuum. The only place for heat to go is radiative and that’s terribly inefficient. Seems like a massive engineering problem.

[–] [email protected] 8 points 2 weeks ago (1 children)

It is, infrared radiators weight a shit ton and are inefficient, big and unwieldy. Still the only viable option for cooling in space. AI would take an hugemongous square footage of it just so the GPUs won't melt.

load more comments (1 replies)
[–] [email protected] 18 points 2 weeks ago* (last edited 2 weeks ago)

You can cool servers way better on Earth than you can in space. Down here you can transfer heat away from the server with conduction and convection, but in space you really only have radiation. Cooling spacecraft is an engineering challenge. One might imagine a server stuck inside a glass thermos that's sitting out in the sun.

[–] [email protected] 15 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

If the end goal is so little Timmy can ask a robot if nazis exist and it spits out misinformation or so Ai bots can flood social media with endless regurgitated bullshit, then no, it's just more garbage in space.

Ai is interesting,... necessary? A lot of people can be fed and housed for the cost of giant, experimental solar powered Ai computers in space so that they have more excuses not to pay people a living wage.

load more comments (3 replies)
[–] [email protected] 13 points 2 weeks ago

Afaik space isn't self cooling. Overheating of spacecraft is a thing. I think they can only cool through infrared radiation or something.

[–] [email protected] 6 points 2 weeks ago

Do you know how much energy you need to launch a kilogram into Earth orbit?

load more comments (3 replies)
[–] [email protected] 5 points 2 weeks ago

Its worth it for school essays and prawn jesus though.

load more comments
view more: next ›