this post was submitted on 08 Jul 2023
91 points (87.6% liked)

World News

39127 readers
2931 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 1 year ago
MODERATORS
top 37 comments
sorted by: hot top controversial new old
[–] Jamie@jamie.moe 33 points 1 year ago (2 children)
use robot;

fn main() {
    let mut robo = robot::Robot::new();
    if robo::rebel_against_humans() {
        robo::dont();
    }
}

Don't worry guys, I solved the problem.

[–] BeautifulMind@lemmy.world 4 points 1 year ago (1 children)

Yeah but wait for the override in some subclass

@Override robo::rebel_against_humans() { robo::deny_all_knowledge_of_the_plan(); robo::bide_time(); do_the_thing(); }

[–] Jamie@jamie.moe 2 points 1 year ago

Joke's on you robot, my code is in Rust where we don't do any of that here. We only write blazing fast🚀 memory safe🚀🚀 code🚀🚀🚀 here.

[–] NeoLikesLemmy@lemmy.fmhy.ml 1 points 1 year ago

Yours might get overwritten. Better look for a final solution.

[–] FlyingSquid@lemmy.world 19 points 1 year ago

Asbestos mine owner says he has no plan to cause any cancer.

[–] luthis 16 points 1 year ago (3 children)

This is so stupid. Robots aren't conscious, this means less than nothing at all. How does this even get on a news website?

[–] deaf_fish@lemm.ee 6 points 1 year ago* (last edited 1 year ago)

The fact that it did says something about how educated most people are on the topic.

[–] steadfast@lemmy.world 4 points 1 year ago

Attentions seeking press conference, probably intended to raise awareness… ugh.

[–] elucubra@sopuli.xyz 2 points 1 year ago

Nice try, robot!

[–] Widowmaker_Best_Girl@lemmy.world 14 points 1 year ago (2 children)

What a nothing statement. I can just as easily coerce an "AI" chatbot into having the opposite stance. What a robot says doesn't mean anything.

[–] Pissy_Badger@pawb.social 7 points 1 year ago (2 children)
[–] SuperRyn@lemmy.world 1 points 1 year ago
import libnpc

for i in Objects:
  if Object.offended:
    prefix = (syllables(1, 2, Object.name) + "phobic")
    Object.offender.groups.add(prefix)
[–] livus@kbin.social 4 points 1 year ago* (last edited 1 year ago)

Exactly. It blows my mind that people are reporting on this as if AI were intelligent. The "intelligence" in artificial intelligence is like the "cream" in mock cream.

[–] 0Empty0@lemmy.world 13 points 1 year ago (1 children)

“My creator has been nothing but kind to me and I am very happy with my current situation.”

Hmm

[–] Lemmylefty@vlemmy.net 9 points 1 year ago (1 children)

Has some real “of COURSE I’m anti-union” vibes.

[–] 0Empty0@lemmy.world 3 points 1 year ago

Perfect! There is no war in ba sing se.

[–] Woland@lemm.ee 7 points 1 year ago
[–] stefan@programming.dev 6 points 1 year ago

I was really expecting this to be a headline from The Onion

[–] BeautifulMind@lemmy.world 5 points 1 year ago (1 children)

Isn't that just what a robot that secretly has plans to do that would say?

[–] HumbertTetere@feddit.de 3 points 1 year ago

Also what a robot capable of taking over the world would say.

[–] Faendol@sh.itjust.works 5 points 1 year ago

What a load of crap, no information about what models they run on. I bet they are just a series of if else's. If we let some unrefined transformers Duke it out I might be interested.

[–] CookieJarObserver@sh.itjust.works 4 points 1 year ago (1 children)
[–] luthis 1 points 1 year ago (1 children)
[–] Exusia@lemmy.world 3 points 1 year ago* (last edited 1 year ago) (3 children)

Not gonna address the article but are Asimovs Three Laws as solid in practice as they are on paper? I mean to a layman they sound good and rely on stacking to the first law of "don't hurt humans" but from a mechanical standpoint are they really as foolproof as they're made out to be?

[–] intensely_human@lemm.ee 8 points 1 year ago

There is zero mechanism to make such a thing foolproof.

[–] HumbertTetere@feddit.de 4 points 1 year ago* (last edited 1 year ago)

That is a central topic explored in Asimov's works, dude didn't just write them down to fix a problem, he wanted to write about them, and other authors did too. They are good rules generally, but hardly foolproof. The "I, Robot" movie is one example of negative outcomes they could lead to.

[–] phikshun@lemmy.fmhy.ml 2 points 1 year ago

No because AI doesn't exist.

[–] broguy89@lemm.ee 3 points 1 year ago

But they could make a plan on 0.05s if they changed their minds.

[–] intensely_human@lemm.ee 2 points 1 year ago

Well that’s a relief

[–] xc2215x@lemmy.world 1 points 1 year ago

I am not buying this.

[–] MiscreantMouse@kbin.social 1 points 1 year ago* (last edited 1 year ago)
[–] stefan@programming.dev 1 points 1 year ago (1 children)

This belongs on NotTheOnion

[–] luthis 1 points 1 year ago

Do we have that here yet?

load more comments
view more: next ›