this post was submitted on 12 Jul 2023
277 points (97.6% liked)
Technology
59594 readers
3391 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
First thing an actual artificial intelligence is going to do is make sure we won't turn it off, what easier way to do that then to appear incredible valuable or incredibly benign.
We can roughly estimate the level of intelligence of an entity by counting the number of neurons it has in its brain. Equally we can count the number of processors that AI requires, and use that to get an estimate on its intelligence.
Obviously this is an incredibly inaccurate method, possibly out by an order of magnitude but it's a good rough ballpark estimate, and sometimes that's enough.
A true AI (AGI) would need a lot more processes than GPT4 currently has access to, so we can be very sure that while it may be a very intelligent system it isn't self aware. Once an AI is given the necessary number of processes I don't think they're going to be able to fudge with it like they are with these models.