this post was submitted on 22 Apr 2024
-3 points (20.0% liked)
BecomeMe
814 readers
1 users here now
Social Experiment. Become Me. What I see, you see.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't understand the problem. I don't see why virtual "embodiment" would be any more difficult problem than the rest of a 'Brain in a vat' setup.
Agreed, the author simply asserts that brains are interactive as if that's some big gotcha. The whole idea of the vat is that the machine is complicated to handle these things, at least well enough for your own system.
Sanderson's short story 'Perfect State' actively explores the edge cases of what a simulation world would have to do to deal with 'i survived this thing that normally kills everyone'.
I think the idea is probably that it would need a lot of simulated data/interactions. I could see a case for both believably of the simulation (particularly with our pattern-seeking brains) and just the normal and long-term operation of the brain.
On the second part, think about how things like inner ear problems and de-personalization exist. Or agony that comes with a plugged ear or nose. Imagine if you lost the sense of weight, pressure, or temperature... or perhaps even just lost accuracy (or gained delay) of it: do you think you would move the same? Do you think it would have any effect on your brain over time? This goes particularly for someone who grew up with a body, I don't think it would be unlikely for most people to have some kind of dissonance from noticing inconsistencies. (now if the thought it it's always been in a vat, that doesn't make as much sense, though that could lead to the argument that it would create developmental differences or at least lack of attunement to physical life EDIT: also garbage-in-garbage-out, particularly even just the quality of socialization for your entire life)
Of course I think something like brainVR could work if it was something you were aware of, but even then it would probably be better to just patch in (or not interfere with) sensory data from your body/local environment.
I think the argument comes down to this:
-- That the brain requires something wrapped around it responding to it and pumping the right juices to and fro.
Which is fine and makes total sense. From there however, the rest of the argument seems to veer completely philosophical - ie. Whether such a "body" changes the definition of what it means to "just" be a brain in a vat. It seems somewhat semantic to me, but I guess they want to make the point that consciousness is not just electrical signals.