Yet, it takes an enormous amount of processing power to produce a comment such as this one. How much would it take to reason why the experiment was structured as it was?
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
Information theory is all about cutting through the waste of a given computation to compare apples to apples.
I'll replicate an example I posted elsewhere:
Let's say I make a machine that sums two numbers between 0-127, and returns the output. Let's say this machine also only understands spoken French. According to information theory, this machine receives 14 bits of information (two 7-bit numbers with equal probability for all values) and returns 8 bits of information. The fact that it understands spoken French is irrelevant to the computation and is ignored.
That's the same line of reasoning here, and the article makes this clear by indicating that brains take in billions of bits of sensory data. But they're not looking at overall processing power, they're looking at cognition, or active thought. Performing a given computational task is about 10 bits/s, which is completely separate from the billions of bits per second of background processing we do.
A lion sucks if measured as a bird.
Crazy how a biological analog lump is capable of even a fraction of what a brain can do.
I could believe that we take 10 decisions based on pre-learned information per second, but we must be able to ingest new information at a much quicker rate.
I mean: look at an image for a second. Can you only remember 10 things about it?
It's hard to speculate on such a short and undoubtedly watered down, press summary. You'd have to read the paper to get the full nuance.
I mean: look at an image for a second. Can you only remember 10 things about it?
The paper actually talks about the winners of memory championships (memorizing random strings of numbers or the precise order of a random arrangement of a 52-card deck). The winners tend to have to study the information for an amount of time roughly equivalent to 10 bits per second.
It even talks about the guy who was given a 45 minute helicopter ride over Rome and asked to draw the buildings from memory. He made certain mistakes, showing that he essentially memorized the positions and architectural styles of 1000 buildings chosen out of 1000 possibilities, for an effective bit rate of 4 bits/s.
That experience suggests that we may compress our knowledge by taking shortcuts, some of which are inaccurate. It's much easier to memorize details in a picture where everything looks normal, than it is to memorize details about a random assortment of shapes and colors.
So even if I can name 10 things about a picture, it might be that those 10 things aren't sufficiently independent from one another to represent 10 bits of entropy.