Thanks to @General_Effort@lemmy.world for the links!
Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior
Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0
Here’s a link to a preprint: https://arxiv.org/abs/2408.10234
We don’t think in “bits” at all because our brain functions nothing like a computer. This entire premise is stupid.
Also supposing it did, I’m quite sure that everyone’s brain would function at different rates. And how do you even measure those people that don’t have an internal monologue? Seems like there is a lot missing here.
Bit in this context refers to the Shannon from information theory. 1 bit of information (that is, 1 shannon) is the amount of information you receive from observing an event with a 50% chance of occurring. 10 bits would be equivalent to the amount of information learned from observing an event with about a 0.1% chance of occurring. So 10 bits in this context is actually not that small of a number.
I also don’t have 10 fingers. That doesn’t make any sense - my hands are not numbers!
Ooooor “bits” has a meaning beyond what you assume, but it’s probably just science that’s stupid.
I can tell you’re trying to make a point, but I have no idea what it is.
You say “we don’t think in bits because our brains function nothing like computers”, but bits aren’t strictly related to computers. Bits are about information. And since our brains are machines that process information, bits are also applicable to those processes.
To show this, I chose an analogy. We say that people have 10 fingers, yet our hands have nothing to do with numbers. That’s because the concept of “10” is applicable both to math and topics that math can describe, just like “bits” are applicable both to information theory and topics that information theory can describe.
For the record: I didn’t downvote you, it was a fair question to ask.
I also thought about a better analogy - imagine someone tells you they measured the temperature of a distant star, and you say “that’s stupid, you can’t get a thermometer to a star and read the measurement, you’d die”, just because you don’t know how one could measure it.
Bits are binary digits used for mechanical computers. Human brains are constantly changing chemical systems that don’t “process” binary bits of information so it makes no sense as a metric.
It’s not about how you measure it, it’s about using a unit system that doesn’t apply. It’s more like trying to calculate how much star costs in USD.