When faced with choosing the shortest queue at a supermarket, what do you do? Nobody starts counting – what our brain does is “number sensing”.
The ability to gauge numbers occurs without knowing how to count has been linked to mathematical skill, and it varies largely between people. Researchers have been studying how number sensing works in the human brain and found out some situations where we are prone to making mistakes.
Try it yourself. In the image below, guess which set has more dots.
If you chose the left set, you have just found one way human visual perception is prone to making mistakes. The correct answer is that both sets have the same number of dots.
This error in number sensing grows in proportion to the number of objects. According to David Burr of the University of Florence, “A previous study has shown that we can immediately sense – not count – up to four objects without error. With more objects we start to make mistakes. With five objects your sense is off by one, with 50 objects, it is off by ten.”
In a new study by Michael Morgan from the Max Planck Institute in Cologne, just published in the Proceedings of the Royal Society B, is an experimental challenge between a human and a model. Images that consisted of many white and black dots randomly placed on a grey background were flashed for less than a second. The time was too short to count the number of dots. So while the human observer gauged numbers, the model used contrast between the dots and the background to make a decision on the number of dots.
Morgan found the ability to “sense” the number of dots depended on how clearly the dots were separated from others. Images that were blurred led to more errors than those that weren’t.
To test the strengths and weaknesses of the model against the human observer, Morgan varied the images presented: fewer dots, bigger dots, more blur and such. The computer model was nearly as good as the best human observer in estimating numbers when the size of the object or the overall pattern changed.
However, when the researchers varied the amount of blurring in the images, the computer model, which only relied on the contrast between the dots and the background to “sense” the numbers, fell short. To the model, neighbouring dots become one object and disturb the number estimation. Our brain is able to identify separate elements as dots and is therefore the superior “number sensor”.
Morgan shows in the paper that number sensing is actually the same as any other visual perception, like seeing a pattern. Marc Tibber, a collaborator on the study and researcher at University College London, said, “Our model doesn’t even try to identify objects, it works on much more basic cues like contrast and density.” He argues that number sensing may be a much simpler process and not a “higher cognitive ability”.
But Burr was not very impressed with the findings. “It underestimates what the human visual cortex can do. In order to gauge the number it almost certainly has to first isolate element as being objects.” So using a ratio between high and low contrast, as the model observer does, doesn’t seem like an effective approach to understand number sensing.
The field of number sensing still seems to be divided over this issue – is gauging the number of jelly beans in a jar a process of identifying objects and counting them or is it about identifying contrasts and patterns? Solving this basic question will be crucial to taking number sensing to the next step and apply it to build technology such as counting cells in a dish or identifying number of people in a crowd.
This article was originally published on The Conversation. Read the original article.