It is an amazing fact that both mathematics and physics are distinguished by being ‘negative’ sciences. Out of all the information provided to us by our senses arithmetic rejects pretty well the whole lot, about 99%. Numbering is an informational minimum. We have something in front of us : living, dead, black, green, beautiful, manufactured, organic, it is all the same numerically speaking. There is not much further we can go except perhaps by making the most basic distinction of all, between ‘something’ and ‘nothing’.  As Piaget and Inhelder put it, “Number results primarily from an ignoring of differential qualities” (P & I, 105).¹
To become numerical entities must first of all undergo two negative transformations: they must first of all be depersonalised, i.e. all distinguishing features such as shape, size, substance, origin &c. must be wiped out (in imagination if not in fact). Secondly, if we are dealing with a group, this group must be disordered, i.e. all relative positioning, left, right, on top, under, and so on must be eradicated as well. This is, incidentally, why Georges Cantor, the brilliant though misguided infinity lover, wrote two bars, symbolising a double negation, on top of the letter he chose to represent the cardinal number of a set, M.²
Those who adhere to a strictly Darwinist position and do not hesitate to apply it to human society, find themselves in something of a predicament when they attempt to explain the origin of numbers and arithmetic. For not only does a knowledge of numbering not confer any immediate evolutionary advantage  but actually goes in the opposite direction to what one would expect. Numerical status can only exist when type status is abolished, at any rate for the time being. But for animals the important distinction to be made in practically all circumstances is one of type not number. Friend or foe, comestible or poisonous, known or unknown, such distinctions are all-important for animal species and mistakes spell disaster and possibly extinction. As to numbering, provided one has a vague idea of quantity, enough to allow you to decide when it is, for example, better to flee rather than fight, there is little need to bother with it. What sort of thing is this? Is the question to be asked, not How many?  Imagine someone saying to you, “There’s three out there”. The natural reply is, “Three what?”
Much the same applies to mankind when living in a hunting/food-gathering society. Any effort made in the direction of increasing numeracy would not only confer no obvious advantage but might even be counter-productive by interfering with the mental procedures that were of proven use. Missionaries, explorers, colonists and the like found that not only did hunting peoples have a very rudimentary number system but strongly resisted the introduction of more complex number systems. This is usually dismissed as ‘cultural resistance’ to anything new but it may also have been motivated by fear of losing their own quite different skills. Whether this was the underlying reason or not, there can be no doubt that hunting peoples along with herdsmen and early agriculturalists had very little interest in numbers. Even such an advanced but still semi-nomadic people as the Jews whose wealth lay in their flocks had a grotesquely rudimentary mathematics when one considers their epoch making achievements in religious thought and literature, law, hygiene and so much else.
It is rather paradoxical that the species which has developed mathematics and mathematical physics so very far is not, naturally, very talented numerically speaking!  Even today, in our ultra-educated, ultra-urban society human beings perform very poorly numerically which is precisely why we rely so cravenly on the ubiquitous calculator. Guess how many daisies there are in the lawn or the number of gulls sitting on the beach and then go and count them if you can. You will not only find that your guess was out but was wildly out. Compare this with the absolutely phenomenal ability we have for recognising faces after decades or recognizing places we have only perhaps visited once in childhood. We retain this ability to make fine qualitative distinctions even though we no longer need it to the same extent as the nomad or goatherd of the distant past. It is ironically exactly those abilities that come most easily to us (pattern recognition) that we find it most difficult to ‘teach’ to computers.
Why did early man not develop much number sense? Clearly, because he did not need it and any expense of effort in that direction ran the risk of warping his judgment in more important matters. Most really effective memorising was, and to some degree still is, essentially visual. Most of the calculating prodigies themselves relied on visualization techniques, turning numbers back into ‘things’ or even into ‘people’ to make them more memorable. One method is to visualize a street until you know it off by heart and store bits of numerical data in each of the houses as if they were inhabitants. Advocates of high speed mental arithmetic, now hardly practised, recommend visually associating signs with others so that, when for example you are asked to add or take away you ‘see’ the result without performing any sort of calculation. Thus the juxtaposition of 7 and 5 will at once ‘call to mind’ ‘12’, or alternatively ‘2’. In effect such techniques are reducing arithmetic, which we are not naturally good at, to visual association and pattern recognition which we are good at. In a mathematical era which concentrates on  deduction — a strictly non-observational process — pattern recognition is not prized and is even distrusted in mathematics as likely to lead to false judgment and rash hypotheses. The ‘classic’ mathematicians, up to the mid nineteenth century and the advent of analysis, on the contrary prized observation and developed it, especially the incomparable Euler.