# MULTIPLICATION

Multiplication is a nonsense interpreted literally. ‘Increase and multiply’ ― but if you multiply something once it stays the same! And one could very reasonably conclude that if you get no change when you carry out an operation once, you will get no change when you repeat it: so no matter how many times you ‘multiply’ something it does not get any bigger. Abraham would never have had as many descendants as there are stars in the sky going about things this way.

Why is it, then, that we have this strange rule that **N × 1 = N**? The number system we work with today is formalised in terms of the Axioms of Fields: we have two basic operations ‘+’ and ‘×’ and both have a so-called ‘identity element’ which keeps things as they are, **0** for addition and **1** for multiplication. Why not have zero as the null operation for multiplication as well? On the face of it this makes a lot more sense since abstaining from increasing (or decreasing) something means it stays the same. In such a system

**1** **× 0 = 1 **and more generally

**N × 0 = N **where **N **is a positive integer.

The most natural sense of ‘multiplication’ is doubling what you already have, so

**1 × 1 = 2 **and more generally

**N × 1 = 2N **

It is not clear how we should proceed from now on. What happens when you have **N × 2 **? We could interpret this as an instruction to double again in which case multiplication becomes the initial quantity ‘times’ the appropriate power of **2**. But this lacks generality. Better to fall back on defining multiplication as repeated addition and interpret the ‘**×**’ as an instruction to “add on another **N**” or

**N × 2 = 3N **and more generally **N × m = (m + 1)N **

** **This sort of multiplication does not, as far as I can see, lead to contradiction and I have even attempted to use it. But it is extremely inconvenient because we lose the so-called commutativity of ‘**×**’ as an operation: the result would not generally be the same if we invert the two numbers involved. **3 × 1 **in this maverick system gives **6 **but **1 × 3 **gives **4**. What we *do *get in lieu is the peculiar

**N × m = m × N **if and only if **N = (m ─ 1**) so that

(**4 × 5) = (5 × 4) = 20**

This type of multiplication would also cause problems when related to ‘normal’ division since **
(N × m) /m ≠ N **in general. It thus requires a re-definition of division. And so on.

It would seem unlikely that these formal issues were the original reason for the rule that a single multiplication leaves the quantity unchanged. Arithmetic has only been formalised during the last 150 years or so while people have been handling numbers for thousands of years. There are two plausible explanations for the ‘null multiplication rule’.

It is generally accepted today that the earliest type of arithmetic was done using the fingers, sometimes the toes as well. This is shown by the abundance of number words that are related to the fingers: ‘digit, for example, comes from the Latin *digitus* meaning finger. And the widespread use of base *10 *throughout the world rather than the much more convenient *12* is doubtless due to the anatomical accident whereby mammals have ten fingers and thumbs rather than twelve. Finger counting and, more generally, ‘finger arithmetic’ was once widespread since most of the world was illiterate and was allegedly still used within living memory by pearl traders in the Middle East. The Venerable Bede wrote a treatise on finger counting and “the reader will be surprised to find that underlying these finger gestures is a positional or place-value system” (Menninger. *Number Words*).

Now I have actually experimented with a simple finger arithmetical system. Numbers, abstract, gestural or concrete, were not originally invented for ‘doing equations’ but in order to assess, and perhaps subsequently record, quantities of objects by representing them in a standard symbolic form ― even today numbers are used primarily for the recording of data. If you are walking along and want to assess how many trees there are in a clump you cannot operate with the trees since they are fixed. What you *can *do is to match each sub-clump of trees with the fingers of one hand and then use the other hand to record the number of handfuls if there aren’t too many. The eventual quantity can then be memorised and, if required, be subsequently recorded in a more permanent manner by way of charcoal marks on a wall, scratches on a bone, knots in a rope and so on — a Roman would have had a household slave with him holding a portable marble abacus.

Now such a procedure involves both division and multiplication. The collection of real objects is first of all ‘divided up’, at least in imagination, into so many fives or tens and each batch is ‘multiplied’ by repeatedly showing two open hands and the remainder neglected or shown with a single hand. There is, however, a significant difference between the two operations: it is the collection itself (the trees or beans or warriors) that is first ‘divided up’ into so many tens, but it is the copy, the handful, that is ‘multiplied’. This seems to be the true meaning of ‘multiplication’, namely ‘replication’ or ‘identical copying’ (cloning) and this, of course, is how mRNA goes about its business when it copies part of a strand of DNA in the nucleus while the actual assembly of amino-acids to form proteins takes place later *outside* the nucleus.

In the context of finger assessment or DNA replication, the rules for multiplication and division make perfect sense. The first copying is, as it were, an *inert *operation while ‘copying the copy’ by repetition is creative. The basic point is that the original collection being represented, the clump of trees or the group of men or the bases making up a gene, is not part of the arithmetic operation proper: it functions as a sort of template. And when we pass on to abstract operations where there is not necessarily a real object or collection in view, we retain the same mental picture to guide our operations. There is an original numerical quantity which is ‘out there’: we represent it by some written or verbal symbol and from then on operate on that.

Primitive commercial practice would have reinforced this schema. Arithmetic only got going with the rise of the large Middle Eastern empires, Assyria, Babylon and the like, when trade was extensive and an extensive bureaucracy was in place. It has been suggested that the development of writing, in the form originally of some sort of ideogram or recognizable picture, came about because of trade. Merchandise, say combs or olives or pins, were apparently often transported in sealed containers which could be checked on arrival to see if they had been tampered with. But how to know what was inside without breaking the seal? A simple stratagem would be to have a clay model of the object attached to the outside of the container indicating the contents. Later, a picture of the object replaced the clay model and later still a stylised representation and eventually a ‘word’. This symbol is then ‘multiplied’ so many times to indicate the sum total of the contents. Again, we have the strict separation between the representation and the actual object or objects without which the system would not work.

*SH 8/04/15*