|
||||
We use them every day - pounds and ounces, feet and inches or even grammes and kilogrammes, metres and kilometres. They are a fact of life, and we couldn't do without them. But why are they that particular length or weight? The width of a man's thumb, the length of a girdle, the weight of a wheat seed, the volume of a handful of flour - all these measures are over a thousand years old. And, although we may not realise it, they are still used by many of us today. The inch is based on the width of a man's thumb, and, as a rough measure, is surprisingly accurate. The Romans introduced the inch (they called it the uncia), and had 12 of them to their foot, which was almost the same as the imperial foot still used in Britain and the United States today. The Anglo-Saxons took the length of a man's girdle to be a yard. The system obviously had disadvantages. You only have to look around you at the male population to see why. Early in the 12th century King Henry I is said to have standardised the yard by stretching out his arm and measuring the distance from his nose to the end of his thumb. The 'grain' is still part of the imperial system of weights and measures, and once represented the weight of a single seed. When you consider the weights of different seeds, such as the Busy Lizzie and the runner bean (or even the coconut) you realise that here was another source of confusion. But 7000 grains were held to he equivalent to one pound, a proportion still used today. The 'handful' seems to be an indeterminate amount, but it is still with us, and in more than one way. In ancient Greece, six tiny iron weights called 'obols' were considered to be a 'handful'. The Greek word for a handful is 'drachma'. This was both a coin - the drachma remains the unit of Greek currency to this day - and a weight, which still survives in the imperial system as the drachm, which equals one sixteenth of an ounce . You must have heard of the metre, even if you have never used it. It took the French a long time to decide on the length of a metre, and even then they got it wrong. In the tumultuous years following the 1789 revolution, the French Academy of Sciences was at last allowed to carry through a reform that it had been urging for many years - namely the establishment of a simple, logical system of measures. This was essential to simplify international trade - for even within France itself measures varied widely from one locality to another. A committee agreed that, to make calculations easier, the new system should be based on the number ten - and that the basic unit would be one ten-millionth of the length of the line running through Paris from the North Pole to the Equator. This new measure would be known as the metre, while the distance itself would be reckoned as 10,000 kilometres. It took them eight years to complete the surveys and calculations involved, and in 1799 the young Republic of France formally adopted its new metric system of weights and measures. Every unit of measurement was based on the length of the metre. The unit of weight, the gramme, was the weight of a cubic centimetre of distilled water at 4°C. - the temperature at which water is at its densest. The unit of volume, the litre, was 1000 cubic centimetres. We rarely hear of some of the original basic measures today. The 'are', for example, which is an area of 10 metres square, has been superseded by the hectare, which is 100 times larger. As the revolution in France gave way to the era of Napoleon's conquests, the metric system spread throughout Europe with the French Army. Its simplicity and logic made it the international language of science during the 19th century. But it was based on a miscalculation. By 1960, data from artificial satellites had confirmed suspicions that the original calculation of the distance between the North Pole and the Equator had been incorrect. The actual distance is not 10,000 km but 10,002 km - wrong by only one part in 5000, but an error nonetheless. The length of the metre couldn't be altered, of course, as this would have affected every other unit in the metric system. What they did was redefine it. Since 1983, the metre has been officially described as the distance travelled by light in a vacuum during a time interval of 1/299,792,458 of a second. This makes little difference to anybody who simply wants to measure a piece of wood or fabric, but it is as accurate as science can make it. Despite the use of the metric system in most countries, the old measures live on. The height of a horse is still measured in 'hands'; a hand being 4 inches long. Printers' type is measured in points; equal to 1/72nd of an inch. Comparatively few people feel absolutely at home with both Fahrenheit and Celsius (otherwise known as 'centigrade') temperature scales. Most of us, hearing a temperature expressed in the scale that is less familiar to us, will mentally convert the figure into the one we know better. But imagine how confusing things must have been early in the 18th century, when at least 35 different temperature scales were being used. It was not until 1714, when the German-Dutch instrument maker Gabriel Daniel Fahrenheit made the first really efficient thermometer using mercury in a sealed tube and created his own scale, that a single measure of temperature came into common use. Fahrenheit started with the coldest thing he knew of - a mixture of ice and salt. This he marked on his thermometer as 0 degrees. Next, he measured the temperature of the healthy human body. He had originally intended to divide his scale between this point and his zero into only 12, but the mercury in his thermometer moved much further up the tube than he had expected. To avoid such large, unwieldy units he decided his scale needed eight times as many divisions. So, instead of assigning a value of 12° to body temperature, as he had first intended, Fahrenheit called body temperature 96°. The precise figure is 98.6° on Fahrenheit's scale and 37°C on the Celsius scale - but small variations in the bore of the tube caused his thermometer to show a lower reading. Fahrenheit next measured the freezing and boiling points of pure water, which came to 32° and 212° respectively. A scale based on the freezing and boiling temperatures of water had been proposed as long ago as the 2nd century AD, by Galen, a Greek physician. Fahrenheit realised that these two temperatures are ideal reference points, because they are constant at a given pressure. His scale of temperature quickly became popular, particularly in English-speaking countries. But it was soon followed by a rival. In 1742, the Swedish astronomer Anders Celsius proposed a scale on which water boiled at 0° and froze at 100°. Why he chose this way round is a mystery, but after his death in 1744 the two figures were changed round. When, at the end of the 18th century, France introduced the decimal metric system of measures, Celsius's 'centigrade' scale found a natural home within it. It soon became the standard temperature scale for all scientific work, and is used in countries that have adopted the metric system. Fahrenheit, however, is still used in many English-speaking countries. But I can't finish without one of my "Did You Know"s . Did you know that, in Old Testament times, every unit of measurement in the Middle East had two values - the 'common measure' and the 'royal measure'. The royal measure was the larger. When a king demanded taxes to be paid in kind, the royal measure had to be used. When a king paid out, however, he always used the common measure. And so the royal coffers always made a profit. Bill Hutchings |
Return to the July 2006 Features page return to Home page and main index page last updated |