From: email@example.com (Lewin A.R.W. Edwards)
Subject: Re: Decimial has set progress back -> Help spread the use of Hex!
Date: 7 Jan 2003 09:16:58 -0800
NNTP-Posting-Date: 7 Jan 2003 17:16:58 GMT
> Why don't people use hex in everyday math? Logically it makes more
> sense. It's far more efficent, requires less concentration to use and
Please explain how hexadecimal is more efficient for general
mathematical purposes than, say, base 314 or base 5?
You may be about to argue that it facilitates conversion to binary,
and makes certain computer programming tasks easier to visualize, but
in fact the vast majority of numbers worked with daily have nothing to
do with computers; they are heights, weights, lengths, densities,
temperatures. However, there are much more important issues in your
message, which I think you fail to realize:
> reasons. Sure, the majority of people have ten fingers / toes, but
Which ought to make us able to count to at least 1 048 575 (in bare
feet), and I for one bitterly regret the retrograde evolutionary step
that denied homo sapiens a tail - clearly, nature's sign bit. The
human's vestigial tail is an obvious signal that the small primates
who have been studying us in our laboratories are soon to take over
You will observe that nature has specialized the mathematical
abilities of its creations by providing each with specific calculating
organs. A chicken, for instance, has only four discernible toes per
foot, and two wings. However, each foot has a separate toe facing
backwards, and three toes facing forwards. The erudite
chicken-observer will note that the rearward claw is a sign bit, the
three forward-facing claws are data bits, and the wing positions are
used to signal arithmetical operations. To wit, both wings
outstretched horizontally denotes a subtraction operation, both wings
held at a 45 degree downward angle denotes addition, and wings furled
by the side of the chicken indicate no-op (simple data storage for use
in a future calculation).
Clearly, the lone chicken is not intended to perform any vast
calcualation. This beast was designed to operate in large arithmetical
flocks, roaming and calculating across the countryside, and
communicating with a simple, audible, staccato binary code. The
chicken is unquestionably to mathematics what the ant is to civil
> decmial, just think how easier life would be! No having to change the
> default radix on out Zilog assembler. No trouble converting to binary
> and back. No problems memorising ASCII codes.
ASCII, like Unicode, is an entirely arbitrary assignment of numbers to
glyphs. You might just as well say that choosing to use the Roman
alphabet makes it "no problem" to memorize the Latin names of every
species of insect found in the Greater New York area.
> I for one vote we take the final step and move towards a true standard
> numbers system. Combine the waring factions of roman and arabic and
> use hex and the worldwide standard. This would be a big step towards
I would point out that Arabic has a different set of glyphs from the
"Arabic" numerals we use. Take a look at an Arabic-numeral calculator
or clock sometime.
> instance. Really though, The Roman empire may never have fallen if
> they had joined the trojans and adopted base 16 as a symbol of peace.
The Roman empire may never have fallen if they had known about the
damaging physiological effects of lead salts.
> I strongly believe that us, readers of technical newsgroups have to
> intelligence and resourcenessfullness to band together in germany as a
Ah! While my intelligence and resourcenessfullness are beyond
question, I find myself lacking the necessary travel documents to
reunite myself with the cause celebre in Germany. Perhaps you could
organize these for me. A simple first-class ticket on Lufthansa would