by Brett Riester, Senior Engineer, The Auri Group
Personally, I wanted to write about this on November 11, because it then it would be "binary" day, 11/11/11. Of course, there were other binary days in the 21st century so far (in the year 2011, as well as 2010, 2001, and 2000); e.g. on 01/01, 01/10, 01/11, 10/01, 10/10, 10/11, and on 11/01 and 11/10 (coming soon to a theater near you!). But 11/11/11 is the real symmetrical one!!
[BTW, 10/28/11 can also kind of/sort of be thought of 11/11 because 1+0+2+8 /11= 11/11, but that’s another story.]
Numbers have long been a cornerstone of computing, and thus hold a special significance in the discipline of computing technology and, specifically, in software development. Whether rigorous or not, in one form or another, they have played both central and supporting roles in the context of the whole gamut of this ever-growing field, from transportation, housing, business, industry, government, healthcare, and commerce-type software to digital/interactive games/entertainment, web- and mobile- based "apps" to, nowadays, software running through your kitchen refrigerator or even toaster (-oven)!
My personal interest in numbers mainly focuses on how we represent and manipulate them in computers via software, however, my educational background was supplemented with a math degree, so I am interested in them from a purely mathematical perspective, as well.
Wikipedia describes some number formats used in computing at http://bit.ly/rAyzRh. There, they describe numbers represented as "bits and bytes" (as well as "nibbles"), number systems such as binary, octal, and hexadecimal, in addition to decimal/fractional formats e.g. floating-point numbers.
Wikipedia (as well as many other places both online and off), can give you the "mechanics" of numbers/systems/formats, but I’m going to try to relate a bit more of their "philosophical" aspect, and/or some of the "bigger-picture" of their role in (modern, digital) computing (and perhaps society). First, in order to begin to realize just how important are these little things that we call numbers, try to imagine a world without them. Such a world is difficult to really imagine at all (at least for me). For, if you think about it, numbers form the basis of so many, many ways that people relate to one another, in terms of quantifying/valuing things (not just money) and differentiating/distinguishing all sorts of things from size (how big is that bread-box, anyway?) to area (how much land does my neighbor have on the other side of the fence?) to volume, weight, rates-and-conversions, etc., etc. – even time itself! (at least how it is measured) And would scientific/engineering disciplines even exist without numbers? Without numbers, we would still be in the Dark Ages (actually, really the Stone Age!!)
That’s society, though; let’s try to focus more on computing, specifically how numbers are used in the modern, digital age. Most people would probably agree that we are certainly living in a (very fast-paced) Technological (or "Digital") Age. So "digits" (i.e. numbers) are really the basis of this Age. Without numbers, we wouldn’t have any of our precious "modern-day-miracle" devices, gadgets, or "toys"; desktop/laptop computers, (smart) cell phones, (personal) electronics, game-consoles, even smart-cars and smart-houses! (the list goes on and on)
I think that we all know, at one level or another, the importance of numbers in our modern lives. But what about digital-computing, and software development? Without these, would we be programming on the abacus or something like that??
Numbers (in the context of digital-information and its processing/storage) is an extremely valuable tool in how we humans communicate and relate to each other in so many aspects. However, before continuing, I would like to say that its at least very interesting to note that the human-brain, so far as we understand it, is not digital at all. It is analog and is relatively very, very slow as compared to modern-day digital computing devices!
But, as tools, technological/computing devices (and thus effectively their digital-aspects/basis) are a cornerstone of many modern lifestyles of people from around the world.
So numbers are important; they matter. But how are they used, specifically in software engineering/development, that make them so "powerful"? Particularly, the Binary System, upon which most (if not all), "mainstream" (non-research-only) software is based. Well, nowadays, most modern software developers only infrequently need to think about binary numbers, and be concerned with how to manipulate them in programs that they write. That is in-large-part thanks to modern "3/4/5GL" computer languages (and, of course, their supporting operating systems).
However, Binary still actually underlies all that "modern" software that they build. Computing, particularly in regards to software, is very-much like an onion, in that it has layers on top of layers on top of layers. Such that, even if you only work with it at a relatively "high-level", the "lower" layers get more-and-more Binary. After all, the lowest-level, the "raw metal" so-to-speak, is all just 0’s and 1’s!
Binary is so important (and powerful), that it’s hard for me to imagine computer/software really based on any other number-system. (even though it easier for us people to work with hexadecimal, or octal, or even decimal; BTW, Douglas Adams’ supercomputer named Deep Thought came up with the answer to the Ultimate Question of Life, the Universe, and Everything as 42 = 6 times 9, in base 13, that is!! http://bit.ly/sF5d9k)
Perhaps one reason is because I know that the (hardware) circuit-gates are either open-or-closed ("on" or "off", 1 or 0; i.e. Binary). But it is the (innovative) combination of these little 0 and 1 bits that make them so powerful. It’s a little like the old saying, "two heads are better than one", albeit billions of times over!
People in today’s society have been so pre-conditioned to just expect extreme speed and such high precision/accuracy from modern computing devices (at least, most of the time, barring inconvenient, albeit usually infrequent, malfunctions, not to mention the all-too-common PICNIC phenomenon, "Problem In Chair, Not In Computer"). But how many of us take a step-back and stop-and-think to realize the "true" basis of all these modern-day "dream-machines", which really is numbers.
So, the next time that your life is impacted by modern technology, pause for a "binary" second and gain a little bit of perspective of and deeper appreciation for numbers, upon which all this computing and digital "stuff" is really based!!