The year 1993 was characterized by the digitiza-tion of the fixed phone network (471 optimallycompressed petabytes). We estimate the year1990 to be the turning point from analog to dig-ital supremacy. The Internet revolution beganshortly after the year 2000. In only 7 years, theintroduction of broadband Internet effectively mul-tiplied the world’s telecommunication capacityby a factor of 29, from 2.2 optimally compressedexabytes in 2000 to 65 in 2007. The most wide-spread telecommunication technology was themobile phone, with 3.4 billion devices in 2007(versus 1.2 billion fixed-line phones and 0.6 bil-lion Internet subscriptions). Nevertheless, the fixed-line phone is still the solution of choice for voicecommunication (1.5% of the total). The mobilephone network became increasingly dominatedby data traffic in 2007 (1.1% for mobile dataversus 0.8% for mobile voice).When compared with broadcasting, telecom-munications makes up a modest but rapidly grow-ing part of the global communications landscape(3.3% of the sum in 2007, up from 0.07% in1986). Although there are only 8% more broad-cast devices in the world than telecommunicationequipment (6.66 billion versus 6.15 billion in2007), the average broadcasting device commu-nicates 27 times more information per day thanthe average telecommunications gadget. This re-sult might be unexpected, especially consideringthe omnipresence of the Internet, but can be un-derstood when considering that an average Inter-net subscription effectively uses its full bandwidthfor only around 9 min per day (during an average1 hour and 36 min daily session).Computation.From a theoretical standpoint,a“computation”is the repeated transmission ofinformation through space (communication) andtime (storage), guided by an algorithmic pro-cedure (31). The problem is that the appliedalgorithmic procedure influences the overall per-formance of a computer, both in terms of hard-ware design and in terms of the contributions ofsoftware. As a result, the theoretical, methodo-logical, and statistical bases for our estimates forcomputation are less solid than the ones for stor-age and communication. In contrast to Shannon’sbit (29,30), there is no generally accepted theorythat provides us with an ultimate performancemeasure for computers. There are several ways tomeasure computational hardware performance.We chose MIPS as our hardware performancemetric, which was imposed on us by the reality ofavailable statistics. Regarding the contributionsof software, it would theoretically be possible tonormalize the resulting hardware capacity foralgorithmic efficiency (such as measured withO-notation) (32). This would recognize the con-stant progress of algorithms, which continuouslymake more efficient use of existing hardware.However, the weighted contribution of each al-gorithm would require statistics on respective exe-cution intensities of diverse algorithms on differentcomputational devices. We are not aware of suchstatistics. As a result of these limitations, our es-timates refer to the installed hardware capacity ofcomputers.
You've reached the end of your free preview.
Want to read all 7 pages?
Test, Data storage device, Computer data storage, Digital television, Exabytes