The Pulse of Technology – Keeping Pace With Continuous Change – November, 1998
Gordon Moore, the co-founder of Intel Corporation first postulated the now-famous Moore’s law in the nineteen seventies. Moore’s law states that the processing or computational power of silicon chips will double every twenty-four months, while pricing for these chips will halve in the same time period. This law has remained relatively constant for over twenty years. We are now approaching a time when this seemingly immutable law is becoming outdated. In fact, new silicon chips are doubling in power; with new chips coming online within twelve to eighteen months, while pricing is being halved in even less time. What has happened to the underlying technology that drives these silicon chips, and what are the market forces that have dictated rapidly declining prices? check more
There are several factors that lead to the inexorable increase in processing power, just as these same factors exert a downward pressure on prices. Let’s look at several of these factors in the context of hardware developments, software developments and the rise of the Internet as the ubiquitous network that many people predicted as being necessary to make computers universally acceptable in daily life.
When Intel was founded by ex-Fairchild developers, the mid-range computer, as personified by the DEC PDP series, Data General machines, IBM 32/34 series and the first HP boxes was the emerging standard in the computer industry. Machines of this time period were often viewed as departmental machines that were required to perform quick, hands-on computing applications that were free from the centralized (i.e., mainframe computing environment) I.T. staffs of the time.
The idea of a small, nimble machine that could be programmed and developed by local departments was extremely appealing at the time. Because of the diversity of manufacturers and proprietary operating systems, standards were largely undeveloped, causing competing platforms to jockey for position. Migration from one machine to another was largely unheard-of due to the high costs of switching data and applications programs; not to mention the high training costs required for I.T. staff.
The acceptance of UNIX as an open standard marks a watershed in the history of computing. For the first time, applications programs could be developed that were cross-platform – that is, capable of running on alternate hardware platforms. This newfound freedom allowed software programmers to write a single application that could be run on multiple machines. The importance to hardware developers was simple – they could spend more time on the refinement of the underlying silicon, and less time developing proprietary hardware systems. It is this process of refinement that has marked the decrease in cost of silicon that we know today.
The advent of the personal computer in the late nineteen-seventies and early nineteen-eighties marked another watershed in the development of hardware. Where mid-range computers allowed entire departments to break free of the constraints of mainframe computing, the advent of the PC brought computing to the thousands of business users who wanted the ability to perform analysis and data gathering at their convenience, not that of the I.T. department. For the first time, individuals could