What refers to the computer chip performance per dollar doubles every 18 months?

Moore's Law, named for the co-founder of Intel Gordon Moore, defines expected advances in the need for data storage over time. In reality, it defines much more, beyond simply data storage. Read this chapter and attempt the exercises to gain a broader understanding of the importance and costs associated with Information Systems.

Introduction

Some Definitions

This phenomenon of "faster, cheaper" computing is often referred to as Moore's Law, after Intel cofounder, Gordon Moore. Moore didn't show up one day, stance wide, hands on hips, and declare "behold my law," but he did write a four-page paper for Electronics Magazine in which he described how the process of chip making enabled more powerful chips to be manufactured at cheaper prices.

Moore's friend, legendary chip entrepreneur, and CalTech professor Carver Mead, later coined the "Moore's Law" moniker. That name sounded snappy, plus as one of the founders of Intel, Moore had enough geek cred for the name to stick. Moore's original paper offered language only a chip designer would love, so we'll rely on the more popular definition: chip performance per dollar doubles every eighteen months. (Moore's original paper stated transistors per chip, a proxy for power, would double every two years, but many sources today refer to the eighteen-month figure, so we'll stick with that - either way, we're still talking about ridiculously accelerating power and plummeting costs).

Moore's Law applies to chips - broadly speaking, to processors, or the electronics stuff that's made out of silicon. Although other materials besides silicon are increasingly being used. The microprocessor is the brain of a computing device. It's the part of the computer that executes the instructions of a computer program, allowing it to run a Web browser, word processor, video game, or virus. For processors, Moore's Law means that next generation chips should be twice as fast in eighteen months, but cost the same as today's models (or from another perspective, in a year and a half, chips that are same speed as today's models should be available for half the price).

Random-access memory (RAM) is chip-based memory. The RAM inside your personal computer is volatile memory, meaning that when the power goes out, all is lost that wasn't saved to nonvolatile memory (i.e., a more permanent storage media like a hard disk or flash memory). Think of RAM as temporary storage that provides fast access for executing computer programs and files. When you "load" or "launch" a program, it usually moves from your hard drive to those RAM chips, where it can be more quickly executed by the processor.

Cameras, MP3 players, USB drives, and mobile phones often use flash memory (sometimes called flash RAM). It's not as fast as the RAM used in most traditional PCs, but holds data even when the power is off (so flash memory is also nonvolatile memory). You can think of flash memory as the chip-based equivalent of a hard drive. In fact, flash memory prices are falling so rapidly that several manufactures including Apple and the One Laptop per Child initiative (see the "Tech for the Poor" sidebar later in this section) have begun offering chip-based, nonvolatile memory as an alternative to laptop hard drives. The big advantage? Chips are solid state electronics (meaning no moving parts), so they're less likely to fail, and they draw less power. The solid state advantage also means that chip-based MP3 players like the iPod nano make better jogging companions than hard drive players, which can skip if jostled. For RAM chips and flash memory, Moore's Law means that in eighteen months you'll pay the same price as today for twice as much storage.

Computer chips are sometimes also referred to as semiconductors (a substance such as silicon dioxide used inside most computer chips that is capable of enabling as well as inhibiting the flow of electricity). So if someone refers to the semiconductor industry, they're talking about the chip business. Semiconductor materials, like the silicon dioxide used inside most computer chips, are capable of enabling as well as inhibiting the flow of electricity. These properties enable chips to perform math or store data.

Strictly speaking, Moore's Law does not apply to other technology components. But other computing components are also seeing their price versus performance curves skyrocket exponentially. Data storage doubles every twelve months. Networking speed is on a tear, too. With an equipment change at the ends of the cables, the amount of data that can be squirted over an optical fiber line can double every nine months. Fiber-optic lines are glass or plastic data transmission cables that carry light. These cables offer higher transmission speeds over longer distances than copper cables that transmit electricity. These numbers should be taken as rough approximations and shouldn't be expected to be strictly precise over time. However, they are useful as rough guides regarding future computing price/performance trends. Despite any fluctuation, it's clear that the price/performance curve for many technologies is exponential, offering astonishing improvement over time.


Is Moore's law still true?

The simple answer to this is no, Moore's Law is not dead. While it's true that chip densities are no longer doubling every two years (thus, Moore's Law isn't happening anymore by its strictest definition), Moore's Law is still delivering exponential improvements, albeit at a slower pace.

What is Moore's law in simple terms?

Moore's law is a term used to refer to the observation made by Gordon Moore in 1965 that the number of transistors in a dense integrated circuit (IC) doubles about every two years. Moore's law isn't really a law in the legal sense or even a proven theory in the scientific sense (such as E = mc2).

What happens after Moore's law ends?

Transistors on CPUs have become so small they are now just a few atoms in size. Challenges of power and heat have made performance gains of the past years marginal, while shrinking transistors any further will take heroic efforts that are increasingly complex and audaciously expensive.

What is Moore's law and why is it important?

In the 1990s, Moore's Law became widely associated with the claim that computing power at fixed cost is doubling every 18 months. Moore's Law has mainly been used to highlight the rapid change in information processing technologies.