BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

The Future Is Discontinous

Following
This article is more than 7 years old.

Technology develops with engineering and scientific efforts, but not in a continuous fashion that simple scaling laws, like Moore’s law, indicate. Even when there is a continuous activity to develop a technology, the actual introduction of advanced technology occurs when the technology is mature enough and the need for the technology is great enough.

Since 2000 Coughlin Associates has been tracking the development of hard disk drive technology on a quarterly basis. The chart below shows quarter-by-quarter announcements of areal density in HDD products over time compared to annual areal density percentage increases (on a vertical log scale). As can be seen, there is a very discontinuous introduction of higher areal density hard disk drives. For instance, in the early 2000’s there was a significant increases in HDD areal density roughly for a couple of quarters a year.

Starting in 2011 the rate of areal density increases was very slight until major increases started in Q2 2015 and continued through Q3 2015, with an overall increase of over 60% in announced areal density. Although this chart describes the development of HDD technology, it is probably true for most engineering and scientific developments. Progress is basically discontinuous.

In a sense, scaling expectations, such as those predicted by Moore’s Law, mislead the planners of technology introductions, since technology may actually mostly proceed in fits and starts. The engineering efforts to accomplish expected scaling developments are getting harder. These difficulties in pushing scaling are leading developers of processing, communication as well as memory and digital storage technology, to look for new ways to increase the ability to create, move and store our increasing knowledge of the world around us and improve the human condition.

At the 2016 IEEE Rebooting Conference as well as the following IEEE Technology Time Machine Conference, experts on digital processing and memory talked about how to continue to increase the capability of our digital devices without the photolithographic line width scaling expected from Moore’s law scaling.

There were 5 basic ways to create new types of processing devices that were discussed at the Rebooting Computing Conference. These are more Moore, that is, continuing scaling approaches, including more parallel processing. Another approach is called neurological computers that use phenomena that emulate the processes of the brain to do computation. There are also methods that give better and faster approximations to problems rather than exact solutions (often called Adiabatic Computing). Quantum computing is another path that may transform the way computer systems are designed. Finally, participants discussed approaches that don’t follow the Von Neumann computer architecture.

As pointed out by many conference participants, especially Rebooting Computing co-chair Tom Conte from Georgia Tech, we live at a turning point in computer design where new approaches can be explored and the next major paradigm that will guide the next few decades of data processing, movement and storage will be created.

Topics discussed included the use of optical signals to move data, superconducting Josephson junction devices for quantum computing or Neurological processing. There was talk about autonomous creation of systems as needed, using fundamental components as well as computer technologies that bring processing to the stored content rather than moving content to processing units.

As an example of the sort of imaginative material presented at the conference(s) Jeff Shainline and colleagues from NIST talked about using superconductors and light to create large-scale neuromophic computing. In their presentation they created superconducting nodes that generate spiked pulses, similar to those created in the nerves of the brain to process data.

All of these superconducting nodes are then connected using photonic signals that allow them to communicate with each other and carry out functions. Superconducting detectors running at low power are used to detect these spike pulses. The paper proposed to increase the number of these connected nodes to emulate the connectivity between synapses in the human brain. The result is projecting neuromophic computing devices about the size of a Volkswagen Beetle and using about 20 W of power (similar to the power requirements of the human brain.

The future of technology is discontinuous. It will be driven by major changes in computer architectures over the next few years. These changes will impact processing as well as memory and the basic way computers and the electronic devices that use computing to operate. Perhaps it is time to say “Death to scaling, throw off the chains of Moore’s Law and embrace the new paradigms of computing.”

Follow me on Twitter or LinkedInCheck out my website