According to "Moore's Law," a statement made by Intel's co-founder Gordon E. Moore in a 1965 paper, the number of components that can be put onto an integrated circuit inexpensively should double every year. He believed that his statement would hold up for at least a decade but it has held true till today. This "law" can also be applied to other realms of the digital world not just processing speed but also memory capacity and screen and photo resolution. The "law," which was originally intended to be an observation and a forecast has become a goal for chip producers.
Recently futurist Ray Kurzweil speculated that if transistor miniturization continues at its current pace, by 2019 transistors will only be a few atoms wide. Moore himself stated that the fundamental barrier to his observation is when transistor size closes in on the size of atoms. Moore also believed that as we come to seemingly unsurmountable problems, solutions will also come. Scientist have theorized on using the photon instead of the electron to transport information on circuits (optical computing) and other scientists have also theorized using qubits to transmit more information per bit of information (quantum computing). These are the most popular theoritical solutions to the "miniturization" problems of todays transistor circuits. They both are still in the theoretical stage of development. What are some other possible solutions to this problem using today's technology? Will we be at stuck at a maximum processor speed until a new technology is developed? If so, how would that affect the world as we know it?
Sunday, July 27, 2008
Subscribe to:
Posts (Atom)