0 Members and 1 Guest are viewing this topic.
Monday will mark the 50th anniversary of the patent for a silicon integrated circuit, awarded to Robert Noyce, an electronics engineer who was later nicknamed “the mayor of Silicon Valley”.Decades later we are still living in the revolution that began almost immediately, but there are growing signs that Moore’s Law – the famously accurate prediction that the number of transistors will double every two years – is ready for retirement.“The rate of progress is finally beginning to show signs of slowing down,” says Professor Steve Furber of the University of Manchester.A British microchip legend, Prof. Furber designed the first ARM processor for Acorn Computers in the mid-1980s. The silicon successors to his invention, developed by ARM Holdings in Cambridge, can now be found in most smartphones and the onrushing wave of tablet computers.But while for 50 years the microchip industry has strived to build faster silicon containing more transistors, it is increasingly coming up against a seemingly insurmountable obstacle: the laws of physics.The most advanced processors now contain transistors that are just 100 to 150 atoms across. Designing a controllable microchip with such precision takes the equivalent of hundreds of man years, and many firms are finding that the financial risks of pushing further are not worth the return. “The economics are beginning to bite and the great majority of people are pretty keen now not to be on the cutting edge,” explains Prof. Furber.“There are about 10 years of to go before we reach the absolute limit. People have been saying that for 30 years, but this time I think it’s probably right.”This is not to say the end of Moore’s Law will mean there will not be more powerful computers. Prof. Furber, along with the likes of Apple and Intel, sees the future in parallel computing, where problems are broken up into smaller ones which are solved simultaneously by microchips with multiple processing cores.Though most home computers and consoles are now capable of parallel processing, software developers have been slow to adapt because coding in parallel is much more difficult. Google’s search engine relies heavily on parallel processing by many thousands of networked computers to deliver relevant information from across the entire web almost instantlyAmong the few home PC programs to really take advantage is Crysis 2, a first-person shooter game that can run on up to eight cores. Its incredible graphics have made it a benchmark of PC performance.But Prof. Furber says that developers will have to tackle the difficulties of parallel programming as hardware firm come to increasingly rely on multicore processors to improve performance instead of smaller transistors. The iPad 2 contains a dual-core ARM A5, which is also rumoured to power the next iPhone.The increased emphasis on software does not mean that there will be no more innovation in silicon, even after 50 years. Plessey, a British engineering firm that built the first production televisions, developed the world’s first model of an integrated circuit in 1957, before Robert Noyce or his co-inventor Jack Kilby. Thanks to Britain’s complacent attitude to patenting at the time, the innovation was unrewarded and the Plessey name eventually disappeared in 2000 following a typical story and foreign ownership and decline.The firm is now on the comeback trail however, after a management buyout and working on applications for silicon microchips beyond number-crunching. It specialises in sensors such as the silicon that captures images in digital cameras and is developing an electro potential sensor with the University of Sussex that can detect a heartbeat up to a metre away.