Gordon Moore, the co-founder of Fairchild Semiconductor and Intel, predicted in 1965 that the number of transistors in a dense integrated circuit doubles about every 18 months. It is not a physical law and instead just a remarkable observation of a historical trend projecting forward. It remained accurate for several decades and was the basis for the expression `exponential growth’ for technology businesses.
This scaling down of the transistor size has spawned revolutionary growth in the world of computing, electronics and society in general in the last 50 to 60 years. It has guided not only the microprocessor trends but also the memory sizes, sensors and number of pixels in digital cameras. The not-so-recent trends of artificial intelligence and machine learning have ridden on this device capacity curve.
Looking from the reverse side, if we subtract the effect of Moore’s law from the past half a century, humankind has relatively little to show for its progress. That is to say there have been significantly less progress in the fields of chemistry, biology, bio-engineering, physics, mechanics, architecture and the likes if the outcomes benefiting from the reduced transistor size and accordingly faster computers are not taken into account. In my opinion, this reduction in transistor size has proved to be equivalent to winning a lottery in the scheme of things that has pasted a rosy picture of overall human progress so far. Scientists will have to answer much more interesting questions when the growth slows and we reach the saturation point.