For the last 4 decades, the electronics industry has been evolving according to what is known as "Moore's Law", which states that the power of computing processors doubles every two years. During his time at Intel, one of its founders, Gordon Moore, came to the conclusion that the number of transistors that industry could fit on a processor chip would double every 24 months. And, as we can see, something like that is really happening: computers and phones that seemed powerful a few years ago already look obsolete compared to the latest innovations. In the meantime, manufacturers are introducing new microprocessors capable of performing even more operations per unit of time.
Transistors - these tiny semiconductors are the backbone of all modern technology. Every year they get smaller and more energy-efficient. But there has to be a limit to how much they can shrink, right? Yes, and we are getting very close to it.
A modern transistor consists of two semiconductors with a surplus of electrons and a semiconductor with a deficiency of electrons in between. Above them are a control gate and a floating gate, insulated with a dielectric. When a voltage is applied to the control gate, some of the electrons will move to the floating gate due to the tunnel effect. The floating gate, which receives a negative charge from the electrons "planted" on it, will interfere with the current flow through the transistor. In this case, the transistor will have a value of "1". The size of the control gate plays a big role here. If it is smaller than 5 nm (nanometers), then from the floating gate, due to all the same tunnel effect, there will be leakage of electrons and the transistor will stop working properly.
Today's processors use transistors with a control gate of about 20 nm, and scientists are looking for ways to reduce them to 5 nm.
Perhaps due to the use of more efficient materials and dielectrics, scientists will be able to increase the computing power of processors for some time, but sooner or later electronic computing systems will reach their maximum capacity. But even now specialists know how to increase their performance: to use for calculations not a flow of electrons, but light - a flow of photons. The main advantage of using photons is that the high frequency of waves of the optical range will allow a high degree of parallelization of data transfer and processing. In addition, the speed of propagation of photons will be almost equal to the speed of light, which will exceed the speed of signals in a conventional wire, where due to the resistance of the material itself and friction there is a loss of energy. And, of course, such a system will not be hindered by any strong electromagnetic fields.
The idea of a processor called the "optical" processor was born quite a long time ago - in the 1980s, when the industry did not even think that the performance of electronic processors would reach its limits in a couple of decades. At that time, the optical processor was perceived as an alternative and simply an interesting substitute for the usual electronic processor. But now it is becoming clear that optical computers are the future.
In such a computer, calculations will be performed using photons generated by miniature lasers and propagated across the chip using a system of reflectors. To preserve the modern logic of calculations in an optical processor, scientists need to recreate a fundamental element - the optical transistor. Currently, research groups from various countries are proposing their own versions of optical transistors, which could change their properties when exposed to light. However, so far many of them require too high intensity of the incoming light signal, which leads to increased energy costs. In addition, the components of an optical processor have not yet been made small enough to match the compactness of silicon processors. The fact is that any elementary particle is at the same time a wave and travels in space as a wave. The minimum size of the waveguide for a photon is 600 nm, which, of course, is a lot. Scientists propose to solve this problem by using so-called plasmonic signals - projecting the oscillation frequency of a light wave into the oscillation of electrons on a metal surface, which allows making the optical system dozens of times smaller, while maintaining its advantages. Currently, the search is underway for the best material to bring this idea to life.
So we can see that the creation of an optical computer is an extremely difficult and expensive task. Nevertheless, mankind in the future will definitely need computing systems capable of processing huge amounts of information in short periods of time, and the optical computer is the main candidate.