Ever since Charles Babbage invented the computer in 1822, people worldwide have been fascinated by the device of every new age. The first modern-day computer, however, was built in 1991 by the British Science Museum, after a massive gap of 169 years. And now, in today's ultra-digital age, we might just have made the most advanced leap involving Light Signals and Computers!
Alan Turing developed the first fully programmable, electronic digital computer, and his computer was named the Colossus Computer in 1943. Even after so long, computers are too complicated for the common folk to understand. While using computers has never been easier than today, the technology behind them has gotten much more complicated.
Still, can you imagine a world without a computer? Thanks to impeccable people like Babbage and Turing, who worked on inventing and modifying computers, our lives are so much easier today.
Essential parts of the computer, that we now know of and appreciate today, include the motherboard, processor, memory and storage. Imagine sorting through heaps of paperwork to get one tiny piece of information back in the day.
The complexity of computers astonishes us. The little switch operation of transistors enables the computer to complete enormous tasks, like downloading a movie or playing a game. This is done by the chip present in the computer fragment, which switches between the binary states.
Using light in computers to get the load off from the traditional ones - Optical Computers
According to Gordon Moore, co-founder of Intel, the number of transistors in a dense integrated circuit (IC) doubles every two years. Although this was just a prediction, it has been true since 1975. It became ‘Moore’s Law’. The traditional computer could not keep up with modern computing needs.
Researchers found a way to use light in computers to get the load off from traditional computers. These computers came to be known as Optical Computers. An optical computer uses light and not electricity to transmit, store, and process data. Lasers or diodes produce these photons, and these photons have shown higher bandwidth than the electrons.
Revolutionising analogue computing by reducing the number of light signals
A new study by the researchers from the University of Cambridge and Skolkovo Institute of Science and Technology talks about revolutionising analogue computing by reducing the number of light signals needed drastically and analysing the search for the best mathematical solutions for ultra-fast optical computers.
They say that since photons are more delicate as they don’t have mass, they are much faster than electrons, and enable a super-fast and energy-efficient optical computer.
What actually is an Optical Computer?
An optical computer uses continuous light signals, and it achieves computation by adding two light waves coming from two different sources and projecting the result onto 0 or 1 state. Statistical workloads such as those employed in Artificial Intelligence algorithms are perfectly suited for optical computing.
It is also said that this form of computing can solve complex network optimisation problems that would usually take centuries for traditional computers to solve. However, the continually changing circumstances in real life keep changing the algorithms’ values while interacting multiplicatively. Here, the conventional way of optical computing fails.
Multiplying the light waves for Optical Computers
Professor Natalia Berloff of Cambridge University and a PhD student Nikita Stroev from the Skolkovo Institute of Science and Technology, found that if they multiply the light waves instead of adding them, they can establish a different connection between the light waves.
For this study, they used quasi-particles called polaritons which are half-light and half-matter substances. Using a larger optical system class such as light pulse in a fibre, tiny pulses or blobs of super-fast moving polaritons can be created and overlapped in a nonlinear way.
Stroev said that the critical ingredient was how to couple the pulses with each other. If the coupling and light intensity is just right, the light multiplies and affects the individual pulses’ phases. This is how they can use light to solve nonlinear problems.
They were surprised by finding that there was no need to project light phases onto 0 and 1 states for solving the binary problems anymore. Instead, due to light signals’ multiplication, the system brings these states to search for the minimum energy configuration. On the contrary, the previous optical machines needed resonant excitation that fixed the phases to binary values from the outside.