Lasers, filters, and fiber add up to 26Tbps data transfers
The modern world revolves around fast, voluminous, and reliable communications. To accommodate the brave but not-so-new world of the interconnected, engineers have been working hard to cram ever larger numbers of bits down the pipes. Of course, for really high data rate connections, this means fiber optic cables and optically encoded data streams. Unfortunately, current encoding methods are very inefficient, so those big fat pipes are actually nearly empty.
More efficient encoding methods exist, of course, but they are difficult to apply in a light-based system. A recent Nature Photonics paper seems to point the way forward with a relatively simple encoding and decoding scheme.
We'll get to the new stuff in a minute, but let's begin with a look at why current optical systems are hitting the wall. Current systems are a basic amplitude modulation system, where data is transmitted by modulating the intensity of the light from near zero to some maximum value. This is very simple and effective, but it runs into several interrelated problems.
Bandwidth hurdles
First, you need to be able to blink the light fast enough—the switching process is electronic, so that is limited to the speed of the electronics. The way to get around this is to run several modulators in parallel and then combine the data streams. You can do this in serial, where each data stream has a time slot into which to drop a bit; or you can do it in parallel, with each data stream encoded onto a different frequency of light. It's also possible to do a combination of the two.
Let's start with a serial data stream. The higher the bit rate of our combined data stream, the shorter the light pulses have to be. With a fair bit of effort, it's possible to create pulses of light that are just five femtoseconds in duration. Being optimistic, that might give a data rate of up to (and I'm being ridiculous here) 100Tb/s. The gotcha is that short pulses of light misbehave in multiple ways.
An example: a short pulse of light is made up of a broad range of wavelengths and, when they are in the optical fiber, these different wavelengths travel at different speeds. So, over a certain distance, a pulse will spread out. In the case of data, the ones and zeros bleed into each other. For a 100Tb/s data stream, the distance over which the bleeding erases the information is about 30cm. And that is just one of the problems facing engineers who try to increase the serial bit rate.
Multiplexing the data
In current systems, the way to increase bandwidth has been to run multiple data streams in parallel, called wave division multiplexing. This takes advantage of the idea that light fields with different colors don't affect each other provided the intensity of the light is low. So you can keep your data rate reasonable enough so that the pulses survive long distances and, at the end of the fiber, you have a set of filters that separate the different data streams.
In principle, this system will work over the entire transparency range of the optical fiber. In practice, things are a bit different. First, you need to have lasers and amplifiers that can emit and amplify all the different wavelengths stuffed into your fiber. Second, because the losses of the fiber are higher for shorter wavelengths, you have to amplify more often. Third... well, let's just say that the list of problems is near endless. What this boils down to is that there's a limited amount of spectrum that can be used, and each channel needs a certain amount of width (which gets larger as the data rate goes up), limiting the total data rate.
The point here is that current encoding schemes are inefficient and use much more bandwidth than necessary, meaning that we waste a lot of the available spectrum. To overcome this limitation, researchers took technology developed by microwave communications engineers and applied it to optical systems.
Leveraging microwave techniques
What was just described is basically the equivalent of AM radio stations; microwave communications links are much more sophisticated. Remember that light has an amplitude, a frequency, and a phase, and all three can be used to help encode information. As an example, let's create four amplitude settings and four phase settings. There are 16 combinations of these amplitude and phase settings, meaning that in each time slot, two bytes of information is transmitted. We can still use a combination of different frequencies and encode separate data streams into different frequencies, but now, instead of trying to keep them separate, we merge them to create a something that looks like a randomly fluctuating light field.
What I have just described is called an inverse Fourier transform, which is performed on each time slot for the data. To get the information back, we need to perform the reverse operation, a Fourier transform, on each time slot. In the field of microwave engineering, this is not such a hard task, because everything is dealt with electronically, and you can use a bunch of tricks that involve mixing the received signal with local microwave sources.
But in the optical regime, using local light sources has been incredibly difficult. You also don't want to perform the Fourier transform electronically, since that limits your total bit rate to the speed of your electronics. Instead, the goal of this research was to do it entirely in the optical domain, so that high bit rates are achievable. And the researchers have taken advantage of the fact that telecoms' laser sources are quite stable now, so it is not so difficult to have a local light source to mix with the received radiation.
So, a huge team of scientists from many different institutions have put this all together. At the heart of the system is a laser that emits a broad frequency comb. The comb is split up and each frequency has its phase and amplitude individually set to encode data. They are then recombined and sent off. At the other end, the light is passed through a series of filters that splits the light up into its original comb frequency elements. These are then directed to a set of detectors for decoding the amplitude and phase.
When operating correctly, this thing can put through a massive 26Tbps, which is very impressive. What makes it more impressive is that this whole system is four commercial devices strung together with a custom filter in the middle. I have no doubt that the filter will be commercialized shortly.
Even with that, I am not sure that we will see the system in the wild any time soon. This is because, even though the laser source is commercial, it is a rather special type of laser. Although the telecoms industry has a habit of taking rather specialized devices and making them commodities, no one has yet succeeded in doing that with this particular type of laser, despite there being huge incentives to do so.
Comments