Friday, June 12, 2020

Scientists propose data encoding method for the 6G standard

Scientists propose data encoding method for the 6G standard


ITMO University 

CC0 Public Domain

Researchers around the world are working on methods to transfer data in the terahertz (THz) range, which would make it possible to send and receive information more quickly than today's technology. But it is much more difficult to encode data in the THz range than in the GHz range currently used by 5G technology. A group of scientists from ITMO University has demonstrated the possibility of modifying terahertz pulses in order to use them for data transmission. They have published their results in Scientific Reports


Telecommunications companies in  are beginning to adopt the new 5G standard, which will provide previously impossible wireless data transfer speeds. Meanwhile, as companies roll out this new generation of data networks, scientists are already at work on its successor. "We're talking about 6G technologies," says Egor Oparin, a staff member of ITMO University's Laboratory of Femtosecond Optics and Femtotechnologies. "They will increase data transfer speeds by anywhere from 100 to 1,000 times, but implementing them will require us to switch to the terahertz range."

Today, a technology for simultaneous transfer of multiple data channels over a single physical channel has been successfully implemented in the infrared (IR) range. This technology is based on the interaction between two broadband IR pulses with a bandwidth measured in tens of nanometers. In the terahertz range, the bandwidth of such pulses would be much larger—and so, in turn, would be their capacity for data transfer.

But scientists and engineers will need to find solutions to numerous crucial issues. One such issue has to do with ensuring the interference of two pulses, which would result in a so-called  train, or frequency comb, used "In the terahertz range, pulses tend to contain a small number of field oscillations; literally one or two per pulse," says Egor Oparin. "They are very short and look like thin peaks on a graph. It is quite challenging to achieve interference between such pulses, as they are difficult to overlap."

A team of scientists at ITMO University has suggested extending the pulse in time so that it would last several times longer but still be measured in picoseconds. In this case, the frequencies within a pulse would not occur simultaneously, but follow one another in succession. In scientific terms, this is referred to as chirping, or linear-frequency modulation. However, this presents another challenge: Although chirping technologies are quite well developed in the infrared range, there is a lack of research on the technique's use in the terahertz range.


"We've turned to the technologies used in the microwave range," says Egor Oparin, who is a co-author of the paper.

"They actively employ metal waveguides, which tend to have high dispersion, meaning that different emission frequencies propagate at different speeds there. But in the microwave range, these waveguides are used in single mode, or, to put it differently, the field is distributed in one configuration, a specific, narrow frequency band, and as a rule, in one wavelength. We took a similar waveguide of a size suitable for the terahertz range and passed a broadband signal through it so that it would propagate in different configurations. Because of this, the pulse became longer in duration, changing from two to about seven picoseconds, which is three and a half times more. This became our solution."

By using a waveguide, researchers have been able to increase the length of the pulses to a duration that is necessary from a theoretical standpoint. This made it possible to achieve interference between two chirped pulses that together create a pulse train. "What's great about this pulse train is that it exhibits a dependence between a pulse's structure in time and the spectrum," says Oparin. "So we have temporal form, or simply put, field oscillations in time, and spectral form, which represents those oscillations in the frequency domain. Let's say we've got three peaks, three substructures in the temporal form, and three corresponding substructures in the spectral form. By using a special filter to remove parts of the spectral form, we can 'blink' in the temporal form and the other way around. This could be the basis for data encoding in the band."

More information: Xinrui Liu et al, Formation of gigahertz pulse train by chirped terahertz pulses interference, Scientific Reports (2020).  DOI: 10.1038/s41598-020-66437-4

Journal information: Scientific Reports

Provided by ITMO University

Wednesday, February 19, 2020

Artificial Intelligence and Data Science

Innovations in artificial intelligence (AI) and a paradigm shift to data-driven approaches owing to the growing trend in data drive new research opportunities in a variety of areas, such as social networks, bioinformatics, healthcare, manufacturing business, beyond 5G (6G) communications, Internet of Things (IoT), and so forth.  

In the aforementioned areas, system-generated information such as smart devices, sensors, agents, and meters as well as human-generated information such as texts, photos, and videos lead to a tremendous amount of data while new levels of security, performance, and reliability are required.

 In this context, equipping the relevant functionality with AI or data mining-based algorithms, including regression models, Bayesian learning, clustering, neural networks, decision trees, information retrieval, decision processes, multi-armed bandits, reinforcement learning, generative models, and graphical models, has received a substantial attention both in academia as well as in industrial communities. 

Recently developed AI or data mining approaches will provide promising solutions to many challenging problems through learning and decision making in terms of significantly improving user experience and service quality.

 

The “Artificial Intelligence and Data Science” bringing this perspective to AI and data science, and focuses on the latest research, algorithm design, analysis, and implementation for various applications. Will address a comprehensive overview of how to enable autonomous and intelligent services/applications though collecting, processing, learning, and controlling a vast amount of information across various domains. The following topics of interest include, but are not limited to:

 

• Network mining and graph mining

• Deep learning and neural network-based approach

• Social network analysis

• Reinforcement learning and multi-armed bandits 

• Knowledge representation and reasoning

• Anomaly and fake content detection

• Information retrieval

• Recommendation and ranking engines

• Machine learning in medicine and healthcare informatics 

• Big data analytics for beyond 5G or 6G

• Edge/fog computing using machine learning

• IoT data analytics

• Data-driven services and applications

Sunday, February 16, 2020

6G Wireless Systems

While 5G is currently being deployed around the globe, research on 6G is under way aiming at addressing the coming challenges of drastic increase of wireless data traffic and support of other usage scenarios. 6G is expected to extend 5G capabilities even further. Higher bitrates (up to Tbps) and lower latency (less than 1ms) will allow introducing new services – such as pervasive edge intelligence, ultra-massive machine-type communications, extremely reliable low-latency communications, holographic rendering and high-precision communications – and meet more stringent requirements, especially in the following dimensions: energy efficiency; intelligence; spectral efficiency; security, secrecy and privacy; affordability; and customization. Artificial intelligence approaches and techniques, such as machine learning (of which deep learning and reinforcement learning are specific examples), and machine reasoning (which includes planning, scheduling, knowledge representation and reasoning, search and optimization), are the new fundamental enablers to operate networks more efficiently, enhance the overall end user experience and provide innovative service applications. Quantum Optics Computing (QOC) and Quantum Key Distribution (QKD) are almost ready for industrial applications. In particular, massive Internet of Things (mIoT), Industrial IoT (IIoT), fully automated robotic platforms (which include control, perception, sensors and actuators, as well as the integration of other techniques into cyber-physical systems), vehicles and multisensory extended reality are examples of the new data-demanding applications, which will impose new performance targets and motivate 6G design and deployment.

This paper aims to provide the scientific community with a comprehensive overview of the most challenging aspects of 6G mobile networks and identify latest research on promising techniques towards the evolution to 6G on topics including, but not limited to the following:


- Vision, key drivers, new services and requirements for 6G

- System and network architectures for 6G 

- Wireless backhaul and fronthaul solutions

- Spectrum and channel modeling for 5G and towards 6G

- Energy efficiency and harvesting technologies

- Multi-level machine learning pipelines in 5G and towards 6G

- 5G and beyond towards 6G testbeds and experimentation

- Security, secrecy and privacy schemes for 5G and towards 6G

- Distributed computing for 5G and towards 6G

- Optical Quantum computing and QKD in 6G