Issaro Prakalung, Dreamstime.com
Issaro Prakalung Dreamstime Promo 142823114 617037977ad30

Algorithms to Antennas: Spectrum Sensing Using Deep-Learning Techniques

Oct. 20, 2021
This edition investigates spectrum sensing that leverages deep-learning techniques to identify 5G NR and LTE signals.

This blog is part of the Algorithms to Antenna Series

What you'll learn:

  • Identifying 5G NR and LTE signal via spectrum sensing using deep-learning techniques.
  • How to characterize spectrum occupancy by training a neural network.
  • Testing network signal identification performance.

We have covered a range of specific topics that relate to wireless applications using deep learning to perform a function or improve overall system performance. A few of our most recent posts are:

In this blog, we look at spectrum sensing employing deep-learning techniques to identify 5G NR and LTE signals. Spectrum sensing offers an important way to understand spectrum usage in crowded RF bands. We show how this technique can identify the type of signal being received in addition to the signal’s occupied bandwidth. While we focus on 5G NR and LTE signals, this type of workflow can be extended to other RF signals, including radar. For example, think of airport radar signals transmitting in the vicinity of a 5G base station. 

We follow a recipe similar to the one used for some of our other deep-learning-related posts, starting with synthesizing labeled datasets for 5G and LTE signals. We use this data to train a semantic segmentation network using a deep-learning network.

Our first goal is to characterize spectrum occupancy, so we train a neural network to identify 5G NR and LTE signals in a wideband spectrogram. As demonstrated in the blogs listed above, we always try to test the examples with a radio or radar whenever it’s practical. Keeping with this theme, we tested our network, which was trained using synthesized data with over-the-air (OTA) data collected from a software-defined radio (SDR).

Many techniques can be used to input data from radios and radars to a deep network. In some of our examples from past blogs, we show how this can be done with baseband, in-phase, and quadrature samples. Here, we borrow semantic segmentation techniques used in computer-vision applications to identify objects, along with identifying their locations within images generated from our training data. For wireless-signal-processing applications, the “objects” of interest are wireless signals, where the locations of the objects are based on the frequency and time occupied by the signals.

We synthesize our training signals and use channel and RF impairment models to ensure the data matches what the trained deep network will encounter when tested with OTA signals. The trained network is based on frames that contain only 5G NR or LTE signals. These signals are randomly shifted in frequency within a band of interest. Each frame is 40 ms long, which corresponds to a duration of 40 subframes. For this example, the network assumes that the 5G NR or LTE signal occupies the same band for the whole frame duration.

A sampling rate of 61.44 MHz was used—this rate is high enough to process most of the latest standards-based signals. Several commercially available, low-cost SDR systems also can sample at this rate, which made it possible to use for testing the network with a radio.

Table 1 lists the 5G NR variable signal parameters with multiple bandwidth and sub-carrier settings, along with the LTE variable signal parameters for different reference channels and bandwidths we used to synthesize our training data.

Table 2 shows a summary of the impairments employed in our 5G CDL and LTE fading channel models.

We generated spectrogram images from our synthesized complex baseband signals to convert the signals into images that represent the time-frequency domain. Figure 1 shows a random subset of time-frequency “tiles” from the training frames. You can see that we have a variety of SNR values, bandwidths, and band occupancy.

We used 80% of the single signal time-frequency images from the dataset for training and 20% of the signal images for validation. A semantic segmentation neural network was created based on resnet50, a common network architecture.

Our next step is to test the network signal identification performance using synthesized signals that contain both 5G NR and LTE signals. Figure 2 shows the normalized confusion matrix for all test frames as a heat map. The results are positive as most of the network predictions match the ground truth.

Figure 3 shows the received spectrum, true labels, and predicted labels for one of the resulting images.

As noted earlier, the plots above were generated from testing that used synthesized data. To see how the trained network performed with OTA signals, we used an ADALM-PLUTO radio and captured signals from a nearby base station. Figure 4 shows the results when LTE signals are sent through the network. Figure 5 shows the results when 5G NR signals are sent through the network.

The trained network can distinguish 5G NR and LTE signals including two example captures from real base stations. The network may not be able to identify every captured signal correctly. However, it’s straightforward to enhance the training data by either generating more representative synthetic signals or capturing OTA signals and including these in the training set.

To learn more about the topics covered in this blog and explore your own designs, see the examples below or email me at [email protected]:

See additional 5G, radar, and EW resources, including those referenced in previous blog posts.

Rick Gentile is Product Manager, Ethem Sozer is Principal Engineer, Jameerali Mujavarshaik is Senior Engineer, and Honglei Chen is Principal Engineer at MathWorks.

About the Author

Rick Gentile | Product Manager, Phased Array System Toolbox and Signal Processing Toolbox

Rick Gentile is the product manager for Phased Array System Toolbox and Signal Processing Toolbox at MathWorks. Prior to joining MathWorks, Rick was a radar systems engineer at MITRE and MIT Lincoln Laboratory, where he worked on the development of several large radar systems. Rick also was a DSP applications engineer at Analog Devices, where he led embedded processor and system level architecture definitions for high performance signal processing systems used in a wide range of applications.

He received a BS in electrical and computer engineering from the University of Massachusetts, Amherst, and an MS in electrical and computer engineering from Northeastern University, where his focus areas of study included microwave engineering, communications, and signal processing.

Sponsored Recommendations

High Performing X-Band Radar ICs and What They Can do for You

Oct. 25, 2024
Discover Qorvo's xRadar solutions offering high-performance X-band radar ICs with advanced GaN T/R modules and beamformers. Ideal for automotive, industrial, and security applications...

GaN Technology Solutions for Powering Defense and Aerospace Innovation

Oct. 25, 2024
Discover how you can make advancements in radar, aerospace, and advanced communications with Qorvo's industry-leading reliability and cutting-edge technology.

Qorvo: Powering Space Missions with Proven GaN & GaAs Reliability

Oct. 25, 2024
With over 25 years in space, Qorvo provides reliable GaAs and GaN devices for missions from LEO to deep space, meeting stringent MIL standards. We offer NASA-compliant product...

Spatium®: The Power Amplifier Technology of Choice

Oct. 25, 2024
Qorvo's patented Spatium® RF power combining technology provides a wide band, highly reliable, efficient alternative for traveling wave tube amplifier (TWTA) replacements, for...