视频长度为 3:33

使用深度学习进行信号处理

*本视频可选择中文字幕。

深度学习越来越多的被应用于处理信号/时序数据领域,例如语音助手,数字健康,雷达和无线通讯。

在本视频中,您将学习如何利用各种技术,如时频变换和小波散射网络,并结合卷积神经网络递归神经网络来建立信号的预测模型。

您还将学习到使用 MATLAB 帮助您完成此类应用构建的四个常见步骤:

  1. 访问和管理来自多多种来源的硬件设备的信号数据
  2. 通过时频表示或深度网络对信号进行深度学习
  3. 在本地机器或基于云的系统上,在单个或多个 NVIDIA® GPU 上训练深度网络
  4. 为您的信号预处理算法和深层网络生成优化的 CUDA® 代码

出版年份: 2019 年 9 月 27 日

Deep learning continues to gain popularity in signal processing with applications like voice assistants, digital health, radar and wireless communications. With MATLAB, you can easily develop deep learning models and build real-world smart signal processing systems. Let’s take a closer look at the four steps involved.

The first step in building a deep learning model is to access and manage your data. Using MATLAB, you can acquire signals from hardware devices from a variety of sources.

You can also generate synthetic signal data via simulation or use data augmentation techniques if you don’t have enough data to begin with.

MATLAB simplifies the process of accessing and working with signal data that is either too large to fit in memory or if you have large collections of signal data.

Once the data is collected and ready, it’s now time to interpret the signal data and label it. You can quickly visualize and analyze your signals using the Signal Analyzer app as a starting point.

You can label signals with attributes, regions, and points of interest, and use domain-specific tools to label audio signals to prepare your data for training.

Moving on to the next step.

There are two approaches for performing deep learning on signals.

The first approach involves converting signals into time-frequency representations and training custom convolutional neural networks to extract patterns directly from those representations. A time-frequency representation describes how spectral components in signals evolve as a function of time.

This approach enhances the patterns that may not be visible in the original signal.

There are a variety of techniques for generating time-frequency representations from signals and saving it as images, including spectrograms, continuous wavelet transforms or scalograms, and constant-Q transforms.

The second approach involves feeding signals directly into deep networks such as LSTM networks. To make the deep network learn the patterns more quickly, you may need to reduce the signal dimensionality and variability. To do this, you have two options in MATLAB:

You can manually identify and extract features from signals, or

You can automatically extract features using invariant scattering convolutional networks which provide low-variance representations without losing critical information

Once you select the right approach for your signals, the next step is to train the deep networks which can be computationally intensive and take anywhere from hours to days. To help speed this up, MATLAB supports training on single or multiple NVIDIA GPUs on local machines or cloud-based systems. You can also visualize the training process to get a sense of the progress long before it finishes.

Finally, you can automatically generate optimized CUDA code for your signal preprocessing algorithms and deep networks to perform inference on embedded GPUs.

To learn more about our deep learning capabilities, check out mathworks.com. We have a large collection of examples to help get you started with using deep learning for signals.