Video length is 3:56

GPU Computing in MATLAB

Speed up your MATLAB® applications using NVIDIA® GPUs without needing any CUDA® programming experience.

Parallel Computing Toolbox™ supports more than 700 functions that let you use GPU computing. Any GPU-supported function automatically runs using your GPU if you provide inputs as GPU arrays, making it easy to convert and evaluate GPU compute performance for your application.

In this video, watch a brief overview, including code examples and benchmarks. In addition, discover options for getting access to a GPU if you do not have one in your desktop computing environment. Also, learn about deploying GPU-enabled applications directly as CUDA code generated by GPU Coder™.

Learn more about Parallel Computing Toolbox

GPU Computing in MATLAB

Published: 11 Mar 2021

GPU computing, is a widely adopted technology, that uses the power of GPUs to accelerate computationally intensive workflows. Since 2010, parallel Computing Toolbox has provided GPU computing support for MATLAB. Although GPS were originally developed for graphics rendering, they are now used generally to accelerate applications in fields such as scientific computing, engineering, artificial intelligence, and financial analysis.

Using parallel Computing Toolbox, you can leverage NVIDIA GPUs, to accelerate your application directly from MATLAB. MATLAB provides a direct interface for accelerating computationally intensive workflows, on GPUs for over 500 functions. Using these supported functions, you can execute your code on a GPU without needing any of programming experience.

For computationally intensive problems, it's possible to achieve significant speed up, making only a few changes to your existing code. With GPU support in parallel Computing Toolbox, it's easy to determine if you can use a GPU to speed up your application. If your code includes GPU supported functions, converting your inputs to GPU arrays will automatically execute those functions on your GPU.

MATLAB automatically handles GPU resource allocation. So you can focus on your application, without having to learn any low level GPU computing tools. MATLAB takes advantage of the hundreds of specialized cores in a GPU. To accelerate performance of applications that can be largely paralyzed. You can achieve the most effective results, with the GPU when executing workflows that process sizable data, and contain heavily vectorized operations.

You can use GPUBench, from MathWorks File Exchange. To compare performance of supported GPUs, using standard numerical benchmarks in MATLAB. Many MATLAB functions, such as the trained network function, use any compatible GPUs by default. To train your model on multiple GPUs, you can simply change a training option directly in MATLAB.

If you don't have access to a GPU on your laptop or workstation, you can leverage of MATLAB reference architecture to use one or more GPUs, within a MATLAB desktop in the cloud. You can also leverage the MATLAB Deep Learning Container from NVIDIA GPU Cloud which supports NVIDIA ddx, and other platforms that support Docker.

If you have many GPU applications to run, or need to scale beyond a single machine with GPUs, you can use MATLAB Parallel Server to extend your workflow to a cluster with GPUs. If you don't already have access to a GPU cluster, you can leverage MathWorks Cloud Center or MATLAB Parallel Server Reference Architecture.

Parallel Computing Toolbox provides additional features for working directly with CUDA code. The mexcuda function, compiles CUDA code into a mex file, that can be called directly in MATLAB as a function. Conversely, after writing your MATLAB code, you can generate and deploy ready to use CUDA code, with CPU coder.

The generated code is optimized, to call standard CUDA libraries and can be integrated and deployed directly onto NVIDIA GPUs. To learn more about how to take full advantage of your GPU in MATLAB, explore the GPU computing solutions page. You can also explore the MathWorks documentation. For a complete list of functions, which give you support and more examples.