Hi David,
The memory usage of MATLAB during training can be influenced by various factors, including the size of the input data, the number of learnable parameters, and the specific operations performed by the network. Other factors, such as model size, batch size and forward/backward pass memory also influence the GPU memory usage.
Every function is different in the amount of working memory it needs to run. There really isn't any way to estimate the memory requirements other than running the function and monitoring GPU memory in a separate process.
The following MATLAB Answer illustrates a crude estimate of GPU memory requirement for a deep learning AI model training:
Following this example might help you in estimating the memory for "dlconv"!