How to reduce the time required for training a logistic regression classifier

2 次查看(过去 30 天)
Hi everyone,
I am trying to train and use a logistic regression classifier using stepwiseglm function. The regression function is allowed to have up to fourth polynomial degrees of each predictors including their interactions. The AIC criterion is used to study the significance of adding or removing each term.
The training set is a 100K by 4 matrix. The problem is that the training process takes a lot of time. Is there any way to improve its speed? (for instance using GPU parallel processing? Since this is an image processing task, can convolutional neural network (CNN) be trained in a shorter time? (I am not familiar with CNN).
Many thanks in advance.

采纳的回答

Prateek Rai
Prateek Rai 2021-7-30
To my understanding, you want to speed up your image classification task. You can use CNN for your classification process.
You can refer to introduction-to-convolutional-neural-networks MathWorks documentation page to learn more on how to use CNN for image classification in MATLAB.
You can also refer to deep-learning-with-big-data-on-gpus-and-in-parallel MathWorks documentation page to learn more on speeding up your task using GPU Parallel Processing.

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Deep Learning Toolbox 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by