GPU for HMM training

5 次查看(过去 30 天)
Mate 2u
Mate 2u 2011-4-29
Hi I have a large dataset and when I used hidden markov model to train the data I have to wait a long time. Would GPU parallel computing speed this up. It seems the HMM (in the stats toolbox) is not mentioned as a supported function within the it.
Any experience or ideas?

采纳的回答

sgmustadio
sgmustadio 2011-5-2
This is exactly the problem I'm working on for my thesis. Unfortunately, the big HMM training algorithm (Baum-Welch) is recursive and cannot be parallelized (at least not easily). I'm looking into some work that Turin did on parallelizing the BWA here:
Additionally, the Segmental K-Means Algorithm is another way to train a HMM. My understanding is that it is computationally more efficient than BWA, but still suffers the same recursion problem.
Luckily, GPUs will help you if you are training many models or training a model with many states, as these are trivially parallel problems. From my research, you really only get benefits when you are talking about HMMS with thousands of states, as the overhead for GPUs is just too great up until that point.
If you're interested in using GPUs in MATLAB, checkout this free toolbox here:
Shoot me an email or a message if you need additional details, I'm glad to help :)
  1 个评论
shahab anjum
shahab anjum 2020-3-4
dear
i have 1000x286 power spectral density matrix. i want to use HMM on that matrix please can you help how can i?

请先登录,再进行评论。

更多回答(2 个)

John Melonakos
John Melonakos 2011-6-7
Jacket has been used heavily for HMMs on the GPU in MATLAB. K-Means is also really fast. Here is some sample code:
% initial points
n = 1e6; % 3e6;
X = [randn(n,1) randn(n,1)] * 2;
ind = ceil(6*rand(n,1));
X(ind<3,1) = X(ind<3,1)/2 - 7;
X(ind==4,:) = X(ind==4,:)/3 + 4;
X = X';
% cluster
k = 20; % 5 clusters
X = gsingle(X);
[d n] = size(X);
assert(d == 2); % just for 2D case
% initial state
Z = X(:, ceil(1:n/k:n));
Z = reshape(Z, [2 1 k]);
% kmeans phase 1, 10 iterations
tic
for i = 1:10, i
% distances
D = bsxfun(@minus, X, Z);
D = sum(D .* D);
D = permute(D, [3 2 1]);
% minimum distances
[Y, I] = min(D);
% re-assign clusters
for j = 1:k
ii = find(I == j); if isempty(ii), error('empty cluster'); end
Z(:,1,j) = mean(X(:,ii), 2);
end
end
toc

engstudent
engstudent 2012-12-7
hi every one i am working in isolated word recognition and i need the step to use the hmm in matlab 2012, if any one can help me please send help to my email"sajasaad_eng@yahoo.com".

类别

Help CenterFile Exchange 中查找有关 Statistics and Machine Learning Toolbox 的更多信息

标签

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by