Kevin Murphy HMM 工具书中的 learn_dmm.m 使用的算法是什么?
我将使用 Python 中的 Kevin Murphy 工具书重写一个 MATLAB 脚本。
我知道 python 中有一些 HMM 算法实现(Viterbi、Baum Welch、Backword Forward),所以我认为我拥有移植 matlab 所需的一切——>python。
我的 MATLAB 脚本使用 learn_dhmm.m 中编写的过程:
function [LL, prior, transmat, obsmat, gamma] = learn_dhmm(data, prior, transmat, obsmat, max_iter, thresh, verbose, act, adj_prior, adj_trans, adj_obs, dirichlet)
% LEARN_HMM Find the ML parameters of an HMM with discrete outputs using EM.
%
% [LL, PRIOR, TRANSMAT, OBSMAT] = LEARN_HMM(DATA, PRIOR0, TRANSMAT0, OBSMAT0)
% computes maximum likelihood estimates of the following parameters,
% where, for each time t, Q(t) is the hidden state, and
% Y(t) is the observation
% prior(i) = Pr(Q(1) = i)
% transmat(i,j) = Pr(Q(t+1)=j | Q(t)=i)
% obsmat(i,o) = Pr(Y(t)=o | Q(t)=i)
% It uses PRIOR0 as the initial estimate of PRIOR, etc.
我不明白这个过程实际上是做什么的。
抱歉,我刚刚接近机器学习
i'm going to rewrite a MATLAB script that use the Kevin Murphy's toolbok in Python.
I know that there are some HMM algos implementation in python (Viterbi, Baum Welch, Backword Forward) so i think that i have everything i need to do the porting matlab-->python.
My MATLAB script uses the procedure written in learn_dhmm.m:
function [LL, prior, transmat, obsmat, gamma] = learn_dhmm(data, prior, transmat, obsmat, max_iter, thresh, verbose, act, adj_prior, adj_trans, adj_obs, dirichlet)
% LEARN_HMM Find the ML parameters of an HMM with discrete outputs using EM.
%
% [LL, PRIOR, TRANSMAT, OBSMAT] = LEARN_HMM(DATA, PRIOR0, TRANSMAT0, OBSMAT0)
% computes maximum likelihood estimates of the following parameters,
% where, for each time t, Q(t) is the hidden state, and
% Y(t) is the observation
% prior(i) = Pr(Q(1) = i)
% transmat(i,j) = Pr(Q(t+1)=j | Q(t)=i)
% obsmat(i,o) = Pr(Y(t)=o | Q(t)=i)
% It uses PRIOR0 as the initial estimate of PRIOR, etc.
i don't understand what this procedure actually do.
sorry i'm just approaching maching learning
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我认为评论是解释性的:
Find the MLparameters of an HMM with离散输出使用EM。
您可以阅读这篇经典论文来理解HMM:隐马尔可夫模型和语音识别中选定应用的教程,L. Rabiner, 1989,Proc。 IEEE 77(2):257--286。
上述函数解决了论文中的问题3(第264页)。
I think that the comment is explanatory:
Find the ML parameters of an HMM with discrete outputs using EM.
You can read this classic paper to understand HMMs: A tutorial on Hidden Markov Models and selected applications in speech recognition, L. Rabiner, 1989, Proc. IEEE 77(2):257--286.
The above function solves problem 3 (page 264) in the paper.