Matlab 调试 - 初级水平

发布于 2024-12-05 05:09:59 字数 1240 浏览 1 评论 0原文

我是 Matlab 的初学者,并尝试在 Matlab 中编写一些机器学习算法。如果有人可以帮助我调试这段代码,我将非常感激。

function y = KNNpredict(trX,trY,K,X)
   % trX is NxD, trY is Nx1, K is 1x1 and X is 1xD
   % we return a single value 'y' which is the predicted class

% TODO: write this function
% int[] distance = new int[N];
distances = zeroes(N, 1);
examples = zeroes(K, D+2);
i = 0;
% for(every row in trX) { // taking ONE example
for row=1:N, 
 examples(row,:) = trX(row,:);
 %sum = 0.0;
 %for(every col in this example) { // taking every feature of this example
 for col=1:D, 
    % diff = compute squared difference between these points - (trX[row][col]-X[col])^2
    diff =(trX(row,col)-X(col))^2;
    sum += diff;
 end % for
 distances(row) = sqrt(sum);
 examples(i:D+1) = distances(row);
 examples(i:D+2) = trY(row:1);
end % for

% sort the examples based on their distances thus calculated
sortrows(examples, D+1);
% for(int i = 0; i < K; K++) {
% These are the nearest neighbors
pos = 0;
neg = 0;
res = 0;
for row=1:K,
    if(examples(row,D+2 == -1))
        neg = neg + 1;
    else
        pos = pos + 1;
    %disp(distances(row));
    end
end % for

if(pos > neg)
    y = 1;
    return;
else
    y = -1;
    return;
end
end
end

非常感谢

I am a total beginner in Matlab and trying to write some Machine Learning Algorithms in Matlab. I would really appreciate it if someone can help me in debugging this code.

function y = KNNpredict(trX,trY,K,X)
   % trX is NxD, trY is Nx1, K is 1x1 and X is 1xD
   % we return a single value 'y' which is the predicted class

% TODO: write this function
% int[] distance = new int[N];
distances = zeroes(N, 1);
examples = zeroes(K, D+2);
i = 0;
% for(every row in trX) { // taking ONE example
for row=1:N, 
 examples(row,:) = trX(row,:);
 %sum = 0.0;
 %for(every col in this example) { // taking every feature of this example
 for col=1:D, 
    % diff = compute squared difference between these points - (trX[row][col]-X[col])^2
    diff =(trX(row,col)-X(col))^2;
    sum += diff;
 end % for
 distances(row) = sqrt(sum);
 examples(i:D+1) = distances(row);
 examples(i:D+2) = trY(row:1);
end % for

% sort the examples based on their distances thus calculated
sortrows(examples, D+1);
% for(int i = 0; i < K; K++) {
% These are the nearest neighbors
pos = 0;
neg = 0;
res = 0;
for row=1:K,
    if(examples(row,D+2 == -1))
        neg = neg + 1;
    else
        pos = pos + 1;
    %disp(distances(row));
    end
end % for

if(pos > neg)
    y = 1;
    return;
else
    y = -1;
    return;
end
end
end

Thanks so much

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

伴梦长久 2024-12-12 05:09:59

在 MATLAB 中处理矩阵时,通常最好避免过多的循环,而是尽可能使用向量化运算。这通常会产生更快、更短的代码。

在您的情况下,k 最近邻算法足够简单并且可以很好地矢量化。考虑以下实现:

function y = KNNpredict(trX, trY, K, x)
    %# euclidean distance between instance x and every training instance
    dist = sqrt( sum( bsxfun(@minus, trX, x).^2 , 2) );

    %# sorting indices from smaller to larger distances
    [~,ord] = sort(dist, 'ascend');

    %# get the labels of the K nearest neighbors
    kTrY = trY( ord(1:min(K,end)) );

    %# majority class vote
    y = mode(kTrY);
end

这是一个使用 Fisher-Iris 数据集对其进行测试的示例:

%# load dataset (data + labels)
load fisheriris
X = meas;
Y = grp2idx(species);

%# partition the data into training/testing
c = cvpartition(Y, 'holdout',1/3);
trX = X(c.training,:);
trY = Y(c.training);
tsX = X(c.test,:);
tsY = Y(c.test);

%# prediction
K = 10;
pred = zeros(c.TestSize,1);
for i=1:c.TestSize
    pred(i) = KNNpredict(trX, trY, K, tsX(i,:));
end

%# validation
C = confusionmat(tsY, pred)

K=10 时 kNN 预测的混淆矩阵:

C =
    17     0     0
     0    16     0
     0     1    16

When working with matrices in MATLAB, it is usually better to avoid excessive loops and instead use vectorized operations whenever possible. This will usually produce faster and shorter code.

In your case, the k-nearest neighbors algorithm is simple enough and can be well vectorized. Consider the following implementation:

function y = KNNpredict(trX, trY, K, x)
    %# euclidean distance between instance x and every training instance
    dist = sqrt( sum( bsxfun(@minus, trX, x).^2 , 2) );

    %# sorting indices from smaller to larger distances
    [~,ord] = sort(dist, 'ascend');

    %# get the labels of the K nearest neighbors
    kTrY = trY( ord(1:min(K,end)) );

    %# majority class vote
    y = mode(kTrY);
end

Here is an example to test it using the Fisher-Iris dataset:

%# load dataset (data + labels)
load fisheriris
X = meas;
Y = grp2idx(species);

%# partition the data into training/testing
c = cvpartition(Y, 'holdout',1/3);
trX = X(c.training,:);
trY = Y(c.training);
tsX = X(c.test,:);
tsY = Y(c.test);

%# prediction
K = 10;
pred = zeros(c.TestSize,1);
for i=1:c.TestSize
    pred(i) = KNNpredict(trX, trY, K, tsX(i,:));
end

%# validation
C = confusionmat(tsY, pred)

The confusion matrix of the kNN prediction with K=10:

C =
    17     0     0
     0    16     0
     0     1    16
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文