在openCV中访问特定像素的RGB值

发布于 2024-12-27 17:19:58 字数 723 浏览 0 评论 0 原文

我已经彻底搜索了互联网和 stackoverflow,但我还没有找到我的问题的答案:

如何在 OpenCV 中获取/设置某些(由 x,y 坐标给出)像素的(两者)RGB 值?重要的是,我用 C++ 编写,图像存储在 cv::Mat 变量中。我知道有一个 IplImage() 运算符,但是 IplImage 使用起来不太舒服——据我所知它来自 C API。

是的,我知道 OpenCV 2.2 中已经存在像素访问< /a> 线程,但它只是关于黑白位图。

编辑:

非常感谢您的所有回答。我发现有很多方法可以获取/设置像素的 RGB 值。我从我的好朋友那里得到了另一个想法——谢谢本尼!这非常简单有效。我认为选择哪一个只是品味问题。

Mat image;

(...)

Point3_<uchar>* p = image.ptr<Point3_<uchar> >(y,x);

然后您可以使用以下命令读取/写入 RGB 值:

p->x //B
p->y //G
p->z //R

I have searched internet and stackoverflow thoroughly, but I haven't found answer to my question:

How can I get/set (both) RGB value of certain (given by x,y coordinates) pixel in OpenCV? What's important-I'm writing in C++, the image is stored in cv::Mat variable. I know there is an IplImage() operator, but IplImage is not very comfortable in use-as far as I know it comes from C API.

Yes, I'm aware that there was already this Pixel access in OpenCV 2.2 thread, but it was only about black and white bitmaps.

EDIT:

Thank you very much for all your answers. I see there are many ways to get/set RGB value of pixel. I got one more idea from my close friend-thanks Benny! It's very simple and effective. I think it's a matter of taste which one you choose.

Mat image;

(...)

Point3_<uchar>* p = image.ptr<Point3_<uchar> >(y,x);

And then you can read/write RGB values with:

p->x //B
p->y //G
p->z //R

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(6

妥活 2025-01-03 17:19:58

尝试以下操作:

cv::Mat image = ...do some stuff...;

image.at(y,x); 为您提供 cv::Vec3b 类型的 RGB(可能以 BGR 形式排序)向量代码>

image.at<cv::Vec3b>(y,x)[0] = newval[0];
image.at<cv::Vec3b>(y,x)[1] = newval[1];
image.at<cv::Vec3b>(y,x)[2] = newval[2];

Try the following:

cv::Mat image = ...do some stuff...;

image.at<cv::Vec3b>(y,x); gives you the RGB (it might be ordered as BGR) vector of type cv::Vec3b

image.at<cv::Vec3b>(y,x)[0] = newval[0];
image.at<cv::Vec3b>(y,x)[1] = newval[1];
image.at<cv::Vec3b>(y,x)[2] = newval[2];
对不⑦ 2025-01-03 17:19:58

低级方法是直接访问矩阵数据。在 RGB 图像中(我相信 OpenCV 通常将其存储为 BGR),并假设您的 cv::Mat 变量称为 frame,您可以在位置 (x >, y)(从左上角开始)这样:

frame.data[frame.channels()*(frame.cols*y + x)];

同样,获取 B、G 和 R:

uchar b = frame.data[frame.channels()*(frame.cols*y + x) + 0];    
uchar g = frame.data[frame.channels()*(frame.cols*y + x) + 1];
uchar r = frame.data[frame.channels()*(frame.cols*y + x) + 2];

请注意,此代码假设步幅等于图像的宽度。

The low-level way would be to access the matrix data directly. In an RGB image (which I believe OpenCV typically stores as BGR), and assuming your cv::Mat variable is called frame, you could get the blue value at location (x, y) (from the top left) this way:

frame.data[frame.channels()*(frame.cols*y + x)];

Likewise, to get B, G, and R:

uchar b = frame.data[frame.channels()*(frame.cols*y + x) + 0];    
uchar g = frame.data[frame.channels()*(frame.cols*y + x) + 1];
uchar r = frame.data[frame.channels()*(frame.cols*y + x) + 2];

Note that this code assumes the stride is equal to the width of the image.

绮烟 2025-01-03 17:19:58

对于遇到此类问题的人来说,一段代码更容易。我把我的代码分享出来,大家可以直接使用。请注意,OpenCV 将像素存储为 BGR。

cv::Mat vImage_; 

if(src_)
{
    cv::Vec3f vec_;

    for(int i = 0; i < vHeight_; i++)
        for(int j = 0; j < vWidth_; j++)
        {
            vec_ = cv::Vec3f((*src_)[0]/255.0, (*src_)[1]/255.0, (*src_)[2]/255.0);//Please note that OpenCV store pixels as BGR.

            vImage_.at<cv::Vec3f>(vHeight_-1-i, j) = vec_;

            ++src_;
        }
}

if(! vImage_.data ) // Check for invalid input
    printf("failed to read image by OpenCV.");
else
{
    cv::namedWindow( windowName_, CV_WINDOW_AUTOSIZE);
    cv::imshow( windowName_, vImage_); // Show the image.
}

A piece of code is easier for people who have such problem. I share my code and you can use it directly. Please note that OpenCV store pixels as BGR.

cv::Mat vImage_; 

if(src_)
{
    cv::Vec3f vec_;

    for(int i = 0; i < vHeight_; i++)
        for(int j = 0; j < vWidth_; j++)
        {
            vec_ = cv::Vec3f((*src_)[0]/255.0, (*src_)[1]/255.0, (*src_)[2]/255.0);//Please note that OpenCV store pixels as BGR.

            vImage_.at<cv::Vec3f>(vHeight_-1-i, j) = vec_;

            ++src_;
        }
}

if(! vImage_.data ) // Check for invalid input
    printf("failed to read image by OpenCV.");
else
{
    cv::namedWindow( windowName_, CV_WINDOW_AUTOSIZE);
    cv::imshow( windowName_, vImage_); // Show the image.
}
愛上了 2025-01-03 17:19:58

当前版本允许 cv::Mat::at 函数处理 3 个维度。因此,对于 Mat 对象 mm.at(0,0,0) 应该可以工作。

The current version allows the cv::Mat::at function to handle 3 dimensions. So for a Mat object m, m.at<uchar>(0,0,0) should work.

—━☆沉默づ 2025-01-03 17:19:58
uchar * value = img2.data; //Pointer to the first pixel data ,it's return array in all values 
int r = 2;
for (size_t i = 0; i < img2.cols* (img2.rows * img2.channels()); i++)
{

        if (r > 2) r = 0;

        if (r == 0) value[i] = 0;
        if (r == 1)value[i] =  0;
        if (r == 2)value[i] = 255;

        r++;
}
uchar * value = img2.data; //Pointer to the first pixel data ,it's return array in all values 
int r = 2;
for (size_t i = 0; i < img2.cols* (img2.rows * img2.channels()); i++)
{

        if (r > 2) r = 0;

        if (r == 0) value[i] = 0;
        if (r == 1)value[i] =  0;
        if (r == 2)value[i] = 255;

        r++;
}
心碎的声音 2025-01-03 17:19:58
const double pi = boost::math::constants::pi<double>();

cv::Mat distance2ellipse(cv::Mat image, cv::RotatedRect ellipse){
    float distance = 2.0f;
    float angle = ellipse.angle;
    cv::Point ellipse_center = ellipse.center;
    float major_axis = ellipse.size.width/2;
    float minor_axis = ellipse.size.height/2;
    cv::Point pixel;
    float a,b,c,d;

    for(int x = 0; x < image.cols; x++)
    {
        for(int y = 0; y < image.rows; y++) 
        {
        auto u =  cos(angle*pi/180)*(x-ellipse_center.x) + sin(angle*pi/180)*(y-ellipse_center.y);
        auto v = -sin(angle*pi/180)*(x-ellipse_center.x) + cos(angle*pi/180)*(y-ellipse_center.y);

        distance = (u/major_axis)*(u/major_axis) + (v/minor_axis)*(v/minor_axis);  

        if(distance<=1)
        {
            image.at<cv::Vec3b>(y,x)[1] = 255;
        }
      }
  }
  return image;  
}
const double pi = boost::math::constants::pi<double>();

cv::Mat distance2ellipse(cv::Mat image, cv::RotatedRect ellipse){
    float distance = 2.0f;
    float angle = ellipse.angle;
    cv::Point ellipse_center = ellipse.center;
    float major_axis = ellipse.size.width/2;
    float minor_axis = ellipse.size.height/2;
    cv::Point pixel;
    float a,b,c,d;

    for(int x = 0; x < image.cols; x++)
    {
        for(int y = 0; y < image.rows; y++) 
        {
        auto u =  cos(angle*pi/180)*(x-ellipse_center.x) + sin(angle*pi/180)*(y-ellipse_center.y);
        auto v = -sin(angle*pi/180)*(x-ellipse_center.x) + cos(angle*pi/180)*(y-ellipse_center.y);

        distance = (u/major_axis)*(u/major_axis) + (v/minor_axis)*(v/minor_axis);  

        if(distance<=1)
        {
            image.at<cv::Vec3b>(y,x)[1] = 255;
        }
      }
  }
  return image;  
}
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文