将数据类型为 YUY2 的 IMFMediaBuffer 数据转换为 RGB24 或 RGB32

发布于 2025-01-11 07:19:20 字数 2690 浏览 1 评论 0原文

我正在使用 MediaFoundation APIS 从网络摄像头读取帧。

IMFMediaType mediatype = null;
Hresult hr= mSourceReaderAsync.GetNativeMediaType((int)MF_SOURCE_READER.FirstAudioStream, i, out mediatype); 

仅返回 YUY2 媒体类型。所以我得到 ReadSample 的输出给出 YUY2 帧。我需要将 YUY2 转换为 RGB24 或 BitmapSource 以在 WPF 窗口中显示。 这是我的 OnRead 回调方法

 public HResult OnReadSample(HResult hrStatus, int dwStreamIndex, MF_SOURCE_READER_FLAG dwStreamFlags, long llTimestamp, IMFSample pSample)
{
    HResult hr = hrStatus;
    IMFMediaBuffer pBuffer = null;
    Stream s = null;
    JpegBitmapDecoder jpgdecoder = null; 
    BitmapSource cameraframe = null; 
    lock (this)
    {
        try
        {
            if (Succeeded(hr))
            {
                if (pSample != null)
                {
                    // Get the video frame buffer from the sample.
                    hr = pSample.GetBufferByIndex(0, out pBuffer);
                }
            }
            if (pBuffer != null)
            {
                int maxlen, curlen;
                pBuffer.GetMaxLength(out maxlen);
                pBuffer.GetCurrentLength(out curlen);
                var arr = new byte[maxlen - 1];
                pBuffer.Lock(out IntPtr ptr, out int maxLen, out int curLen);
                if (arr == null)
                    arr = new byte[maxlen - 1];
                var writable = (maxlen > 0) ? true : false;
                if (s == null)
                    s = new MemoryStream(arr, writable);

                System.Runtime.InteropServices.Marshal.Copy(ptr, arr, 0, curlen);


                s.Flush();
                s.Seek(0, SeekOrigin.Begin);
                if (jpgdecoder == null)
                    jpgdecoder = new JpegBitmapDecoder(s, BitmapCreateOptions.None, BitmapCacheOption.OnLoad);

                var frame = jpgdecoder.Frames[0];
                cameraframe = frame;
            }
            dispatcher.Invoke(() =>
            {
                OnCapture.Invoke(this, cameraframe);
            });
            // Request the next frame.
            if (Succeeded(hr))
            {
                // Ask for the first sample.
                
            }
        }
        catch (Exception ex)
        {
            Console.WriteLine(ex.ToString());
        }
        finally
        {
            SafeRelease(pBuffer);
            SafeRelease(pSample);
            dispatcher.Invoke(() =>
            {
                hr = mSourceReaderAsync.ReadSample((int)MF_SOURCE_READER.FirstVideoStream, 0, IntPtr.Zero, IntPtr.Zero, IntPtr.Zero, IntPtr.Zero);
            });
        }
    }

    return hr;
}

,现在它引发异常:{“找不到适合完成此操作的成像组件。”}

I am reading a frames from my web cam using MediaFoundation APIS.

IMFMediaType mediatype = null;
Hresult hr= mSourceReaderAsync.GetNativeMediaType((int)MF_SOURCE_READER.FirstAudioStream, i, out mediatype); 

returns only YUY2 media types.So i am getting the output of ReadSample gives YUY2 frame. I need to convert YUY2 to RGB24 or BitmapSource to show in WPF window.
This is my OnRead callback method

 public HResult OnReadSample(HResult hrStatus, int dwStreamIndex, MF_SOURCE_READER_FLAG dwStreamFlags, long llTimestamp, IMFSample pSample)
{
    HResult hr = hrStatus;
    IMFMediaBuffer pBuffer = null;
    Stream s = null;
    JpegBitmapDecoder jpgdecoder = null; 
    BitmapSource cameraframe = null; 
    lock (this)
    {
        try
        {
            if (Succeeded(hr))
            {
                if (pSample != null)
                {
                    // Get the video frame buffer from the sample.
                    hr = pSample.GetBufferByIndex(0, out pBuffer);
                }
            }
            if (pBuffer != null)
            {
                int maxlen, curlen;
                pBuffer.GetMaxLength(out maxlen);
                pBuffer.GetCurrentLength(out curlen);
                var arr = new byte[maxlen - 1];
                pBuffer.Lock(out IntPtr ptr, out int maxLen, out int curLen);
                if (arr == null)
                    arr = new byte[maxlen - 1];
                var writable = (maxlen > 0) ? true : false;
                if (s == null)
                    s = new MemoryStream(arr, writable);

                System.Runtime.InteropServices.Marshal.Copy(ptr, arr, 0, curlen);


                s.Flush();
                s.Seek(0, SeekOrigin.Begin);
                if (jpgdecoder == null)
                    jpgdecoder = new JpegBitmapDecoder(s, BitmapCreateOptions.None, BitmapCacheOption.OnLoad);

                var frame = jpgdecoder.Frames[0];
                cameraframe = frame;
            }
            dispatcher.Invoke(() =>
            {
                OnCapture.Invoke(this, cameraframe);
            });
            // Request the next frame.
            if (Succeeded(hr))
            {
                // Ask for the first sample.
                
            }
        }
        catch (Exception ex)
        {
            Console.WriteLine(ex.ToString());
        }
        finally
        {
            SafeRelease(pBuffer);
            SafeRelease(pSample);
            dispatcher.Invoke(() =>
            {
                hr = mSourceReaderAsync.ReadSample((int)MF_SOURCE_READER.FirstVideoStream, 0, IntPtr.Zero, IntPtr.Zero, IntPtr.Zero, IntPtr.Zero);
            });
        }
    }

    return hr;
}

now it raises exception that {"No imaging component suitable to complete this operation was found."}

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

∞梦里开花 2025-01-18 07:19:20

我认为你可以使用 Media Foundation Transform 在类型之间进行转换。你可以通过 snarfle 参考来自 MF.net 的示例
在此处输入链接说明

I think you can use Media foundation transform to convert between types.You can refer samples from MF.net by snarfle
enter link description here

z祗昰~ 2025-01-18 07:19:20

您有多种转换帧的选项。

  1. 在你的CPU上。出于明显的原因,我不推荐这样做,但如果图像预览不是您的主要功能,您可以不用这样做。像您正在做的那样提取缓冲区并手动转换数据数组,最好使用数组操作。
  2. 在硬件上。我想到了两个选择。第一个是OpenGL。将像素缓冲区分配给 OpenGL 中的缓冲区并使用着色器,您可以将像素从 yuy2 转换为 RGB。为此,请使用 yuy2 规范。
  3. 硬件上的另一个选择是使用 MFT。我真的不知道你为什么不想这样做;它很快。我看到你正在用 C# 实现这个,这是一个相当大的挑战,但我之前已经这样做过,证明这是可能的。如果您愿意,我可以添加一个小代码示例,但我建议通过文档来解决这个问题。关键是配置转换,然后启动转换(将管道设置为接收帧),然后将读取的样本中的缓冲区传输到转换中,然后从转换中提取结果。

我会在您的场景中推荐 MFT。也许尝试设置 DirectX 或 OpenGL 进行显示;对于更高的分辨率,使用 BitmapSources 的解决方案可能会很慢。

You have several options for converting frames.

  1. On your CPU. I do not recommend this for obvious reasons, but if the image preview is not your main feature, you can get away with this. Extract the buffer like you are doing and convert the data array manually, preferably with array operations.
  2. On hardware. I have two options that come to mind. The first is OpenGL. Assign your pixel buffer to a buffer in OpenGL and with a shader, you can convert the pixels from yuy2 to RGB. To do so, use the yuy2 spec.
  3. The other option on hardware is using MFTs. I do not really know why you do not want to do this; it is fast. I see that you are implementing this in C#, which is quite the challenge, but I have done this before, proving it is possible. I can add a small code sample if you prefer this, but I suggest figuring this out with the docs. The key is to configure your transforms, then start the transform (set the pipeline to take in frames), and then pipe buffers from the samples you read into the transform, after which you extract the result from the transform.

I would recommend MFTs in your scenario. Maybe try to set up DirectX or OpenGL for displaying; your solution with BitmapSources can be slow for higher resolutions.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文