有没有办法在程序中伪造 DirectShow 过滤器?

发布于 2024-11-29 02:42:04 字数 4078 浏览 0 评论 0原文

我有一个 IP 摄像机,它通过网络接收包含图像的字符缓冲区。在我在程序中设置与它的连接之前,我无法访问它。我正在尝试剖析Windows源过滤器代码,但我的速度不是很快,所以我想我会问是否有可能只采用这样的缓冲区并将其转换为可以然后将引脚连接到AVISplitter或Directshow中的东西/.net

(video buffer from IP Cam) -> (???) -> (AVI Splitter) -> (Profit)

更新

我的程序在命名空间中捕获视频,并且我在其自己的命名空间中拥有来自 GSSF 的代码。我将带有图像的 ptr 从 cam 命名空间传递到 GSSF 命名空间。这仅发生一次,但图形来自这一张图像,并且摄像机来自网络。有没有办法不断地将缓冲区从 cam 传递到 GSSF 或者我应该以某种方式组合命名空间?我尝试将主相机指针发送到 GSSF,但它崩溃了,因为它访问了指针并正在写入。也许如果我抓取一张图像,传递指针,等待抓取新的图像?

*更新*

我缩小了我的代码,而且我现在查看它时也不相信我正确地执行了命名空间。

namespace Cam_Controller
{
    static byte[] mainbyte = new byte[1280*720*2];
    static IntPtr main_ptr = new IntPtr();

    //(this function is threaded)
    static void Trial(NPvBuffer mBuffer, NPvDisplayWnd mDisplayWnd, VideoCompression compressor)
    {
        Functions function = new Functions();
        Defines define = new Defines();
        NPvResult operationalResult = new NPvResult();
        VideoCompression mcompressor = new VideoCompression();

        int framecount = 0;
        while (!Stopping && AcquiringImages)
        {
            Mutex lock_video = new Mutex();
            NPvResult result = mDevice.RetrieveNextBuffer(mBuffer, operationalResult);

            if(result.isOK())
            {
                framecount++;
                wer = (int)mDisplayWnd.Display(mBuffer, wer);


                    main_ptr = (IntPtr)mBuffer.GetMarshalledBuffer();


                    Marshal.Copy(main_ptr, mainbyte, 0, 720 * 2560);
             }
        }
    }
    private void button7_Click(object sender, EventArgs e)
    {
        IntPtr dd = (IntPtr)mBuffer.GetMarshalledBuffer();
        Marshal.Copy(dd, main_byte1, 0, 720 * 2560);
        play = new VisiCam_Controller.DxPlay.DxPlay("", panel9, main_byte1);
        play.Start();


    }


    namespace DxPlay
    {
        public class DxPlay
        {
            public DxPlay(string sPath, Control hWin, byte[] color)
            {
                try
                {
                    // pick one of our image providers
                    //m_ImageHandler = new ImageFromFiles(sPath, 24);
                    m_ImageHandler = new ImageFromPixels(20, color);
                    //m_ImageHandler = new ImageFromMpg(@"c:\c1.mpg");
                    //m_ImageHandler = new ImageFromMpg(sPath);
                    //m_ImageHandler = new ImageFromMP3(@"c:\vss\media\track3.mp3");

                // Set up the graph
                    SetupGraph(hWin);
                }
                catch
                {
                    Dispose();
                    throw;
                }
            }
        }
        abstract internal class imagehandler
        internal class imagefrompixels
        {
            private int[] mainint = new int[720 * 1280];
            unsafe public ImageFromPixels(long FPS, byte[] x)
            {
                long fff = 720 * 1280 * 3;
                mainptr = new IntPtr(fff);
                for (int p = 0; p < 720 * 640; p++)
                {
                    U = (x[ p * 4 + 0]);

                    Y = (x[p * 4 + 1]);
                    V = (x[p * 4 + 2]);
                    Y2 = (x[p * 4 + 3]);

                    int one = V << 16 | Y << 8 | U;
                    int two = V << 16 | Y2 << 8 | U;
                    mainint[p * 2 + 0] = one;
                    mainint[p * 2 + 1] = two;

                }

                m_FPS = UNIT / FPS;
                m_b = 211;
                m_g = 197;
            }
        }
    }
}

还有 GetImage 但那是相对相同的,将缓冲区复制到指针中。发生的情况是我抓取图像的缓冲区并将其发送到 DxPlay 类。它能够处理它并将其放入 directshow 线上,没有任何问题;但它永远不会更新也不会被更新,因为它只是一个缓冲区。如果我改为向 DxPlay 发送保存图像缓冲区地址的 IntPtr,则访问内存时代码会崩溃,因为我假设 ImageFromPixels 代码(现在不存在(更改

(x[p * 4 + #]) 

(IntPtr)((x-passed as an IntPtr).toInt64()+p*4 + #)

)) 当 Cam_Controller 类正在编辑指针时,正在访问指针的内存。我制作并传递 IntPtr 和新 IntPtr 的副本,但它们在转换过程中失败。

I have an IP Camera that receives a char buffer containing an image over the network. I cant access it until i setup the connection to it in a program. I am trying to dissect windows source filter code and im not going very fast so i thought i'd ask if it was possible to just take a buffer like that and cast it to something that could then connect a pin to AVISplitter or such in Directshow/.net

(video buffer from IP Cam) -> (???) -> (AVI Splitter) -> (Profit)

Update

I have my program capturing video in a namespace, and i have this code from the GSSF in its own namespace. I pass a ptr with an image from the cam namespace to the GSSF namespace. This only occurs once, but the graph streams from this one image, and the camera streams from the network. is there a way to continually pass the buffer from cam to GSSF or should i combine the namespaces somehow? I tried sending the main camera pointer to the GSSF but it crashed because its accessing the pointer and its being written. maybe if i grabbed an image, passed the pointer, waited to grab a new one?

*Update*

I shrunk my code and I don't believe im doing the namespace correctly either now that i look at it.

namespace Cam_Controller
{
    static byte[] mainbyte = new byte[1280*720*2];
    static IntPtr main_ptr = new IntPtr();

    //(this function is threaded)
    static void Trial(NPvBuffer mBuffer, NPvDisplayWnd mDisplayWnd, VideoCompression compressor)
    {
        Functions function = new Functions();
        Defines define = new Defines();
        NPvResult operationalResult = new NPvResult();
        VideoCompression mcompressor = new VideoCompression();

        int framecount = 0;
        while (!Stopping && AcquiringImages)
        {
            Mutex lock_video = new Mutex();
            NPvResult result = mDevice.RetrieveNextBuffer(mBuffer, operationalResult);

            if(result.isOK())
            {
                framecount++;
                wer = (int)mDisplayWnd.Display(mBuffer, wer);


                    main_ptr = (IntPtr)mBuffer.GetMarshalledBuffer();


                    Marshal.Copy(main_ptr, mainbyte, 0, 720 * 2560);
             }
        }
    }
    private void button7_Click(object sender, EventArgs e)
    {
        IntPtr dd = (IntPtr)mBuffer.GetMarshalledBuffer();
        Marshal.Copy(dd, main_byte1, 0, 720 * 2560);
        play = new VisiCam_Controller.DxPlay.DxPlay("", panel9, main_byte1);
        play.Start();


    }


    namespace DxPlay
    {
        public class DxPlay
        {
            public DxPlay(string sPath, Control hWin, byte[] color)
            {
                try
                {
                    // pick one of our image providers
                    //m_ImageHandler = new ImageFromFiles(sPath, 24);
                    m_ImageHandler = new ImageFromPixels(20, color);
                    //m_ImageHandler = new ImageFromMpg(@"c:\c1.mpg");
                    //m_ImageHandler = new ImageFromMpg(sPath);
                    //m_ImageHandler = new ImageFromMP3(@"c:\vss\media\track3.mp3");

                // Set up the graph
                    SetupGraph(hWin);
                }
                catch
                {
                    Dispose();
                    throw;
                }
            }
        }
        abstract internal class imagehandler
        internal class imagefrompixels
        {
            private int[] mainint = new int[720 * 1280];
            unsafe public ImageFromPixels(long FPS, byte[] x)
            {
                long fff = 720 * 1280 * 3;
                mainptr = new IntPtr(fff);
                for (int p = 0; p < 720 * 640; p++)
                {
                    U = (x[ p * 4 + 0]);

                    Y = (x[p * 4 + 1]);
                    V = (x[p * 4 + 2]);
                    Y2 = (x[p * 4 + 3]);

                    int one = V << 16 | Y << 8 | U;
                    int two = V << 16 | Y2 << 8 | U;
                    mainint[p * 2 + 0] = one;
                    mainint[p * 2 + 1] = two;

                }

                m_FPS = UNIT / FPS;
                m_b = 211;
                m_g = 197;
            }
        }
    }
}

Theres also GetImage but thats relatively the same, copy the buffer into the pointer. What happens is i grab a buffer of the image and send it to the DxPlay class. it is able to process it and put it on the directshow line no problems; but it never updates nor gets updated because its just a single buffer. If i instead send DxPlay a IntPtr holding the address of the image buffer, the code crashes for accessing memory because i assume ImageFromPixels code ( which isn't there now ( change

(x[p * 4 + #]) 

to

(IntPtr)((x-passed as an IntPtr).toInt64()+p*4 + #)

))
is accessing the memory of the pointer as the Cam_Controller class is editing it. I make and pass copies of the IntPtrs, and new IntPtrs but they fail halfway through the conversion.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

层林尽染 2024-12-06 02:42:04

如果要在 .NET 中执行此操作,则需要执行以下步骤:

  1. 使用 示例包。源过滤器始终是 COM 模块,因此您也需要使用“RegSvr32 GSSF.ax”注册它。

  2. 在 .NET 中实现位图提供程序

  3. 设置图形,并将引脚从 GSSF 连接到位图提供程序的实现。

  4. 祈祷。

我在项目中使用以下内容,并使其可重用以供将来使用。

代码(不是最好的,也没有完成,但是一个工作开始)(这需要一个 IVideoSource,如下所示):

public class VideoSourceToVideo : IDisposable
{
    object locker = new object();

    public event EventHandler<EventArgs> Starting;
    public event EventHandler<EventArgs> Stopping;
    public event EventHandler<EventArgs> Completed;

    /// <summary> graph builder interface. </summary>
    private DirectShowLib.ICaptureGraphBuilder2 captureGraphBuilder = null;
    DirectShowLib.IMediaControl mediaCtrl = null;
    IMediaEvent mediaEvent = null;
    bool stopMediaEventLoop = false;
    Thread mediaEventThread;

    /// <summary> Dimensions of the image, calculated once in constructor. </summary>
    private readonly VideoInfoHeader videoInfoHeader;

    IVideoSource source;

    public VideoSourceToVideo(IVideoSource source, string destFilename, string encoderName)
    {
        try
        {
            this.source = source;

            // Set up the capture graph
            SetupGraph(destFilename, encoderName);
        }
        catch
        {
            Dispose();
            throw;
        }
    }


    /// <summary> release everything. </summary>
    public void Dispose()
    {
        StopMediaEventLoop();
        CloseInterfaces();
    }

    /// <summary> build the capture graph for grabber. </summary>
    private void SetupGraph(string destFilename, string encoderName)
    {
        int hr;

        // Get the graphbuilder object
        captureGraphBuilder = new DirectShowLib.CaptureGraphBuilder2() as DirectShowLib.ICaptureGraphBuilder2;

        IFilterGraph2 filterGraph = new DirectShowLib.FilterGraph() as DirectShowLib.IFilterGraph2;

        mediaCtrl = filterGraph as DirectShowLib.IMediaControl;
        IMediaFilter mediaFilt = filterGraph as IMediaFilter;
        mediaEvent = filterGraph as IMediaEvent;



        captureGraphBuilder.SetFiltergraph(filterGraph);

        IBaseFilter aviMux;
        IFileSinkFilter fileSink = null;
        hr = captureGraphBuilder.SetOutputFileName(MediaSubType.Avi, destFilename, out aviMux, out fileSink);
        DsError.ThrowExceptionForHR(hr);

        DirectShowLib.IBaseFilter compressor = DirectShowUtils.GetVideoCompressor(encoderName);

        if (compressor == null)
        {
            throw new InvalidCodecException(encoderName);
        }


        hr = filterGraph.AddFilter(compressor, "compressor");
        DsError.ThrowExceptionForHR(hr);


        // Our data source
        IBaseFilter source = (IBaseFilter)new GenericSampleSourceFilter();

        // Get the pin from the filter so we can configure it
        IPin ipin = DsFindPin.ByDirection(source, PinDirection.Output, 0);

        try
        {
            // Configure the pin using the provided BitmapInfo
            ConfigurePusher((IGenericSampleConfig)ipin);
        }
        finally
        {
            Marshal.ReleaseComObject(ipin);
        }

        // Add the filter to the graph
        hr = filterGraph.AddFilter(source, "GenericSampleSourceFilter");
        Marshal.ThrowExceptionForHR(hr);


        hr = filterGraph.AddFilter(source, "source");
        DsError.ThrowExceptionForHR(hr);

        hr = captureGraphBuilder.RenderStream(null, null, source, compressor, aviMux);
        DsError.ThrowExceptionForHR(hr);

        IMediaPosition mediaPos = filterGraph as IMediaPosition;

        hr = mediaCtrl.Run();
        DsError.ThrowExceptionForHR(hr);
    }

    private void ConfigurePusher(IGenericSampleConfig ips)
    {
        int hr;

        source.SetMediaType(ips);

        // Specify the callback routine to call with each sample
        hr = ips.SetBitmapCB(source);
        DsError.ThrowExceptionForHR(hr);
    }


    private void StartMediaEventLoop()
    {
        mediaEventThread = new Thread(MediaEventLoop)
        {
            Name = "Offscreen Vid Player Medialoop",
            IsBackground = false
        };

        mediaEventThread.Start();
    }

    private void StopMediaEventLoop()
    {
        stopMediaEventLoop = true;

        if (mediaEventThread != null)
        {
            mediaEventThread.Join();
        }
    }

    public void MediaEventLoop()
    {
        MediaEventLoop(x => PercentageCompleted = x);
    }

    public double PercentageCompleted
    {
        get;
        private set;
    }

    // FIXME this needs some work, to be completely in-tune with needs.
    public void MediaEventLoop(Action<double> UpdateProgress)
    {
        mediaEvent.CancelDefaultHandling(EventCode.StateChange);
        //mediaEvent.CancelDefaultHandling(EventCode.Starvation);

        while (stopMediaEventLoop == false)
        {
            try
            {
                EventCode ev;

                IntPtr p1, p2;
                if (mediaEvent.GetEvent(out ev, out p1, out p2, 0) == 0)
                {
                    switch (ev)
                    {
                        case EventCode.Complete:
                            Stopping.Fire(this, null);
                            if (UpdateProgress != null)
                            {
                                UpdateProgress(source.PercentageCompleted);
                            }
                            return;


                        case EventCode.StateChange:
                            FilterState state = (FilterState)p1.ToInt32();

                            if (state == FilterState.Stopped || state == FilterState.Paused)
                            {
                                Stopping.Fire(this, null);
                            }
                            else if (state == FilterState.Running)
                            {
                                Starting.Fire(this, null);
                            }

                            break;

                        // FIXME add abort and stuff, and propagate this.
                    }

                    //                        Trace.WriteLine(ev.ToString() + " " + p1.ToInt32());

                    mediaEvent.FreeEventParams(ev, p1, p2);
                }
                else
                {
                    if (UpdateProgress != null)
                    {
                        UpdateProgress(source.PercentageCompleted);
                    }
                    // FiXME use AutoResetEvent
                    Thread.Sleep(100);
                }
            }
            catch (Exception e)
            {
                Trace.WriteLine("MediaEventLoop: " + e);
            }
        }
    }

    /// <summary> Shut down capture </summary>
    private void CloseInterfaces()
    {
        int hr;

        try
        {
            if (mediaCtrl != null)
            {
                // Stop the graph
                hr = mediaCtrl.Stop();
                mediaCtrl = null;
            }
        }
        catch (Exception ex)
        {
            Debug.WriteLine(ex);
        }

        if (captureGraphBuilder != null)
        {
            Marshal.ReleaseComObject(captureGraphBuilder);
            captureGraphBuilder = null;
        }

        GC.Collect();
    }

    public void Start()
    {
        StartMediaEventLoop();
    }
}

IVideoSource:

public interface IVideoSource : IGenericSampleCB
{
    double PercentageCompleted { get; }
    int GetImage(int iFrameNumber, IntPtr ip, int iSize, out int iRead);
    void SetMediaType(global::IPerform.Video.Conversion.Interops.IGenericSampleConfig psc);
    int SetTimeStamps(global::DirectShowLib.IMediaSample pSample, int iFrameNumber);
}

ImageVideoSource(主要取自 DirectShow.NET 示例):

    // A generic class to support easily changing between my different sources of data.

// Note: You DON'T have to use this class, or anything like it.  The key is the SampleCallback
// routine.  How/where you get your bitmaps is ENTIRELY up to you.  Having SampleCallback call
// members of this class was just the approach I used to isolate the data handling.
public abstract class ImageVideoSource : IDisposable, IVideoSource
{
    #region Definitions

    /// <summary>
    /// 100 ns - used by a number of DS methods
    /// </summary>
    private const long UNIT = 10000000;

    #endregion

    /// <summary>
    /// Number of callbacks that returned a positive result
    /// </summary>
    private int m_iFrameNumber = 0;

    virtual public void Dispose()
    {
    }

    public abstract double PercentageCompleted { get; protected set; }

    abstract public void SetMediaType(IGenericSampleConfig psc);
    abstract public int GetImage(int iFrameNumber, IntPtr ip, int iSize, out int iRead);
    virtual public int SetTimeStamps(IMediaSample pSample, int iFrameNumber)
    {
        return 0;
    }

    /// <summary>
    /// Called by the GenericSampleSourceFilter.  This routine populates the MediaSample.
    /// </summary>
    /// <param name="pSample">Pointer to a sample</param>
    /// <returns>0 = success, 1 = end of stream, negative values for errors</returns>
    virtual public int SampleCallback(IMediaSample pSample)
    {
        int hr;
        IntPtr pData;

        try
        {
            // Get the buffer into which we will copy the data
            hr = pSample.GetPointer(out pData);
            if (hr >= 0)
            {
                // Set TRUE on every sample for uncompressed frames
                hr = pSample.SetSyncPoint(true);
                if (hr >= 0)
                {
                    // Find out the amount of space in the buffer
                    int cbData = pSample.GetSize();

                    hr = SetTimeStamps(pSample, m_iFrameNumber);
                    if (hr >= 0)
                    {
                        int iRead;

                        // Get copy the data into the sample
                        hr = GetImage(m_iFrameNumber, pData, cbData, out iRead);
                        if (hr == 0) // 1 == End of stream
                        {
                            pSample.SetActualDataLength(iRead);

                            // increment the frame number for next time
                            m_iFrameNumber++;
                        }
                    }
                }
            }
        }
        finally
        {
            // Release our pointer the the media sample.  THIS IS ESSENTIAL!  If
            // you don't do this, the graph will stop after about 2 samples.
            Marshal.ReleaseComObject(pSample);
        }

        return hr;
    }
}

RawVideoSource(具体托管源生成器的示例)对于 DirectShow 管道):

    internal class RawVideoSource : ImageVideoSource
{ 
    private byte[] buffer;
    private byte[] demosaicBuffer;
    private RawVideoReader reader;

    public override double PercentageCompleted
    {
        get;
        protected set;
    }

    public RawVideoSource(string sourceFile)
    {
        reader = new RawVideoReader(sourceFile);
    }

    override public void SetMediaType(IGenericSampleConfig psc)
    {
        BitmapInfoHeader bmi = new BitmapInfoHeader();

        bmi.Size = Marshal.SizeOf(typeof(BitmapInfoHeader));
        bmi.Width = reader.Header.VideoSize.Width;
        bmi.Height = reader.Header.VideoSize.Height;
        bmi.Planes = 1;
        bmi.BitCount = 24;
        bmi.Compression = 0;
        bmi.ImageSize = (bmi.BitCount / 8) * bmi.Width * bmi.Height;
        bmi.XPelsPerMeter = 0;
        bmi.YPelsPerMeter = 0;
        bmi.ClrUsed = 0;
        bmi.ClrImportant = 0;

        int hr = psc.SetMediaTypeFromBitmap(bmi, 0);

        buffer = new byte[reader.Header.FrameSize];
        demosaicBuffer = new byte[reader.Header.FrameSize * 3];

        DsError.ThrowExceptionForHR(hr);
    }

    long startFrameTime;
    long endFrameTime;
    unsafe override public int GetImage(int iFrameNumber, IntPtr ip, int iSize, out int iRead)
    {
        int hr = 0;

        if (iFrameNumber < reader.Header.NumberOfFrames)
        {
            reader.ReadFrame(buffer, iFrameNumber, out startFrameTime, out endFrameTime);

            Demosaic.DemosaicGBGR24Bilinear(buffer, demosaicBuffer, reader.Header.VideoSize);

            Marshal.Copy(demosaicBuffer, 0, ip, reader.Header.FrameSize * 3);

            PercentageCompleted = ((double)iFrameNumber / reader.Header.NumberOfFrames) * 100.0;
        }
        else
        {
            PercentageCompleted = 100;

            hr = 1; // End of stream
        }

        iRead = iSize;

        return hr;
    }

    override public int SetTimeStamps(IMediaSample pSample, int iFrameNumber)
    {
        reader.ReadTimeStamps(iFrameNumber, out startFrameTime, out endFrameTime);

        DsLong rtStart = new DsLong(startFrameTime);
        DsLong rtStop = new DsLong(endFrameTime);

        int hr = pSample.SetTime(rtStart, rtStop);

        return hr;
    }
}

以及 GSSF.AX COM 的互操作:

namespace IPerform.Video.Conversion.Interops
{
    [ComImport, Guid("6F7BCF72-D0C2-4449-BE0E-B12F580D056D")]
    public class GenericSampleSourceFilter
    {
    }

    [InterfaceType(ComInterfaceType.InterfaceIsIUnknown),
    Guid("33B9EE57-1067-45fa-B12D-C37517F09FC0")]
    public interface IGenericSampleCB
    {
        [PreserveSig]
        int SampleCallback(IMediaSample pSample);
    }

    [Guid("CE50FFF9-1BA8-4788-8131-BDE7D4FFC27F"),
    InterfaceType(ComInterfaceType.InterfaceIsIUnknown)]
    public interface IGenericSampleConfig
    {
        [PreserveSig]
        int SetMediaTypeFromBitmap(BitmapInfoHeader bmi, long lFPS);

        [PreserveSig]
        int SetMediaType([MarshalAs(UnmanagedType.LPStruct)] AMMediaType amt);

        [PreserveSig]
        int SetMediaTypeEx([MarshalAs(UnmanagedType.LPStruct)] AMMediaType amt, int lBufferSize);

        [PreserveSig]
        int SetBitmapCB(IGenericSampleCB pfn);
    }
}

祝你好运,尝试使用它来使其工作。或者提出进一步的问题,以便我们解决其他问题。

If you want to do this in .NET, the following steps are needed:

  1. Use the DirectShow.NET Generic Sample Source Filter (GSSF.AX) from the Misc/GSSF directory within the sample package. A source filter is always a COM module, so you need to register it too using "RegSvr32 GSSF.ax".

  2. Implement a bitmap provider in .NET

  3. Setup a graph, and connect the pin from the GSSF to the implementation of the bitmap provider.

  4. Pray.

I am using the following within a project, and made it reusable for future usage.

The code (not the best, and not finished, but a working start) (this takes a IVideoSource, which is bellow):

public class VideoSourceToVideo : IDisposable
{
    object locker = new object();

    public event EventHandler<EventArgs> Starting;
    public event EventHandler<EventArgs> Stopping;
    public event EventHandler<EventArgs> Completed;

    /// <summary> graph builder interface. </summary>
    private DirectShowLib.ICaptureGraphBuilder2 captureGraphBuilder = null;
    DirectShowLib.IMediaControl mediaCtrl = null;
    IMediaEvent mediaEvent = null;
    bool stopMediaEventLoop = false;
    Thread mediaEventThread;

    /// <summary> Dimensions of the image, calculated once in constructor. </summary>
    private readonly VideoInfoHeader videoInfoHeader;

    IVideoSource source;

    public VideoSourceToVideo(IVideoSource source, string destFilename, string encoderName)
    {
        try
        {
            this.source = source;

            // Set up the capture graph
            SetupGraph(destFilename, encoderName);
        }
        catch
        {
            Dispose();
            throw;
        }
    }


    /// <summary> release everything. </summary>
    public void Dispose()
    {
        StopMediaEventLoop();
        CloseInterfaces();
    }

    /// <summary> build the capture graph for grabber. </summary>
    private void SetupGraph(string destFilename, string encoderName)
    {
        int hr;

        // Get the graphbuilder object
        captureGraphBuilder = new DirectShowLib.CaptureGraphBuilder2() as DirectShowLib.ICaptureGraphBuilder2;

        IFilterGraph2 filterGraph = new DirectShowLib.FilterGraph() as DirectShowLib.IFilterGraph2;

        mediaCtrl = filterGraph as DirectShowLib.IMediaControl;
        IMediaFilter mediaFilt = filterGraph as IMediaFilter;
        mediaEvent = filterGraph as IMediaEvent;



        captureGraphBuilder.SetFiltergraph(filterGraph);

        IBaseFilter aviMux;
        IFileSinkFilter fileSink = null;
        hr = captureGraphBuilder.SetOutputFileName(MediaSubType.Avi, destFilename, out aviMux, out fileSink);
        DsError.ThrowExceptionForHR(hr);

        DirectShowLib.IBaseFilter compressor = DirectShowUtils.GetVideoCompressor(encoderName);

        if (compressor == null)
        {
            throw new InvalidCodecException(encoderName);
        }


        hr = filterGraph.AddFilter(compressor, "compressor");
        DsError.ThrowExceptionForHR(hr);


        // Our data source
        IBaseFilter source = (IBaseFilter)new GenericSampleSourceFilter();

        // Get the pin from the filter so we can configure it
        IPin ipin = DsFindPin.ByDirection(source, PinDirection.Output, 0);

        try
        {
            // Configure the pin using the provided BitmapInfo
            ConfigurePusher((IGenericSampleConfig)ipin);
        }
        finally
        {
            Marshal.ReleaseComObject(ipin);
        }

        // Add the filter to the graph
        hr = filterGraph.AddFilter(source, "GenericSampleSourceFilter");
        Marshal.ThrowExceptionForHR(hr);


        hr = filterGraph.AddFilter(source, "source");
        DsError.ThrowExceptionForHR(hr);

        hr = captureGraphBuilder.RenderStream(null, null, source, compressor, aviMux);
        DsError.ThrowExceptionForHR(hr);

        IMediaPosition mediaPos = filterGraph as IMediaPosition;

        hr = mediaCtrl.Run();
        DsError.ThrowExceptionForHR(hr);
    }

    private void ConfigurePusher(IGenericSampleConfig ips)
    {
        int hr;

        source.SetMediaType(ips);

        // Specify the callback routine to call with each sample
        hr = ips.SetBitmapCB(source);
        DsError.ThrowExceptionForHR(hr);
    }


    private void StartMediaEventLoop()
    {
        mediaEventThread = new Thread(MediaEventLoop)
        {
            Name = "Offscreen Vid Player Medialoop",
            IsBackground = false
        };

        mediaEventThread.Start();
    }

    private void StopMediaEventLoop()
    {
        stopMediaEventLoop = true;

        if (mediaEventThread != null)
        {
            mediaEventThread.Join();
        }
    }

    public void MediaEventLoop()
    {
        MediaEventLoop(x => PercentageCompleted = x);
    }

    public double PercentageCompleted
    {
        get;
        private set;
    }

    // FIXME this needs some work, to be completely in-tune with needs.
    public void MediaEventLoop(Action<double> UpdateProgress)
    {
        mediaEvent.CancelDefaultHandling(EventCode.StateChange);
        //mediaEvent.CancelDefaultHandling(EventCode.Starvation);

        while (stopMediaEventLoop == false)
        {
            try
            {
                EventCode ev;

                IntPtr p1, p2;
                if (mediaEvent.GetEvent(out ev, out p1, out p2, 0) == 0)
                {
                    switch (ev)
                    {
                        case EventCode.Complete:
                            Stopping.Fire(this, null);
                            if (UpdateProgress != null)
                            {
                                UpdateProgress(source.PercentageCompleted);
                            }
                            return;


                        case EventCode.StateChange:
                            FilterState state = (FilterState)p1.ToInt32();

                            if (state == FilterState.Stopped || state == FilterState.Paused)
                            {
                                Stopping.Fire(this, null);
                            }
                            else if (state == FilterState.Running)
                            {
                                Starting.Fire(this, null);
                            }

                            break;

                        // FIXME add abort and stuff, and propagate this.
                    }

                    //                        Trace.WriteLine(ev.ToString() + " " + p1.ToInt32());

                    mediaEvent.FreeEventParams(ev, p1, p2);
                }
                else
                {
                    if (UpdateProgress != null)
                    {
                        UpdateProgress(source.PercentageCompleted);
                    }
                    // FiXME use AutoResetEvent
                    Thread.Sleep(100);
                }
            }
            catch (Exception e)
            {
                Trace.WriteLine("MediaEventLoop: " + e);
            }
        }
    }

    /// <summary> Shut down capture </summary>
    private void CloseInterfaces()
    {
        int hr;

        try
        {
            if (mediaCtrl != null)
            {
                // Stop the graph
                hr = mediaCtrl.Stop();
                mediaCtrl = null;
            }
        }
        catch (Exception ex)
        {
            Debug.WriteLine(ex);
        }

        if (captureGraphBuilder != null)
        {
            Marshal.ReleaseComObject(captureGraphBuilder);
            captureGraphBuilder = null;
        }

        GC.Collect();
    }

    public void Start()
    {
        StartMediaEventLoop();
    }
}

IVideoSource:

public interface IVideoSource : IGenericSampleCB
{
    double PercentageCompleted { get; }
    int GetImage(int iFrameNumber, IntPtr ip, int iSize, out int iRead);
    void SetMediaType(global::IPerform.Video.Conversion.Interops.IGenericSampleConfig psc);
    int SetTimeStamps(global::DirectShowLib.IMediaSample pSample, int iFrameNumber);
}

ImageVideoSource (mostly taken from DirectShow.NET examples):

    // A generic class to support easily changing between my different sources of data.

// Note: You DON'T have to use this class, or anything like it.  The key is the SampleCallback
// routine.  How/where you get your bitmaps is ENTIRELY up to you.  Having SampleCallback call
// members of this class was just the approach I used to isolate the data handling.
public abstract class ImageVideoSource : IDisposable, IVideoSource
{
    #region Definitions

    /// <summary>
    /// 100 ns - used by a number of DS methods
    /// </summary>
    private const long UNIT = 10000000;

    #endregion

    /// <summary>
    /// Number of callbacks that returned a positive result
    /// </summary>
    private int m_iFrameNumber = 0;

    virtual public void Dispose()
    {
    }

    public abstract double PercentageCompleted { get; protected set; }

    abstract public void SetMediaType(IGenericSampleConfig psc);
    abstract public int GetImage(int iFrameNumber, IntPtr ip, int iSize, out int iRead);
    virtual public int SetTimeStamps(IMediaSample pSample, int iFrameNumber)
    {
        return 0;
    }

    /// <summary>
    /// Called by the GenericSampleSourceFilter.  This routine populates the MediaSample.
    /// </summary>
    /// <param name="pSample">Pointer to a sample</param>
    /// <returns>0 = success, 1 = end of stream, negative values for errors</returns>
    virtual public int SampleCallback(IMediaSample pSample)
    {
        int hr;
        IntPtr pData;

        try
        {
            // Get the buffer into which we will copy the data
            hr = pSample.GetPointer(out pData);
            if (hr >= 0)
            {
                // Set TRUE on every sample for uncompressed frames
                hr = pSample.SetSyncPoint(true);
                if (hr >= 0)
                {
                    // Find out the amount of space in the buffer
                    int cbData = pSample.GetSize();

                    hr = SetTimeStamps(pSample, m_iFrameNumber);
                    if (hr >= 0)
                    {
                        int iRead;

                        // Get copy the data into the sample
                        hr = GetImage(m_iFrameNumber, pData, cbData, out iRead);
                        if (hr == 0) // 1 == End of stream
                        {
                            pSample.SetActualDataLength(iRead);

                            // increment the frame number for next time
                            m_iFrameNumber++;
                        }
                    }
                }
            }
        }
        finally
        {
            // Release our pointer the the media sample.  THIS IS ESSENTIAL!  If
            // you don't do this, the graph will stop after about 2 samples.
            Marshal.ReleaseComObject(pSample);
        }

        return hr;
    }
}

RawVideoSource (an example of a concrete managed source generator for a DirectShow pipeline):

    internal class RawVideoSource : ImageVideoSource
{ 
    private byte[] buffer;
    private byte[] demosaicBuffer;
    private RawVideoReader reader;

    public override double PercentageCompleted
    {
        get;
        protected set;
    }

    public RawVideoSource(string sourceFile)
    {
        reader = new RawVideoReader(sourceFile);
    }

    override public void SetMediaType(IGenericSampleConfig psc)
    {
        BitmapInfoHeader bmi = new BitmapInfoHeader();

        bmi.Size = Marshal.SizeOf(typeof(BitmapInfoHeader));
        bmi.Width = reader.Header.VideoSize.Width;
        bmi.Height = reader.Header.VideoSize.Height;
        bmi.Planes = 1;
        bmi.BitCount = 24;
        bmi.Compression = 0;
        bmi.ImageSize = (bmi.BitCount / 8) * bmi.Width * bmi.Height;
        bmi.XPelsPerMeter = 0;
        bmi.YPelsPerMeter = 0;
        bmi.ClrUsed = 0;
        bmi.ClrImportant = 0;

        int hr = psc.SetMediaTypeFromBitmap(bmi, 0);

        buffer = new byte[reader.Header.FrameSize];
        demosaicBuffer = new byte[reader.Header.FrameSize * 3];

        DsError.ThrowExceptionForHR(hr);
    }

    long startFrameTime;
    long endFrameTime;
    unsafe override public int GetImage(int iFrameNumber, IntPtr ip, int iSize, out int iRead)
    {
        int hr = 0;

        if (iFrameNumber < reader.Header.NumberOfFrames)
        {
            reader.ReadFrame(buffer, iFrameNumber, out startFrameTime, out endFrameTime);

            Demosaic.DemosaicGBGR24Bilinear(buffer, demosaicBuffer, reader.Header.VideoSize);

            Marshal.Copy(demosaicBuffer, 0, ip, reader.Header.FrameSize * 3);

            PercentageCompleted = ((double)iFrameNumber / reader.Header.NumberOfFrames) * 100.0;
        }
        else
        {
            PercentageCompleted = 100;

            hr = 1; // End of stream
        }

        iRead = iSize;

        return hr;
    }

    override public int SetTimeStamps(IMediaSample pSample, int iFrameNumber)
    {
        reader.ReadTimeStamps(iFrameNumber, out startFrameTime, out endFrameTime);

        DsLong rtStart = new DsLong(startFrameTime);
        DsLong rtStop = new DsLong(endFrameTime);

        int hr = pSample.SetTime(rtStart, rtStop);

        return hr;
    }
}

And the interops to the GSSF.AX COM:

namespace IPerform.Video.Conversion.Interops
{
    [ComImport, Guid("6F7BCF72-D0C2-4449-BE0E-B12F580D056D")]
    public class GenericSampleSourceFilter
    {
    }

    [InterfaceType(ComInterfaceType.InterfaceIsIUnknown),
    Guid("33B9EE57-1067-45fa-B12D-C37517F09FC0")]
    public interface IGenericSampleCB
    {
        [PreserveSig]
        int SampleCallback(IMediaSample pSample);
    }

    [Guid("CE50FFF9-1BA8-4788-8131-BDE7D4FFC27F"),
    InterfaceType(ComInterfaceType.InterfaceIsIUnknown)]
    public interface IGenericSampleConfig
    {
        [PreserveSig]
        int SetMediaTypeFromBitmap(BitmapInfoHeader bmi, long lFPS);

        [PreserveSig]
        int SetMediaType([MarshalAs(UnmanagedType.LPStruct)] AMMediaType amt);

        [PreserveSig]
        int SetMediaTypeEx([MarshalAs(UnmanagedType.LPStruct)] AMMediaType amt, int lBufferSize);

        [PreserveSig]
        int SetBitmapCB(IGenericSampleCB pfn);
    }
}

Good luck, try to get it working using this. Or comment with further questions so we can iron out other issues.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文