有没有办法在程序中伪造 DirectShow 过滤器?
我有一个 IP 摄像机,它通过网络接收包含图像的字符缓冲区。在我在程序中设置与它的连接之前,我无法访问它。我正在尝试剖析Windows源过滤器代码,但我的速度不是很快,所以我想我会问是否有可能只采用这样的缓冲区并将其转换为可以然后将引脚连接到AVISplitter或Directshow中的东西/.net
(video buffer from IP Cam) -> (???) -> (AVI Splitter) -> (Profit)
更新
我的程序在命名空间中捕获视频,并且我在其自己的命名空间中拥有来自 GSSF 的代码。我将带有图像的 ptr 从 cam 命名空间传递到 GSSF 命名空间。这仅发生一次,但图形来自这一张图像,并且摄像机来自网络。有没有办法不断地将缓冲区从 cam 传递到 GSSF 或者我应该以某种方式组合命名空间?我尝试将主相机指针发送到 GSSF,但它崩溃了,因为它访问了指针并正在写入。也许如果我抓取一张图像,传递指针,等待抓取新的图像?
*更新*
我缩小了我的代码,而且我现在查看它时也不相信我正确地执行了命名空间。
namespace Cam_Controller
{
static byte[] mainbyte = new byte[1280*720*2];
static IntPtr main_ptr = new IntPtr();
//(this function is threaded)
static void Trial(NPvBuffer mBuffer, NPvDisplayWnd mDisplayWnd, VideoCompression compressor)
{
Functions function = new Functions();
Defines define = new Defines();
NPvResult operationalResult = new NPvResult();
VideoCompression mcompressor = new VideoCompression();
int framecount = 0;
while (!Stopping && AcquiringImages)
{
Mutex lock_video = new Mutex();
NPvResult result = mDevice.RetrieveNextBuffer(mBuffer, operationalResult);
if(result.isOK())
{
framecount++;
wer = (int)mDisplayWnd.Display(mBuffer, wer);
main_ptr = (IntPtr)mBuffer.GetMarshalledBuffer();
Marshal.Copy(main_ptr, mainbyte, 0, 720 * 2560);
}
}
}
private void button7_Click(object sender, EventArgs e)
{
IntPtr dd = (IntPtr)mBuffer.GetMarshalledBuffer();
Marshal.Copy(dd, main_byte1, 0, 720 * 2560);
play = new VisiCam_Controller.DxPlay.DxPlay("", panel9, main_byte1);
play.Start();
}
namespace DxPlay
{
public class DxPlay
{
public DxPlay(string sPath, Control hWin, byte[] color)
{
try
{
// pick one of our image providers
//m_ImageHandler = new ImageFromFiles(sPath, 24);
m_ImageHandler = new ImageFromPixels(20, color);
//m_ImageHandler = new ImageFromMpg(@"c:\c1.mpg");
//m_ImageHandler = new ImageFromMpg(sPath);
//m_ImageHandler = new ImageFromMP3(@"c:\vss\media\track3.mp3");
// Set up the graph
SetupGraph(hWin);
}
catch
{
Dispose();
throw;
}
}
}
abstract internal class imagehandler
internal class imagefrompixels
{
private int[] mainint = new int[720 * 1280];
unsafe public ImageFromPixels(long FPS, byte[] x)
{
long fff = 720 * 1280 * 3;
mainptr = new IntPtr(fff);
for (int p = 0; p < 720 * 640; p++)
{
U = (x[ p * 4 + 0]);
Y = (x[p * 4 + 1]);
V = (x[p * 4 + 2]);
Y2 = (x[p * 4 + 3]);
int one = V << 16 | Y << 8 | U;
int two = V << 16 | Y2 << 8 | U;
mainint[p * 2 + 0] = one;
mainint[p * 2 + 1] = two;
}
m_FPS = UNIT / FPS;
m_b = 211;
m_g = 197;
}
}
}
}
还有 GetImage 但那是相对相同的,将缓冲区复制到指针中。发生的情况是我抓取图像的缓冲区并将其发送到 DxPlay 类。它能够处理它并将其放入 directshow 线上,没有任何问题;但它永远不会更新也不会被更新,因为它只是一个缓冲区。如果我改为向 DxPlay 发送保存图像缓冲区地址的 IntPtr,则访问内存时代码会崩溃,因为我假设 ImageFromPixels 代码(现在不存在(更改
(x[p * 4 + #])
为
(IntPtr)((x-passed as an IntPtr).toInt64()+p*4 + #)
)) 当 Cam_Controller 类正在编辑指针时,正在访问指针的内存。我制作并传递 IntPtr 和新 IntPtr 的副本,但它们在转换过程中失败。
I have an IP Camera that receives a char buffer containing an image over the network. I cant access it until i setup the connection to it in a program. I am trying to dissect windows source filter code and im not going very fast so i thought i'd ask if it was possible to just take a buffer like that and cast it to something that could then connect a pin to AVISplitter or such in Directshow/.net
(video buffer from IP Cam) -> (???) -> (AVI Splitter) -> (Profit)
Update
I have my program capturing video in a namespace, and i have this code from the GSSF in its own namespace. I pass a ptr with an image from the cam namespace to the GSSF namespace. This only occurs once, but the graph streams from this one image, and the camera streams from the network. is there a way to continually pass the buffer from cam to GSSF or should i combine the namespaces somehow? I tried sending the main camera pointer to the GSSF but it crashed because its accessing the pointer and its being written. maybe if i grabbed an image, passed the pointer, waited to grab a new one?
*Update*
I shrunk my code and I don't believe im doing the namespace correctly either now that i look at it.
namespace Cam_Controller
{
static byte[] mainbyte = new byte[1280*720*2];
static IntPtr main_ptr = new IntPtr();
//(this function is threaded)
static void Trial(NPvBuffer mBuffer, NPvDisplayWnd mDisplayWnd, VideoCompression compressor)
{
Functions function = new Functions();
Defines define = new Defines();
NPvResult operationalResult = new NPvResult();
VideoCompression mcompressor = new VideoCompression();
int framecount = 0;
while (!Stopping && AcquiringImages)
{
Mutex lock_video = new Mutex();
NPvResult result = mDevice.RetrieveNextBuffer(mBuffer, operationalResult);
if(result.isOK())
{
framecount++;
wer = (int)mDisplayWnd.Display(mBuffer, wer);
main_ptr = (IntPtr)mBuffer.GetMarshalledBuffer();
Marshal.Copy(main_ptr, mainbyte, 0, 720 * 2560);
}
}
}
private void button7_Click(object sender, EventArgs e)
{
IntPtr dd = (IntPtr)mBuffer.GetMarshalledBuffer();
Marshal.Copy(dd, main_byte1, 0, 720 * 2560);
play = new VisiCam_Controller.DxPlay.DxPlay("", panel9, main_byte1);
play.Start();
}
namespace DxPlay
{
public class DxPlay
{
public DxPlay(string sPath, Control hWin, byte[] color)
{
try
{
// pick one of our image providers
//m_ImageHandler = new ImageFromFiles(sPath, 24);
m_ImageHandler = new ImageFromPixels(20, color);
//m_ImageHandler = new ImageFromMpg(@"c:\c1.mpg");
//m_ImageHandler = new ImageFromMpg(sPath);
//m_ImageHandler = new ImageFromMP3(@"c:\vss\media\track3.mp3");
// Set up the graph
SetupGraph(hWin);
}
catch
{
Dispose();
throw;
}
}
}
abstract internal class imagehandler
internal class imagefrompixels
{
private int[] mainint = new int[720 * 1280];
unsafe public ImageFromPixels(long FPS, byte[] x)
{
long fff = 720 * 1280 * 3;
mainptr = new IntPtr(fff);
for (int p = 0; p < 720 * 640; p++)
{
U = (x[ p * 4 + 0]);
Y = (x[p * 4 + 1]);
V = (x[p * 4 + 2]);
Y2 = (x[p * 4 + 3]);
int one = V << 16 | Y << 8 | U;
int two = V << 16 | Y2 << 8 | U;
mainint[p * 2 + 0] = one;
mainint[p * 2 + 1] = two;
}
m_FPS = UNIT / FPS;
m_b = 211;
m_g = 197;
}
}
}
}
Theres also GetImage but thats relatively the same, copy the buffer into the pointer. What happens is i grab a buffer of the image and send it to the DxPlay class. it is able to process it and put it on the directshow line no problems; but it never updates nor gets updated because its just a single buffer. If i instead send DxPlay a IntPtr holding the address of the image buffer, the code crashes for accessing memory because i assume ImageFromPixels code ( which isn't there now ( change
(x[p * 4 + #])
to
(IntPtr)((x-passed as an IntPtr).toInt64()+p*4 + #)
))
is accessing the memory of the pointer as the Cam_Controller class is editing it. I make and pass copies of the IntPtrs, and new IntPtrs but they fail halfway through the conversion.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
如果要在 .NET 中执行此操作,则需要执行以下步骤:
使用 示例包。源过滤器始终是 COM 模块,因此您也需要使用“RegSvr32 GSSF.ax”注册它。
在 .NET 中实现位图提供程序
设置图形,并将引脚从 GSSF 连接到位图提供程序的实现。
祈祷。
我在项目中使用以下内容,并使其可重用以供将来使用。
代码(不是最好的,也没有完成,但是一个工作开始)(这需要一个 IVideoSource,如下所示):
IVideoSource:
ImageVideoSource(主要取自 DirectShow.NET 示例):
RawVideoSource(具体托管源生成器的示例)对于 DirectShow 管道):
以及 GSSF.AX COM 的互操作:
祝你好运,尝试使用它来使其工作。或者提出进一步的问题,以便我们解决其他问题。
If you want to do this in .NET, the following steps are needed:
Use the DirectShow.NET Generic Sample Source Filter (GSSF.AX) from the Misc/GSSF directory within the sample package. A source filter is always a COM module, so you need to register it too using "RegSvr32 GSSF.ax".
Implement a bitmap provider in .NET
Setup a graph, and connect the pin from the GSSF to the implementation of the bitmap provider.
Pray.
I am using the following within a project, and made it reusable for future usage.
The code (not the best, and not finished, but a working start) (this takes a IVideoSource, which is bellow):
IVideoSource:
ImageVideoSource (mostly taken from DirectShow.NET examples):
RawVideoSource (an example of a concrete managed source generator for a DirectShow pipeline):
And the interops to the GSSF.AX COM:
Good luck, try to get it working using this. Or comment with further questions so we can iron out other issues.