存储、转换为视频流 Android 屏幕

发布于 2024-12-21 10:22:11 字数 1418 浏览 1 评论 0原文

我使用 DDMS 从 Android 手机获取屏幕截图,我需要一种有效的方法将它们转换为视频并通过网络流式传输视频。

我有一个 RawImage,其中填充了屏幕截图的数据:

RawImage mRawImage;

直到知道我使用 SWT 创建图像并保存它:

PaletteData paletteData = new PaletteData(
    mRawImage.getRedMask(),
    mRawImage.getGreenMask(),
    mRawImage.getBlueMask());

ImageData imageData = new ImageData(
    mRawImage.width,
    mRawImage.height,
    mRawImage.bpp,
    paletteData,
    1,
    mRawImage.data);

ImageLoader s = new ImageLoader();
s.data = new ImageData[] {imageData};
s.save("temp.jpg",SWT.IMAGE_JPEG);
  • 您能提出一种将这些图像序列转换为视频的方法吗? 然后流式传输视频?

我发现 NanoHTTPD 可以用于流式传输,但是如何我可以将图像转换并压缩为视频吗?

  • 你相信我可以使用 ffmpeg 做到这一点吗?

我找到了一个很好的教程 使用 FFMPEG 和 video4linux2 流式传输网络摄像头。

是否可以将 RawImage 中的字节发送到 FFMPEG 以转换为实时视频流?

实际代码:

$ffmpeg -f video4linux2 -i /dev/video0 http://78.47.18.19:8090/cam1.ffm

将其替换为类似以下内容:

$ffmpeg -f video4linux2 -i **<add here java stream>** http://78.47.18.19:8090/cam1.ffm

有什么建议吗?

谢谢

PS:我希望有一个解决方案可以帮助我将图像转换为压缩视频,然后通过网络流式传输视频,以便使用 HTML5 或 Flash Player 播放

I use DDMS to get screenshots from my Android phone and I need an efficient way for converting them in video and streaming the video over the network.

I have a RawImage which is filled with the data of the screenshot:

RawImage mRawImage;

Until know I use SWT to create the image and save it:

PaletteData paletteData = new PaletteData(
    mRawImage.getRedMask(),
    mRawImage.getGreenMask(),
    mRawImage.getBlueMask());

ImageData imageData = new ImageData(
    mRawImage.width,
    mRawImage.height,
    mRawImage.bpp,
    paletteData,
    1,
    mRawImage.data);

ImageLoader s = new ImageLoader();
s.data = new ImageData[] {imageData};
s.save("temp.jpg",SWT.IMAGE_JPEG);
  • Can you propose a way to convert those images sequence to video and
    then stream the video?

I found NanoHTTPD which can be used for streaming but how can I convert and compress the images to video?

  • Do you believe that I can do that using ffmpeg?

I found a good Tutorial for streaming your webcam using FFMPEG and video4linux2.

Is it possible to send the bytes from the RawImage to the FFMPEG to be converted to a live video stream?

Actual code:

$ffmpeg -f video4linux2 -i /dev/video0 http://78.47.18.19:8090/cam1.ffm

Replace it with something similar to:

$ffmpeg -f video4linux2 -i **<add here java stream>** http://78.47.18.19:8090/cam1.ffm

Any suggestions?

Thanks

PS: I expect a solution which will help me convert the images to a compressed video and then stream the video over the network in order to play it with either HTML5 or a Flash Player

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

月下凄凉 2024-12-28 10:22:11

有两种方法可以解决此问题:

1)在设备本身上捕获、编码和流式传输,

2)在设备上捕获、从服务器编码和流式传输

我不知道您的所有要求,但我认为该选项2 是要走的路线。您将拥有更好的性能和更广泛的工具来实现您的最终目标。

捕捉
几乎完全按照您描述的方式捕获 JPEG 图像。除了为每个图像添加一个索引,这样您就有 FILE1.JPG FILE2.JPG FILE3.JPG FILE4.jpg 等。

现在,根据您的要求,每隔一段时间,将这些图像上传到服务器。

编码
使用 mencode 设置有损压缩率,如下所示:

mencoder "./*.jpg" -mf fps=5 -o yourvideo.avi -ovc lavc -lavcopts vcodec=msmpeg4v2:vbitrate=800

一次完成后,您可以使用 ffmpeg 创建 MP4,如下所示:

ffmpeg -r 5 -b 1800 -i %01d.jpg yourvideo.mp4

Streaming
现在,为了通过网络流式传输 mp4,我将设置一个网页,如下所示:

http://myserver/androidStream

加载动态编写的 M3U 播放列表,以便它始终指向要流式传输的最新视频。根据您使用的播放器,您可以将 M3U 播放列表指向将加载下一个视频的“下一个”M3U 播放列表。您可能还想根据您的要求查看其他播放列表格式,例如 ASX 或 PLS。

请参阅:http://en.wikipedia.org/wiki/Advanced_Stream_Redirector
http://en.wikipedia.org/wiki/PLS_(file_format) 作为示例其他非 M3U 播放列表格式。

使用上述一般步骤,您将拥有一个系统,其中设备以每分钟 n 个图像的速率捕获图像,然后将这些图像上传到服务器进行编码。对这些图像进行编码后,您可以通过引用文件直接流式传输电影,也可以设置 M3U 类型的播放列表,这将允许播放器从“当前”视频移动到“下一个”视频(如果可用)。

There are two ways to approach this:

1) Capture, Encode and Stream on and from the device itself

or

2) Capture on the device, Encode and Stream from a server

I don't know all of your requirements but I would presume that option 2 is the route to go. You will have better performance and a wider array of tools to use in order to accomplish your end goal.

Capture
Capture the JPEG images almost exactly the way you describe. Except add an index to each one such that you have FILE1.JPG FILE2.JPG FILE3.JPG FILE4.jpg etc.

Now, at some interval, depending on your requirements, upload these images to a server.

Encode
Use mencode to set the rate of lossy compression like so:

mencoder "./*.jpg" -mf fps=5 -o yourvideo.avi -ovc lavc -lavcopts vcodec=msmpeg4v2:vbitrate=800

Once that is done, then you can use ffmpeg to create the MP4 like so:

ffmpeg -r 5 -b 1800 -i %01d.jpg yourvideo.mp4

Streaming
Now in order to stream the mp4 over a network, i would setup a web page like:

http://myserver/androidStream

Which loads up a M3U playlist that is written dynamically such that it's always pointing to the most up to date video to stream. Depending on the player your using, you may be able to point the M3U playlist to the 'next' M3U playlist that will load up the next video. You may also want to look at alternative playlist formats like ASX or PLS depending upon your requirements.

See: http://en.wikipedia.org/wiki/Advanced_Stream_Redirector and
http://en.wikipedia.org/wiki/PLS_(file_format) as examples of other non-M3U playlist formats.

Using the general steps above, you will have a system where the device is capturing images at a rate of n images per minute and then uploading those images to a server for encoding. Once those images are encoded, you can either stream the movie directly by referencing the file or you can setup an M3U type playlist which will allow the player to move from the 'current' video to the 'next' video when it's available.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文