Android VideoView GStreamer Streaming(MediaController 不起作用)
我有一些小项目将视频流式传输到 Android 设备。流媒体已完成,但我在控制视频方面遇到问题。当我按下暂停键时,MediaController 不起作用,没有任何效果。 VideoView.pause() 也不起作用。流媒体服务器基于GStreamer(服务器是我的朋友写的),我使用的是Android 2.2 CyanogenMod。
这是服务器代码:
#include <gst/gst.h>
#include <gst/rtsp-server/rtsp-server.h>
int
main (int argc, char *argv[])
{
GMainLoop *loop;
GstRTSPServer *server;
GstRTSPMediaMapping *mapping;
GstRTSPMediaFactory *factory;
gchar *str;
gst_init (&argc, &argv);
if (argc < 2) {
g_message ("usage: %s <filename>", argv[0]);
return -1;
}
loop = g_main_loop_new (NULL, FALSE);
/* create a server instance */
server = gst_rtsp_server_new ();
/* get the mapping for this server, every server has a default mapper object
* that be used to map uri mount points to media factories */
mapping = gst_rtsp_server_get_media_mapping (server);
str = g_strdup_printf ("( "
"filesrc location=\"%s\" ! decodebin2 name=d "
"d. ! queue ! videoscale ! video/x-raw-yuv, width=500, height=300 "
"! ffenc_mpeg4 ! rtpmp4vpay name=pay0 "
"d. ! queue ! audioconvert ! faac ! rtpmp4apay name=pay1"
" )", argv[1]);
/* make a media factory for a test stream. The default media factory can use
* gst-launch syntax to create pipelines.
* any launch line works as long as it contains elements named pay%d. Each
* element with pay%d names will be a stream */
factory = gst_rtsp_media_factory_new ();
gst_rtsp_media_factory_set_launch (factory, str);
g_free (str);
/* attach the test factory to the /test url */
gst_rtsp_media_mapping_add_factory (mapping, "/test", factory);
/* don't need the ref to the mapper anymore */
g_object_unref (mapping);
/* attach the server to the default maincontext */
gst_rtsp_server_attach (server, NULL);
/* start serving */
g_main_loop_run (loop);
return 0;
}
I have some small project to stream video to android device. Streaming is done but I have problem with controlling video. The MediaController doesn't work when I push pause there is no effect. VideoView.pause() also doesn't work. Streaming server is based on GStreamer (server was wrote by my friend), and I'am using Android 2.2 CyanogenMod.
This is server code :
#include <gst/gst.h>
#include <gst/rtsp-server/rtsp-server.h>
int
main (int argc, char *argv[])
{
GMainLoop *loop;
GstRTSPServer *server;
GstRTSPMediaMapping *mapping;
GstRTSPMediaFactory *factory;
gchar *str;
gst_init (&argc, &argv);
if (argc < 2) {
g_message ("usage: %s <filename>", argv[0]);
return -1;
}
loop = g_main_loop_new (NULL, FALSE);
/* create a server instance */
server = gst_rtsp_server_new ();
/* get the mapping for this server, every server has a default mapper object
* that be used to map uri mount points to media factories */
mapping = gst_rtsp_server_get_media_mapping (server);
str = g_strdup_printf ("( "
"filesrc location=\"%s\" ! decodebin2 name=d "
"d. ! queue ! videoscale ! video/x-raw-yuv, width=500, height=300 "
"! ffenc_mpeg4 ! rtpmp4vpay name=pay0 "
"d. ! queue ! audioconvert ! faac ! rtpmp4apay name=pay1"
" )", argv[1]);
/* make a media factory for a test stream. The default media factory can use
* gst-launch syntax to create pipelines.
* any launch line works as long as it contains elements named pay%d. Each
* element with pay%d names will be a stream */
factory = gst_rtsp_media_factory_new ();
gst_rtsp_media_factory_set_launch (factory, str);
g_free (str);
/* attach the test factory to the /test url */
gst_rtsp_media_mapping_add_factory (mapping, "/test", factory);
/* don't need the ref to the mapper anymore */
g_object_unref (mapping);
/* attach the server to the default maincontext */
gst_rtsp_server_attach (server, NULL);
/* start serving */
g_main_loop_run (loop);
return 0;
}
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
根据我收集的信息,android 中的
VideoView
只接受h.264
feed,因此您需要使用h.264
进行编码。From what I have gathered, the
VideoView
in android only acceptsh.264
feeds so you need to be encoding inh.264
.