Android MediaPlayer 上的 RTP

发布于 2024-10-30 13:17:37 字数 1247 浏览 7 评论 0原文

我已经使用 VLC 作为 rtsp 在 Android MediaPlayer 上实现了 RTSP 使用此代码的服务器:

# vlc -vvv /home/marco/Videos/pippo.mp4 --sout 
#rtp{dst=192.168.100.246,port=6024-6025,sdp=rtsp://192.168.100.243:8080/test.sdp}

在Android项目上:


Uri videoUri = Uri.parse("rtsp://192.168.100.242:8080/test.sdp"); 
videoView.setVideoURI(videoUri); 
videoView.start(); 

这工作正常,但如果我还想播放实时流RTP,那么我 将 sdp 文件复制到 sdcard (/mnt/sdcard/test.sdp) 并设置 vlc:

# vlc -vvv /home/marco/Videos/pippo.mp4 --sout 
#rtp{dst=192.168.100.249,port=6024-6025} 

我尝试播放流RTP设置sdp文件的路径 本地:


Uri videoUri = Uri.parse("/mnt/sdcard/test.sdp");
videoView.setVideoURI(videoUri); 
videoView.start(); 

但我收到错误:


D/MediaPlayer( 9616): Couldn't open file on client side, trying server side 
W/MediaPlayer( 9616): info/warning (1, 26) 
I/MediaPlayer( 9616): Info (1,26) 
E/PlayerDriver(   76): Command PLAYER_INIT completed with an error or info PVMFFailure 
E/MediaPlayer( 9616): error (1, -1)
E/MediaPlayer( 9616): Error (1,-1) 
D/VideoView( 9616): Error: 1,-1 

有谁知道问题出在哪里?我是我错了还是不可能 在 MediaPlayer 上播放 RTP? 干杯 乔治奥

I've implemented RTSP on Android MediaPlayer using VLC as rtsp
server with this code:

# vlc -vvv /home/marco/Videos/pippo.mp4 --sout 
#rtp{dst=192.168.100.246,port=6024-6025,sdp=rtsp://192.168.100.243:8080/test.sdp}

and on the Android project:


Uri videoUri = Uri.parse("rtsp://192.168.100.242:8080/test.sdp"); 
videoView.setVideoURI(videoUri); 
videoView.start(); 

This works fine but if I'd like also to play live stream RTP so I
copied the sdp file into the sdcard (/mnt/sdcard/test.sdp) and setting
vlc:

# vlc -vvv /home/marco/Videos/pippo.mp4 --sout 
#rtp{dst=192.168.100.249,port=6024-6025} 

I tried to play the stream RTP setting the path of the sdp file
locally:


Uri videoUri = Uri.parse("/mnt/sdcard/test.sdp");
videoView.setVideoURI(videoUri); 
videoView.start(); 

But I got an error:


D/MediaPlayer( 9616): Couldn't open file on client side, trying server side 
W/MediaPlayer( 9616): info/warning (1, 26) 
I/MediaPlayer( 9616): Info (1,26) 
E/PlayerDriver(   76): Command PLAYER_INIT completed with an error or info PVMFFailure 
E/MediaPlayer( 9616): error (1, -1)
E/MediaPlayer( 9616): Error (1,-1) 
D/VideoView( 9616): Error: 1,-1 

Does anyone know where's the problem? I'm I wrong or it's not possible
to play RTP on MediaPlayer?
Cheers
Giorgio

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

百善笑为先 2024-11-06 13:17:37

我为你提供了部分解决方案。

我目前正在从事一个 Ra&D 项目,涉及从服务器到 Android 客户端的 RTP 媒体流。

通过完成这项工作,我为我自己的名为 smpte2022lib 的库做出了贡献,您可以在这里找到:
http://sourceforge.net/projects/smpte-2022lib/

在这样的库的帮助下(Java实现是目前最好的一个),您也许能够解析来自专业流媒体设备、VLC RTP会话的RTP多播流......

我已经使用来自捕获的专业RTP流的流成功地测试了它SMPTE-2022 2D-FEC 或使用 VLC 生成的简单流。

不幸的是,我不能在这里放置代码片段,因为使用它的项目实际上受版权保护,但我保证您可以简单地通过解析 RtpPacket 构造函数帮助的 UDP 流来使用它。

如果数据包是有效的 RTP 数据包(字节),它们将按原样进行解码。

此时,我将对 RtpPacket 构造函数的调用包装到一个线程,该线程实际上将解码的有效负载存储为媒体文件。然后我将使用该文件作为参数调用 VideoView。

交叉手指;-)

亲切的问候,

大卫·费舍尔

I have a partial solution for you.

I'm currently working on a Ra&D project involving RTP streaming of medias from a server to Android clients.

By doing this work, I contribute to my own library called smpte2022lib you may find here :
http://sourceforge.net/projects/smpte-2022lib/.

Helped with such library (the Java implementation is currently the best one) you may be able to parse RTP multicast streams coming from professional streaming equipements, VLC RTP sessions, ...

I already tested it successfully with streams coming from captured profesionnal RTP streams with SMPTE-2022 2D-FEC or with simple streams generated with VLC.

Unfortunately I cannot put a code-snippet here as the project using it is actually under copyright, but I ensure you you can use it simply by parsing UDP streams helped with RtpPacket constructor.

If the packets are valid RTP packets (the bytes) they will be decoded as such.

At this moment of time, I wrap the call to RtpPacket's constructor to a thread that actually stores the decoded payload as a media file. Then I will call the VideoView with this file as parameter.

Crossing fingers ;-)

Kind Regards,

David Fischer

雨落□心尘 2024-11-06 13:17:37

在android中可以使用(不是mediaPlayer,而是堆栈中的其他东西),但是当媒体生态系统的其他部分不这样做时,你真的想追求RTSP/RTP吗?

IMO - 在 HTML5/WebRTC 的保护下有更好的媒体/流方法。就像看看 'Ondello' 对流做了什么。

也就是说,这里是一些使用“netty”和“efflux”的 android/RTSP/SDP/RTP 旧项目代码。它将在 SDP 文件提供者上协商“会话”的某些部分。不记得它是否真的会播放 Youtube/RTSP 的东西,但这就是我当时的目标。 (我认为它可以使用 AMR-NB 编解码器,但是,存在大量问题,我在 Android 上放弃了 RTSP,就像一个坏习惯!)

在 Git 上....

        @Override
        public void mediaDescriptor(Client client, String descriptor)
        {
            // searches for control: session and media arguments.
            final String target = "control:";
            Log.d(TAG, "Session Descriptor\n" + descriptor);
            int position = -1;
            while((position = descriptor.indexOf(target)) > -1)
            {
                descriptor = descriptor.substring(position + target.length());
                resourceList.add(descriptor.substring(0, descriptor.indexOf('\r')));
            }
        }
        private int nextPort()
        {
            return (port += 2) - 2;
        }       


        private void getRTPStream(TransportHeader transport){

            String[] words;
            // only want 2000 part of 'client_port=2000-2001' in the Transport header in the response

            words = transport.getParameter("client_port").substring(transport.getParameter("client_port").indexOf("=") +1).split("-");
            port_lc = Integer.parseInt(words[0]);

            words = transport.getParameter("server_port").substring(transport.getParameter("server_port").indexOf("=") +1).split("-");
            port_rm = Integer.parseInt(words[0]);

            source = transport.getParameter("source").substring(transport.getParameter("source").indexOf("=") +1);          
            ssrc = transport.getParameter("ssrc").substring(transport.getParameter("ssrc").indexOf("=") +1);
            // assume dynamic Packet type = RTP , 99
            getRTPStream(session, source, port_lc, port_rm, 99);
            //getRTPStream("sessiona", source, port_lc, port_rm, 99);
            Log.d(TAG, "raw parms " +port_lc +" " +port_rm +" " +source );
//          String[] words = session.split(";");
        Log.d(TAG, "session: " +session);   
        Log.d(TAG, "transport: " +transport.getParameter("client_port") 
                +" "  +transport.getParameter("server_port") +" "  +transport.getParameter("source") 
                +" "  +transport.getParameter("ssrc"));

        }

        private void getRTPStream(String session, String source, int portl, int portr, int payloadFormat ){
            // what do u do with ssrc?
            InetAddress addr;
            try {
                addr = InetAddress.getLocalHost();
                // Get IP Address
 //             LAN_IP_ADDR = addr.getHostAddress();
                LAN_IP_ADDR = "192.168.1.125";
                Log.d(TAG, "using client IP addr " +LAN_IP_ADDR);

            } catch (UnknownHostException e1) {
                // TODO Auto-generated catch block
                e1.printStackTrace();
            }


            final CountDownLatch latch = new CountDownLatch(2);

            RtpParticipant local1 = RtpParticipant.createReceiver(new RtpParticipantInfo(1), LAN_IP_ADDR, portl, portl+=1);
     //       RtpParticipant local1 = RtpParticipant.createReceiver(new RtpParticipantInfo(1), "127.0.0.1", portl, portl+=1);
            RtpParticipant remote1 = RtpParticipant.createReceiver(new RtpParticipantInfo(2), source, portr, portr+=1);


            remote1.getInfo().setSsrc( Long.parseLong(ssrc, 16));
            session1 = new SingleParticipantSession(session, payloadFormat, local1, remote1);

           Log.d(TAG, "remote ssrc " +session1.getRemoteParticipant().getInfo().getSsrc());

            session1.init();

            session1.addDataListener(new RtpSessionDataListener() {
                @Override
                public void dataPacketReceived(RtpSession session, RtpParticipantInfo participant, DataPacket packet) {
     //               System.err.println("Session 1 received packet: " + packet + "(session: " + session.getId() + ")");
                    //TODO close the file, flush the buffer
//                  if (_sink != null) _sink.getPackByte(packet);
                    getPackByte(packet);

     //             System.err.println("Ssn 1  packet seqn: typ: datasz "  +packet.getSequenceNumber()  + " " +packet.getPayloadType() +" " +packet.getDataSize());
     //             System.err.println("Ssn 1  packet sessn: typ: datasz "  + session.getId() + " " +packet.getPayloadType() +" " +packet.getDataSize());
 //                   latch.countDown();
                }

            });
     //       DataPacket packet = new DataPacket();
      //      packet.setData(new byte[]{0x45, 0x45, 0x45, 0x45});
     //       packet.setSequenceNumber(1);
     //       session1.sendDataPacket(packet);


//        try {
       //       latch.await(2000, TimeUnit.MILLISECONDS);
     //     } catch (Exception e) {
   //         fail("Exception caught: " + e.getClass().getSimpleName() + " - " + e.getMessage());

 //      }
        }
 //TODO  below should collaborate with the audioTrack object and should write to the AT buffr
        // audioTrack write was blocking forever 

    public void getPackByte(DataPacket packet) {
            //TODO this is getting called but not sure why only one time 
            // or whether it is stalling in mid-exec??

            //TODO on firstPacket write bytes and start audioTrack
            // AMR-nb frames at 12.2 KB or format type 7 frames are handled . 
            // after the normal header, the getDataArray contains extra 10 bytes of dynamic header that are bypassed by 'limit'


            // real value for the frame separator comes in the input stream at position 1 in the data array
            // returned by 

//          int newFrameSep = 0x3c;
            // bytes avail = packet.getDataSize() - limit;

//          byte[] lbuf = new byte[packet.getDataSize()];
//          if ( packet.getDataSize() > 0)
//              lbuf = packet.getDataAsArray();
            //first frame includes the 1 byte frame header whose value should be used 
            // to write subsequent frame separators 
            Log.d(TAG, "getPackByt start and play");

            if(!started){
                Log.d(TAG, " PLAY  audioTrak");
                track.play();
                started = true;
            }

//          track.write(packet.getDataAsArray(), limit, (packet.getDataSize() - limit));
            track.write(packet.getDataAsArray(), 0, packet.getDataSize() );
            Log.d(TAG, "getPackByt aft write");

//          if(!started && nBytesRead > minBufferSize){
    //          Log.d(TAG, " PLAY  audioTrak");
        //      track.play();
        //  started = true;}
            nBytesRead += packet.getDataSize(); 
            if (nBytesRead % 500 < 375) Log.d(TAG, " getPackByte plus 5K received");
        }       
    }

Possible in android using ( not mediaPlayer but other stuff further down the stack) but do you really want do pursue RTSP/RTP when the rest of the media ecosystem does not??

IMO - there are far better media/stream approaches under the umbrella of HTML5/WebRTC. Like look at what 'Ondello' is doing with streams.

That said, here is some old-project code for android/RTSP/SDP/RTP using 'netty' and 'efflux'. It will negotiate some portions of 'Sessions' on SDP file providers. Cant remember whether it would actually play the audio portion of Youtube/RTSP stuff, but that is what my goal was at the time. ( i think that it worked using AMR-NB codec but , there were tons of issues and i dropped RTSP on android like a bad habit!)

on Git....

        @Override
        public void mediaDescriptor(Client client, String descriptor)
        {
            // searches for control: session and media arguments.
            final String target = "control:";
            Log.d(TAG, "Session Descriptor\n" + descriptor);
            int position = -1;
            while((position = descriptor.indexOf(target)) > -1)
            {
                descriptor = descriptor.substring(position + target.length());
                resourceList.add(descriptor.substring(0, descriptor.indexOf('\r')));
            }
        }
        private int nextPort()
        {
            return (port += 2) - 2;
        }       


        private void getRTPStream(TransportHeader transport){

            String[] words;
            // only want 2000 part of 'client_port=2000-2001' in the Transport header in the response

            words = transport.getParameter("client_port").substring(transport.getParameter("client_port").indexOf("=") +1).split("-");
            port_lc = Integer.parseInt(words[0]);

            words = transport.getParameter("server_port").substring(transport.getParameter("server_port").indexOf("=") +1).split("-");
            port_rm = Integer.parseInt(words[0]);

            source = transport.getParameter("source").substring(transport.getParameter("source").indexOf("=") +1);          
            ssrc = transport.getParameter("ssrc").substring(transport.getParameter("ssrc").indexOf("=") +1);
            // assume dynamic Packet type = RTP , 99
            getRTPStream(session, source, port_lc, port_rm, 99);
            //getRTPStream("sessiona", source, port_lc, port_rm, 99);
            Log.d(TAG, "raw parms " +port_lc +" " +port_rm +" " +source );
//          String[] words = session.split(";");
        Log.d(TAG, "session: " +session);   
        Log.d(TAG, "transport: " +transport.getParameter("client_port") 
                +" "  +transport.getParameter("server_port") +" "  +transport.getParameter("source") 
                +" "  +transport.getParameter("ssrc"));

        }

        private void getRTPStream(String session, String source, int portl, int portr, int payloadFormat ){
            // what do u do with ssrc?
            InetAddress addr;
            try {
                addr = InetAddress.getLocalHost();
                // Get IP Address
 //             LAN_IP_ADDR = addr.getHostAddress();
                LAN_IP_ADDR = "192.168.1.125";
                Log.d(TAG, "using client IP addr " +LAN_IP_ADDR);

            } catch (UnknownHostException e1) {
                // TODO Auto-generated catch block
                e1.printStackTrace();
            }


            final CountDownLatch latch = new CountDownLatch(2);

            RtpParticipant local1 = RtpParticipant.createReceiver(new RtpParticipantInfo(1), LAN_IP_ADDR, portl, portl+=1);
     //       RtpParticipant local1 = RtpParticipant.createReceiver(new RtpParticipantInfo(1), "127.0.0.1", portl, portl+=1);
            RtpParticipant remote1 = RtpParticipant.createReceiver(new RtpParticipantInfo(2), source, portr, portr+=1);


            remote1.getInfo().setSsrc( Long.parseLong(ssrc, 16));
            session1 = new SingleParticipantSession(session, payloadFormat, local1, remote1);

           Log.d(TAG, "remote ssrc " +session1.getRemoteParticipant().getInfo().getSsrc());

            session1.init();

            session1.addDataListener(new RtpSessionDataListener() {
                @Override
                public void dataPacketReceived(RtpSession session, RtpParticipantInfo participant, DataPacket packet) {
     //               System.err.println("Session 1 received packet: " + packet + "(session: " + session.getId() + ")");
                    //TODO close the file, flush the buffer
//                  if (_sink != null) _sink.getPackByte(packet);
                    getPackByte(packet);

     //             System.err.println("Ssn 1  packet seqn: typ: datasz "  +packet.getSequenceNumber()  + " " +packet.getPayloadType() +" " +packet.getDataSize());
     //             System.err.println("Ssn 1  packet sessn: typ: datasz "  + session.getId() + " " +packet.getPayloadType() +" " +packet.getDataSize());
 //                   latch.countDown();
                }

            });
     //       DataPacket packet = new DataPacket();
      //      packet.setData(new byte[]{0x45, 0x45, 0x45, 0x45});
     //       packet.setSequenceNumber(1);
     //       session1.sendDataPacket(packet);


//        try {
       //       latch.await(2000, TimeUnit.MILLISECONDS);
     //     } catch (Exception e) {
   //         fail("Exception caught: " + e.getClass().getSimpleName() + " - " + e.getMessage());

 //      }
        }
 //TODO  below should collaborate with the audioTrack object and should write to the AT buffr
        // audioTrack write was blocking forever 

    public void getPackByte(DataPacket packet) {
            //TODO this is getting called but not sure why only one time 
            // or whether it is stalling in mid-exec??

            //TODO on firstPacket write bytes and start audioTrack
            // AMR-nb frames at 12.2 KB or format type 7 frames are handled . 
            // after the normal header, the getDataArray contains extra 10 bytes of dynamic header that are bypassed by 'limit'


            // real value for the frame separator comes in the input stream at position 1 in the data array
            // returned by 

//          int newFrameSep = 0x3c;
            // bytes avail = packet.getDataSize() - limit;

//          byte[] lbuf = new byte[packet.getDataSize()];
//          if ( packet.getDataSize() > 0)
//              lbuf = packet.getDataAsArray();
            //first frame includes the 1 byte frame header whose value should be used 
            // to write subsequent frame separators 
            Log.d(TAG, "getPackByt start and play");

            if(!started){
                Log.d(TAG, " PLAY  audioTrak");
                track.play();
                started = true;
            }

//          track.write(packet.getDataAsArray(), limit, (packet.getDataSize() - limit));
            track.write(packet.getDataAsArray(), 0, packet.getDataSize() );
            Log.d(TAG, "getPackByt aft write");

//          if(!started && nBytesRead > minBufferSize){
    //          Log.d(TAG, " PLAY  audioTrak");
        //      track.play();
        //  started = true;}
            nBytesRead += packet.getDataSize(); 
            if (nBytesRead % 500 < 375) Log.d(TAG, " getPackByte plus 5K received");
        }       
    }
怪我闹别瞎闹 2024-11-06 13:17:37

实际上,可以使用官方不支持的 ExoPlayer 的修改版本在 Android 上播放 RTSP/RTP 流不支持 RTSP/RTP (问题 55),但是,有一个活跃的拉取请求< a href="https://github.com/google/ExoPlayer/pull/3854" rel="nofollow noreferrer">#3854 添加此支持。

同时,您可以克隆原作者exoplayer fork,它确实支持RTSP(分支dev-v2- rtsp):

git clone -b dev-v2-rtsp https://github.com/tresvecesseis/ExoPlayer.git.

我已经测试过了,它工作得很好。作者正在积极努力解决许多用户报告的问题,我希望 RTSP 支持在某个时候成为官方 exoplayer 的一部分。

Actually it's possible to play RTSP/RTP streams on Android by using a modified version of ExoPlayer which officially doesn't support RTSP/RTP (issue 55), however, there's an active pull request #3854 to add this support.

In the meantime, you can clone the original authors exoplayer fork which does support RTSP (branch dev-v2-rtsp):

git clone -b dev-v2-rtsp https://github.com/tresvecesseis/ExoPlayer.git.

I've tested it and it works perfectly. The authors are working actively to fix the issues reported by many users and I hope that RTSP support at some point becomes part of the official exoplayer.

泛滥成性 2024-11-06 13:17:37

不幸的是,无法使用 Android MediaPlayer 播放 RTP 流。

该问题的解决方案包括使用 ffmpeg 对 RTP 流进行解码。有关如何为 Android 编译 ffmpeg 的教程可以在 Web 上找到。

Unfortunately it is not possible to play an RTP Stream with the Android MediaPlayer.

Solutions to this problems include the decoding of the RTP Stream with ffmpeg. Tutorials on how to compile ffmpeg for Android can be found on the Web.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文