HTTP 直播

发布于 2024-11-19 01:23:46 字数 210 浏览 2 评论 0原文

好吧,我一直在努力理解这个 http 直播。我只是不明白它,是的,我已经阅读了所有苹果文档并观看了 wwdc 视频,但仍然非常困惑,所以请帮助一个想成为程序员的人!

你写的代码放在服务器上?不在 xcode 中? 如果我是对的我该如何设置? 我需要在我的服务器上设置一些特殊的东西吗?比如php什么的? 如何使用Apple..segmenter等提供的工具?

请帮我, 谢谢

Ok, I have been trying to wrap my head around this http live streaming. I just do not understand it and yes I have read all the apple docs and watched the wwdc videos, but still super confused, so please help a wanna be programer out!!!

The code you write goes on the server? not in xcode?
If I am right how do i set this up?
Do I need to set up something special on my server? like php or something?
How do use the tools that are supplied by Apple.. segmenter and such?

Please help me,
Thanks

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

淡忘如思 2024-11-26 01:23:46

HTTP Live Streaming

HTTP Live Streaming 是 Apple 提出的流媒体标准。请参阅最新的标准草案

涉及的文件是

  • 用于音频的.m4a(如果您只想要音频流)。
  • .ts 用于视频。这是 MPEG-2 传输,通常带有 h.264/AAC 有效负载。它包含 10 秒的视频,是通过分割原始视频文件或转换实时视频而创建的。
  • .m3u8 用于播放列表。这是 WinAmp 格式的 UTF-8 版本。

即使称为直播,通常也会有一分钟左右的延迟,在此期间,视频转换、ts 和 m3u8 文件写入、客户端刷新 m3u8 文件。

所有这些文件都是服务器上的静态文件。但在现场活动中,会添加更多 .ts 文件,并更新 m3u8 文件。

由于您在 iOS 上标记了此问题,因此有必要提及相关的 App Store 规则:

  • 您只能对小于 10 分钟或每 5 分钟小于 5 MB 的视频使用渐进式下载。否则,您必须使用 HTTP Live Streaming。
  • 如果您使用 HTTP Live Streaming,则必须至少提供一个 64 Kbps 或更低带宽的流(低带宽流可能是纯音频或带有静态图像的音频)。

示例

获取流媒体工具

要下载 HTTP 直播流媒体工具,请执行以下操作:

安装的命令行工具:

 /usr/bin/mediastreamsegmenter
 /usr/bin/mediafilesegmenter
 /usr/bin/variantplaylistcreator
 /usr/bin/mediastreamvalidator
 /usr/bin/id3taggenerator

手册页中的说明:

  • 媒体流分段器:从用于 HTTP 直播流的 MPEG-2 传输流创建分段。
  • 媒体文件分段器:从媒体文件创建 HTTP 实时流媒体分段。
  • 变体播放列表创建器:从 mediafilesegmenter 创建的 HTTP Live 流媒体片段创建用于流切换的播放列表。
  • 媒体流验证器:验证 HTTP 实时流媒体流和服务器。
  • ID3 标签生成器:创建 ID3 标签。

创建视频

安装 Macports,转到终端并 sudo port install ffmpeg。然后使用此 FFMpeg 脚本将视频转换为传输流 (.ts):

# bitrate, width, and height, you may want to change this
BR=512k
WIDTH=432
HEIGHT=240
input=${1}
 
# strip off the file extension
output=$(echo ${input} | sed 's/\..*//' )
 
# works for most videos
ffmpeg -y -i ${input} -f mpegts -acodec libmp3lame -ar 48000 -ab 64k -s ${WIDTH}x${HEIGHT} -vcodec libx264 -b ${BR} -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 7 -trellis 0 -refs 0 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate ${BR} -bufsize ${BR} -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 30 -qmax 51 -qdiff 4 -level 30 -aspect ${WIDTH}:${HEIGHT} -g 30 -async 2 ${output}-iphone.ts

这将生成一个 .ts 文件。现在我们需要将文件分成片段并创建一个包含所有这些文件的播放列表。为此,我们可以使用 Apple 的 mediafilesegmenter

mediafilesegmenter -t 10 myvideo-iphone.ts

这将为视频的每 10 秒生成一个 .ts 文件,以及指向所有这些文件的 .m3u8 文件。

设置网络服务器

要在 iOS 上播放 .m3u8,我们使用 mobile safari 指向该文件。
当然,首先我们需要将它们放在网络服务器上。为了让 Safari(或其他播放器)识别 ts 文件,我们需要添加其 MIME 类型。在 Apache 中:

 AddType application/x-mpegURL m3u8
 AddType video/MP2T ts

在 lighttpd 中:

 mimetype.assign = ( ".m3u8" => "application/x-mpegURL", ".ts" => "video/MP2T" )

要从网页链接此内容:

<html><head>
    <meta name="viewport" content="width=320; initial-scale=1.0; maximum-scale=1.0; user-scalable=0;"/>
</head><body>
    <video width="320" height="240" src="stream.m3u8" />
</body></html>

要检测设备方向,请参阅 检测并设置 iPhone &使用 JavaScript、CSS 和元标记的 iPad 视口方向

您可以做的更多事情是创建视频的不同比特率版本,嵌入元数据以在播放时作为通知读取它,当然还可以使用 MoviePlayerController 和 AVPlayer 进行有趣的编程。

HTTP Live Streaming

HTTP Live Streaming is a streaming standard proposed by Apple. See the latest draft standard.

Files involved are

  • .m4a for audio (if you want a stream of audio only).
  • .ts for video. This is a MPEG-2 transport, usually with a h.264/AAC payload. It contains 10 seconds of video and it is created by splitting your original video file, or by converting live video.
  • .m3u8 for the playlist. This is a UTF-8 version of the WinAmp format.

Even when it's called live streaming, usually there is a delay of one minute or so during which the video is converted, the ts and m3u8 files written, and your client refresh the m3u8 file.

All these files are static files on your server. But in live events, more .ts files are added, and the m3u8 file is updated.

Since you tagged this question iOS it is relevant to mention related App Store rules:

  • You can only use progressive download for videos smaller than 10 minutes or 5 MB every 5 minutes. Otherwise you must use HTTP Live Streaming.
  • If you use HTTP Live Streaming you must provide at least one stream at 64 Kbps or lower bandwidth (the low-bandwidth stream may be audio-only or audio with a still image).

Example

Get the streaming tools

To download the HTTP Live Streaming Tools do this:

Command line tools installed:

 /usr/bin/mediastreamsegmenter
 /usr/bin/mediafilesegmenter
 /usr/bin/variantplaylistcreator
 /usr/bin/mediastreamvalidator
 /usr/bin/id3taggenerator

Descriptions from the man page:

  • Media Stream Segmenter: Create segments from MPEG-2 Transport streams for HTTP Live Streaming.
  • Media File Segmenter: Create segments for HTTP Live Streaming from media files.
  • Variant Playlist Creator: Create playlist for stream switching from HTTP Live streaming segments created by mediafilesegmenter.
  • Media Stream Validator: Validates HTTP Live Streaming streams and servers.
  • ID3 Tag Generator: Create ID3 tags.

Create the video

Install Macports, go to the terminal and sudo port install ffmpeg. Then convert the video to transport stream (.ts) using this FFMpeg script:

# bitrate, width, and height, you may want to change this
BR=512k
WIDTH=432
HEIGHT=240
input=${1}
 
# strip off the file extension
output=$(echo ${input} | sed 's/\..*//' )
 
# works for most videos
ffmpeg -y -i ${input} -f mpegts -acodec libmp3lame -ar 48000 -ab 64k -s ${WIDTH}x${HEIGHT} -vcodec libx264 -b ${BR} -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 7 -trellis 0 -refs 0 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate ${BR} -bufsize ${BR} -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 30 -qmax 51 -qdiff 4 -level 30 -aspect ${WIDTH}:${HEIGHT} -g 30 -async 2 ${output}-iphone.ts

This will generate one .ts file. Now we need to split the files in segments and create a playlist containing all those files. We can use Apple's mediafilesegmenter for this:

mediafilesegmenter -t 10 myvideo-iphone.ts

This will generate one .ts file for each 10 seconds of the video plus a .m3u8 file pointing to all of them.

Setup a web server

To play a .m3u8 on iOS we point to the file with mobile safari.
Of course, first we need to put them on a web server. For Safari (or other player) to recognize the ts files, we need to add its MIME types. In Apache:

 AddType application/x-mpegURL m3u8
 AddType video/MP2T ts

In lighttpd:

 mimetype.assign = ( ".m3u8" => "application/x-mpegURL", ".ts" => "video/MP2T" )

To link this from a web page:

<html><head>
    <meta name="viewport" content="width=320; initial-scale=1.0; maximum-scale=1.0; user-scalable=0;"/>
</head><body>
    <video width="320" height="240" src="stream.m3u8" />
</body></html>

To detect the device orientation see Detect and Set the iPhone & iPad's Viewport Orientation Using JavaScript, CSS and Meta Tags.

More stuff you can do is create different bitrate versions of the video, embed metadata to read it while playing as notifications, and of course have fun programming with the MoviePlayerController and AVPlayer.

鲜血染红嫁衣 2024-11-26 01:23:46

这可能对 swift 有所帮助:

    import UIKit
    import MediaPlayer

 class ViewController: UIViewController {

     var streamPlayer : MPMoviePlayerController =  MPMoviePlayerController(contentURL: NSURL(string:"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"))
     override func viewDidLoad() {
         super.viewDidLoad()
         streamPlayer.view.frame = self.view.bounds
         self.view.addSubview(streamPlayer.view)

         streamPlayer.fullscreen = true
         // Play the movie!
         streamPlayer.play()
}
}

MPMoviePlayerController 从 iOS 9 开始已被弃用。我们可以使用 AVPlayerViewController() 或 AVPlayer 来达到此目的。看看:

import AVKit
import AVFoundation
import UIKit

AVPlayerViewController :

override func viewDidAppear(animated: Bool){
let videoURL = NSURL(string: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4")
let player = AVPlayer(URL: videoURL!)
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.presentViewController(playerViewController, animated: true) {
    playerViewController.player!.play()
}
}

AVPlayer :

 override func viewDidAppear(animated: Bool){
    let videoURL = NSURL(string: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4")
    let player = AVPlayer(URL: videoURL!)
    let playerLayer = AVPlayerLayer(player: player)
    playerLayer.frame = self.view.bounds
    self.view.layer.addSublayer(playerLayer)
    player.play()
    }

This might help in swift:

    import UIKit
    import MediaPlayer

 class ViewController: UIViewController {

     var streamPlayer : MPMoviePlayerController =  MPMoviePlayerController(contentURL: NSURL(string:"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"))
     override func viewDidLoad() {
         super.viewDidLoad()
         streamPlayer.view.frame = self.view.bounds
         self.view.addSubview(streamPlayer.view)

         streamPlayer.fullscreen = true
         // Play the movie!
         streamPlayer.play()
}
}

MPMoviePlayerController is deprecated from iOS 9 onwards. We can use AVPlayerViewController() or AVPlayer for the purpose. Have a look:

import AVKit
import AVFoundation
import UIKit

AVPlayerViewController :

override func viewDidAppear(animated: Bool){
let videoURL = NSURL(string: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4")
let player = AVPlayer(URL: videoURL!)
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.presentViewController(playerViewController, animated: true) {
    playerViewController.player!.play()
}
}

AVPlayer :

 override func viewDidAppear(animated: Bool){
    let videoURL = NSURL(string: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4")
    let player = AVPlayer(URL: videoURL!)
    let playerLayer = AVPlayerLayer(player: player)
    playerLayer.frame = self.view.bounds
    self.view.layer.addSublayer(playerLayer)
    player.play()
    }
作死小能手 2024-11-26 01:23:46

Cloudinary 的另一个解释 http://cloudinary.com/documentation/video_manipulation_and_delivery#http_live_streaming_hls

HTTP Live Streaming(也称为 HLS)是一种基于 HTTP 的媒体流通信协议,提供可扩展并适应不同网络的机制。 HLS 的工作原理是将视频文件分解为一系列基于 HTTP 的小文件下载,每个下载都会加载视频文件的一小块。

播放视频流时,客户端播放器可以从许多不同的备用视频流中进行选择,其中包含以各种数据速率编码的相同材料,从而允许流会话适应可用的数据速率并进行高质量播放在高带宽网络上和低质量播放 在带宽减少的网络上。

在流会话开始时,客户端软件会下载一个主 M3U8 播放列表文件,其中包含各种可用子流的元数据。然后,客户端软件根据设备类型、分辨率、数据速率、大小等预定义因素,决定从可用媒体文件中下载哪些内容。

Another explanation from Cloudinary http://cloudinary.com/documentation/video_manipulation_and_delivery#http_live_streaming_hls

HTTP Live Streaming (also known as HLS) is an HTTP-based media streaming communications protocol that provides mechanisms that are scalable and adaptable to different networks. HLS works by breaking down a video file into a sequence of small HTTP-based file downloads, with each download loading one short chunk of a video file.

As the video stream is played, the client player can select from a number of different alternate video streams containing the same material encoded at a variety of data rates, allowing the streaming session to adapt to the available data rate with high quality playback on networks with high bandwidth and low quality playback on networks where the bandwidth is reduced.

At the start of the streaming session, the client software downloads a master M3U8 playlist file containing the metadata for the various sub-streams which are available. The client software then decides what to download from the media files available, based on predefined factors such as device type, resolution, data rate, size, etc.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文