如何使用 avassetreader 在音轨中查找?
我熟悉如何使用 AVAssetReader 从 ipod 库流式传输音频数据,但我不知道如何在曲目中查找。例如从中间点开始播放等。从头开始然后顺序获取连续的样本很容易,但是肯定有一种方法可以进行随机访问吗?
I'm familiar with how to stream audio data from the ipod library using AVAssetReader, but I'm at a loss as to how to seek within the track. e.g. start playback at the halfway point, etc. Starting from the beginning and then sequentially getting successive samples is easy, but surely there must be a way to have random access?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
AVAssetReader 有一个属性 timeRange,它确定将从中读取媒体数据的资产的时间范围。
此属性的值与 CMTimeRangeMake(kCMTimeZero, asset.duration) 的交集确定将从中读取媒体数据的资产的时间范围。
默认值为 CMTimeRangeMake(kCMTimeZero, kCMTimePositiveInfinity)。读取开始后,您无法更改此属性的值。
因此,如果您想寻找轨道的中间位置,您可以创建一个从 asset.duration/2 到 asset.duration 的 CMTimeRange,并将其设置为 AVAssetReader 上的 timeRange。
AVAssetReader has a property, timeRange, which determines the time range of the asset from which media data will be read.
The intersection of the value of this property and CMTimeRangeMake(kCMTimeZero, asset.duration) determines the time range of the asset from which media data will be read.
The default value is CMTimeRangeMake(kCMTimeZero, kCMTimePositiveInfinity). You cannot change the value of this property after reading has started.
So, if you want to seek to the middle the track, you'd create a CMTimeRange from asset.duration/2 to asset.duration, and set that as the timeRange on the AVAssetReader.
AVAssetReader
查找时速度非常慢。如果您尝试重新创建一个AVAssetReader
来在用户拖动滑块时进行搜索,您的应用将使 iOS 屈服。相反,您应该使用
AVAssetReader
仅快进访问视频帧,然后在用户想要时也使用AVPlayerItem
和AVPlayerItemVideoOutput
使用滑块寻找。如果 Apple 将
AVAssetReader
和AVPlayerItem
/AVPlayerItemVideoOutput
合并到一个高性能且能够快速查找的新类中,那就太好了。请注意,除非有附加到
AVPlayerItem
的AVPlayer
,否则AVPlayerItemVideoOutput
不会返回像素缓冲区。这显然是一个奇怪的实现细节,但事实就是如此。如果您使用
AVPlayer
和AVPlayerLayer
,那么您可以简单地使用AVPlayer
本身的搜索方法。仅当您使用像素缓冲区进行自定义渲染和/或需要将像素缓冲区发送到AVAssetWriter
时,上述详细信息才重要。AVAssetReader
is amazingly slow when seeking. If you try to recreate anAVAssetReader
to seek while the user is dragging a slider, your app will bring iOS to its knees.Instead, you should use an
AVAssetReader
for fast forward only access to video frames, and then also use anAVPlayerItem
andAVPlayerItemVideoOutput
when the user wants to seek with a slider.It would be nice if Apple combined
AVAssetReader
andAVPlayerItem
/AVPlayerItemVideoOutput
into a new class that was performant and was able to seek quickly.Be aware that
AVPlayerItemVideoOutput
will not give back pixel buffers unless there is anAVPlayer
attached to theAVPlayerItem
. This is obviously a strange implementation detail, but it is what it is.If you are using
AVPlayer
andAVPlayerLayer
, then you can simply use the seek methods onAVPlayer
itself. The above details are only important if you are doing custom rendering with the pixel buffers and/or need to send the pixel buffers to anAVAssetWriter
.