是否可以在 iPhone SDK 中使用 HTTP 实时流读取元数据

发布于 2024-10-11 17:22:44 字数 80 浏览 4 评论 0原文

使用 HTTP Live Streaming 方法播放直播时,是否可以读取当前元数据(例如标题和艺术家)?这是一个 iPhone 收音机应用程序。

When playing a live stream using the HTTP Live Streaming method, is it possible read the current metadata (eg. Title and Artist)? This is for an iPhone radio app.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

往事随风而去 2024-10-18 17:22:44

不确定这个问题对其作者来说是否仍然存在,但可能会对某人有所帮助。经过两天的痛苦研究,发现其实很简单。这是适合我的代码:

AVPlayerItem* playerItem = [AVPlayerItem playerItemWithURL:[NSURL URLWithString:<here your http stream url>]];

[playerItem addObserver:self forKeyPath:@"timedMetadata" options:NSKeyValueObservingOptionNew context:nil];

AVPlayer* player = [[AVPlayer playerWithPlayerItem:playerItem] retain];
[player play];

然后:

- (void) observeValueForKeyPath:(NSString*)keyPath ofObject:(id)object
                        change:(NSDictionary*)change context:(void*)context {

   if ([keyPath isEqualToString:@"timedMetadata"])
   {
      AVPlayerItem* playerItem = object;

      for (AVMetadataItem* metadata in playerItem.timedMetadata)
      {
         NSLog(@"\nkey: %@\nkeySpace: %@\ncommonKey: %@\nvalue: %@", [metadata.key description], metadata.keySpace, metadata.commonKey, metadata.stringValue);
      }
   }
}

就是这样。我不知道为什么苹果没有在 AVPlayerItem 的文档中提供此示例来访问流的“标题”,这是现实世界流音频的关键功能。在“AV Foundation Framework Reference”中,他们在需要的地方讲述了“timedMetadata”。 Matt 的示例并不适用于所有流(但 AVPlayer 可以)。

Not sure that this question is still actual for its author, but may be it will help someone. After two days of pain I investigated that it's quite simple. Here is the code that works for me:

AVPlayerItem* playerItem = [AVPlayerItem playerItemWithURL:[NSURL URLWithString:<here your http stream url>]];

[playerItem addObserver:self forKeyPath:@"timedMetadata" options:NSKeyValueObservingOptionNew context:nil];

AVPlayer* player = [[AVPlayer playerWithPlayerItem:playerItem] retain];
[player play];

and then:

- (void) observeValueForKeyPath:(NSString*)keyPath ofObject:(id)object
                        change:(NSDictionary*)change context:(void*)context {

   if ([keyPath isEqualToString:@"timedMetadata"])
   {
      AVPlayerItem* playerItem = object;

      for (AVMetadataItem* metadata in playerItem.timedMetadata)
      {
         NSLog(@"\nkey: %@\nkeySpace: %@\ncommonKey: %@\nvalue: %@", [metadata.key description], metadata.keySpace, metadata.commonKey, metadata.stringValue);
      }
   }
}

That's it. I dont know why Apple didn't provide in the docs for AVPlayerItem this sample for access "title" of the stream which is the key feature for real world streaming audio. In "AV Foundation Framework Reference" they tell about "timedMetadata" nowhere where needed. And Matt's sample does not work with all streams (but AVPlayer does).

初心未许 2024-10-18 17:22:44

在 swift 2.0 获取音乐流媒体元数据信息:

PlayerItem.addObserver(self, forKeyPath: "timedMetadata", options: NSKeyValueObservingOptions.New, context: nil)

添加此方法:

override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {

    //Atualiza Nome Musica
    if keyPath == "timedMetadata" {
        if let meta = PlayerItem.timedMetadata {
            print("Novo Metadata \(meta)")
            for metadata in meta {
                if let nomemusica = metadata.valueForKey("value") as? String{
                    LB_NomeMusica.text = nomemusica
                    if NSClassFromString("MPNowPlayingInfoCenter") != nil {
                        let image:UIImage = UIImage(named: "logo.gif")!
                        let albumArt = MPMediaItemArtwork(image: image)
                        var songInfo: [String:AnyObject] = [
                            MPMediaItemPropertyTitle: nomemusica,
                            MPMediaItemPropertyArtist: "Ao Vivo",
                            MPMediaItemPropertyArtwork: albumArt
                        ]
                        MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo = songInfo
                    }
                }
            }
        }
    }


}

in swift 2.0 getting metadata info music streaming:

PlayerItem.addObserver(self, forKeyPath: "timedMetadata", options: NSKeyValueObservingOptions.New, context: nil)

add this method:

override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {

    //Atualiza Nome Musica
    if keyPath == "timedMetadata" {
        if let meta = PlayerItem.timedMetadata {
            print("Novo Metadata \(meta)")
            for metadata in meta {
                if let nomemusica = metadata.valueForKey("value") as? String{
                    LB_NomeMusica.text = nomemusica
                    if NSClassFromString("MPNowPlayingInfoCenter") != nil {
                        let image:UIImage = UIImage(named: "logo.gif")!
                        let albumArt = MPMediaItemArtwork(image: image)
                        var songInfo: [String:AnyObject] = [
                            MPMediaItemPropertyTitle: nomemusica,
                            MPMediaItemPropertyArtist: "Ao Vivo",
                            MPMediaItemPropertyArtwork: albumArt
                        ]
                        MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo = songInfo
                    }
                }
            }
        }
    }


}
2024-10-18 17:22:44

快速解决方案。这是一个简单的流媒体音频播放器的示例。您可以在委托 AVPlayerItemMetadataOutputPushDelegate 的方法中读取元数据。

import UIKit
import AVFoundation

class PlayerViewController: UIViewController {
    var player = AVPlayer()

    override func viewDidLoad() {
        super.viewDidLoad()
        configurePlayer()
        player.play()
    }

    private func configurePlayer() {
        guard let url = URL(string: "Your stream URL") else { return }
        let asset = AVAsset(url: url)
        let playerItem = AVPlayerItem(asset: asset)
        let metadataOutput = AVPlayerItemMetadataOutput(identifiers: nil)
        metadataOutput.setDelegate(self, queue: DispatchQueue.main)
        playerItem.add(metadataOutput)
        player = AVPlayer(playerItem: playerItem)
    }
}

extension PlayerViewController: AVPlayerItemMetadataOutputPushDelegate {
    func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) {
        let item = groups.first?.items.first
        item?.value(forKeyPath: "value")
        print(item!.value(forKeyPath: "value")!)
    }
}

Swift solution. This is a sample of simple streaming audio player. You can read metadata in the method of delegate AVPlayerItemMetadataOutputPushDelegate.

import UIKit
import AVFoundation

class PlayerViewController: UIViewController {
    var player = AVPlayer()

    override func viewDidLoad() {
        super.viewDidLoad()
        configurePlayer()
        player.play()
    }

    private func configurePlayer() {
        guard let url = URL(string: "Your stream URL") else { return }
        let asset = AVAsset(url: url)
        let playerItem = AVPlayerItem(asset: asset)
        let metadataOutput = AVPlayerItemMetadataOutput(identifiers: nil)
        metadataOutput.setDelegate(self, queue: DispatchQueue.main)
        playerItem.add(metadataOutput)
        player = AVPlayer(playerItem: playerItem)
    }
}

extension PlayerViewController: AVPlayerItemMetadataOutputPushDelegate {
    func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) {
        let item = groups.first?.items.first
        item?.value(forKeyPath: "value")
        print(item!.value(forKeyPath: "value")!)
    }
}
陌伤ぢ 2024-10-18 17:22:44

确实如此,但这并不容易。 Matt Gallagher 在他的博客上发表了一篇关于流音频的不错的帖子。引用他关于这个问题的话:

最简单的元数据来源来了
来自 HTTP 标头。里面的
处理ReadFromStream:事件类型:
方法,使用CFReadStreamCopyProperty
复制
kCFStreamPropertyHTTPResponseHeader
来自 CFReadStreamRef 的属性,
那么你可以使用
CFHTTPMessageCopyAllHeaderFields 到
将标题字段复制出来
回复。对于许多流媒体音频
服务器,流名称是其中之一
这些字段。

相当困难的来源
元数据是 ID3 标签。 ID3v1 是
始终位于文件末尾(也是如此
流式传输时无用)。 ID3v2 是
位于开头,所以可能更多
可访问。

我从未读过 ID3 标签,但我
怀疑如果你缓存第一个
几百KB的文件
在加载时的某个地方,打开该缓存
与 AudioFileOpenWithCallbacks 和
然后读取 kAudioFilePropertyID3Tag
使用 AudioFileGetProperty 你可能会
能够读取ID3数据(如果
存在)。就像我说的:我已经
从来没有真正这样做过,所以我没有
确信它会起作用。

It is, but it's not easy. Matt Gallagher has a nice post on his blog about streaming audio. To quote him on the subject:

The easiest source of metadata comes
from the HTTP headers. Inside the
handleReadFromStream:eventType:
method, use CFReadStreamCopyProperty
to copy the
kCFStreamPropertyHTTPResponseHeader
property from the CFReadStreamRef,
then you can use
CFHTTPMessageCopyAllHeaderFields to
copy the header fields out of the
response. For many streaming audio
servers, the stream name is one of
these fields.

The considerably harder source of
metadata are the ID3 tags. ID3v1 is
always at the end of the file (so is
useless when streaming). ID3v2 is
located at the start so may be more
accessible.

I've never read the ID3 tags but I
suspect that if you cache the first
few hundred kilobytes of the file
somewhere as it loads, open that cache
with AudioFileOpenWithCallbacks and
then read the kAudioFilePropertyID3Tag
with AudioFileGetProperty you may be
able to read the ID3 data (if it
exists). Like I said though: I've
never actually done this so I don't
know for certain that it would work.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文