有没有办法使用 Matt Gallagher 的音频流录制音频流?

发布于 2024-10-09 04:32:58 字数 206 浏览 0 评论 0原文

我使用 Matt Gallagher 的音频流媒体来播放广播电台。但是如何录制音频呢?有没有办法将下载的数据包放入 NSData 并将其保存在 iPhone 文档文件夹中的音频文件中?

谢谢

I use Matt Gallagher's audio streamer for streaming radio stations. But how to record the audio? Is there a way to get the downloaded packets into NSData and save it in an audio file in the documents folder on the iPhone?

Thanks

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

你不是我要的菜∠ 2024-10-16 04:32:58

是的,有,而且我已经做到了。我的问题是能够在同一个流媒体中播放它(在别处询问)。它将使用 iOS 中的标准 AVAudioPlayer 进行播放。但是,这会将数据通过在流送代码中写出来保存到文件中。

这个例子缺少一些错误检查,但会给你主要的想法。

首先,从主线程调用以开始和停止录制。当有人按下记录时,这是在我的 viewController 中:

//---------------------------------------------------------
// Record button was pressed (toggle on/off)
// writes a file to the documents directory using date and time for the name
//---------------------------------------------------------

-(IBAction)recordButton:(id)sender {
    // only start if the streamer is playing (self.streamer is my streamer instance)
    if ([self.streamer isPlaying])     {

    NSDate *currentDateTime = [NSDate date];      // get current date and time                                         
    NSDateFormatter *dateFormatter = [[[NSDateFormatter alloc] init] autorelease];      
    [dateFormatter setDateFormat:@"EEEE MMMM d YYYY 'at' HH:mm:ss"];
    NSString *dateString = [dateFormatter stringFromDate:currentDateTime];

    self.isRecording = !self.isRecording;       // toggle recording state BOOL
    if (self.isRecording)
    {
        // start recording here
        // change the record button to show it is recording - this is an IBOutlet
        [self.recordButtonImage setImage:[UIImage imageNamed:@"Record2.png"] forState:0];
       // call AudioStreamer to start recording. It returns the file pointer back
       // 
       self.recordFilePath = [self.streamer recordStream:TRUE fileName:dateString];    // start file stream and get file pointer              
    } else
    {
        //stop recording here
        // change the button back
        [self.recordButtonImage setImage:[UIImage imageNamed:@"Record.png"] forState:0];
        // call streamer code, stop the recording. Also returns the file path again.
        self.recordFilePath = [self.streamer recordStream:FALSE fileName:nil];     // stop stream and get file pointer             
        // add to "recorded files" for selecting a recorderd file later.
        // first, add channel, date, time
        dateString = [NSString stringWithFormat:@"%@ Recorded on %@",self.model.stationName, dateString];  // used to identify the item in a list laster
        // the dictionary will be used to hold the data on this recording for display elsewhere
        NSDictionary *row1 = [[[NSDictionary alloc] initWithObjectsAndKeys: self.recordFilePath, @"path", dateString, @"dateTime", nil] autorelease];
        // save the stream info in an array of recorded Streams
        if (self.model.recordedStreamsArray == nil) {
            self.model.recordedStreamsArray = [[NSMutableArray alloc] init]// init the array
        }
        [self.model.recordedStreamsArray addObject:row1];          // dict for this recording
        }
    }
}

现在,在 AudioStreamer.m 中,我最后需要处理上面的记录设置调用

- (NSString*)recordStream:(BOOL)record fileName:(NSString *)fileName
{
    // this will start/stop recording, and return the file pointer
    if (record) {
        if (state == AS_PLAYING)
        {
            // now open a file to save the data into
            NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,       NSUserDomainMask, YES);
            NSString *documentsDirectory = [paths objectAtIndex:0];
            // will call this an mp3 file for now (this may need to change)
            NSMutableString *temp = [NSMutableString stringWithString:[documentsDirectory stringByAppendingFormat:@"/%@.mp3",fileName]];
            // remove the ':' in the time string, and create a file name w/ time & date
           [temp replaceOccurrencesOfString:@":" withString:@"" options:NSLiteralSearch range:NSMakeRange(0, [temp length])];

           self.filePath = temp;              // file name is date time generated.
            NSLog(@"Stream Save File Open = %@", self.filePath);
            // open the recording file stream output
            self.fileStream = [NSOutputStream outputStreamToFileAtPath:self.filePath append:NO];
           [self.fileStream open];
           NSLog(@"recording to %@", self.fileStream);
           self.isRecording = TRUE;
           return (self.filePath);             // if started, send back the file path
        }
      return (nil);                           // if not started, return nil for error checking
} else {
    // save the stream here to a file.
    // we are done, close the stream.
        if (self.fileStream != nil) {
            [self.fileStream close];
            self.fileStream = nil;
        }
        NSLog(@"stop recording");
        self.isRecording = FALSE;
        return (self.filePath);                       // when stopping, return nil
    }
}

,我们需要修改流媒体的数据部分以实际保存字节。需要修改方法中的流代码: -(void)handleReadFromStream:(CFReadStreamRef)aStreameventType:(CFStreamEventType)eventType
在该方法中向下滚动,直到找到:

@synchronized(self)
{
    if ([self isFinishing] || !CFReadStreamHasBytesAvailable(stream))
    {
        return;
    }

    //
    // Read the bytes from the stream
    //
    length = CFReadStreamRead(stream, bytes, kAQDefaultBufSize);

    if (length == -1)
    {
        [self failWithErrorCode:AS_AUDIO_DATA_NOT_FOUND];
        return;
    }

在 length = 行之后,添加以下代码:

            //
            // if recording, save the raw data to a file
            //
            if(self.isRecording && length != 0){
                //
                // write the data to a file
                //
                NSInteger       bytesWritten;
                NSInteger       bytesWrittenSoFar;
                bytesWrittenSoFar = 0;
                do {
                    bytesWritten = [self.fileStream write:&bytes[bytesWrittenSoFar] maxLength:length - bytesWrittenSoFar];
                    NSLog(@"bytesWritten = %i",bytesWritten);
                    if (bytesWritten == -1) {
                        [self.fileStream close];
                        self.fileStream = nil;
                        NSLog(@"File write error");
                        break;
                    } else {
                        bytesWrittenSoFar += bytesWritten;
                    }
                } while (bytesWrittenSoFar != length);
            }

以下是 .h 声明:

添加到 AudioStreamer.h 的界面

// for recording and saving a stream
NSString*        filePath;
NSOutputStream*  fileStream;
BOOL isRecording;
BOOL isPlayingFile;

在您的视图控制器中,您将需要:

@property(nonatomic, assign) IBOutlet UIButton* recordButtonImage;
@property(nonatomic, assign) BOOL isRecording;
@property (nonatomic, copy)   NSString* recordFilePath;

希望这对某人有帮助。如果有问题请告诉我,并且很高兴听到有人可以改进这一点。

另外,有人询问 self.model.xxx 模型是我创建的一种数据对象,它允许我轻松传递由多个对象使用的数据,并且也由多个对象修改。我知道,全局数据的形式很糟糕,但有时只是让它更容易访问。我在调用时将数据模型传递给每个新对象。我在模型中保存了一系列频道、歌曲名称、艺术家姓名和其他流相关数据。我还将想要在启动过程中保留的任何数据(例如设置)放在此处,并在每次更改持久数据时将此数据模型写入文件。在此示例中,您可以将数据保存在本地。如果您需要模型传递方面的帮助,请告诉我。

Yes, there is and I have done it. My problem is being able to play it back IN the same streamer (asked elsewhere). It will play back with the standard AVAudioPlayer in iOS. However, this will save the data to a file by writing it out in the streamer code.

This example is missing some error checks, but will give you the main idea.

First, a call from the main thread to start and stop recording. This is in my viewController when someone presses record:

//---------------------------------------------------------
// Record button was pressed (toggle on/off)
// writes a file to the documents directory using date and time for the name
//---------------------------------------------------------

-(IBAction)recordButton:(id)sender {
    // only start if the streamer is playing (self.streamer is my streamer instance)
    if ([self.streamer isPlaying])     {

    NSDate *currentDateTime = [NSDate date];      // get current date and time                                         
    NSDateFormatter *dateFormatter = [[[NSDateFormatter alloc] init] autorelease];      
    [dateFormatter setDateFormat:@"EEEE MMMM d YYYY 'at' HH:mm:ss"];
    NSString *dateString = [dateFormatter stringFromDate:currentDateTime];

    self.isRecording = !self.isRecording;       // toggle recording state BOOL
    if (self.isRecording)
    {
        // start recording here
        // change the record button to show it is recording - this is an IBOutlet
        [self.recordButtonImage setImage:[UIImage imageNamed:@"Record2.png"] forState:0];
       // call AudioStreamer to start recording. It returns the file pointer back
       // 
       self.recordFilePath = [self.streamer recordStream:TRUE fileName:dateString];    // start file stream and get file pointer              
    } else
    {
        //stop recording here
        // change the button back
        [self.recordButtonImage setImage:[UIImage imageNamed:@"Record.png"] forState:0];
        // call streamer code, stop the recording. Also returns the file path again.
        self.recordFilePath = [self.streamer recordStream:FALSE fileName:nil];     // stop stream and get file pointer             
        // add to "recorded files" for selecting a recorderd file later.
        // first, add channel, date, time
        dateString = [NSString stringWithFormat:@"%@ Recorded on %@",self.model.stationName, dateString];  // used to identify the item in a list laster
        // the dictionary will be used to hold the data on this recording for display elsewhere
        NSDictionary *row1 = [[[NSDictionary alloc] initWithObjectsAndKeys: self.recordFilePath, @"path", dateString, @"dateTime", nil] autorelease];
        // save the stream info in an array of recorded Streams
        if (self.model.recordedStreamsArray == nil) {
            self.model.recordedStreamsArray = [[NSMutableArray alloc] init]// init the array
        }
        [self.model.recordedStreamsArray addObject:row1];          // dict for this recording
        }
    }
}

NOW, in AudioStreamer.m I need to handle the record setup call above

- (NSString*)recordStream:(BOOL)record fileName:(NSString *)fileName
{
    // this will start/stop recording, and return the file pointer
    if (record) {
        if (state == AS_PLAYING)
        {
            // now open a file to save the data into
            NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,       NSUserDomainMask, YES);
            NSString *documentsDirectory = [paths objectAtIndex:0];
            // will call this an mp3 file for now (this may need to change)
            NSMutableString *temp = [NSMutableString stringWithString:[documentsDirectory stringByAppendingFormat:@"/%@.mp3",fileName]];
            // remove the ':' in the time string, and create a file name w/ time & date
           [temp replaceOccurrencesOfString:@":" withString:@"" options:NSLiteralSearch range:NSMakeRange(0, [temp length])];

           self.filePath = temp;              // file name is date time generated.
            NSLog(@"Stream Save File Open = %@", self.filePath);
            // open the recording file stream output
            self.fileStream = [NSOutputStream outputStreamToFileAtPath:self.filePath append:NO];
           [self.fileStream open];
           NSLog(@"recording to %@", self.fileStream);
           self.isRecording = TRUE;
           return (self.filePath);             // if started, send back the file path
        }
      return (nil);                           // if not started, return nil for error checking
} else {
    // save the stream here to a file.
    // we are done, close the stream.
        if (self.fileStream != nil) {
            [self.fileStream close];
            self.fileStream = nil;
        }
        NSLog(@"stop recording");
        self.isRecording = FALSE;
        return (self.filePath);                       // when stopping, return nil
    }
}

LASTLY, we need to modify the data portion of the streamer to actually save the bytes. You need to modify the stream code in the method: -(void)handleReadFromStream:(CFReadStreamRef)aStreameventType:(CFStreamEventType)eventType
Scroll down in that method until you find:

@synchronized(self)
{
    if ([self isFinishing] || !CFReadStreamHasBytesAvailable(stream))
    {
        return;
    }

    //
    // Read the bytes from the stream
    //
    length = CFReadStreamRead(stream, bytes, kAQDefaultBufSize);

    if (length == -1)
    {
        [self failWithErrorCode:AS_AUDIO_DATA_NOT_FOUND];
        return;
    }

RIGHT after the length = line, add the following code:

            //
            // if recording, save the raw data to a file
            //
            if(self.isRecording && length != 0){
                //
                // write the data to a file
                //
                NSInteger       bytesWritten;
                NSInteger       bytesWrittenSoFar;
                bytesWrittenSoFar = 0;
                do {
                    bytesWritten = [self.fileStream write:&bytes[bytesWrittenSoFar] maxLength:length - bytesWrittenSoFar];
                    NSLog(@"bytesWritten = %i",bytesWritten);
                    if (bytesWritten == -1) {
                        [self.fileStream close];
                        self.fileStream = nil;
                        NSLog(@"File write error");
                        break;
                    } else {
                        bytesWrittenSoFar += bytesWritten;
                    }
                } while (bytesWrittenSoFar != length);
            }

Here are the .h declarations:

Added to the interface for AudioStreamer.h

// for recording and saving a stream
NSString*        filePath;
NSOutputStream*  fileStream;
BOOL isRecording;
BOOL isPlayingFile;

In your view controller you will need:

@property(nonatomic, assign) IBOutlet UIButton* recordButtonImage;
@property(nonatomic, assign) BOOL isRecording;
@property (nonatomic, copy)   NSString* recordFilePath;

Hope this helps someone. Let me know if questions, and always happy to hear someone who can improve this.

Also, someone asked about self.model.xxx Model is a Data Object I created to allow me to easily pass around data that is used by more than one object, and is also modified by more than one object. I know, global data is bad form, but there are times that just make it easier to access. I pass the data model to each new object when called. I save an array of channels, song name, artist name, and other stream related data inside the model. I also put any data I want to persist through launches here, like settings, and write this data model to a file each time a persistent data is changed. IN this example, you can keep the data locally. If you need help on the model passing, let me know.

游魂 2024-10-16 04:32:58

好的,这就是我回放录音文件的方法。播放文件时,电台 URL 包含文件的路径。 self.model.playRecordedSong 包含我想要播放的流的秒数的时间值。我保留了歌曲名称和时间索引的字典,这样我就可以在任何歌曲开始时跳到录制的流中。使用 0 从头开始​​。

  NSError *error;
    NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:self.model.stationURL, [[NSBundle mainBundle] resourcePath]]];
    // get the file URL  and then create an audio player if we don't already have one.
    if (audioPlayer == nil) {
        // set the seconds count to the proper start point (0, or some time into the stream)
        // this will be 0 for start of stream, or some value passed back if they picked a song.
        self.recordPlaySecondsCount = self.model.playRecordedSong;
        //create a new player
        audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
        // set self so we can catch the end of file.
        [audioPlayer setDelegate: self];
        // audio player needs an NSTimeInterval. Get it from the seconds start point.
        NSTimeInterval interval = self.model.playRecordedSong;
        // seek to the proper place in file.
        audioPlayer.currentTime = interval;
    }
    audioPlayer.numberOfLoops = 0;          // do not repeat

    if (audioPlayer == nil)
        NSLog(@"AVAudiolayer error: %@", error);
    // I need to do more on the error of no player
    else {
        [audioPlayer play];
    }

我希望这可以帮助您播放录制的文件。

OK, here is how I play back the recorded file. When playing a file, the station URL contains the path to the file. self.model.playRecordedSong contains a time value for how many seconds into the stream I want to play. I keep a dictionary of song name and time index, so I can jump into the recorded stream at the start of any song. Use 0 to start form the beginning.

  NSError *error;
    NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:self.model.stationURL, [[NSBundle mainBundle] resourcePath]]];
    // get the file URL  and then create an audio player if we don't already have one.
    if (audioPlayer == nil) {
        // set the seconds count to the proper start point (0, or some time into the stream)
        // this will be 0 for start of stream, or some value passed back if they picked a song.
        self.recordPlaySecondsCount = self.model.playRecordedSong;
        //create a new player
        audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
        // set self so we can catch the end of file.
        [audioPlayer setDelegate: self];
        // audio player needs an NSTimeInterval. Get it from the seconds start point.
        NSTimeInterval interval = self.model.playRecordedSong;
        // seek to the proper place in file.
        audioPlayer.currentTime = interval;
    }
    audioPlayer.numberOfLoops = 0;          // do not repeat

    if (audioPlayer == nil)
        NSLog(@"AVAudiolayer error: %@", error);
    // I need to do more on the error of no player
    else {
        [audioPlayer play];
    }

I hope this helps you play back the recorded file.

勿挽旧人 2024-10-16 04:32:58

尝试这个类,它有所有广播流录音播放全部的完整解决方案..

在 Git Hub 中,你可以找到这个使用这个类非常易于使用

Try This Class This Have Full Solution OF All Radio streaming recording Playing All..

In Git Hub You Can Find This Use This Class Very Easy To Use

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文