如何将动画实时链接/同步到视频录制

发布于 2024-11-18 05:00:10 字数 493 浏览 4 评论 0原文

应用说明:车速表。有针盘和动画针作为视频上的叠加。我通过后处理将针的动画输出到视频上。我使用 AVAssetExportSession,并构建一个 AVComposition,其中包含我的动画图层以及视频中的视频和音频轨道。这很好用。视频显示,针动画。

目前,为了在后处理过程中重播动画,我已经保存了自视频“录制”开始以来一段时间的速度变化。在后处理过程中,我然后根据保存的时间/速度数据启动计时器,然后将针动画到下一个速度。

问题:生成的视频/动画对并不完全准确,并且拍摄视频时显示的速度与播放和合成视频时显示的速度通常不匹配。 (通常针在视频之前),因为导出期间的合成/压缩不一定是实时的。

问题:有没有办法可以将速度信息嵌入到录制视频流中,然后在导出时访问它,以便视频和速度计在时间上匹配?

如果在导出过程中的特定时间获得包含我的速度数据的回调,那就太好了。

一如既往...谢谢!

App Descrtiption: Speedometer. Has needle dial and animated needle as overlay on the video. I output the animation of the needle onto the video via post-processing. I use AVAssetExportSession, and construct an AVComposition containing my animated layers along with the Video and Audio tracks from the video. This works fine. Video shows, needle animates.

Currently to replay the animation during the post-processing, I have saved off any change in speed with a time since "recording" of the video began. During postprocessing, I then fire off a timer(s) based on the saved time/speed data to then animate the needle to the next speed.

Problem: Resulting video/animation pair are not completely accurate and there often is a mismatch between the speed displayed when the video was taken and when it is played back and composited. (usually needle is in advance of video) due to the fact that the compositing/compression during export is not necessarily real-time.

Question: Is there a way I can embed speed information into the recording video stream and then get access to it when it is exported so that the video and speedometer are temporally matched up?

Would be nice to get a callback at specific times during export that contains my speed data.

As always...thanks!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(5

瑾兮 2024-11-25 05:00:10

不使用计时器来为针设置动画,而是根据您记录的速度数据创建关键帧动画。

定时器和 CA 通常不能很好地混合,至少不是我从你的描述中推断出的。

Instead of using timers to animate your needle create a keyframe animation based on the speed data you recorded.

Timers and CA don't generally mix well, at least not in the way I infer from your description.

π浅易 2024-11-25 05:00:10

如果您需要在应用程序在 iPhone 上运行时嵌入元数据,我不知道该怎么做。如果您之前可以进行嵌入,请使用 HTTP Live Streaming 和 HTTP Live Streaming 工具。

元数据由 id3taggenerator 在文件上生成,并使用 mediafilesegmenter 嵌入到视频中。示例:

id3taggenerator -o camera1.id3 -text "Dolly camera"
id3taggenerator -o camera2.id3 -text "Tracking camera"

您可以嵌入多种元数据,包括二进制对象。有关详细信息,请参阅手册页。现在我们需要从“元宏文件”引用生成的文件。这是一个纯文本文件,具有以下格式:

60 id3 camera1.id3
120 id3 camera2.id3

第一个数字是自您要插入通知的视频开始以来经过的秒数。我不太记得 mediafilesegmenter 命令了,抱歉,你至少必须传递宏文件、索引和视频文件。

生成的视频包含由 MPMoviePlayerController 作为通知发布的元数据。有关详细信息,请参阅此页面:http ://jmacmullin.wordpress.com/2010/11/03/adding-meta-data-to-video-in-ios/

If you need to embed the metadata while the app is running on the iPhone I don't know how to do it. If you can do the embedding before, use HTTP LIve Streaming and the HTTP Live Streaming Tools.

The metadata is generated on a file by the id3taggenerator, and embedded on video using mediafilesegmenter. Example:

id3taggenerator -o camera1.id3 -text "Dolly camera"
id3taggenerator -o camera2.id3 -text "Tracking camera"

There are several kinds of metadata you can embed, including binary objects. Refer to the man page for details. Now we need to reference the generated file from a "meta macro-file". This is a plain text file with the following format:

60 id3 camera1.id3
120 id3 camera2.id3

The first number is seconds elapsed since the beginning of the video where you want to insert the notification. I don't remember the mediafilesegmenter command exactly, sorry, you have to pass the macro file, the index, and the video file at least.

The resulting video contains metadata that is posted by the MPMoviePlayerController as notifications. See this page for details: http://jmacmullin.wordpress.com/2010/11/03/adding-meta-data-to-video-in-ios/

阿楠 2024-11-25 05:00:10

您应该使用 CAAnimations 和 beginTime 属性提前设置动画,然后使用 AVVideoComposition + AVVideoCompositionCoreAnimationTool 将它们添加到视频中出口。请注意其文档说明:

任何动画都将在视频时间轴上解释,而不是实时...

因此您的动画将与您指定的最终影片完全对齐。

You should use CAAnimations and the beginTime property to set up your animations ahead of time, then use AVVideoComposition + AVVideoCompositionCoreAnimationTool to add them to the video when exporting. Note its documentation states:

Any animations will be interpreted on the video's timeline, not real-time...

So your animations will line up exactly where you specify with the resulting movie.

十年不长 2024-11-25 05:00:10

自从这个问题被问起已经有一段时间了,但在到处查看之后,我设法通过在录制过程中实时采样数据(以 1/30 秒的速度,使用以 30fps 录制的视频的计时器)并存储来想出类似的东西它在一个数组中。然后在后处理中,我在循环中为数组中的每个数据元素创建多个 CALayer,并在每个层上绘制该数据的可视化效果。

每个图层都有一个 CAAnimation,它会在正确的媒体时间轴上使用 beginTime 属性(即 1/30 秒)淡出不透明度。乘以数组索引。由于时间太短,该层立即出现在前一层之上。如果图层背景不透明,它会遮挡上一层中渲染的针,因此看起来与原始视频捕获非常同步地对针进行动画处理。你可能需要稍微调整一下时间,但我只落后一帧。

/******** this has not been compiled but you should get the idea ************

// Before starting the AVAssetExportSession session and after the AVMutableComposition routine

CALayer* speedoBackground = [[CALayer alloc] init]; // background layer for needle layers
[speedoBackground setFrame:CGRectMake(x,y,width,height)]; // size and location
[speedoBackground setBackgroundColor:[[UIColor grayColor] CGColor]];
[speedoBackground setOpacity:0.5] // partially see through on video

// loop through the data
for (int index = 0; index < [dataArray count]; index++) {

  CALayer* speedoNeedle = [[CALayer alloc] init]; // layer for needle drawing
  [speedoNeedle setFrame:CGRectMake(x,y,width,height)]; // size and location
  [speedoNeedle setBackgroundColor:[[UIColor redColor] CGColor]];
  [speedoNeedle setOpacity:1.0]; // probably not needed

  // your needle drawing routine for each data point ... e.g.
  [self drawNeedleOnLayer:speedoNeedle angle:[self calculateNeedleAngle[dataArray objectAtIndex:index]]];

  CABasicAnimation *needleAnimation = [CABasicAnimation animationWithKeyPath:@"opacity"];
  needleAnimation.fromValue = [NSNumber numberWithFloat:(float)0.0];
  needleAnimation.toValue = [NSNumber numberWithFloat:(float)1.0]; // fade in
  needleAnimation.additive = NO;
  needleAnimation.removedOnCompletion = NO; // it obscures previous layers
  needleAnimation.beginTime = index*animationDuration;
  needleAnimation.duration = animationDuration -.03; // it will not animate at this speed but layer will appear immediately over the previous layer at the correct media time
  needleAnimation.fillMode = kCAFillModeBoth;
  [speedoNeedle addAnimation:needleAnimation forKey:nil];
  [speedoBackground addSublayer:needleOverlay];
}

[parentLayer addSublayer:speedoBackground];

.
.
.
// when the AVAssetExportSession has finished, make sure you clear all the layers
parentLayer.sublayers = nil;

它是处理器和内存密集型的,因此不适合长视频或复杂的绘图。我确信还有更优雅的方法,但这很有效,我希望这会有所帮助。

It's been a while since this question was asked but after looking everywhere, I have managed to come up with something similar by sampling data in real time during recording (at 1/30 sec. with a timer for a video recorded at 30fps) and storing it in an array. Then in post-processing, I create multiple CALayers in a loop for each data element in the array and draw the visualisation of that data on each layer.

Each layer has an CAAnimation that fades in opacity at the correct media timeline with the beginTime attribute, which is simply 1/30 sec. multiplied by the array index. This is so short a time that the layer immediately appears over the preceeding layer. If the layer background is opaque, it will obscure the needle rendered in the previous layer and so appear to animate the needle in pretty good synch with the original video capture. You may have to tweak the timing a little but I am no more than one frame out.

/******** this has not been compiled but you should get the idea ************

// Before starting the AVAssetExportSession session and after the AVMutableComposition routine

CALayer* speedoBackground = [[CALayer alloc] init]; // background layer for needle layers
[speedoBackground setFrame:CGRectMake(x,y,width,height)]; // size and location
[speedoBackground setBackgroundColor:[[UIColor grayColor] CGColor]];
[speedoBackground setOpacity:0.5] // partially see through on video

// loop through the data
for (int index = 0; index < [dataArray count]; index++) {

  CALayer* speedoNeedle = [[CALayer alloc] init]; // layer for needle drawing
  [speedoNeedle setFrame:CGRectMake(x,y,width,height)]; // size and location
  [speedoNeedle setBackgroundColor:[[UIColor redColor] CGColor]];
  [speedoNeedle setOpacity:1.0]; // probably not needed

  // your needle drawing routine for each data point ... e.g.
  [self drawNeedleOnLayer:speedoNeedle angle:[self calculateNeedleAngle[dataArray objectAtIndex:index]]];

  CABasicAnimation *needleAnimation = [CABasicAnimation animationWithKeyPath:@"opacity"];
  needleAnimation.fromValue = [NSNumber numberWithFloat:(float)0.0];
  needleAnimation.toValue = [NSNumber numberWithFloat:(float)1.0]; // fade in
  needleAnimation.additive = NO;
  needleAnimation.removedOnCompletion = NO; // it obscures previous layers
  needleAnimation.beginTime = index*animationDuration;
  needleAnimation.duration = animationDuration -.03; // it will not animate at this speed but layer will appear immediately over the previous layer at the correct media time
  needleAnimation.fillMode = kCAFillModeBoth;
  [speedoNeedle addAnimation:needleAnimation forKey:nil];
  [speedoBackground addSublayer:needleOverlay];
}

[parentLayer addSublayer:speedoBackground];

.
.
.
// when the AVAssetExportSession has finished, make sure you clear all the layers
parentLayer.sublayers = nil;

It is processor and memory intensive, so it's not great for long videos or complex drawing. I am sure there are more elegant methods but this works and I hope this helps.

星星的軌跡 2024-11-25 05:00:10

今年 WWDC 上的一场会议可能会为您正在做的事情提供不同的方法。您可以在此处观看视频:http://developer.apple.com/videos/wwdc/2011 / 。查找名为“在 AVFoundation 中使用媒体”的内容。有趣的部分发生在第 26 分钟左右。我不完全确定我理解了这个问题,但是当我读到它时,我想起了那个会话。

此致。

There is a session from this year's WWDC that might provide a different approach to what you're doing. You can see the videos here: http://developer.apple.com/videos/wwdc/2011/ . Look for one called "Working with Media in AVFoundation". The interesting bits are around minute 26 or so. I'm not completely sure I understand the problem, but when I read it, that session occurred to me.

Best regards.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文