使用已知的工作功能组合为我的树脂打印机制作延时拍摄 - 但定制
所以我一直想在使用树脂打印机时制作延时摄影。 有一些已知的方法可以做到这一点: https://www.hackster.io/ryanjgill2/msla-smooth- time-lapse-199d87
或使用 DLSR 相机和光敏电阻通过 2.5 毫米插孔进行远程触发。
我的妻子对她的 DLSR 非常谨慎,所以这是不可能的。
我绝对不是编码员,我知道如何设置打印机的配置,并且我想相信我可以理解一些行,但我只会坚持我最擅长的,机械设计工程和建模!
回到 Hackster 模型。
当我通过 OctoPrint 运行打印机时,我已经将 Pi 连接到打印机。
虽然 OctoPrint 中包含的延时拍摄不起作用,因为我只能在给定的时间间隔内调用,但缺少结果。
Hackster 模型运行 PiCamera,我没有这样的设备,我认为 90 欧元对于它的功能来说是一个很高的价格,所以我必须依靠我的 C920HD USB 摄像头。
看看 Ryan 编写的代码:
from gpiozero import Button
from picamera import PiCamera
from signal import pause
import time
camera = PiCamera()
camera.resolution = '3280x2464'
currentTime = int
def capture():
currentTime = int(round(time.time() * 1000))
image_path = '/mnt/usb/photos/image_%s.jpg' % currentTime
camera.capture(image_path)
print('Image captured: %d' % currentTime)
button = Button(14)
button.when_pressed = capture
pause()
我可以得出结论,我需要编辑其中的一些代码。
我已经知道我的 C920 可以通过 fswebcam
或 ffmpeg
使用,并且我知道分辨率,但我不知道这里实际需要定义什么?
无论哪种方式,使用fswebcam
拍摄照片的完整命令都可以/看起来像这样:
fswebcam -r 1920x1080 --no-banner /images/image1.jpg
我也知道我可以使用os.system
来处理这个问题。
因此,由于我实际上掌握的知识很少甚至一无所知,我想出了这个,但我显然在这里是因为它不起作用。
import time
import os
from gpiozero import Button
from signal import pause
currentTime = int
timelapse = os.system('fswebcam -r 1920x1080 -S 3 --jpeg 50 --save /mnt/usb/photos/image_%s.jpg' % currentTime)
def capture():
currentTime = int(round(time.time() * 1000))
image_path = '/mnt/usb/photos/image_%s.jpg' % currentTime
camera.capture(image_path)
print('Image captured: %d' % currentTime)
while True:
os.system('fswebcam -r 1920x1080 -S 3 --jpeg 50 --save /mnt/usb/photos/image_%s.jpg' % currentTime)
button = Button(14)
button.when_pressed = timelapse
pause()
为了方便起见,我想为我运行的每个新的游戏中时光倒流创建一个文件夹。
拍摄照片后我会使用:
ffmpeg -framerate 120 -pattern_type glob -i "photos/*.jpg" -s:v 1920x1080 -c:v libx264 -crf 20 -pix_fmt yuv420p timelapse.mp4
但在撰写本文时,我突然想到我正在尝试结合使用 fswebcam
和 ffmpeg
。
我彻底搞砸了吗?
如果需要的话,我可以澄清 GPIO 等,但假设这里的人都知道发生了什么。
So I've been wanting to make a timelapse while using my resin printer.
There are some known ways to do this this this:
https://www.hackster.io/ryanjgill2/msla-smooth-time-lapse-199d87
or by using a DLSR camera and a photoresistor via a 2.5mm jack to remote trigger.
My wife is very guarding of her DLSR so that is out of the question.
I am by all means not a coder, I know how to setup configs for printers, and I want to believe that I can understand some lines, but I'll just stick to what I do best, mechanical design engineering and modelling!
So back to the Hackster model.
I have a Pi already connected to the printer as I run it off OctoPrint.
Though included timelapse in OctoPrint won't work as I can only call on a given time interval the result is lacking.
The Hackster model runs of a PiCamera, I do not have such and I think 90 € is a high price for what it does, so I must rely on my C920HD USB camera.
Looking at the code made by Ryan:
from gpiozero import Button
from picamera import PiCamera
from signal import pause
import time
camera = PiCamera()
camera.resolution = '3280x2464'
currentTime = int
def capture():
currentTime = int(round(time.time() * 1000))
image_path = '/mnt/usb/photos/image_%s.jpg' % currentTime
camera.capture(image_path)
print('Image captured: %d' % currentTime)
button = Button(14)
button.when_pressed = capture
pause()
I can conclude that I need to edit some of it.
I already know that my C920 can be used via fswebcam
or ffmpeg
and I know the resolution, though I do not know what is actually needed to be defined here?
Either way the full command to take a picture could/would look like this using fswebcam
:
fswebcam -r 1920x1080 --no-banner /images/image1.jpg
I also know that I can use os.system
to handle this.
So with the little to no knowledge I actually hold, I came up with this, but I'm obviously here because it didn't work.
import time
import os
from gpiozero import Button
from signal import pause
currentTime = int
timelapse = os.system('fswebcam -r 1920x1080 -S 3 --jpeg 50 --save /mnt/usb/photos/image_%s.jpg' % currentTime)
def capture():
currentTime = int(round(time.time() * 1000))
image_path = '/mnt/usb/photos/image_%s.jpg' % currentTime
camera.capture(image_path)
print('Image captured: %d' % currentTime)
while True:
os.system('fswebcam -r 1920x1080 -S 3 --jpeg 50 --save /mnt/usb/photos/image_%s.jpg' % currentTime)
button = Button(14)
button.when_pressed = timelapse
pause()
And I would like to create a folder for every new timelapse I run, for the ease of it.
After shooting the pictures I would use:
ffmpeg -framerate 120 -pattern_type glob -i "photos/*.jpg" -s:v 1920x1080 -c:v libx264 -crf 20 -pix_fmt yuv420p timelapse.mp4
But as of writing this, it just occurred to me that I'm trying to use both fswebcam
and ffmpeg
combined.
Have I screwed up completely?
I can clarify the GPIO etc. if this needs but assuming that the lot here has an idea of what's going on.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
对于工作的不同部分使用适当的工具当然没有什么问题 -
fswebcam
非常适合从 USB 网络摄像头抓取图像,而ffmpeg
非常适合从一个媒体转换为媒体排序到另一个。还有一些其他选项 用于通过 Python 访问 USB 网络摄像头,但这些有点复杂。
如果您很高兴将命令串在一起,就像您在那里所做的那样,那么以下应该可以工作:
There's certainly nothing wrong with using appropriate tools for different parts of the job -
fswebcam
is great for grabbing images from the USB web cam, andffmpeg
is great a converting media from one sort to another.There are some other options for accessing a USB webcam through Python, but those are somewhat more involved.
If you're happy stringing commands together as you've done there, then the following should work: