在 HoughCircles()、OpenCV 给出的一系列点上使用 warpPerspective()
我试图从以透视角度拍摄的图像中检测台球在桌子上的位置。我正在使用 getPerspectiveTransform() 方法来查找变换矩阵,并且我想将其仅应用于使用 HoughCircles 检测到的圆。我正在尝试从一个相当大的梯形形状变成一个较小的矩形形状。我不想先对图像进行转换,然后找到霍夫圆,因为图像变形太大,霍夫圆无法提供有用的结果。
这是我的代码:
CvMat mmat = cvCreateMat(3,3,CV_32FC1);
double srcX1 = 462;
double srcX2 = 978;
double srcX3 = 1440;
double srcX4 = 0;
double srcY = 241;
double srcHeight = 772;
double dstX = 56.8;
double dstY = 33.5;
double dstWidth = 262.4;
double dstHeight = 447.3;
CvSeq seq = cvHoughCircles(newGray, circles, CV_HOUGH_GRADIENT, 2.1d, (double)newGray.height()/40, 85d, 65d, 5, 50);
JavaCV.getPerspectiveTransform(new double[]{srcX1, srcY, srcX2,srcY, srcX3, srcHeight, srcX4, srcHeight},
new double[]{dstX, dstY, dstWidth, dstY, dstWidth, dstHeight, dstX, dstHeight}, mmat);
cvWarpPerspective(seq, seq, mmat);
for(int j=0; j<seq.total(); j++){
CvPoint3D32f point = new CvPoint3D32f(cvGetSeqElem(seq, j));
float xyr[] = {point.x(),point.y(),point.z()};
CvPoint center = new CvPoint(Math.round(xyr[0]), Math.round(xyr[1]));
int radius = Math.round(xyr[2]);
cvCircle(gray, center, 3, CvScalar.GREEN, -1, 8, 0);
cvCircle(gray, center, radius, CvScalar.BLUE, 3, 8, 0);
}
问题是我在 warpPerspective() 方法上遇到此错误:
error: (-215) seq->total > 0 && CV_ELEM_SIZE(seq->flags) == seq->elem_size in function cv::Mat cv::cvarrToMat(const CvArr*, bool, bool, int)
另外我想值得一提的是我正在使用 JavaCV,以防方法调用看起来与您习惯的有点不同。感谢您的任何帮助。
I'm trying to detect the positions of billiards balls on a table from an image taken at a perspective angle. I'm using the getPerspectiveTransform() method to find the transformation matrix and I want to apply that to only the circles I detect using HoughCircles. I'm trying to go from a rather large trapezoidal shape to a smaller rectangular shape. I don't want to do the transformation on the image first and then find the HoughCircles because the image gets too warped for houghcircles to provide useful results.
Here's my code:
CvMat mmat = cvCreateMat(3,3,CV_32FC1);
double srcX1 = 462;
double srcX2 = 978;
double srcX3 = 1440;
double srcX4 = 0;
double srcY = 241;
double srcHeight = 772;
double dstX = 56.8;
double dstY = 33.5;
double dstWidth = 262.4;
double dstHeight = 447.3;
CvSeq seq = cvHoughCircles(newGray, circles, CV_HOUGH_GRADIENT, 2.1d, (double)newGray.height()/40, 85d, 65d, 5, 50);
JavaCV.getPerspectiveTransform(new double[]{srcX1, srcY, srcX2,srcY, srcX3, srcHeight, srcX4, srcHeight},
new double[]{dstX, dstY, dstWidth, dstY, dstWidth, dstHeight, dstX, dstHeight}, mmat);
cvWarpPerspective(seq, seq, mmat);
for(int j=0; j<seq.total(); j++){
CvPoint3D32f point = new CvPoint3D32f(cvGetSeqElem(seq, j));
float xyr[] = {point.x(),point.y(),point.z()};
CvPoint center = new CvPoint(Math.round(xyr[0]), Math.round(xyr[1]));
int radius = Math.round(xyr[2]);
cvCircle(gray, center, 3, CvScalar.GREEN, -1, 8, 0);
cvCircle(gray, center, radius, CvScalar.BLUE, 3, 8, 0);
}
The problem is I get this error on the warpPerspective() method:
error: (-215) seq->total > 0 && CV_ELEM_SIZE(seq->flags) == seq->elem_size in function cv::Mat cv::cvarrToMat(const CvArr*, bool, bool, int)
Also I guess it's worth mentioning that I'm using JavaCV, in case the method calls look a bit different than what you're used to. Thanks for any help.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
答案:
你想要做的问题(除了明显的问题之外,opencv 不会让你这样做)是半径不能真正正确地扭曲。 AFAIK xy 坐标很容易计算 x'=((m00x+m01y+m02)/(m20x+m21y+m22)) y'=((m10x+m11y+m12)/(m20x+m21y_m22)) 当 m 为变换矩阵。您可以通过变换原始圆的所有点来破解半径,然后找到 x'y' 和这些点之间的最大距离(至少如果扭曲图像中的半径预计覆盖所有这些点)
顺便说一句,mIJx = m(i,j)*x(只是为了澄清)
结束答案。
我写的所有内容都是根据c++版本的,我从未使用过JavaCV,但从我可以看到它只是一个调用本机 C++ 库的包装器。
CvSeq 是一种序列数据结构,其行为类似于链表。
您的应用程序崩溃的断言是,
这意味着您的 seq 实例为空(总数是序列中的元素数量),或者内部 seq 标志已损坏。
我建议您检查 CvSeq 和 cvHoughCircles 调用的总成员。
所有这些都发生在 cvWarpPerspective 的实际实现之前(它是实现中的第一行,仅将您的 CvSeq 转换为 cv::Mat).. 所以它不是扭曲,而是您之前所做的事情。
无论如何,要了解 cvHoughCircles 出了什么问题,我们需要有关 newGray 和 Circles 的创建的更多信息。
这是我在 javaCV 页面(链接)上找到的一个
示例我在 cvHoughCircles 的实现中看到,答案保存在 Circles buff 中,最后他们从中创建要返回的 CvSeq,所以如果您错误地分配了 Circles buff,它将无法工作。
编辑:
如您所见,从 cvHoughCircles 返回的 CvSeq 实例是点值列表,这可能就是断言失败的原因。您无法将此 CvSeq 转换为 cv::Mat.. 因为它不是 cv::Mat。要仅获取 cv::Mat 实例中从 cvHoughCircles 返回的圆,您需要创建一个新的 cv::Mat 实例,然后在其上绘制 CvSeq 中的所有圆 - 如上面提供的示例所示。
比扭曲会起作用(您将有一个 cv::Mat 实例,这就是函数所期望的 - cv::Mat 作为 CvSeq 中的唯一元素)
END EDIT
此处是 CvSeq 的 C++ 参考
如果你想摆弄源代码
我希望会有所帮助。
顺便说一句,你的下一个错误可能是:
C++ OpencCv 中的 cv::warpPerspective 断言
dst.data != src.data
不会工作,因为
源 mat 和目标 mat 引用相同的数据。
并非 OpenCV 中的所有功能(以及一般的图像处理)都可以原位工作(因为没有原位算法,或者因为它比其他版本慢,例如 n*n 垫的转置可以在原位工作,但 n*m (其中 n!=m )将更难在原位执行,并且可能会慢点)
你不能假设使用 src 矩阵作为 dst 会起作用。
Answer:
the problem with what you want to do (besides the obvious, opencv wont let you) is that the radius cant really be warped correctly. AFAIK the xy coordinates are pretty easy to calculate x'=((m00x+m01y+m02)/(m20x+m21y+m22)) y'=((m10x+m11y+m12)/(m20x+m21y_m22)) when m is the transformation matrix. the radius you can hack by transforming all the points of the original circle and then find the max distance between x'y' and those points (atleast if the radius in the warped image is expected to cover all those points)
btw, mIJx = m(i,j)*x (just to clarify)
End Answer.
Everything i write is according to the c++ version, i've never used JavaCV but from what i could see its just a wrapper that calls the native c++ lib.
CvSeq is a sequance data structure that behaves like a linked list.
the assert your application crushes at is
which means that either your seq instance is empty (total is the number of elements in the sequence) or somehow the inner seq flags are corrupted.
I'd recommend that you'd check the total member of your CvSeq, and the cvHoughCircles call.
all of this occurs before the actual implementation of cvWarpPerspective (its the first line in the implementation, that only converts your CvSeq to cv::Mat).. so its not the warping but what you're doing before that.
anyway, to understand whats wrong with cvHoughCircles we'll need more info about the creation of newGray and circles.
here is an example i've found on the javaCV page (Link)
from what i've seen in the implementation of cvHoughCircles, the answer is saved in the circles buff and at the end they create from it the CvSeq to return, so if you've allocated the circles buff wrong, it wont work.
EDIT:
as you can see, the CvSeq instance in case of the return from cvHoughCircles is a list of point-values, that is probably why the assertion failed. you cannot convert this CvSeq into a cv::Mat.. because its just not a cv::Mat. to get only the circles returned from cvHoughCircles in an cv::Mat instance, you'll need to create a new cv::Mat instance and than draw onto it all the circles in the CvSeq - as seen in the provided example above.
than the warping will work (you'll have a cv::Mat instance, and that is what the function expect - a cv::Mat as the only element in the CvSeq)
END EDIT
here is the c++ reference for CvSeq
and if you want to fiddle with the source code than
I hope that will help.
BTW, your next error will probably be:
cv::warpPerspective in the C++ OpencCv asserts that
dst.data != src.data
thus
wont work cause your source mat and destination mat referencing the same data.
Not all the functions in OpenCV (and image processing in general) work in-situ (because there is no in-situ algorithm or because its slower then the other version eg. transpose of an n*n mat will work in-situ, but n*m where n!=m will be harder to do in-situ and might be slower)
you cant assume the using the src matrix as the dst will work.