使用 Qualcomm SDK 的基于标记的 AR android 应用程序

发布于 2024-12-01 08:22:52 字数 183 浏览 2 评论 0原文

我正在开发一个Android应用程序,当特定电影的海报显示在Android相机前时,会播放特定的视频。 早些时候,我使用 AndAr 项目来处理这个东西,我对 customObject 的 draw() 函数做了一些更改并使其正常工作。但现在我的客户希望我使用 Qualcomm sdk 。 我正在玩图像目标应用程序,但找不到任何可以进入我脑海的东西。

I am developing an android application in which a specific video is played when the poster of a specific movie is shown infront of the camera in android.
Earlier i was using AndAr project for this stuff and i did some changes in draw() function of customObject and got it working. But now my client wants me to use Qualcomm sdk .
I was playing with image targets application but couldnt find anything to get in my head.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

街角迷惘 2024-12-08 08:22:52

好的,我成功了。以下是我从此链接

https://ar 获得的解决方案。 qualcomm.at/arforums/showthread.php?t=32

我拥有的 imagetargets.cpp 已经具有 renderFrame 方法
所以我必须对其进行一些修改

  JNIEXPORT void JNICALL
  Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargetsRenderer_renderFrame(JNIEnv*       
  env, jobject obj)

{
//LOG("Java_com_qualcomm_QCARSamples_ImageTargets_GLRenderer_renderFrame");

// Clear color and depth buffer 
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

// Render video background:
QCAR::State state = QCAR::Renderer::getInstance().begin();

 #ifdef USE_OPENGL_ES_1_1
// Set GL11 flags:
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_NORMAL_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);

glEnable(GL_TEXTURE_2D);
glDisable(GL_LIGHTING);

 #endif

glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);

// Did we find any trackables this frame?
for(int tIdx = 0; tIdx < state.getNumActiveTrackables(); tIdx++)
{
    // Get the trackable:
    const QCAR::Trackable* trackable = state.getActiveTrackable(tIdx);
    QCAR::Matrix44F modelViewMatrix =
        QCAR::Tool::convertPose2GLMatrix(trackable->getPose());        

    // Choose the texture based on the target name:
    int textureIndex = (!strcmp(trackable->getName(), "stones")) ? 0 : 1;
    const Texture* const thisTexture = textures[textureIndex];

    jstring js = env->NewStringUTF(trackable->getName());
               jclass javaClass = env->GetObjectClass(obj);
               jmethodID method = env->GetMethodID(javaClass, "displayMessage", "(Ljava/lang/String;)V");
               env->CallObjectMethod(obj, method, js);

  }

glDisable(GL_DEPTH_TEST);

 #ifdef USE_OPENGL_ES_1_1        
glDisable(GL_TEXTURE_2D);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_NORMAL_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
 #else
glDisableVertexAttribArray(vertexHandle);
glDisableVertexAttribArray(normalHandle);
glDisableVertexAttribArray(textureCoordHandle);
 #endif

QCAR::Renderer::getInstance().end();
}

,在对 imageRenderer 类进行更改后,我在简历中添加了以下代码:-

   ImageTargetsRenderer.mainActivityHandler = new Handler() {
        @Override
        public void handleMessage(Message msg) {
            Intent intent = new Intent(Intent.ACTION_VIEW);
            intent.setData(Uri.parse("http://www.youtube.com/watch?v=DyDA2Abnssg"));
            startActivity(intent);
            ImageTargets.this.finish();
        }
    };

Ok i got it working. The following is the solution to it which i got from this link

https://ar.qualcomm.at/arforums/showthread.php?t=32

The imagetargets.cpp which i had was already having the method renderFrame
so i had to modify it a little

  JNIEXPORT void JNICALL
  Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargetsRenderer_renderFrame(JNIEnv*       
  env, jobject obj)

{
//LOG("Java_com_qualcomm_QCARSamples_ImageTargets_GLRenderer_renderFrame");

// Clear color and depth buffer 
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

// Render video background:
QCAR::State state = QCAR::Renderer::getInstance().begin();

 #ifdef USE_OPENGL_ES_1_1
// Set GL11 flags:
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_NORMAL_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);

glEnable(GL_TEXTURE_2D);
glDisable(GL_LIGHTING);

 #endif

glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);

// Did we find any trackables this frame?
for(int tIdx = 0; tIdx < state.getNumActiveTrackables(); tIdx++)
{
    // Get the trackable:
    const QCAR::Trackable* trackable = state.getActiveTrackable(tIdx);
    QCAR::Matrix44F modelViewMatrix =
        QCAR::Tool::convertPose2GLMatrix(trackable->getPose());        

    // Choose the texture based on the target name:
    int textureIndex = (!strcmp(trackable->getName(), "stones")) ? 0 : 1;
    const Texture* const thisTexture = textures[textureIndex];

    jstring js = env->NewStringUTF(trackable->getName());
               jclass javaClass = env->GetObjectClass(obj);
               jmethodID method = env->GetMethodID(javaClass, "displayMessage", "(Ljava/lang/String;)V");
               env->CallObjectMethod(obj, method, js);

  }

glDisable(GL_DEPTH_TEST);

 #ifdef USE_OPENGL_ES_1_1        
glDisable(GL_TEXTURE_2D);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_NORMAL_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
 #else
glDisableVertexAttribArray(vertexHandle);
glDisableVertexAttribArray(normalHandle);
glDisableVertexAttribArray(textureCoordHandle);
 #endif

QCAR::Renderer::getInstance().end();
}

and after doing changes in imageRenderer class i added following code in on resume :-

   ImageTargetsRenderer.mainActivityHandler = new Handler() {
        @Override
        public void handleMessage(Message msg) {
            Intent intent = new Intent(Intent.ACTION_VIEW);
            intent.setData(Uri.parse("http://www.youtube.com/watch?v=DyDA2Abnssg"));
            startActivity(intent);
            ImageTargets.this.finish();
        }
    };
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文