在 Android OpenGL ES 中使用 Wavefront .obj 纹理坐标的问题

发布于 2024-10-31 05:01:38 字数 6768 浏览 0 评论 0原文

我正在使用 openGL ES 编写一个 Android 应用程序。我遵循了一些在线教程,并设法使用硬编码顶点/索引/纹理坐标加载纹理立方体。

下一步,我为波前 .obj 文件编写了一个解析器。我使用教程中的顶点等制作了一个模拟文件,加载效果很好。

但是,当我使用使用 3D 建模包制作的文件时,所有纹理都会变得混乱。

下面是我当前获取纹理坐标的方式:

首先,我加载所有纹理坐标,即 vt' 接下来,我找到每个 f 三角形的前两个纹理坐标

(因此 f 1/2/3 2/5/2 3/4/1 意味着我采用第二个和第五个纹理由于 .obj 从 1 而不是 0 开始计数,因此我必须从位置开始计算 -1,然后将位置乘以 2 作为 x 坐标位置,并在我的 vt 中为 y 坐标位置执行相同的操作,但 +1 数组)

我将刚刚找到的纹理坐标添加到另一个向量中。

一旦我遍历完所有的顶点。我将向量转换为 FloatBuffer,将其传递给我的绘制方法中的 glTexCoordPointer

这是用于解析文件的代码:

private void openObjFile(String filename, Context context, GL10 gl){

    Vector<String> lines = openFile(filename, context); // opens the file

    Vector<String[]> tokens = new Vector<String[]>();

    Vector<Float> vertices = new Vector<Float>();
    Vector<Float> textureCoordinates = new Vector<Float>();
    Vector<Float> vertexNormals = new Vector<Float>();

    // tokenise
    for(int i = 0;i<lines.size();i++){
        String line = lines.get(i);
        tokens.add(line.split(" "));
    }

    for(int j = 0;j<tokens.size();j++){
        String[] linetokens = tokens.get(j);

        // get rid of comments
        //if(linetokens[0].equalsIgnoreCase("#")){
            //tokens.remove(j);
        //}


        // get texture from .mtl file
        if(linetokens[0].equalsIgnoreCase("mtllib")){
            parseMaterials(linetokens[1],context, gl);

        }

        // vertices
        if(linetokens[0].equalsIgnoreCase("v")){
            vertices.add(Float.valueOf(linetokens[1]));
            vertices.add(Float.valueOf(linetokens[2]));
            vertices.add(Float.valueOf(linetokens[3]));
        }


        // texture coordinates
        if(linetokens[0].equalsIgnoreCase("vt")){

            textureCoordinates.add(Float.valueOf(linetokens[1]));
            textureCoordinates.add(Float.valueOf(linetokens[2]));

        }

        // vertex normals
        if(linetokens[0].equalsIgnoreCase("vn")){

            vertexNormals.add(Float.valueOf(linetokens[1]));
            vertexNormals.add(Float.valueOf(linetokens[2]));
            vertexNormals.add(Float.valueOf(linetokens[3]));
        }

    }

    // vertices     
    this.vertices = GraphicsUtil.getFloatBuffer(vertices);


    Mesh mesh = null;

    Vector<Short> indices = null;
    Vector<Float> textureCoordinatesMesh = null;
    Vector<Float> vertexNormalsMesh = null;

    for(int j = 0;j<tokens.size();j++){



        String[] linetokens = tokens.get(j);

        if(linetokens[0].equalsIgnoreCase("g")){

            if(mesh!=null){

                mesh.setIndices(GraphicsUtil.getShortBuffer(indices));
                mesh.setNumindices(indices.size());
                mesh.setNormals(GraphicsUtil.getFloatBuffer(vertexNormalsMesh));
                mesh.setTextureCoordinates(GraphicsUtil.getFloatBuffer(textureCoordinatesMesh));

                meshes.add(mesh);

            }

            mesh = new Mesh();
            indices = new Vector<Short>();
            textureCoordinatesMesh = new Vector<Float>();
            vertexNormalsMesh = new Vector<Float>();


        } else if(linetokens[0].equalsIgnoreCase("usemtl")){

            String material_name = linetokens[1];

            for(int mn = 0;mn<materials.size();mn++){

                if(materials.get(mn).getName().equalsIgnoreCase(material_name)){
                    mesh.setTextureID(materials.get(mn).getTextureID());
                    mn = materials.size();
                }

            }

        } else if(linetokens[0].equalsIgnoreCase("f")){

            for(int v = 1;v<linetokens.length;v++){

                String[] vvtvn = linetokens[v].split("/");

                short index = Short.parseShort(vvtvn[0]);
                index -= 1;                 
                indices.add(index);

                if(v!=3){
                    int texturePosition = (Integer.parseInt(vvtvn[1]) - 1) * 2;
                    float xcoord = (textureCoordinates.get(texturePosition));
                    float ycoord = (textureCoordinates.get(texturePosition+1));


                    // normalise
                    if(xcoord>1 || ycoord>1){
                        xcoord = xcoord / Math.max(xcoord, ycoord);
                        ycoord = ycoord / Math.max(xcoord, ycoord);
                    }

                    textureCoordinatesMesh.add(xcoord);
                    textureCoordinatesMesh.add(ycoord);

                }                   

                int normalPosition = (Integer.parseInt(vvtvn[2]) - 1) *3;

                vertexNormalsMesh.add(vertexNormals.get(normalPosition));
                vertexNormalsMesh.add(vertexNormals.get(normalPosition)+1);
                vertexNormalsMesh.add(vertexNormals.get(normalPosition)+2);

            }

        }

    }

    if(mesh!=null){             

        mesh.setIndices(GraphicsUtil.getShortBuffer(indices));
        mesh.setNumindices(indices.size());
        mesh.setNormals(GraphicsUtil.getFloatBuffer(vertexNormalsMesh));
        mesh.setTextureCoordinates(GraphicsUtil.getFloatBuffer(textureCoordinatesMesh));

        meshes.add(mesh);
    }// Adding the final mesh
}

这是用于绘制的代码:

public void draw(GL10 gl){

    gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);

    // Counter-clockwise winding.
    gl.glFrontFace(GL10.GL_CCW);
    gl.glEnable(GL10.GL_CULL_FACE);
    gl.glCullFace(GL10.GL_BACK);

    // Pass the vertex buffer in
    gl.glVertexPointer(3, GL10.GL_FLOAT, 0,
                             vertices);

    for(int i=0;i<meshes.size();i++){
        meshes.get(i).draw(gl);
    }

    // Disable the buffers

    gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);

}

public void draw(GL10 gl){



    if(textureID>=0){

        // Enable Textures
        gl.glEnable(GL10.GL_TEXTURE_2D);

        // Get specific texture.
        gl.glBindTexture(GL10.GL_TEXTURE_2D, textureID);

        // Use UV coordinates.
        gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);

        // Pass in texture coordinates
        gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureCoordinates);

    } 

    // Pass in texture normals
    gl.glNormalPointer(GL10.GL_FLOAT, 0, normals);

    gl.glEnableClientState(GL10.GL_NORMAL_ARRAY);



        gl.glDrawElements(GL10.GL_TRIANGLES, numindices,GL10.GL_UNSIGNED_SHORT, indices);


    if(textureID>=0){
        // Disable buffers
        gl.glDisableClientState(GL10.GL_NORMAL_ARRAY);
        gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
    }

}

我非常感谢任何帮助。无法从文件加载模型是令人沮丧的,我真的不确定我在这里做错或遗漏了什么

I'm writing an android app using openGL ES. I followed some online tutorials and managed to load up a textured cube using hard-coded vertices/indices/texture coordinates

As a next step I wrote a parser for wavefront .obj files. I made a mock file using the vertices etc from the tutorial, which loads fine.

However, when I use a file made using a 3d modelling package, all the textures get messed up

Below is how I'm currently getting the texture coordinates:

First I load all the texture coordinates, the vt's into a big vector

Next I find the first two texture coordinates for each f triangle (so f 1/2/3 2/5/2 3/4/1 means I take the 2nd and 5th texture coordinates. Since .obj starts counting from 1 not 0, I have to -1 from the position and then multiply the position by 2 for the x coord position and do the same but +1 for the y coord position in my vt array)

I take those texture coordinates that I just found and add them to another vector.

Once I've gone through all the vertices. I turn the vector into a FloatBuffer, passing that to glTexCoordPointer in my draw method

Here is the code for parsing the file:

private void openObjFile(String filename, Context context, GL10 gl){

    Vector<String> lines = openFile(filename, context); // opens the file

    Vector<String[]> tokens = new Vector<String[]>();

    Vector<Float> vertices = new Vector<Float>();
    Vector<Float> textureCoordinates = new Vector<Float>();
    Vector<Float> vertexNormals = new Vector<Float>();

    // tokenise
    for(int i = 0;i<lines.size();i++){
        String line = lines.get(i);
        tokens.add(line.split(" "));
    }

    for(int j = 0;j<tokens.size();j++){
        String[] linetokens = tokens.get(j);

        // get rid of comments
        //if(linetokens[0].equalsIgnoreCase("#")){
            //tokens.remove(j);
        //}


        // get texture from .mtl file
        if(linetokens[0].equalsIgnoreCase("mtllib")){
            parseMaterials(linetokens[1],context, gl);

        }

        // vertices
        if(linetokens[0].equalsIgnoreCase("v")){
            vertices.add(Float.valueOf(linetokens[1]));
            vertices.add(Float.valueOf(linetokens[2]));
            vertices.add(Float.valueOf(linetokens[3]));
        }


        // texture coordinates
        if(linetokens[0].equalsIgnoreCase("vt")){

            textureCoordinates.add(Float.valueOf(linetokens[1]));
            textureCoordinates.add(Float.valueOf(linetokens[2]));

        }

        // vertex normals
        if(linetokens[0].equalsIgnoreCase("vn")){

            vertexNormals.add(Float.valueOf(linetokens[1]));
            vertexNormals.add(Float.valueOf(linetokens[2]));
            vertexNormals.add(Float.valueOf(linetokens[3]));
        }

    }

    // vertices     
    this.vertices = GraphicsUtil.getFloatBuffer(vertices);


    Mesh mesh = null;

    Vector<Short> indices = null;
    Vector<Float> textureCoordinatesMesh = null;
    Vector<Float> vertexNormalsMesh = null;

    for(int j = 0;j<tokens.size();j++){



        String[] linetokens = tokens.get(j);

        if(linetokens[0].equalsIgnoreCase("g")){

            if(mesh!=null){

                mesh.setIndices(GraphicsUtil.getShortBuffer(indices));
                mesh.setNumindices(indices.size());
                mesh.setNormals(GraphicsUtil.getFloatBuffer(vertexNormalsMesh));
                mesh.setTextureCoordinates(GraphicsUtil.getFloatBuffer(textureCoordinatesMesh));

                meshes.add(mesh);

            }

            mesh = new Mesh();
            indices = new Vector<Short>();
            textureCoordinatesMesh = new Vector<Float>();
            vertexNormalsMesh = new Vector<Float>();


        } else if(linetokens[0].equalsIgnoreCase("usemtl")){

            String material_name = linetokens[1];

            for(int mn = 0;mn<materials.size();mn++){

                if(materials.get(mn).getName().equalsIgnoreCase(material_name)){
                    mesh.setTextureID(materials.get(mn).getTextureID());
                    mn = materials.size();
                }

            }

        } else if(linetokens[0].equalsIgnoreCase("f")){

            for(int v = 1;v<linetokens.length;v++){

                String[] vvtvn = linetokens[v].split("/");

                short index = Short.parseShort(vvtvn[0]);
                index -= 1;                 
                indices.add(index);

                if(v!=3){
                    int texturePosition = (Integer.parseInt(vvtvn[1]) - 1) * 2;
                    float xcoord = (textureCoordinates.get(texturePosition));
                    float ycoord = (textureCoordinates.get(texturePosition+1));


                    // normalise
                    if(xcoord>1 || ycoord>1){
                        xcoord = xcoord / Math.max(xcoord, ycoord);
                        ycoord = ycoord / Math.max(xcoord, ycoord);
                    }

                    textureCoordinatesMesh.add(xcoord);
                    textureCoordinatesMesh.add(ycoord);

                }                   

                int normalPosition = (Integer.parseInt(vvtvn[2]) - 1) *3;

                vertexNormalsMesh.add(vertexNormals.get(normalPosition));
                vertexNormalsMesh.add(vertexNormals.get(normalPosition)+1);
                vertexNormalsMesh.add(vertexNormals.get(normalPosition)+2);

            }

        }

    }

    if(mesh!=null){             

        mesh.setIndices(GraphicsUtil.getShortBuffer(indices));
        mesh.setNumindices(indices.size());
        mesh.setNormals(GraphicsUtil.getFloatBuffer(vertexNormalsMesh));
        mesh.setTextureCoordinates(GraphicsUtil.getFloatBuffer(textureCoordinatesMesh));

        meshes.add(mesh);
    }// Adding the final mesh
}

And here is the code for drawing:

public void draw(GL10 gl){

    gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);

    // Counter-clockwise winding.
    gl.glFrontFace(GL10.GL_CCW);
    gl.glEnable(GL10.GL_CULL_FACE);
    gl.glCullFace(GL10.GL_BACK);

    // Pass the vertex buffer in
    gl.glVertexPointer(3, GL10.GL_FLOAT, 0,
                             vertices);

    for(int i=0;i<meshes.size();i++){
        meshes.get(i).draw(gl);
    }

    // Disable the buffers

    gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);

}

public void draw(GL10 gl){



    if(textureID>=0){

        // Enable Textures
        gl.glEnable(GL10.GL_TEXTURE_2D);

        // Get specific texture.
        gl.glBindTexture(GL10.GL_TEXTURE_2D, textureID);

        // Use UV coordinates.
        gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);

        // Pass in texture coordinates
        gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureCoordinates);

    } 

    // Pass in texture normals
    gl.glNormalPointer(GL10.GL_FLOAT, 0, normals);

    gl.glEnableClientState(GL10.GL_NORMAL_ARRAY);



        gl.glDrawElements(GL10.GL_TRIANGLES, numindices,GL10.GL_UNSIGNED_SHORT, indices);


    if(textureID>=0){
        // Disable buffers
        gl.glDisableClientState(GL10.GL_NORMAL_ARRAY);
        gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
    }

}

I'd really appreciate any help with this. It is frustrating to be not-quite able to load up the model from file and I'm really not sure what I'm doing wrong or missing here

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

梦里寻她 2024-11-07 05:01:38

我不得不承认对你的代码框架有点困惑。我预计会出现一个问题的具体事项:

  • 您拒绝将纹理坐标复制到与任何面关联的第三个顶点的最终网格列表;这应该会使你的所有坐标在前两个之后不同步,
  • 你的纹理坐标标准化步骤是不必要的——在某种程度上,我不确定它为什么在那里——并且可能会被破坏(如果第一个上的 xcoord 大于 ycoord 会怎么样?行,然后第二条更小?)
  • OBJ 认为 (0, 0) 是纹理的左上角,OpenGL 认为它是左下角,所以除非你在代码中设置纹理矩阵堆栈来反转纹理坐标未显示,您需要自己反转它们,例如 textureCooperativesMesh.add(1.0 - ycoord);

除此之外,我确信您已经很清楚并且不涉及通用 OBJ 注释这里的问题是,您应该期望处理不提供法线的文件和不提供法线或纹理坐标的文件(您当前假设两者都存在),并且 OBJ 可以保存具有任意数量顶点的面,不仅仅是三角形。但它们总是平面和凸面的,所以你可以把它们画成扇子,或者把它们分成三角形,就好像它们是扇子一样。

I have to admit to being a little confused by the framing of your code. Specific things I would expect to be an issue:

  • you decline to copy a texture coordinate to the final mesh list for the third vertex associated with any face; this should put all of your coordinates out of sync after the first two
  • your texture coordinate normalisation step is unnecessary — to the extent that I'm not sure why it's in there — and probably broken (what if xcoord is larger than ycoord on the first line, then smaller on the second?)
  • OBJ considers (0, 0) to be the top left of a texture, OpenGL considers it to be the bottom left, so unless you've set the texture matrix stack to invert texture coordinates in code not shown, you need to invert them yourself, e.g. textureCoordinatesMesh.add(1.0 - ycoord);

Besides that, generic OBJ comments that I'm sure you're already well aware of and don't relate to the problem here are that you should expect to handle files that don't supply normals and files that don't supply either normals or texture coordinates (you currently assume both are present), and OBJ can hold faces with an arbitrary number of vertices, not just triangles. But they're always planar and convex, so you can just draw them as a fan or break them into triangles as though they were a fan.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文