Minecraft着色器如何知道仅像素是输入时的对象

发布于 2025-01-30 13:44:34 字数 1551 浏览 2 评论 0 原文

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

不气馁 2025-02-06 13:44:34

将着色器描述为“将一组像素作为输入并返回一堆新像素”的东西有点过于简化。我不知道您对着色器有什么了解,所以我将介绍与您的问题相关的几个概念。

渲染过程称为渲染管道,由GPU执行,并分为多个阶段。对于其中一些阶段,您可以实现自定义着色器程序。因此,每个可编程渲染阶段都有其自己的着色器,其输入和输出取决于阶段:

对于Minecraft,您可以在Wiki上找到有关着色器布局的一些信息: https://minecraft.fandom.com/wiki/shaders

这也是一个页面,列出了有关Shader编程和Optifine管道的详细资源: https://wiki.shaderlabs.org/wiki/getting_started
请务必检查它,因为您可能可以在此处找到您要查找的大多数信息。

无论如何,我将尝试简要摘要有关着色器以及他们如何“知道”有关场景(位置,实体...)的细节,而不仅仅是像素颜色。

顶点和碎片着色器

如前所述,有几种类型的着色器。在渲染过程中执行的两个“主”着色器是 vertex着色器 fragment着色器(有时称为像素着色器)。在Minecraft ShaderPacks中,它们分别具有扩展名 .vsh .fsh

顶点着色器是在渲染过程中执行的第一个着色器,它将场景的顶点和属性作为输入,并输出新的顶点。在这里,您可以在这里实现转换并取代网格的顶点,以获得某些效果。

片段着色器是在渲染过程中执行的最后一个着色器,并且是生成要放在屏幕上的图像的着色器。它需要从前阶段(顶点着色器)计算出的像素坐标以及数据,并为每个像素返回颜色。

从着色器到着色器属性传递数据

是每个顶点传递给GPU的特殊数据,并且您可以在着色器中使用。例如,您可以在场景中的每个顶点,一个整数变量,指示该顶点属于哪个对象或实体,并在顶点着色器代码中使用它。
您还可以生成新数据,然后将从顶点着色器传递到片段着色器输入,这些特殊变量称为 varyings

在GLSL中,属性和变化是您在脚本中声明的特殊变量。顶点着色器可能包含:

// A variable that receives a value passed by the actual program
// on the CPU side.
attribute float someAttribute;

// A variable that you can fill with a value from this shader,
// and its content will be passed to the fragment shader.
varying vec3 someOutput;

然后在片段着色器中:

// Value of the variable with the same name from the vertex shader.
varying vec3 someOutput;

每个顶点都有自己的 someattribute someoutput varible。

(您可能想知道如何为每个顶点分配的值如何传递给每个像素。我不会详细介绍,但是发生的是 somevalue 的值,即Fragment Shader 变量的三角形顶点的变量之间的插值

像素的接收是 someValue 和像素(这可能与您的问题更相关)是使用 buffer 。通常,将场景渲染到屏幕上是多次通过的。您可以将阴影呈现为一种纹理,然后在另一个纹理中呈现几何形状,最后将这些纹理与另一个着色器结合起来,以创建最终图像以输出到屏幕上。

例如,在Minecraft Optifine ShaderPack中,您可以编写特定的着色器以在缓冲纹理中渲染图像,其中每个像素的颜色代表您要独立处理的一些特定数据。例如,您可以使用一个缓冲区,仅渲染所针对的实体,以属于实体的白色像素进行着色,以及其他颜色,并将结果图像存储在质地中。然后,您可以从另一个着色器访问此掩码纹理,以添加一些后过程效果,例如突出显示您所针对的实体区域。简单地使用特殊功能(例如 texture(),在着色器中访问纹理以在某些变量中引用的纹理中的某些坐标( sampler2d type )在某些坐标中获取像素颜色>)。

从着色器选择哪些对象呈现的对象,这是在CPU方面决定的。程序(Minecraft)基本上是通过OpenGL API告诉GPU的3D模型,使用哪个阴影脚本,以及是否必须将生成的图像直接发送到屏幕或将其存储在纹理中。该程序可以要求GPU快速在具有不同的着色器和设置的不同预加载渲染程序之间切换。

在Optifine中,用于缓冲区的着色器文件对应于 gdbuffers _ 开始的文件。直接取自DOC( https://pastebin.com/ab5mj7an ):

这些文件用于渲染地形,实体,
天空以及游戏中的几乎所有其他东西。
文件的特定名称告诉您
关于它用于渲染的内容的更多信息。
SkyBasic首先运行,并处理主天色。
接下来是SkyTextruend,可以处理太阳和月亮。
接下来是地形,处理所有不透明的街区。

在同一链接上,他们给出了如何使用它的一些例子:

创建2个缓冲区。一个是材料缓冲区,
另一个是半透明的缓冲区。
使所有透明对象输出
向半透明缓冲区的颜色和一个
代表他们的ID的数字(通过
与物料缓冲液不同)。
复合[由Optifine定义的一种着色器]可以读取材料缓冲区,
并将半透明缓冲液与不透明
颜色缓冲区不同,具体取决于ID。

我希望这有助于回答您的问题。我还建议您直接查看某些ShaderPacks的源文件,以查看如何实现这些文件。

Describing a shader as something that "gets a set of pixels as an input and returns a bunch of new pixels" is a bit of an oversimplification. I don't know what is your knowledge of shaders, so I'll walk through several concepts relevant for your question.

The rendering process, called the rendering pipeline, is executed by the GPU and is split in several stages. For some of these stages you can implement a custom shader program. Thus each programmable rendering stage has its own shader whose inputs and outputs depend on the stage : https://www.khronos.org/opengl/wiki/Rendering_Pipeline_Overview

For Minecraft, you can find some information about the shader layout on the wiki : https://minecraft.fandom.com/wiki/Shaders

Here is also a page listing detailed resources about shader programming and Optifine's pipeline : https://wiki.shaderlabs.org/wiki/Getting_Started
Be sure to check it as you can probably find most of the information you are looking for here.

I'll anyway try to give a brief summary about shaders and how they can "know" details about the scene (positions, entities...), more than just pixel colors.

Vertex and fragment shaders

As stated previously, there are several types of shader. The two "main" shaders that are executed during rendering are the vertex shader and the fragment shader (sometimes called the pixel shader). In Minecraft shaderpacks, they have respectively the extensions .vsh and .fsh.

The vertex shader is the first shader executed during the rendering, it takes vertices of the scene and attributes as inputs, and outputs new vertices. This is where you can implement transformations and displace vertices of a mesh for some effects.

The fragment shader is the last shader executed during rendering, and is the one generating the image to be put on the screen. It takes pixel coordinates as well as data calculated from the previous stages (vertex shader) and returns a color for each pixel.

Passing data from shader to shader

Attributes are special data that is passed to the GPU for each vertex and that you can use within a shader. You can for example have, for each vertex in the scene, an integer variable indicating to which object or entity this vertex belongs to, and use it in your vertex shader code.
You can also generate new data that will then be passed from the vertex shader to the fragment shader input, these are special variables called varyings.

In GLSL, attributes and varyings are special variables you declare in your scripts. A vertex shader may contain :

// A variable that receives a value passed by the actual program
// on the CPU side.
attribute float someAttribute;

// A variable that you can fill with a value from this shader,
// and its content will be passed to the fragment shader.
varying vec3 someOutput;

Then in the fragment shader :

// Value of the variable with the same name from the vertex shader.
varying vec3 someOutput;

Each of the vertices have their own someAttribute and someOutput variable.

(You are maybe wondering how a value, assigned for each vertex, can then be passed to each pixels. I won't go into details, but what happens is that the value of someValue that the fragment shader receives for a pixel is an interpolation between the values of the someValue variable of the vertices of the triangle that has been projected on that pixel.)

Buffers

Another way a shader can have access to more information than just vertices and pixels (and that is probably more relevant for your question) is by using buffers. Often, rendering a scene to the screen is done in more than one pass. You may render shadows in one texture, then the geometry in another one, and finally combine these texture with another shader to create the final image to output to the screen.

In Minecraft Optifine shaderpacks for example, you can write a specific shaders to render an image in a buffer texture, where the color of each pixel represents some specific data you want to process independently. You can for example use a buffer where you render only the entity you are aiming at, coloring in white pixels that belong to the entity, and in black the other ones, and storing the result image in a texture. You can then access this mask texture from another shader to add some postprocess effects, for example highlighting the area of the entity you are aiming at. Accessing texture in a shader is simply done by using special functions such as texture() to get the pixel color at some coordinate in a texture which is referenced in some variable (sampler2D type).

As of how the shader choose what objects to render or not, this is decided on the CPU side. The program (here Minecraft) basically tells the GPU, via the OpenGL API, what 3D models to render, with which shader scripts, and if it must send the generated image directly to the screen or store it in a texture. The program can request the GPU to quickly switch between different preloaded rendering programs with different shaders and settings.

In Optifine, shader files for buffers correspond to the files starting with gdbuffers_. Taken directly from the doc (https://pastebin.com/aB5MJ7aN) :

These files are used to render terrain, entities,
the sky, and almost everything else in the game.
The specific name of the file tells you
a bit more about what it's used to render.
Skybasic runs first, and handles the main sky color.
This is followed by skytextured, which handles the sun and moon.
Up next comes terrain, which handles all opaque blocks.

On that same link, they give some example of how this can be used :

Create 2 buffers. one is a material buffer,
the other is the translucent buffer.
Make all transparent objects output their
color to the translucent buffer, and a
number representing their ID (passed in
with varyings) to the material buffer.
Composite [a type of shader defined by Optifine] can read the material buffer,
and mix the translucent buffer with the opaque
color buffer differently depending on the ID.

I hope this helps answering your question. I also recommend you to directly look into the source files of some shaderpacks to see how these are implemented.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文