Translation preface
It is a great open source WebGL library, which allows JavaScript to operate GPUs and implements true 3D on the browser side. However, this technology is still in the development stage and there is extremely scarce information. Enthusiasts must learn through the Demo source code and their own source code.
0. Introduction
I have already given an article called "Start Use". If you haven't read it, you may need to read it. The foundation of this article is based on that tutorial.
I want to discuss shaders. Native WebGL is excellent before helping you get rid of a lot of trouble. Sometimes, you may want to complete some specific effects, or want to study more deeply about what is presented on your screen, and the shader will definitely come into your field of view. If you are like me, you also want to implement something more interesting than the basics in the previous tutorial. In this tutorial, I will explain the basics, which actually do a lot of boring work for us.
Before I start, I will say that this tutorial will have a lot of space to explain the shader code. Then there will be a tutorial that will advance a little based on the shader code and use the shader to do something. This is because shaders look not easy to understand at first glance and require some explanation.
1. Two shaders
WebGL does not have a fixed rendering pipeline, so you cannot directly use a black box shader (Translator's note: Graphics cards in the last century only support fixed rendering pipelines); WebGL provides programmable pipelines, which is more powerful but harder to understand and use. Long story short, programmable rendering pipeline means that the person who writes the program is responsible for getting the vertex and drawing it on the screen. Shaders are part of the rendering pipeline and there are two types of shaders:
1. Vertex Shader
2. Chip Shader
What you should know is that both shaders run completely on the GPU of the graphics card, and we remove the data that they need to process from the CPU and install it on the GPU, reducing the CPU's Fudan. Modern GPUs have greatly optimized the types of operations required by shaders, which is worth it.
2. Vertex Shader
A primitive shape, such as a sphere, is composed of vertices, right? Vertex shaders are passed into one of these vertices in turn and then processed. How to handle each vertex is freely customizable, but the vertex shader has a must-do, which is to assign a value to a variable called gl_Position, which is a 4-dimensional array that indicates where the vertex ends up on the screen. This is an interesting process in itself, because we are actually talking about how to convert a three-dimensional coordinate (a vertex with x, y, z worthy of vertices) into, or projecting a two-dimensional screen. Thankfully, if we use tools like this, we can access gl_Position so easily.
3. Fragment shader
Now we have a three-dimensional object containing vertices. Now we have to project the object onto the two-dimensional screen, but where is the color? What about texture and lighting? This is exactly what the chip shader needs to deal with.
Similar to vertex shaders, the fragment shader has a task that must be completed: set or eliminate the variable gl_FragColor, another four-dimensional floating point variable, which is the final color of the fragment point. What is a film? Imagine a triangle with three vertices. The ingot is all the points inside the triangle after these three vertices are calculated. Therefore, the fragment value is generated by the value interpolation of the vertex. If the color of a vertex is red and the color of adjacent vertices is blue, then we can observe that the color gradually changes from near the red vertex, from red to purple, and eventually turns blue near the blue vertex.
4. Shader variables
Speaking of shader variables, there are three types: Uniforma, Attributes, and Varyings. When I first heard these three words, I was confused because they didn’t match anything I used before. But now, you can understand them like this:
Variables can be passed to either the vertex shader or the chip shader, which contain variables that remain unchanged throughout the rendering process, such as the location of a point light source.
Variables correspond to each vertex, and they can only be passed into the vertex shader, for example, each vertex has a color. The relationship between the Attributes variable and the vertex corresponds one by one.
Variables are variables defined in the vertex shader and are ready to be passed to the fragment shader. To ensure this, we need to make sure that the types and naming of variables are exactly the same in both shaders. A classic application is the normal vector, because normals are required to be used when calculating lighting.
In the next tutorial, I will use these three variables, and you will also learn how to truly apply these three variables.
Now, we have talked about vertex shader, fragment shader, and three shader variables. It's time to look at the easiest shader we can create.
World (Translator complains: Can you not show French)
Here is the simplest vertex shader:
/**
* Multiply each vertex coordinate by the model view matrix multiplied by the projection matrix
* Get coordinates on a 2D screen
*/
void main() {
gl_Position = projectionMatrix *
modelViewMatrix *
vec4(position,1.0);
}
One of the simplest pieces shaders:
/**
* Set any cell color to pink
*/
void main() {
gl_FragColor = vec4(1.0, // R
0.0, // G
1.0, // B
1.0); // A
}
That's all. If you run it directly now, you can see a "lightless" pink shape on the screen. Not very complicated, isn't it?
In the vertex shader, we pass in some uniforms variables. There are two 4×4 matrix uniforms variables: model view matrix and projection matrix. You don't need to know much about how these two matrices work. Simply put, these two matrices describe how three-dimensional point coordinates are projected into coordinates on a two-dimensional screen.
In fact, I only introduced these two short snippets. They've been added before your own shader code, so you don't have to worry about it. To be honest, there are also many things in front of your code, such as lighting data, node colors, node normal vectors, etc. If not you were going to create and set up these objects yourself, really.
6. Use the shader material
/**
* Suppose we can use JQuery
* Extract the shader's code text from the DOM
*/
var vShader = $('vertexshader');
var fShader = $('fragmentshader');
var shaderMaterial =
new ({
vertexShader: (),
fragmentShader: ()
});
From here, your shader will be compiled and run, connecting it to the material you created, which is attached to the mesh you created. It hasn't become easier than it really is. Maybe that's the case, but we're considering browser 3D programming, and I think you should expect that this topic has some complexity.
We can also add two other attributes like the shader material: uniforms and attributes. They can be vectors, integers, or floating point numbers, but as I said before, uniforms variables remain unchanged during the calculation of all points, so they are more likely to be single values, and attributes variables are for each vertex, so they should be arrays. In a mesh, the attribute variable and the vertex should correspond one by one.
7. Summary
That's all for this tutorial. Actually, I've already talked a lot, but in many aspects I just skipped it. In the next tutorial I will provide a complex shader, through which I will pass in some attributes variables and uniforms variables to do some simulated lighting effects.
I'll take this tutorialSource codePacked, you can download it as a reference
It is a great open source WebGL library, which allows JavaScript to operate GPUs and implements true 3D on the browser side. However, this technology is still in the development stage and there is extremely scarce information. Enthusiasts must learn through the Demo source code and their own source code.
0. Introduction
I have already given an article called "Start Use". If you haven't read it, you may need to read it. The foundation of this article is based on that tutorial.
I want to discuss shaders. Native WebGL is excellent before helping you get rid of a lot of trouble. Sometimes, you may want to complete some specific effects, or want to study more deeply about what is presented on your screen, and the shader will definitely come into your field of view. If you are like me, you also want to implement something more interesting than the basics in the previous tutorial. In this tutorial, I will explain the basics, which actually do a lot of boring work for us.
Before I start, I will say that this tutorial will have a lot of space to explain the shader code. Then there will be a tutorial that will advance a little based on the shader code and use the shader to do something. This is because shaders look not easy to understand at first glance and require some explanation.
1. Two shaders
WebGL does not have a fixed rendering pipeline, so you cannot directly use a black box shader (Translator's note: Graphics cards in the last century only support fixed rendering pipelines); WebGL provides programmable pipelines, which is more powerful but harder to understand and use. Long story short, programmable rendering pipeline means that the person who writes the program is responsible for getting the vertex and drawing it on the screen. Shaders are part of the rendering pipeline and there are two types of shaders:
1. Vertex Shader
2. Chip Shader
What you should know is that both shaders run completely on the GPU of the graphics card, and we remove the data that they need to process from the CPU and install it on the GPU, reducing the CPU's Fudan. Modern GPUs have greatly optimized the types of operations required by shaders, which is worth it.
2. Vertex Shader
A primitive shape, such as a sphere, is composed of vertices, right? Vertex shaders are passed into one of these vertices in turn and then processed. How to handle each vertex is freely customizable, but the vertex shader has a must-do, which is to assign a value to a variable called gl_Position, which is a 4-dimensional array that indicates where the vertex ends up on the screen. This is an interesting process in itself, because we are actually talking about how to convert a three-dimensional coordinate (a vertex with x, y, z worthy of vertices) into, or projecting a two-dimensional screen. Thankfully, if we use tools like this, we can access gl_Position so easily.
3. Fragment shader
Now we have a three-dimensional object containing vertices. Now we have to project the object onto the two-dimensional screen, but where is the color? What about texture and lighting? This is exactly what the chip shader needs to deal with.
Similar to vertex shaders, the fragment shader has a task that must be completed: set or eliminate the variable gl_FragColor, another four-dimensional floating point variable, which is the final color of the fragment point. What is a film? Imagine a triangle with three vertices. The ingot is all the points inside the triangle after these three vertices are calculated. Therefore, the fragment value is generated by the value interpolation of the vertex. If the color of a vertex is red and the color of adjacent vertices is blue, then we can observe that the color gradually changes from near the red vertex, from red to purple, and eventually turns blue near the blue vertex.
4. Shader variables
Speaking of shader variables, there are three types: Uniforma, Attributes, and Varyings. When I first heard these three words, I was confused because they didn’t match anything I used before. But now, you can understand them like this:
Variables can be passed to either the vertex shader or the chip shader, which contain variables that remain unchanged throughout the rendering process, such as the location of a point light source.
Variables correspond to each vertex, and they can only be passed into the vertex shader, for example, each vertex has a color. The relationship between the Attributes variable and the vertex corresponds one by one.
Variables are variables defined in the vertex shader and are ready to be passed to the fragment shader. To ensure this, we need to make sure that the types and naming of variables are exactly the same in both shaders. A classic application is the normal vector, because normals are required to be used when calculating lighting.
In the next tutorial, I will use these three variables, and you will also learn how to truly apply these three variables.
Now, we have talked about vertex shader, fragment shader, and three shader variables. It's time to look at the easiest shader we can create.
World (Translator complains: Can you not show French)
Here is the simplest vertex shader:
Copy the codeThe code is as follows:
/**
* Multiply each vertex coordinate by the model view matrix multiplied by the projection matrix
* Get coordinates on a 2D screen
*/
void main() {
gl_Position = projectionMatrix *
modelViewMatrix *
vec4(position,1.0);
}
One of the simplest pieces shaders:
Copy the codeThe code is as follows:
/**
* Set any cell color to pink
*/
void main() {
gl_FragColor = vec4(1.0, // R
0.0, // G
1.0, // B
1.0); // A
}
That's all. If you run it directly now, you can see a "lightless" pink shape on the screen. Not very complicated, isn't it?
In the vertex shader, we pass in some uniforms variables. There are two 4×4 matrix uniforms variables: model view matrix and projection matrix. You don't need to know much about how these two matrices work. Simply put, these two matrices describe how three-dimensional point coordinates are projected into coordinates on a two-dimensional screen.
In fact, I only introduced these two short snippets. They've been added before your own shader code, so you don't have to worry about it. To be honest, there are also many things in front of your code, such as lighting data, node colors, node normal vectors, etc. If not you were going to create and set up these objects yourself, really.
6. Use the shader material
Copy the codeThe code is as follows:
/**
* Suppose we can use JQuery
* Extract the shader's code text from the DOM
*/
var vShader = $('vertexshader');
var fShader = $('fragmentshader');
var shaderMaterial =
new ({
vertexShader: (),
fragmentShader: ()
});
From here, your shader will be compiled and run, connecting it to the material you created, which is attached to the mesh you created. It hasn't become easier than it really is. Maybe that's the case, but we're considering browser 3D programming, and I think you should expect that this topic has some complexity.
We can also add two other attributes like the shader material: uniforms and attributes. They can be vectors, integers, or floating point numbers, but as I said before, uniforms variables remain unchanged during the calculation of all points, so they are more likely to be single values, and attributes variables are for each vertex, so they should be arrays. In a mesh, the attribute variable and the vertex should correspond one by one.
7. Summary
That's all for this tutorial. Actually, I've already talked a lot, but in many aspects I just skipped it. In the next tutorial I will provide a complex shader, through which I will pass in some attributes variables and uniforms variables to do some simulated lighting effects.
I'll take this tutorialSource codePacked, you can download it as a reference