Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What is needsUpdate in three.js

2025-04-05 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/03 Report--

This article shows you what the needsUpdate in three.js is. It is concise and easy to understand. It will definitely brighten your eyes. I hope you can get something through the detailed introduction of this article.

Many objects in three.js have a needsUpdate attribute, which is rarely written in the document (but the three.js document is not much, and many problems depend on the issues on github), and it is not very good at writing this attribute in various tutorials on the Internet, because this attribute is not needed for simple starters.

So what exactly is this attribute used for? in a word, it tells renderer that I should update the cache at this frame. Although the use as a flag bit is very simple, it is necessary to know why and which cache to update.

Why do you need needsUpdate

First of all, let's take a look at why caching is needed. Caches generally exist to reduce the number of data transfers, thus reducing the time spent by programs in data transmission. Here, it is also not easy for an object (Mesh) to be successfully displayed in front of the screen, and it needs to be transferred three times.

First of all, all the vertex data and texture data are read from the local disk to memory through the program.

Then the program will transfer the vertex data and texture data of the objects that need to be drawn in front of the screen to the video memory after proper processing in memory.

Finally, when each frame is rendered, the vertex data and texture data in the display memory are flush to GPU for assembly and drawing.

According to the pyramid data transfer model, the first step is obviously the slowest, and it is even slower if it is transmitted over the network in an environment like WebGL, followed by the time from memory to video memory, which will be followed by a simple data test.

Then there is the frequency of these three steps. For small scenarios, the first step is one-time, that is, all the data of a scene will be loaded into memory every time the program is initialized. For large scenarios, some asynchronous loading may be done, but it is not among the issues we are considering for the time being. For the frequency of the second step, it should be the most important thing to talk about this time. First, write a simple program to test the consumption caused by doing this step of transmission.

The code is as follows:

Var canvas = document.createElement ('canvas')

Var _ gl = canvas.getContext ('experimental-webgl')

Var vertices = []

For (var I = 0; I < 1000003; iTunes +) {

Vertices.push (I * Math.random ())

}

Var buffer = _ gl.createBuffer ()

Console.profile ('buffer_test')

BindBuffer ()

Console.profileEnd ('buffer_test')

Function bindBuffer () {

For (var I = 0; I < 1000; iTunes +) {

_ gl.bindBuffer (_ gl.ARRAY_BUFFER, buffer)

_ gl.bufferData (_ gl.ARRAY_BUFFER, new Float32Array (vertices), _ gl.STATIC_DRAW)

}

}

First of all, a brief explanation of this program, vertices is an array of saved vertices, here is randomly generated 1000 vertices, because each vertex has three coordinates, so you need a 3000-size array. The _ gl.createBuffer command opens up a cache in video memory to store vertex data, and then use _ gl.bufferData to transfer a copy of vertex data from memory to video memory. Here, we assume that an object with 1000 1000 vertices in a scene, each vertex is 3 32-bit 4-byte float data, and the calculation is about 1000 x 1000 x 12 = 11m. Profile almost consumes 15ms time. Here we may take a look at 15ms for only so little time, but for a real-time program, if you want to ensure the frame rate of 30fps, the time required for each frame should be controlled around 30ms. It takes half the time to do just one data transfer. You should know that the main part should be the drawing operation in GPU and all kinds of processing in CPU, and you should be stingy with every step in the whole rendering process.

So we should try our best to reduce the number of transfers in this step. In fact, we can transfer all the vertex data and texture data from memory to video memory as soon as it is loaded. This is what three.js does now. For the first time, the vertex data of the object (Geometry) that needs to be drawn is transferred to video memory, and the buffer is cached to geometry.__webglVertexBuffer. After that, the verticesNeedUpdate attribute of Geometry is judged every time it is painted. If you do not need to update and directly use the current cache, if you see that verticesNeedUpate is true, you will re-transfer the vertex data in Geometry to geometry.__webglVertexBuffer. Generally speaking, we do not need this step for static objects, but if you encounter objects whose vertices change frequently, such as particle systems that use vertices as particles, and Mesh that use bone animation, these objects will change their vertices every frame. So you need to set its verticesNeedUpdate property to true for each frame to tell renderer that I need to retransmit the data!

In fact, in the WebGL program, it is more likely to change the position of vertices in vertex shader to complete the particle effect and bone animation. Although it is easier to expand if you put it on the cpu side, because of the limitation of the computing power of javascript, more of these computationally intensive operations will be put on the gpu side. In this case, there is no need to retransmit vertex data, so the above kind of case is not very useful in the actual program, but more likely to update the texture and material cache.

The above case mainly describes a scene in which vertex data is transferred. In addition to vertex data, a big head is texture. A 1024mm 1024 R8G8B8A8 texture takes up to 4m of memory, so take a look at the following example

The code is as follows:

Var canvas = document.createElement ('canvas')

Var _ gl = canvas.getContext ('experimental-webgl')

Var texture = _ gl.createTexture ()

Var img = new Image

Img.onload = function () {

Console.profile ('texture test')

BindTexture ()

Console.profileEnd ('texture test')

}

Img.src = 'test_tex.jpg'

Function bindTexture () {

_ gl.bindTexture (_ gl.TEXTURE_2D, texture)

_ gl.texImage2D (_ gl.TEXTURE_2D, 0, _ gl.RGBA, _ gl.RGBA, _ gl.UNSIGNED_BYTE, img)

}

There is no need to abnormal repeat 1000 times, a transfer of 10241024 of the texture has already spent 30ms, a piece of 256256 is almost 2ms, so the texture in three.js is only transmitted once at the beginning as far as possible, and then if the texture.needsUpdate attribute is not manually set to true, the texture that has been transferred to video memory will be used directly.

Which caches need to be updated

The above describes why three.js adds such a needsUpdate attribute through two case, and then lists several scenarios to know when these caches need to be updated manually.

Asynchronous loading of textures

This is a small pit, because the front-end image is loaded asynchronously. If you write texture.needsUpdate=true directly after creating the img, the renderer of three.js will use _ gl.texImage2D to transfer the empty texture data to video memory in this frame, and then set the flag bit to false. After that, the video memory data will not be updated when the image is loaded. So you have to write texture.needsUpdate = true after the whole picture is loaded in the onload event.

Video texture

Most textures are like the above case directly load and transfer pictures, but not for video texture, because the video is a picture stream, each frame to display a different picture, so each frame needs to be set needsUpdate to true to update the texture data in the graphics card.

Use render buffer

Render buffer is a special object. General programs flush directly to the screen after the whole scene is drawn, but if there is more post processing or this screen based xxx (such as screen based ambient occlusion), you need to draw the scene to a render buffer first. This buffer is actually a texture, but it is generated by drawing in the previous step, not loaded from disk. There is a special texture object WebGLRenderTarget in three.js to initialize and save renderbuffer. This texture also needs to set needsUpdate to true at each frame.

NeedsUpdate of Material

Materials are described through THREE.Material in three.js. In fact, materials do not have any data to transmit, but why do we need a needsUpdate? here, we also want to talk about shader. Shader literally translates as a shader, which provides the possibility of programming to deal with vertices and pixels in gpu. There is a term shading in painting to describe the shading of painting, and shading in GPU is also similar. Through the program to calculate the light and shade to show the material of the object, ok, since shader is a program running on GPU, then like all programs need to carry out a compilation link operation, WebGL is in the runtime when the shader program is compiled, which of course takes time, so it is best to be able to compile to the end of the program. So in three.js, the shader program is compiled and linked at the time of material initialization and the program object obtained from the compiled link is cached. Generally, a material does not need to recompile the entire shader, the material adjustment only needs to modify the uniform parameter of the shader. But if you replace the entire material, such as replacing the shader of the original phong with a shader of lambert, you need to set material.needsUpdate to true to do a new compilation. However, this situation is rare, and one of the situations mentioned below is more common.

Add and remove lights

This should be quite common in the scene. Many people who just started using three.js will fall into this pit and find out why the light doesn't work after dynamically adding a light to the scene. However, when using three.js 's built-in shader, such as phong and lambert, if you look at the source code in renderer, you will find that three.js uses # define in the built-in shader code to set the number of lights in the scene. The value of # define is obtained by string concatenation shader each time the material is updated. The code is as follows

The code is as follows:

"# define MAX_DIR_LIGHTS" + parameters.maxDirLights

"# define MAX_POINT_LIGHTS" + parameters.maxPointLights

"# define MAX_SPOT_LIGHTS" + parameters.maxSpotLights

"# define MAX_HEMI_LIGHTS" + parameters.maxHemiLights

It is true that this writing method can effectively reduce the use of gpu registers. If there is only one light, you can declare only one uniform variable for one light, but every time the number of lights changes, especially when adding, you need to re-splice, compile and link shader. In this case, you also need to set the material.needsUpdate of all materials to true.

Change texture

Changing the texture here does not mean updating the texture data, but the original material uses the texture, but later it is not used, or the original material does not use the texture, or the original material does not use the texture. If you do not manually force the material to update, the final effect will be different from what you think. The reason for this problem is similar to adding lights above, but also because a macro is added to shader to determine whether the texture is used.

The code is as follows:

Parameters.map? "# define USE_MAP": ""

Parameters.envMap? "# define USE_ENVMAP": ""

Parameters.lightMap? "# define USE_LIGHTMAP": ""

Parameters.bumpMap? "# define USE_BUMPMAP": ""

Parameters.normalMap? "# define USE_NORMALMAP": ""

Parameters.specularMap? "# define USE_SPECULARMAP": ""

So the material needs to be updated every time map, or envMap or lightMap, etc., change the true value.

Changes in other vertex data

In fact, the above texture changes will also give rise to a problem, mainly because there is no texture at the time of initialization, but later dynamically added to this environment, it is not enough to set material.needsUpdate to true, but also need to set geometry.uvsNeedsUpdate to true, why is there such a problem, or because of three.js 's optimization of the program, when initializing geometry and material for the first time in renderer, if it is judged that there is no texture? Although the data in memory has each vertex uv data, but three.js will not copy these data to video memory, the original intention should be to save some valuable memory space, but after adding texture geometry will not be very intelligent to re-transmit these uv data for texture use, we must manually set uvsNeedsUpdate to tell it it is time to update uv, this problem really cheated me for a long time at the beginning.

For the needUpdate attributes of several vertex data, you can see this issue

Https://github.com/mrdoob/three.js/wiki/Updates

Last

The optimization of three.js is good, but under all kinds of optimization, there are all kinds of possible pitfalls. The best way to do this is to look at the source code or mention issues on github.

The above content is what is needsUpdate in three.js? have you learned any knowledge or skills? If you want to learn more skills or enrich your knowledge reserve, you are welcome to follow the industry information channel.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report