Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Water in Unity3D simulates rainy days

2025-03-26 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/02 Report--

This article mainly introduces "the water in Unity3D simulates rainy days". In the daily operation, I believe that many people have doubts about the problem of simulating rainy days in Unity3D. The editor has consulted all kinds of materials and sorted out simple and easy-to-use operation methods. I hope it will be helpful to answer the doubts of "water special features in Unity3D to simulate rainy days". Next, please follow the editor to study!

In the production of realistic games, it is often necessary to make rain scenes. Since almost all objects in daily life will be drenched, there are many aspects that need to be considered in the production of rain. We will analyze how to render a rainy scene from the aspects of particles, materials, script control, etc.

Material highlights:

In Unity's Standard Lighting, GGX is used as the highlight algorithm of BRDF. GGX has a trailing feeling and can better simulate the reflective effect of wet objects. First of all, let's take a look at this photo of the Imperial Palace when it rains (photo invading and deleting):

In the eyes of ordinary tourists, the ground has become "slippery", but in the eyes of rendering engineers, we should divide this "slippery" effect into four parts in PBR: the improvement of Smoothness, the brightening of Specular Color, the decrease of GI Occlusion and the decrease of normal map weight. First of all, there is no doubt that in order to make the surface of the object smooth, we must first reduce the roughness, but it is not just to reduce the roughness. When the water stays on the surface of the object, it is water rather than the object itself that reflects light, so the diffuse reflection of the object itself will be reduced, so the color of the object that is drenched will look darker. According to the law of physical reflection, the color of the object itself does not change. That is, in the case of no change in the proportion of light reflection, the degree of highlight should be improved, so it will show the phenomenon of increased highlight rate. Similarly, because the stagnant water stays on the surface of the object, the object will increase under the influence of ambient light, so we should appropriately reduce the influence of Occlusion Map and reduce the offset intensity of normal mapping, which are provided in the Standard Shader of Unity. There is no need to write Shader manually. Of course, if you want to deal with a large scene, you want to be controlled by global variables, or handwritten shader should be accurately optimized. Hope that readers can independently write based on PBR Shader without relying on Shader Forge, Unity Surface Shader and other auxiliary functions. As for the processing of reflected images, we generally use Reflection Probe, Screen Space Reflection & Planar Reflection and other methods, which are beyond the scope of this article and will be discussed in detail in other articles in this column.

Materials with different reflectivity

Then there is the production of Rain Water. Rain Water's own particle effect production belongs to the relatively primary particle production, and even there are a lot of resources on Assets Store, but it has higher requirements for art production ability. If, like me, art skills are extremely poor, you can buy an effect directly to achieve # manual squint #, and then bind a script to the particle emitter so that it always hovers over the camera, as you can see in the example. we use particle collisions to prevent piercing:

In the first view of the camera, the effect is as follows:

Rain Water falls on the plane.

At this point, a simple Rain Water rendering comes out, however, the whole picture looks stiff and rigid, because we do not show the effect of raindrops hitting the ground, so we need to simulate a dynamic normal map to make the ground normals "move". There are many solutions, the simplest way is to sequence frames, make sequence frames and render them in CG software (such as Houdini,Substance Painter, etc.). Then play it in Unity, which is of course a relatively simple method, but it is also impossible to achieve real-time and random. Here, we use the CommandBuffer of Unity to draw graphics at the bottom to achieve random raindrop effects.

Friends who have studied the basics of the rendering pipeline know that Unity's camera is not the only way to draw RenderTexture, it's just an encapsulated higher-level method, in fact, the camera's workflow is (remove-> draw grid-> post-processing) these three parts, whether it is Forward path or Deferred Shading path, or the latest HDRP and LWRP provided by Unity 2018, are essentially these three parts, the only difference is Deferred shading will use lighting as a post-processing operation, while forward path will directly transfer the light information to shader for light and shadow operation and directly output colors. Here, we do not need dynamic culling, we just need to use command buffer to GPU Instance on a specified Render Target and draw a large number of patches using the specified material. A friend asked me why I didn't use CustomRenderTexture launched by Unity 2017 to draw. I think CustomRenderTexture is just an upper wrapper for programs that won't render the underlying layer, and the actual function is better than using Graphics class or CommandBuffer to draw directly, the latter although the threshold is higher, but more powerful, probably equivalent to the relationship between Meitu and PhotoShop (just personal opinion, don't be mean).

First of all, we need to manually generate a square Mesh, and set the indexBuffer to quadrilateral drawing, the implementation is very simple, the code is as follows:

Because we draw directly to the screen, there is no need to consider the problem of ViewProjectionMatrix, just draw with NDC coordinates (- 1,1). If you draw this mesh directly to RenderTarget, it is a full-screen Mesh.

Next, we want to shrink the mesh and randomly distribute it on the RenderTarget to achieve the effect of randomly scattered raindrops. At this time, we need to use a matrix to transform. However, there are a large number of raindrops, in this case we have drawn 1023 raindrops, so it is difficult to rely on CPU for iterative rendering. Whether it is calculation or Drawcall, the consumption is unacceptable. So we use Compute Shader and Gpu Instance to draw, which greatly improves the efficiency of operation.

The first is Compute Shader, which does not dwell on how to use Compute Shader, but only provides the goal and process of achieving Compute Shader. Achieve the goal: generate a matrix of 1023 randomly assigned locations and execute 1023 timers. Why do you use a timer? the reason is simple: when a raindrop falls on the ground, the ripples should be lighter and lighter until they disappear, updating the position information when the ripples disappear, so that the patch is drawn in another location. The implementation code is as follows:

To explain the meaning of this code, MatrixBuffer is the 1023 coordinate matrix we need to use, and timeSliceBuffer is the timer we need to use, where the x value of float2 is the timer value and the y value is the timer speed. _ DeltaFlashSpeed is the update of each frame passed in by the script, namely Time.DeltaTime * X; then there are two LocalRand functions that use the magic number operation to output a pseudorandom number. The first function outputs a float2 random number in the (- 1,1) interval, which is used to randomly generate a plane position, while the second function outputs a float random number in the (0,1) interval, which is used to generate a random timer speed.

The following CSMain function is relatively simple: when the timer value is > 1, return to 0 and regenerate the random speed and position. According to the line generation basis, the M03 and M13 of the matrix determine the position of the xy axis, while the M00 M11 determines the Scale of the xy axis. Here, in order to be lazy, we decisively omit the random size of raindrops and directly use patches of the same size.

After you finish the operation in ComputeShader, you can get the result of the calculation in the script and draw it with the result. Of course, we need to initialize it before that:

Here you initialize Compute Shader,Compute Buffer and the GPU Instance and Gaussian Blur materials that you need to use (which will be used later).

The next step is to call Compute Shader and draw using CommandBuffer:

First, specify renderTarget and initialize it to (0.5, 0.5), which is the standard normal mapping format, then use the Compute Shader output matrix for Gpu Instance, and finally go through Gaussian Gaussian blur to make the picture smoother.

With the input timer and input matrix, you can start to draw ripples. Ripple drawing is actually very simple, directly using Alpha Blend to achieve the weakening effect, trigonometric functions to achieve fluctuations, directly on the code:

Shader "Unlit/Wave"

{

SubShader

{

Tags {"RenderType" = "Opaque"}

ZWrite Off

ZTest Always

Cull Off

Blend oneMinusSrcAlpha srcAlpha

Pass

{

CGPROGRAM

# pragma vertex vert

# pragma fragment frag

# pragma multi_compile_instancing

# include "UnityCG.cginc"

# pragma target 5.0

# define MAXCOUNT 1023

StructuredBuffer timeSliceBuffer

Struct appdata

{

Float4 vertex: POSITION

Float4 uv: TEXCOORD0

UNITY_VERTEX_INPUT_INSTANCE_ID

}

Struct v2f

{

Float4 vertex: SV_POSITION

Float timeSlice: TEXCOORD0

Float2 uv: TEXCOORD1

}

V2f vert (appdata v, uint instanceID: SV_InstanceID)

{

V2f o

UNITY_SETUP_INSTANCE_ID (v)

O.vertex = mul (unity_ObjectToWorld, v.vertex)

O.timeSlice = timeSliceBuffer [instanceID] .x

O.uv = v.uv

Return o

}

# define PI 18.84955592153876

Float4 frag (v2f I): SV_Target

{

Float4 c = 1

Float2 dir = i.uv-0.5

Float len = length (dir)

Bool ignore = len > 0.5

Dir / = max (len, 1e-5)

C.xy = (dir * sin (- i.timeSlice * PI + len * 20)) * 0.5 + 0.5

C. A = ignore? 1: i.timeSlice

Return c

}

ENDCG

}

}

}

Shader is very simple, it just paints a rough effect, and the resulting normal map is as follows:

As you can see, this picture, which is very friendly to dense phobia patients, already has different shades of ripples (though ugly). We put this renderTarget on the ground, and the effect is as follows:

You can see that the ground already has normal ripples. Recently, when I returned to visit my relatives during the holiday, I could only use the antique notebook at home to write articles. However, from particles to motion drawing, it only took about 4ms computing time on this antique, and drawcall did not add extra because of gpu instance. It can be said that the performance was relatively satisfactory.

At this point, the study on "the water in Unity3D is special to simulate rainy days" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report