ShaderGraph Energy Shield (II): UV Disturbance and Space Warp

The sharing video of Unity Japan meetup 2019 was flashed on Youtube. The sharing of ShaderGraph is very good. Follow me to learn.

preface

Previously: ShaderGraph energy shield (I): contact surface detection and edge illumination

Shader mainly includes the following contents:

  • Contact surface detection
  • Edge glow
  • UV perturbation and map blending
  • Space Warp (or Space Disturbance?)

This article is about UV perturbation map blending and space warping.

effect

On the basis of the previous article, diamond map blending and space warping are added:

Previous effect:

realization

UV perturbation and map blending

Add, Multiply, Sine and other mathematical calculations of UV XY based on the Time value.

On this basis, a noise map (Gradient Noise, etc.) is added, multiplied by the RGBA Color sampled under the new UV, and the alpha effect of the noise map is superimposed. Offset of noise map is also based on Time and other mathematical operations.

In a word, a bunch of mathematical operation nodes overlap each other to get the desired effect, and then the RGBA Color obtained will be multiplied with the RGBA Color calculated in the previous article to complete the mixing.

Personally, the disturbance effect of this part mainly depends on personal experience. The experienced TAFA will select the appropriate calculation method according to the needs, so the above calculation nodes have no particularly high reference value.

Space warp

GrabPass of build in pipeline

When using the traditional rendering pipeline, we can add GrabPass in the Shader to obtain the current rendered Buffer (used in the texture mode). We use this texture to do Gaussian blur, space warp and other effects.

The URP has hard coded several fixed passes, so the GrabPass of the old Shader will not be executed. Even if you add calls to Pass in the way of RenderFeatures, there will be no Build in like graphs fed to Pass.

URP

Once upon a time, under the URP default pipeline code, there were some intermediate or temporary textures that we could use, such as _CameraColorTexture , _CameraOpaqueTexture , _AfterPostProcessTexture Etc.

However, at present, it seems that only _CameraOpaqueTexture

Generally, we use _CameraOpaqueTexture It is sufficient, but its disadvantage is obvious: it is the rendering result of the opaque object rendering queue, and does not include translucent objects.

If you want to obtain the rendering results of translucent objects, there are two ways:

  • Add a URP RendererFeatures (RenderObjects), add your Shader's Pass, and adjust the Queue equivalent to ensure that your Pass is called at the right time to do things in your Pass (for example, open the Camera Stack to obtain _AfterPostProcessTexture).
  • Change the URP pipeline source code, and add other passes and related processing in the form of hard coding (Unity does this itself, see the URP source code for details).

Of course, it's a bit far fetched. This article uses _CameraOpaqueTexture OK.

Keep looking

Obtained in ShaderGraph _CameraOpaqueTexture Very simple, add a name as _CameraOpaqueTexture Parameters.

Using noise maps to sample slices _CameraOpaqueTexture When the UV changes and perturbs, the spatial perturbation effect is achieved:

The color of the space warp should be alpha blended with the special effect color above, but the sharing session is simple and crude: directly use two Materials/Shaders to render the space warp first, then render the special effect color, and let Unity overlap by itself.

Of course, it is also possible to use the same Shader, but several nodes need to be added (to put it bluntly: if it is written in code, it can be done in a few lines).

final result

Zimiao haunting blog (azimiao. com) All rights reserved. Please note the link when reprinting: https://www.azimiao.com/9999.html
Welcome to the Zimiao haunting blog exchange group: three hundred and thirteen million seven hundred and thirty-two thousand

Comment

*

*