RenderTexture of CommandBuffer learning

RenderTexture of CommandBuffer learning ...
1, What is RenderTexture
2, RenderTexture utility
3, How RenderTexture is created:
4, Properties of RenderTexture
5, Summary
RenderTexture of CommandBuffer learning

1, What is RenderTexture

  1. A special Texture type in Unity. The provider inherits Texture

  2. Texture objects are divided into two types by visitors:
    • One is that the server side is created in the video memory and accessed by the GPU

    • The other is that the client side is created in memory and accessed by the CPU

    • When the CPU submits a rendering request, it will copy the texture object in memory to the GPU accessible video memory

    • Of course, if necessary, you can also copy the texture in GPU video memory to CPU accessible memory

  3. RenderTexture is the Texture object of server side on GPU. It is directly associated with a memory block called FBO(Frame Buffer Object) in GPU video memory

  4. FBO is the destination of all GPU rendering

  5. GPU can create multiple FBOs. One of the default FBOs is directly associated with the display device, which displays the content we draw on the display device

  6. Because it is a server side Texture, RenderTexture can be directly used by GPU rendering, and there is no need to copy when the CPU submits a rendering request

2, RenderTexture utility

  1. Screen image post-processing: after all objects in the scene are rendered, perform various image processing on the output RenderTexture, such as blur, distortion, color removal, color correction, etc. these image processing algorithms can refer to the post-processing effect of Photoshop

  2. Shadow processing: use another camera to render the depth of the object in the current scene into a RenderTexture, and then judge whether it is obscured by the depth when the main camera renders the object, so as to judge whether a shadow is generated

  3. Delayed rendering: render multiple lighting elements and model elements to different rendertextures respectively, and finally assemble all elements to draw the overall effect of the model to the final display device

  4. The above are just a few commonly used. Now almost all advanced real-time rendering effects use RenderTexture. Its shadow is everywhere

3, How RenderTexture is created:

  1. Use the constructor of RenderTexture
    • The RenderTexture created by this method needs to be managed by itself, including creation and release, effectiveness, etc

    • Example:

    //one   Create a RenderTexture directly by calling the constructor var RT = new RenderTexture(1024, 1024, 16, RenderTextureFormat.ARGB32, RenderTextureReadWrite.sRGB); //two   Release RenderTexture RT.Release(); GameObject.Destroy(RT); RT = null;
  2. Use the GetTemporary static function of RenderTexture
    • This method requests Unity's RenderTexture cache system to obtain a qualified RenderTexture

    • The creation and release of the RenderTexture applied by this method are handled by the Unity cache system, and we have only obtained the right to use it.

    • Example:

    //Call the static function GetTemporary to apply for the right to use a RenderTexture from the cache system RT = RenderTexture.GetTemporary(1024, 1024, 16, RenderTextureFormat.ARGB32, RenderTextureReadWrite.sRGB); //Return the right to use RenderTexture to Unity's cache system. RenderTexture.ReleaseTemporary(RT); RT = null;
  3. GetTemporary static function using CommandBuffer
    • The essence of this method is consistent with the method in (2). Both obtain the right to use a RenderTexture from Unity's cache system.

    • The difference between him and method (2) is that his first parameter is an integer ID pointing to RenderTexture, which can also be used as a pointer.

    • Example:

    //Create a CommandBuffer var cmd = new CommandBuffer(); //Defines an ID that points to the rendered texture var texID = Shader.PropertyToID("_RTTex"); //Apply for a RenderTexture from Unity's cache system through the GetTemporaryRT function of CommandBuffer cmd.GetTemporaryRT(texID, 1024, 1024, 16, FilterMode.Bilinear, RenderTextureFormat.ARGB32, RenderTextureReadWrite.sRGB); //Return the right of use cmd.ReleaseTemporaryRT(texID); //Mount to camera _currentCamera.AddCommandBuffer(CameraEvent.BeforeImageEffects, cmd); //Release from camera _currentCamera.RemoveCommandBuffer(CameraEvent.BeforeImageEffects, cmd); cmd.Release(); cmd = null;

4, Properties of RenderTexture

  1. Formatcolor format:

    • One format, ARGB32, is supported by default

    • Be sure to understand that TextureFormat and RenderTextureFormat are not the same thing.

    • If you do not use ARGB32, you must query whether the GPU supports this format through the function: SystemInfo.SupportsRenderTextureFormat

    bool isSupport = SystemInfo.SupportsRenderTextureFormat(RenderTextureFormat.ARGB32);
  2. size, width and height integer:

    • The type is integer

    • The size setting is generally not limited. Generally, the maximum is the screen pixel size. Except under special circumstances.

    • A special use is that when using the GetTemporaryRT function of CommandBuffer, if a negative number is entered when the parameter is entered, it means that it is 1 / 2 of the width and height of the currently associated camera pixel.
      • When the value is less than 0:

      • -1: Camera pixels

      • -2: 1 / 2 of the camera pixel width and height

      • -3: 1 / 3 of the camera pixel width and height

      • and so on.

    • Example:

    //1. Directly create a render texture by calling the RenderTexture constructor and specify the size explicitly var RT = new RenderTexture(1024, 1024, 16, RenderTextureFormat.ARGB32, RenderTextureReadWrite.sRGB); RT.Release(); GameObject.Destroy(RT); RT = null; //2. Call the static function GetTemporary of class RenderTexture to create a render texture and specify the size explicitly RT = RenderTexture.GetTemporary(1024, 1024, 16, RenderTextureFormat.ARGB32, RenderTextureReadWrite.sRGB); RenderTexture.ReleaseTemporary(RT); RT = null; var cmd = new CommandBuffer(); var texID = Shader.PropertyToID("_RTTex"); //3. Create a rendering texture through the function of CommandBuffer and specify the size explicitly. cmd.GetTemporaryRT(texID, 1024, 1024, 16, FilterMode.Bilinear, RenderTextureFormat.ARGB32, RenderTextureReadWrite.sRGB); cmd.ReleaseTemporaryRT(texID); //4. Create a rendering texture through the CommandBuffer function, specify a - 1, and the size of the rendering texture is the CommandBuffer associated camera_ The size of PixelWidth and PixelHeight of currentamera cmd.GetTemporaryRT(texID, -1, -1, 16, FilterMode.Bilinear, RenderTextureFormat.ARGB32, RenderTextureReadWrite.sRGB); cmd.ReleaseTemporaryRT(texID); //5. Create a rendering texture through the CommandBuffer function, specify a - 2, and the size of the rendering texture is the CommandBuffer associated camera_ 1 / 2 of the size of PixelWidth and PixelHeight of currentamera cmd.GetTemporaryRT(texID, -2, -2, 16, FilterMode.Bilinear, RenderTextureFormat.ARGB32, RenderTextureReadWrite.sRGB); cmd.ReleaseTemporaryRT(texID); //6. Create a rendering texture through the CommandBuffer function, specify a - 2, - 3, and the width of the rendering texture is_ 1 / 2 of currentamera.pixelwidth, and the height of the rendered texture is_ 1 / 3 of the size of currentamera.pixelheight cmd.GetTemporaryRT(texID, -2, -3, 16, FilterMode.Bilinear, RenderTextureFormat.ARGB32, RenderTextureReadWrite.sRGB); cmd.ReleaseTemporaryRT(texID); //Mount to camera _currentCamera.AddCommandBuffer(CameraEvent.BeforeImageEffects, cmd);
  3. ZBuffer depth and template stencil

    • Here, set the size of Zbuffer, which contains two types of content: depth and template stencil

    • There are three valid parameter values: 0,16,24;
      • 0: do not save depth and stencil

      • 16: Only depth is saved, not stencil

      • 24: both depth and stencil are saved

    • Through the parameter value, we can correspond to the following information:
      • The depth information is only 16 bits

      • The template value has only 8 bits, which is why when using a template, the maximum reference value is 255. When increasing 255, it will overflow and then become 0

    • Example:

    //1. Create a texture without saving depth var RT = new RenderTexture(1024, 1024, 0); RT.Release(); //2. Create a texture that only saves depth information RT = new RenderTexture(1024, 1024, 16); RT.Release(); //3. Create a texture that holds depth and template information RT = new RenderTexture(1024, 1024, 24); RT.Release(); GameObject.Destroy(RT); RT = null;
  4. Read / write mode of RenderTexture

    • In fact, it defines the conversion mode of rendered texture when reading and writing in color space.

    • There are three enumeration values:
      • Default: directly use the settings in Project Settings.

      • Linear: mark that the texture data stores linear values, and no color conversion is performed during reading and writing

      • SRGB: mark that this texture saves sRGB value and performs color conversion when reading and writing

  5. Filter mode: FilterMode:

    • Defines the sampling filter mode when the current texture is used

    • There are three enumeration values: point, bilinear and trilinear

    • Let's describe these three modes and write some simulation code

      • Point: get the color value of a texture element directly from the texture as the final output of the current sample

      //Point sampling simulation Color Sampling(Texture2D tex,float2 uv) { //1. Calculate the grade value of mipmap according to the partial derivative   float2 p =_Tex_Size.xy * float2(ddx(uv),ddy(uv)) ; float lod = 0.5 * log2(max(dot(p.x, p.x), dot(p.y, p.y))); float level = Mathf.floor(lod); //2. Get mipmap sampleTex mpTex = tex.GetMipmap(level); //3. Calculate the position of the texture element float2 pos = Mathf.Floor(_Tex_Size.xy * uv / pow(2,level)); //4. Obtain the information of the current texture element return mpTex.GetPixel(pos.x,pos.y); }
      • Bilinear: read four adjacent texture elements at one time, and then get the final output of the current sample by adding and averaging

      //Bilinear's sampling simulation Color Sampling(Texture2D tex,float2 uv) { //1. Calculate the grade value of mipmap according to the partial derivative   float2 p =_Tex_Size.xy * float2(ddx(uv),ddy(uv)) ; float lod = 0.5 * log2(max(dot(p.x, p.x), dot(p.y, p.y))); float level = Mathf.floor(lod); //2. Get mipmap sampleTex mpTex = tex.GetMipmap(level); //3. Calculate the position of the texture element float2 pos = Mathf.Floor(_Tex_Size.xy * uv / pow(2,level)); //4. Read the texture information of four adjacent points respectively Color p0 = mpTex.GetPixel(pos.x,pos.y); Color p1 = mpTex.GetPixel(pos.x-1,pos.y); Color p2 = mpTex.GetPixel(pos.x,pos.y-1); Color p1 = mpTex.GetPixel(pos.x-1,pos.y-1); //5. Average output return (p0+p1+p2+p3)/4; }
      • Trilinear: the color values will be read from two different mipmap levels, and then the final output of the current sample will be obtained by interpolation blending

      //Trilinear sampling simulation Color Sampling(Texture2D tex,float2 uv) { //1. Calculate the grade value of mipmap according to the partial derivative   float2 p =_Tex_Size.xy * float2(ddx(uv),ddy(uv)) ; float lod = 0.5 * log2(max(p.x*p.x, p.y*p.y)); float level = Mathf.floor(lod); float lerpValue = Mathf.frac(lod); //2. Get mipmap sampleTex mpTex1 = tex.GetMipmap(level); sampleTex mpTex2 = tex.GetMipmap(level+1); //3. Sample information from the first mipmap texture float2 pos = Mathf.Floor(_Tex_Size.xy * uv / pow(2,level)); Color p0 = mpTex1.GetPixel(pos.x,pos.y); Color p1 = mpTex1.GetPixel(pos.x-1,pos.y); Color p2 = mpTex1.GetPixel(pos.x,pos.y-1); Color p3 = mpTex1.GetPixel(pos.x-1,pos.y-1); Color r1=(p0+p1+p2+p3)/4; //4. Sample information from the second mipmap texture float2 pos1 = Mathf.Floor(_Tex_Size.xy * uv / pow(2,level+1)); Color p4 = mpTex2.GetPixel(pos1.x,pos1.y); Color p5 = mpTex2.GetPixel(pos1.x-1,pos1.y); Color p6 = mpTex2.GetPixel(pos1.x,pos1.y-1); Color p7 = mpTex2.GetPixel(pos1.x-1,pos1.y-1); Color r2=(p4+p5+p6+p7)/4; //5. Interpolation output return lerp(r1,r2,lerpValue); }
  6. Anti aliasing anti aliasing sampling

    • Increase the sampling frequency to reduce the loss of information during sampling

    • The main performance is: eliminate aliasing

    • Its effective parameter values are: the four integer values of 1, 2, 4 and 8 respectively represent how many input slices correspond to a target pixel
      • 1: Indicates that a target pixel corresponds to an input slice, which is equivalent to not changing the sampling frequency

      • 2: Indicates that a target pixel corresponds to two input slices. For example, the position of a target pixel: (x,y) is divided into two positions (x,y) and (x+0.5,y)

      • By analogy, the increase method is exponential growth, and 8 samples are used at most in Unity

    • Setting this parameter will increase the memory of RenderTexture. After all, the increase of sampling frequency increases the size of information data

  7. aniso level,

    • The main processing environment is how to optimize a screen pixel in different mipmap levels corresponding to x and y directions

    • The generation of mipmap is to scale the length and width at the same time

    • For each heterosexual grade here, the length and width are not reduced at the same time to generate some similar mipmap s

    • Quality setting - > anisotropic textures: Disabled;PerTexture;Force On

    • The red diagonal is mipmap, and the yellow and green parts are anisotropic pictures

      ...

  8. Wrap extension mode:

    • When texture sampling, what information will be sampled when the input value exceeds the range of [0,1]

    • Common parameter values: Repeat, Clamp

      //1.Repeat through code simulation is to directly obtain the decimal part of the input fixed4 color = tex2D(_MainTex,frac(uv)) //2.Clamp is processed through code simulation, that is, min and max fixed4 color =tex2D(_MainTex,min(1,max(0,uv)));

5, Summary

  • Rendering texture is the cornerstone of various rendering effects. It can be easily used only after you have a clear understanding of the properties of RenderTexture

28 November 2021, 02:01 | Views: 8334

Add new comment

For adding a comment, please log in
or create account

0 comments