Unity Shader after screen effect - edge detection

For details about the control class of the after screen effect, see another blog written earlier:


This article is mainly based on the previous control class, to achieve another common after screen effect - edge detection.


Concept and principle part:


First of all, we need to know that convolution is an operation that often deals with pixels in graphics.

The essence of convolution operation is to re fuse each pixel and its surrounding pixels in order to get different pixel processing effects, such as sharpening image, blurring image, detecting edge and so on.


Convolution operation can get different results through different pixel fusion algorithms, which mainly depends on convolution kernel.

The convolution kernel can be regarded as a square matrix of n rows and n columns, and the original pixel is located in the center of the square matrix.


The convolution kernel of edge detection is also called edge detection operator. Taking Sobel operator as an example, the shape is as follows:

It should be noted that the Sobel operator here is based on the coordinate axis with the upper left of the screen as the origin, and the lower right is + x,+y direction respectively, rather than similar to the uv coordinate axis with the lower left of the screen as the origin, and the upper right is + x,+y direction respectively. This point needs special attention, otherwise the later procedure is easy to write wrong.

Where Gx and Gy are edge detection in the longitudinal and transverse directions respectively, you can imagine by removing the zero element in the matrix, because the zero element will not have any impact on the pixels. That is to say, Gx is to calculate the horizontal gradient value and Gy is to calculate the vertical gradient value.

The horizontal gradient value detects the vertical edge line, and the vertical gradient value detects the horizontal edge line. This is very confusing and needs special attention.


In addition to fusing pixels, edge detection operator is mainly used to calculate the gradient value of pixels.

The high gradient value between a pixel and the surrounding pixels means that it is very different from the surrounding pixels. We can imagine that this pixel is out of line with the surrounding pixels, and there is an insurmountable ladder; then we can think that this pixel can be used as a value in a boundary.

By processing every pixel in the image, we can get the edge of the image. This is the essence of edge detection.


Calculation method:


1. Get the coordinate positions of 8 pixels around each pixel for calculation with Sobel operator, similar to: (the arrangement should be consistent with the coordinate axis of Sobel operator)

uv[0] uv[1] uv[2]
uv[3] uv[4] (original pixel point) uv[5]
uv[6] uv[7] uv[8]





However, since the origin of uv coordinate is at the lower left corner, when calculating uv[0]-uv[8], if uv[4] is the original pixel point, their offsets can be expressed as follows:

(-1,1)uv[0] (0,1)uv[1] (1,1)uv[2]
(-1,0)uv[3] (0,0)uv[4] (1,0)uv[5]
(-1,-1)uv[6] (0,-1)uv[7] (1,-1)uv[8]






2. The position coordinate information of the surrounding pixels of the target pixel can be calculated quickly by the offset value, and then the horizontal and vertical gradient values are calculated with the corresponding elements of Gx and Gy respectively, that is, the vertical and horizontal edge detection are carried out respectively:

The specific calculation method is: first, turn the convolution kernel 180 degrees to get a new matrix, then multiply and add the corresponding elements, pay attention not to be confused with the multiplication calculation of the matrix.

However, whether the Sobel operator performs the flipping operation has no effect on the calculation results, so for the Sobel operator, the flipping operation can be omitted.

In order to simplify the calculation of GPU, we can directly add their absolute values to get the final gradient value G.


3. After calculating the gradient value, the original sampling results are interpolated with respect to G to get the final image.


Program implementation:


The first is the script of parameter control:

1 using UnityEngine;
 3 public class EdgeDetectionCtrl : ScreenEffectBase
 4 {
 5     private const string _EdgeOnly = "_EdgeOnly";
 6     private const string _EdgeColor = "_EdgeColor";
 7     private const string _BackgroundColor = "_BackgroundColor";
 9     [Range(0,1)]
10     public float edgeOnly = 0.0f;
12     public Color edgeColor = Color.black;
14     public Color backgroundColor = Color.white;
16     private void OnRenderImage(RenderTexture source, RenderTexture destination)
17     {
18         if (Material!=null)
19         {
20             Material.SetFloat(_EdgeOnly, edgeOnly);
21             Material.SetColor(_EdgeColor, edgeColor);
22             Material.SetColor(_BackgroundColor, backgroundColor);
23             Graphics.Blit(source, destination, Material);
24         }
25         else
26             Graphics.Blit(source, destination);
27     }
28 }

Also inherited from ScreenEffectBase The meanings of the three parameters of the base class are as follows:

edgeOnly(shader Medium:_EdgeOnly): The superposition degree of edge lines, 0 for complete superposition, 1 for displaying only edge lines, not original image
edgeColor(shader Medium:_EdgeColor): Color of edge line
backgroundColor(shader Medium:_BackgroundColor): The background color can be seen clearly when only the edge lines are displayed

For the base class script, see:


Here is Shader script:

1 Shader "MyUnlit/EdgeDetection"
  2 {
  3     Properties
  4     {
  5         _MainTex ("Texture", 2D) = "white" {}
  6     }
  7     SubShader
  8     {
  9         Tags { "RenderType"="Opaque" }
 11         Pass
 12         {
 13             ZTest always
 14             Cull off
 15             ZWrite off
 17             CGPROGRAM
 18             #pragma vertex vert
 19             #pragma fragment frag
 21             #pragma multi_compile_fog
 23             #include "UnityCG.cginc"
 25             struct appdata
 26             {
 27                 float4 vertex : POSITION;
 28                 float2 uv : TEXCOORD0;
 29             };
 31             struct v2f
 32             {
 33                 half2 uv[9] : TEXCOORD0;
 34                 UNITY_FOG_COORDS(1)
 35                 float4 pos : SV_POSITION;
 36             };
 38             sampler2D _MainTex;
 39             //Texture mapped to the size after [0,1], used to calculate texture coordinates of adjacent areas
 40             half4 _MainTex_TexelSize;
 41             //Define the corresponding parameters in the control script
 42             fixed _EdgeOnly;
 43             fixed4 _EdgeColor;
 44             fixed4 _BackgroundColor;
 46             v2f vert (appdata v)
 47             {
 48                 v2f o;
 49                 o.pos = UnityObjectToClipPos(v.vertex);
 51                 half2 uv = v.uv;
 52                 half2 size = _MainTex_TexelSize;
 53                 //Calculate the texture coordinate position of the surrounding pixels, where 4 is the original point, the product factor on the right is the pixel unit of the offset, the coordinate axis is the origin of the lower left corner, and the upper right is the + x,+y direction, matching with the uv coordinate axis
 54                 o.uv[0] = uv + size * half2(-1, 1);
 55                 o.uv[1] = uv + size * half2(0, 1);
 56                 o.uv[2] = uv + size * half2(1, 1);
 57                 o.uv[3] = uv + size * half2(-1, 0);
 58                 o.uv[4] = uv + size * half2(0, 0);
 59                 o.uv[5] = uv + size * half2(1, 0);
 60                 o.uv[6] = uv + size * half2(-1, -1);
 61                 o.uv[7] = uv + size * half2(0, -1);
 62                 o.uv[8] = uv + size * half2(1, -1);
 64                 UNITY_TRANSFER_FOG(o,o.pos);
 65                 return o;
 66             }
 67             //Calculate the lowest gray value of the corresponding pixel and return
 68             fixed minGrayCompute(v2f i,int idx) 
 69             {
 70                 return Luminance(tex2D(_MainTex, i.uv[idx]));
 71             }
 72             //Using Sobel operator to calculate the final gradient value
 73             half sobel(v2f i) 
 74             {
 75                 const half Gx[9] = {
 76                     - 1,0,1,
 77                     - 2,0,2,
 78                     - 1,0,1
 79                 };
 80                 const half Gy[9] = {
 81                     -1,-2,-1,
 82                      0, 0, 0,
 83                      1, 2, 1
 84                 };
 85                 //The horizontal and vertical gradients are calculated by multiplying and adding the corresponding elements
 86                 half graX = 0;
 87                 half graY = 0;
 89                 for (int it = 0; it < 9; it++) 
 90                 {
 91                     graX += Gx[it] * minGrayCompute(i, it);
 92                     graY += Gy[it] * minGrayCompute(i, it);
 93                 }
 94                 //The absolute value is added to approximate the final gradient value
 95                 return abs(graX) + abs(graY);
 96              }
 98             fixed4 frag (v2f i) : SV_Target
 99             {
100                 half gra = sobel(i);
101                 fixed4 col = tex2D(_MainTex, i.uv[4]);
102                 //The gradient value is used for interpolation, and the larger the gradient value is, the closer it is to the color of the edge
103                 fixed4 withEdgeColor = lerp( col, _EdgeColor, gra);
104                 fixed4 onlyEdgeColor = lerp( _BackgroundColor, _EdgeColor, gra);
105                 fixed4 color = lerp(withEdgeColor, onlyEdgeColor, _EdgeOnly);
107                 UNITY_APPLY_FOG(i.fogCoord, color);
108                 return color;
109             }
110             ENDCG
111         }
112     }
113 }

The effect is as follows:

Tags: Fragment

Posted on Mon, 08 Jun 2020 23:56:03 -0400 by jsims