If we want to use tangent-space normal mapping in SURFACE_FUNCTION, then we have to explicitly do this ourselves. Note that unlike Unity's surface shader approach, we're working with a normal vector in world space, not tangent space. I.normal = surface.normal No Tangent Space That way we don't need to change all the code that uses i.normal. #if defined(SURFACE_FUNCTION) surface.normal = i.normal surface.albedo = 1 surface.alpha = 1 surface.emission = 0 tallic = 0 surface.occlusion = 1 surface.smoothness = 0.5 SurfaceParameters sp sp.normal = i.normal sp.position = i.worldPos.xyz sp.uv = UV_FUNCTION(i) SURFACE_FUNCTION(surface, sp) #elseĪs it might be possible that SURFACE_FUNCTION changes the surface normal, assign it back to i.normal afterwards. Its arguments are the surface-as an inout parameter-and the parameters struct. Then create the surface parameters and invoke the custom surface function. When this is the case, fill surface with the normal vector and set all other values to their default. struct InterpolatorsVertex īack in My Lighting, adjust MyFragmentProgram so it uses a different way to setup the surface data when a SURFACE_FUNCTION is defined. So make the existence of the UV interpolator in My Lighting Input dependent on NO_DEFAULT_UV. When the mesh data doesn't contain UV, then we don't have any UV to pass from the vertex to the fragment program. We keep the current approach as the default, but will switch to working without UV when NO_DEFAULT_UV is defined. While we could create alternatives that do not depend on vertex UV, it would be more convenient if our current files could be made to work both with and without UV. Our My Lighting Input and My Lighting shader include files depend on them. Up to this point, we've always assumed that UV coordinates are available. In those cases, we have to use an alternative way to map textures onto our surfaces. When creating a terrain or cave systems at run-time, it usually isn't feasible to generate UV coordinates for an appropriate texture unwrap. For example, when working with procedural geometry of arbitrary shapes. ![]() Sometimes, there are no UV coordinates available. The usual way to perform texture mapping is by using the UV coordinates stored per-vertex in a mesh. Neither vertex UV coordinates nor tangent vectors required. This tutorial is made with Unity 2017.4.1f1. It uses the FXAA tutorial project as its foundation. This tutorial is about supporting triplanar texture mapping.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |