Skip to content
Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions com.unity.render-pipelines.core/Runtime/Textures/RTHandle.cs
Original file line number Diff line number Diff line change
Expand Up @@ -141,6 +141,9 @@ public void Release()
/// <returns>Input size scaled by the RTHandle scale factor.</returns>
public Vector2Int GetScaledSize(Vector2Int refSize)
{
if (!useScaling)
return refSize;

if (scaleFunc != null)
{
return scaleFunc(refSize);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,8 @@ real3 ComputeEdgeFactor(real3 V1, real3 V2)
if (V1oV2 < 0)
{
// Undo range reduction.
y = PI * rsqrt(saturate(1 - V1oV2 * V1oV2)) - y;
const float epsilon = 1e-5f;
y = PI * rsqrt(max(epsilon, saturate(1 - V1oV2 * V1oV2))) - y;
}

return V1xV2 * y;
Expand Down
5 changes: 5 additions & 0 deletions com.unity.render-pipelines.high-definition/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,10 @@ and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.
- Fixed issue with the color space of AOVs (case 1324759)
- Fixed issue with history buffers when using multiple AOVs (case 1323684).
- Fixed camera preview with multi selection (case 1324126).
- Fixed GBuffer clear option in FrameSettings not working
- Fixed usage of Panini Projection with floating point HDRP and Post Processing color buffers.
- Fixed a NaN generating in Area light code.
- Fixed CustomPassUtils scaling issues when used with RTHandles allocated from a RenderTexture.

### Changed
- Reduced the maximal number of bounces for both RTGI and RTR (case 1318876).
Expand All @@ -49,6 +53,7 @@ and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.
- Increased path tracing max samples from 4K to 16K (case 1327729).
- Changed ray tracing acceleration structure build, so that only meshes with HDRP materials are included (case 1322365).
- Default black texture XR is now opaque (alpha = 1).
- Changed default sidedness to double, when a mesh with a mix of single and double-sided materials is added to the ray tracing acceleration structure (case 1323451).

## [10.4.0] - 2021-03-11

Expand Down
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
# Getting started with ray tracing

The High Definition Render Pipeline (HDRP) includes preview ray tracing support from Unity 2019.3. Ray tracing is a feature that allows you to access data that is not on screen. For example, you can use it to request position data, normal data, or lighting data, and then use this data to compute quantities that are hard to approximate using classic rasterization techniques.

While film production uses ray tracing extensively, its resource intensity has limited its use to offline rendering for a long time. Now, with recent advances in GPU hardware, you can make use of ray tracing effect in real time.
The High Definition Render Pipeline (HDRP) includes preview ray tracing support from Unity 2019.3. Ray tracing allows you to access data that is not on screen. For example, you can use it to request position data, normal data, or lighting data, and then use this data to compute quantities that are hard to approximate using classic rasterization techniques.

This document covers:

Expand Down Expand Up @@ -161,7 +159,7 @@ Now that your HDRP Project supports ray tracing, there are a few steps you must

#### Frame Settings

To make HDRP calculates ray tracing effects for [Cameras](HDRP-Camera.md) in your Scene, make sure your Cameras use [Frame Settings](Frame-Settings.md) that have ray tracing enabled. You can enable ray tracing for all Cameras by default, or you can enable ray tracing for specific Cameras in your Scene.
To make HDRP calculate ray tracing effects for [Cameras](HDRP-Camera.md) in your Scene, make sure your Cameras use [Frame Settings](Frame-Settings.md) that have ray tracing enabled. You can enable ray tracing for all Cameras by default, or you can enable ray tracing for specific Cameras in your Scene.

To enable ray tracing by default:

Expand All @@ -188,6 +186,20 @@ To check whether it is possible to use ray tracing in a Scene, HDRP includes a m
1. Click **Edit > Rendering > Check Scene Content for HDRP Ray Tracing**.
2. In the Console window (menu: **Window > General > Console**), check if there are any warnings.

<a name="RayTracingMeshes"></a>

# Ray tracing and Meshes

HDRP changes how it handles Meshes in your scene when you integrate a ray traced effect into your project.

When you enable ray tracing, HDRP automatically creates a ray tracing acceleration structure. This structure allows Unity to calculate ray tracing for Meshes in your scene efficiently in real time.

As a result, ray tracing can change how some Meshes appear in your scene in the following ways:

- If your Mesh has a Material assigned that does not have the HDRenderPipeline tag, HDRP does not add it to the acceleration structure and does not apply any ray traced effects to the mesh as a result.
- If your Mesh has a Decal Material assigned, HDRP does not add it to the acceleration structure and the Mesh does not appear in your scene.
- If a Mesh has a combination of Materials that are single and double-sided, HDRP flags all Materials you have assigned to this mesh as double-sided.

<a name="RayTracingEffectsOverview"></a>

## Ray tracing effects overview
Expand All @@ -209,7 +221,7 @@ HDRP includes two ray tracing modes that define how it evaluates certain ray-tra
* **Performance**: This mode targets real-time applications. If you select this mode, ray-traced effects include presets that you can change to balance performance with quality.
* **Quality**: This mode targets technical demos and applications that want the best quality results.

Depending on which ray tracing mode you select, HDRP may expose difference properties for some ray-traced effects.
Depending on which ray tracing mode you select, HDRP may expose different properties for some ray-traced effects.

You can change which ray tracing mode HDRP uses on either a Project level or effect level. To change it for your entire Project:

Expand All @@ -224,7 +236,7 @@ If you select **Both**, you can change the ray tracing mode for each ray-traced

## Ray tracing project

You can find a small ray tracing project that contains all the effects mention above here:
You can find a small ray tracing project that contains all the effects mentioned above here:
https://github.com/Unity-Technologies/SmallOfficeRayTracing
This Project is already set up with ray tracing support.

Expand All @@ -238,7 +250,7 @@ There is no support for ray tracing on platforms other than DX12 for now.

HDRP ray tracing in Unity 2020.2 has the following limitations:
- Does not support vertex animation.
- Does not supports decals.
- Does not support decals.
- Does not support tessellation.
- Does not support per pixel displacement (parallax occlusion mapping, height map, depth offset).
- Does not support VFX and Terrain.
Expand All @@ -247,6 +259,7 @@ HDRP ray tracing in Unity 2020.2 has the following limitations:
- For renderers that have [LODs](https://docs.unity3d.com/2019.3/Documentation/Manual/LevelOfDetail.html), the ray tracing acceleration structure only includes the highest level LOD and ignores the lower LODs.
- Does not support [Graphics.DrawMesh](https://docs.unity3d.com/ScriptReference/Graphics.DrawMesh.html).
- Ray tracing is not supported when rendering [Reflection Probes](Reflection-Probe.md).
- HDRP does not support [orthographic projection](HDRP-Camera.md). If you enable orthographic projection mode, you might experience rendering problems for Transparent Materials, volumetrics and planar reflections.

### Unsupported shader graph nodes for ray tracing

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Path tracing

Path tracing is a ray tracing algorithm that sends rays from the Camera and, when a ray hits a reflective or refractive surface, recurses the process until it reaches a light source. The series of rays from the Camera to the Light forms a "path".
Path tracing is a ray tracing algorithm that sends rays from the Camera and, when a ray hits a reflective or refractive surface, recurses the process until it reaches a light source. The series of rays from the Camera to the Light form a "path".

It enables HDRP to compute many different effects (such as hard or soft shadows, mirror or glossy reflections and refractions, and indirect illumination) in one single unified process.

Expand All @@ -14,7 +14,7 @@ Noisy image with **Maximum Samples** set to 1

Clean image with **Maximum Samples** set to 256

The current implementation for path tracing in the High Definition Render Pipeline (HDRP) accumulates paths for every pixel up to a maximum count, unless the Camera moves. If the Camera moves, HDRP restarts the path accumulation. Path tracing supports Lit, LayeredLit and Unlit materials, and area, point, directional and environment lights.
The current implementation for path tracing in the High Definition Render Pipeline (HDRP) accumulates paths for every pixel up to a maximum count unless the Camera moves. If the Camera moves, HDRP restarts the path accumulation. Path tracing supports Lit, LayeredLit, and Unlit materials, and area, point, directional, and environment lights.

## Set up path tracing

Expand Down Expand Up @@ -62,11 +62,12 @@ There is no support for path tracing on platforms other than DX12 for now.

HDRP path tracing in Unity 2020.2 has the following limitations:

- If a Mesh in your scene has a Material assigned that does not have the `HDRenderPipeline` tag, the mesh will not appear in your scene. For more information, see [Ray tracing and Meshes](Ray-Tracing-Getting-Started.md#RayTracingMeshes).
- Does not support 3D Text and TextMeshPro.
- Does not support Shader Graph nodes that use derivatives (ex : normal from textures).
- Does not support Shader Graph nodes that use derivatives (for example, a normal map that derives from a texture).
- Does not support decals.
- Does not support tessellation.
- Does not support Tube and Disc shaped Area Light.
- Does not support Tube and Disc-shaped Area Lights.
- Does not support Translucent Opaque Materials.
- Does not support several of HDRP's Materials. This includes Fabric, Eye, StackLit, Hair, Decal.
- Does not support per-pixel displacement (parallax occlusion mapping, height map, depth offset).
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,10 +15,10 @@ static partial class HDCameraUI

static readonly GUIContent clearModeContent = EditorGUIUtility.TrTextContent("Background Type", "Specifies the type of background the Camera applies when it clears the screen before rendering a frame. Be aware that when setting this to None, the background is never cleared and since HDRP shares render texture between cameras, you may end up with garbage from previous rendering.");
static readonly GUIContent backgroundColorContent = EditorGUIUtility.TrTextContent("Background Color", "The Background Color used to clear the screen when selecting Background Color before rendering.");
static readonly GUIContent cullingMaskContent = EditorGUIUtility.TrTextContent("Culling Mask");
static readonly GUIContent cullingMaskContent = EditorGUIUtility.TrTextContent("Culling Mask", "Specifies the list of layers the camera should render.");
static readonly GUIContent volumeLayerMaskContent = EditorGUIUtility.TrTextContent("Volume Layer Mask");
static readonly GUIContent volumeAnchorOverrideContent = EditorGUIUtility.TrTextContent("Volume Anchor Override");
static readonly GUIContent occlusionCullingContent = EditorGUIUtility.TrTextContent("Occlusion Culling");
static readonly GUIContent occlusionCullingContent = EditorGUIUtility.TrTextContent("Occlusion Culling", "When enabled, the camera does not render objects that are being obscured by other geometry.");

static readonly GUIContent exposureTargetContent = EditorGUIUtility.TrTextContent("Exposure Target", "The object used as a target for centering the Exposure's Procedural Mask metering mode when target object option is set (See Exposure Volume Component).");

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3053,6 +3053,9 @@ PaniniProjectionParameters PreparePaniniProjectionParameters(HDCamera camera)
parameters.paniniProjectionCS.EnableKeyword("UNITDISTANCE");
}

if (m_EnableAlpha)
parameters.paniniProjectionCS.EnableKeyword("ENABLE_ALPHA");

parameters.paniniParams = new Vector4(viewExtents.x, viewExtents.y, paniniD, paniniS);
parameters.paniniProjectionKernel = parameters.paniniProjectionCS.FindKernel("KMain");

Expand Down
Original file line number Diff line number Diff line change
@@ -1,15 +1,17 @@
#include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Common.hlsl"
#include "Packages/com.unity.render-pipelines.high-definition/Runtime/ShaderLibrary/ShaderVariables.hlsl"
#include "Packages/com.unity.render-pipelines.high-definition/Runtime/PostProcessing/Shaders/PostProcessDefines.hlsl"

#pragma only_renderers d3d11 playstation xboxone xboxseries vulkan metal switch

#pragma kernel KMain

#pragma multi_compile GENERIC UNITDISTANCE
#pragma multi_compile _ ENABLE_ALPHA

TEXTURE2D_X(_InputTexture);

RW_TEXTURE2D_X(float3, _OutputTexture);
RW_TEXTURE2D_X(CTYPE, _OutputTexture);

SAMPLER(sampler_LinearClamp);

Expand Down Expand Up @@ -125,7 +127,7 @@ void KMain(uint3 dispatchThreadId : SV_DispatchThreadID)
}
else
{
float3 smp = SAMPLE_TEXTURE2D_X_LOD(_InputTexture, sampler_LinearClamp, ClampAndScaleUVForBilinear(coords), 0.0).xyz;
CTYPE smp = SAMPLE_TEXTURE2D_X_LOD(_InputTexture, sampler_LinearClamp, ClampAndScaleUVForBilinear(coords), 0.0).CTYPE_SWIZZLE;
_OutputTexture[COORD_TEXTURE2D_X(posInputs.positionSS)] = smp;
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -45,10 +45,9 @@ void CleanupPrepass()
CoreUtils.Destroy(m_DepthResolveMaterial);
}

bool NeedClearGBuffer()
bool NeedClearGBuffer(HDCamera hdCamera)
{
// TODO: Add an option to force clear
return m_CurrentDebugDisplaySettings.IsDebugDisplayEnabled();
return m_CurrentDebugDisplaySettings.IsDebugDisplayEnabled() || hdCamera.frameSettings.IsEnabled(FrameSettingsField.ClearGBuffers);
}

HDUtils.PackedMipChainInfo GetDepthBufferMipChainInfo()
Expand Down Expand Up @@ -108,7 +107,7 @@ TextureHandle CreateDepthBuffer(RenderGraph renderGraph, bool clear, bool msaa)
return renderGraph.CreateTexture(depthDesc);
}

TextureHandle CreateNormalBuffer(RenderGraph renderGraph, bool msaa)
TextureHandle CreateNormalBuffer(RenderGraph renderGraph, HDCamera hdCamera, bool msaa)
{
#if UNITY_2020_2_OR_NEWER
FastMemoryDesc fastMemDesc;
Expand All @@ -118,7 +117,7 @@ TextureHandle CreateNormalBuffer(RenderGraph renderGraph, bool msaa)
#endif

TextureDesc normalDesc = new TextureDesc(Vector2.one, true, true)
{ colorFormat = GraphicsFormat.R8G8B8A8_UNorm, clearBuffer = NeedClearGBuffer(), clearColor = Color.black, bindTextureMS = msaa, enableMSAA = msaa, enableRandomWrite = !msaa, name = msaa ? "NormalBufferMSAA" : "NormalBuffer"
{ colorFormat = GraphicsFormat.R8G8B8A8_UNorm, clearBuffer = NeedClearGBuffer(hdCamera), clearColor = Color.black, bindTextureMS = msaa, enableMSAA = msaa, enableRandomWrite = !msaa, name = msaa ? "NormalBufferMSAA" : "NormalBuffer"
#if UNITY_2020_2_OR_NEWER
, fastMemoryDesc = fastMemDesc
#endif
Expand Down Expand Up @@ -327,7 +326,7 @@ bool RenderDepthPrepass(RenderGraph renderGraph, CullingResults cull, HDCamera h
decalBuffer = renderGraph.defaultResources.blackTextureXR;
output.depthAsColor = renderGraph.CreateTexture(new TextureDesc(Vector2.one, true, true)
{ colorFormat = GraphicsFormat.R32_SFloat, clearBuffer = true, clearColor = Color.black, bindTextureMS = true, enableMSAA = true, name = "DepthAsColorMSAA" });
output.normalBuffer = CreateNormalBuffer(renderGraph, msaa);
output.normalBuffer = CreateNormalBuffer(renderGraph, hdCamera, msaa);
return false;
}

Expand All @@ -339,7 +338,7 @@ bool RenderDepthPrepass(RenderGraph renderGraph, CullingResults cull, HDCamera h
passData.hasDepthDeferredPass = depthPrepassParameters.hasDepthDeferredPass;

passData.depthBuffer = builder.UseDepthBuffer(output.depthBuffer, DepthAccess.ReadWrite);
passData.normalBuffer = builder.WriteTexture(CreateNormalBuffer(renderGraph, msaa));
passData.normalBuffer = builder.WriteTexture(CreateNormalBuffer(renderGraph, hdCamera, msaa));
if (decalLayersEnabled)
passData.decalBuffer = builder.WriteTexture(CreateDecalPrepassBuffer(renderGraph, msaa));
// This texture must be used because reading directly from an MSAA Depth buffer is way to expensive.
Expand Down Expand Up @@ -464,7 +463,7 @@ struct GBufferOutput

void SetupGBufferTargets(RenderGraph renderGraph, HDCamera hdCamera, GBufferPassData passData, TextureHandle sssBuffer, TextureHandle vtFeedbackBuffer, ref PrepassOutput prepassOutput, FrameSettings frameSettings, RenderGraphBuilder builder)
{
bool clearGBuffer = NeedClearGBuffer();
bool clearGBuffer = NeedClearGBuffer(hdCamera);
bool lightLayers = frameSettings.IsEnabled(FrameSettingsField.LightLayers);
bool shadowMasks = frameSettings.IsEnabled(FrameSettingsField.Shadowmask);

Expand Down Expand Up @@ -630,7 +629,7 @@ void ResolvePrepassBuffers(RenderGraph renderGraph, HDCamera hdCamera, ref Prepa

passData.depthBuffer = builder.UseDepthBuffer(CreateDepthBuffer(renderGraph, true, false), DepthAccess.Write);
passData.depthValuesBuffer = builder.UseColorBuffer(depthValuesBuffer, 0);
passData.normalBuffer = builder.UseColorBuffer(CreateNormalBuffer(renderGraph, false), 1);
passData.normalBuffer = builder.UseColorBuffer(CreateNormalBuffer(renderGraph, hdCamera, false), 1);
if (passData.needMotionVectors)
passData.motionVectorsBuffer = builder.UseColorBuffer(CreateMotionVectorBuffer(renderGraph, false, false), 2);
else
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ void ExecuteWithRenderGraph(RenderRequest renderRequest,

LightingBuffers lightingBuffers = new LightingBuffers();
lightingBuffers.diffuseLightingBuffer = CreateDiffuseLightingBuffer(m_RenderGraph, msaa);
lightingBuffers.sssBuffer = CreateSSSBuffer(m_RenderGraph, msaa);
lightingBuffers.sssBuffer = CreateSSSBuffer(m_RenderGraph, hdCamera, msaa);

var prepassOutput = RenderPrepass(m_RenderGraph, colorBuffer, lightingBuffers.sssBuffer, vtFeedbackBuffer, cullingResults, customPassCullingResults, hdCamera, aovRequest, aovBuffers);

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ namespace UnityEngine.Rendering.HighDefinition
{
public partial class HDRenderPipeline
{
TextureHandle CreateSSSBuffer(RenderGraph renderGraph, bool msaa)
TextureHandle CreateSSSBuffer(RenderGraph renderGraph, HDCamera hdCamera, bool msaa)
{
#if UNITY_2020_2_OR_NEWER
FastMemoryDesc fastMemDesc;
Expand All @@ -20,7 +20,7 @@ TextureHandle CreateSSSBuffer(RenderGraph renderGraph, bool msaa)
enableRandomWrite = !msaa,
bindTextureMS = msaa,
enableMSAA = msaa,
clearBuffer = NeedClearGBuffer(),
clearBuffer = NeedClearGBuffer(hdCamera),
clearColor = Color.clear,
name = msaa ? "SSSBufferMSAA" : "SSSBuffer"
#if UNITY_2020_2_OR_NEWER
Expand Down
Loading