Raymarcher Technical & APi Documentation
Raymarcher Camera Filters
Namespace: Raymarcher.CameraFilters
Raymarcher utilizes camera filters to display the currently processed frame.

There are three types of camera filters:

- Built-In Render Pipeline camera filter
- Universal Render Pipeline camera filter
- High Definition Render Pipeline camera filter


Each camera filter is designed for a specific Unity render pipeline but shares the same purpose - displaying the processed frame to the camera.
Every camera filter creates a quad mesh in front of the target camera with proper frustum parameters.
As the entire Raymarcher pipeline is processed, the final frame is 'blit' into the quad's material.

Setting up individual render pipelines in Unity differs slightly:

Built-In RP
To display a frame processed by Raymarcher in the Built-In RP, each target camera needs to have the 'RMCamFilterBuiltInRP' component.
This component facilitates rendering the final frame in the scene view, 'blitting' the current scene color (useful for features like 'scene refraction' in Raymarcher materials),
disabling the depth texture if unused, and adjusting the projector size (resizing the 'generated quad' for VR cameras).
You can access the camera's fields/properties by directly getting a reference to the component.
Properties & Fields
// Use downsample feature? The Raymarcher will be rendered into a downsampled RT and may rapidly increase performance
public bool useDownsampleFeature;
public float downsample;
public float sharpness;

// Current target render master the camera is rendering with
public RMRenderMaster TargetRenderMaster { get; }
// Is the current camera the editor camera?
public bool SceneViewEditorCamera { get; }
public bool BlitSceneColor { get; set; }
// Currently captured scene color (if blitSceneColor is enabled)
public RenderTexture CurrentSceneColor { get; }
public bool DisableDepthIfUnused { get; set; }
public float ProjectorSize { get; set; }


URP
For Raymarcher in URP, it is displayed through the URP's renderer features found in the URP settings.
Choose your desired URP setting asset and add the 'RMCamFilterURP'.
This renderer feature enables rendering the final frame in the scene view, 'blitting' the current scene color, and adjusting the projector size.
The Raymarcher Render Master will automatically dispatch the required session material to the renderer feature.
You can access the camera's fields/properties by directly getting a reference to the URP asset component where the RMCamFilterURP is located.
Properties & Fields
// Access the settings struct in the renderer feature
public CamFilterSettings settings;

// Content of the settings:

public RenderPassEvent renderPassEvent;
public bool blitSceneColor;
public bool renderInSceneView;

public float projectorSize;

public bool useDownsampleFeature;
public float downsample;
public float sharpness;

public RMRenderMaster RaymarcherSession { get; set; }
If you would like to change a target render asset, you also need to refresh the Raymarcher's URP Camera Filter. You can do that by calling:
Raymarcher.CameraFilters.RMCamFilterURPHandler.SetupURPCam(RMRenderMaster targetRenderMaster)


HDRP
In HDRP, Raymarcher is displayed through the HDRP's custom pass features, which can be created through Create/CustomPass in the hierarchy.
Select the created custom pass and add 'RMCamFilterHDRP'.
In the custom pass, assign the created Raymarcher's session material, and you can adjust settings related to rendering the final frame in the scene view and changing the projector size (the size of the generated quad).
You can access the camera's fields/properties by directly getting a reference to the HDRP custom pass component where the RMCamFilterHDRP is located.
Properties & Fields
public Material rmSessionMaterial;
public bool renderInSceneView = true;
public float projectorSize = 0;


Ensure that you unpack the correct camera filter before integrating Raymarcher into your project.