This page contains a list of all my small shaders I coded up over time. All of them are public, and available on GitHub: LINK!

Fractal shader

This shader generates the effect of a colored Julia set, which can optionally be made emissive. The fractal is calculated live instead of being a texture, so various animated effects are possible as every aspect of the calculation is parametrized.

A Audio-Link reactive version of the shader also exists, as well as a Luma Glow enabled variant, which modulate the fractal generation parameters based on sound.

The shader computes the distance of each point to the set and returns it as a value between 0 and 1, before using it to interpolate between colors as the final output. The scale of this distance value is dependent on the Max Iterations setting, so this will also influence visual appearance rather then just image detail.

Video Demo:

CRT Input layer

This shader is not intended to be a standalone shader. Instead, it is a solution I came up with for forwarding avatar parameters to a shader running in a CustomRenderTexture. A material used to shade a CRT cannot be affected by animations, making it seemingly impossible to transfer parameters to them (i.e. from the expressions menu).

However, they can access GrabPass textures, which is what the CRT input layer takes advantage of. It is actually two shaders. The first solid-colors a massive sphere surrounding the avatar, and the second renders right after, simply taking a GrabPass that will then be filled with said solid color. This all happens before the skybox is rendered in the render queue, so no screenspace pixels are actually affected. Finally, the GrabPass texture can be accessed in any CRT shader.

Although this should technically be capable of forwarding up to 3 bytes of data to the CRT shader (one per color channel), it appears as if world effects still affect the input layer, leading to light data corruption. The most I ever managed to achieve was sending 2 bits of information per color channel.

Several of my other shaders depend on this input layer, so it is sometimes included with those shaders.

Procedural Maze Shader

This shader proceduraly generates a maze. I created it to see if it was possible to create and manage a stack inside a CRT, as the algorithm used (randomized depth-first search) requires a large amount of stack memory to function.

The maze is initialized as a grid of cells with 4 walls, and marked as "unvisited". The algorithm then starts at the lower-left corner of the grid. Every frame, it tries to take a step into a random, adjacent unvisited cell, pushing the location of the previous cell onto the stack.

If it can do so, it "breaks" the wall between the previous and next cells, extending the passage.

If no unvisited cells are available, the algorithm pops the last visited location off of the stack, and jumps to it, taking a step back. This repeats until it once again has a unvisited neighboring cell to branch into. The algorithm completes once it has returned to the very beginning (the stack is empty).

The stack and stack pointer are stored in a separate color channel in the CRT from the cell grid. This shader also uses the CRT input layer to allow inputing a RNG seed and for reseting the algorithm, generating a new maze.

Video Demo:

Plasma Shader

This shader is a plasma effect using distored perlin noise, which is when the values of two perlin noise maps is used to offset the sample location of a third noise map. The effect is animated by shifting the input coordinates to the noise function over time.

The colors also occasionally "pulse", which is not achieved by simply brightening the colors, but by offseting the output of the perlin noise function by a positive amount. Because the noise function outputs values between -0.5 and 0.5, this pushes some values into the positive, making them visible and giving the appearance as if the plasma trails are expanding briefly.

Video demos:

Particle life

A CRT shader implementation of the simulation shown in this video. To see it in action, you can check it out in this VRC world.
The Idea of the simulation is that it is capable of creating complex emergent behaviour by simulating particles that simply attract or repulse each other (where repulsion is actually just attraction of a negative strength).
Every particle has one of four colors, and for each color, a interaction force of it to each other color is defined. On every step of the simulation, the velocities of the particles are updated by computing the attraction forces of each particle to all others, and then their positions are updated using their velocities.
The actual simulation run in my shader is slightly different than that from the video, though. In the video, he programs each particle to update its position immediately after its new velocity is computed. Due to how this change affects the velocities of the particles computer after, it forces a requirement to have all the particle velocity updates be updated sequentially, which is very difficult to do in a shader.
As a result, I went the easier route of first computing the new velocities for all particles, and then updating their positions in a separate pass (this is is how you’re supposed to implement gravitational simulations anways, so really, the code shown in the YouTube video is the one that’s wrong). This works very well in GPU code, and the implementation was almost trivial.
A CRT is used to contain the particle positions and velocities, and the velocity update is spread over four frames, to reduce lag. For rendering the particles, I used the GPU Particles shader from here, modified to read the data from my simulation CRT, render the particles as quads, and apply a texture to them, to make it look nicer.

Video of the simulation running, taken in the demo VRC world: