Released open source under MIT license https://forum.unity.com/threads/ethans-graphics-kit-realtime-gi-sky-cloud-rendering-and-procedural-mesh-gen-open-source.497734/
I started creating a custom deferred renderer(can toggle between deferred/forward at run-time) in Unity 5.3.1 around 2 years ago and it’s evolved into a whole collection of effects.
Ethans Graphics Kit includes:
-Lighting System(requires shader level 5), real-time deferred path-tracer with forward-rendered fallback.
-Mobile Lighting System(requires shader level 3.1), forward-rendered light propagation volume(like a grid of light probes).
-Sky Renderer(requires shader level 3.0), renders sky(clouds, atmospheric scattering, godrays, sun and stars/custom background) to a cubemap.
-Procedural Generation(CPU based), easily generate triangle meshes from distance functions and procedural textures.
-Image Effects, automatic shader generator(powered by neural networks) and helper classes for using image effect shaders.
Here you can download some of the demos seen in the video.
Here’s a Shadertoy of a tropical bird style transfer shader Shader - Shadertoy BETA.
Here’s the outline shader from the video Shader - Shadertoy BETA.
There are 2 neural network shader generators in Ethans Graphics Kit, Style Transfer(tropical bird example) and Transformation(outline example). For both you just select images in an Editor window, click one or two buttons and you have a usable shader in your Project folder. Style Transfer window also has an ‘Alter’ option so you can test out the neural network styles before generating a shader.
How they work, Style Transfer:
A neural network is trained to classify between the style images you provide. Once trained it can be run forward then backward to transfer or ‘dream’ the target style onto whatever pixels you feed into the network. For example if the 1st style is ‘tropical bird’ and you transfer the 1st style to an image, the neural network will be altering the image to what it identifies as a ‘tropical bird’.
Transformation:
A neural network is trained to transform the pixels from a ‘source’ image to a ‘target’ image.
Outline Transformation example:
In the outline example you can see I drew some black outlines on the source image, the bottom shows the original vs with the generated shader. The generated shader applies a black outline.
Note: The added blue tint is because the example was tinted yellow, the neural net added blue to compensate. You can solve this by not using an example tinted yellow :p.
Included ‘Fullscreen Image Effect’ and ‘Temporal Image Effect’ components allow you to use the generated shaders right away, just drop the generated shader in the ‘Effect’ field.