Ethans Graphics Kit(Lighting, Sky, Procedural Mesh Generation and Neural Network powered Post-fx)

Released open source under MIT license https://forum.unity.com/threads/ethans-graphics-kit-realtime-gi-sky-cloud-rendering-and-procedural-mesh-gen-open-source.497734/

I started creating a custom deferred renderer(can toggle between deferred/forward at run-time) in Unity 5.3.1 around 2 years ago and it’s evolved into a whole collection of effects.

Ethans Graphics Kit includes:

-Lighting System(requires shader level 5), real-time deferred path-tracer with forward-rendered fallback.

-Mobile Lighting System(requires shader level 3.1), forward-rendered light propagation volume(like a grid of light probes).

-Sky Renderer(requires shader level 3.0), renders sky(clouds, atmospheric scattering, godrays, sun and stars/custom background) to a cubemap.

-Procedural Generation(CPU based), easily generate triangle meshes from distance functions and procedural textures.

-Image Effects, automatic shader generator(powered by neural networks) and helper classes for using image effect shaders.

Here you can download some of the demos seen in the video.

Here’s a Shadertoy of a tropical bird style transfer shader Shader - Shadertoy BETA.
Here’s the outline shader from the video Shader - Shadertoy BETA.

There are 2 neural network shader generators in Ethans Graphics Kit, Style Transfer(tropical bird example) and Transformation(outline example). For both you just select images in an Editor window, click one or two buttons and you have a usable shader in your Project folder. Style Transfer window also has an ‘Alter’ option so you can test out the neural network styles before generating a shader.

How they work, Style Transfer:
A neural network is trained to classify between the style images you provide. Once trained it can be run forward then backward to transfer or ‘dream’ the target style onto whatever pixels you feed into the network. For example if the 1st style is ‘tropical bird’ and you transfer the 1st style to an image, the neural network will be altering the image to what it identifies as a ‘tropical bird’.

Transformation:
A neural network is trained to transform the pixels from a ‘source’ image to a ‘target’ image.

Outline Transformation example:

In the outline example you can see I drew some black outlines on the source image, the bottom shows the original vs with the generated shader. The generated shader applies a black outline.
Note: The added blue tint is because the example was tinted yellow, the neural net added blue to compensate. You can solve this by not using an example tinted yellow :p.

Included ‘Fullscreen Image Effect’ and ‘Temporal Image Effect’ components allow you to use the generated shaders right away, just drop the generated shader in the ‘Effect’ field.

That path tracer looks nice, how come there is no noise on it like there usually is with path tracing ? Are you planning to release it as an asset that would work with Unity 5.6 ?

There is noise but very little, it shows up more when you move the camera. It uses an unsigned distance field to do the path tracing which means its limited by the fields resolution, the larger area you cover with the light volume the less detail. To help cover large view distances it also has a cascaded lightmap mode similar to shadow map cascades, it splits the light volume into 1 near volume for high resolution lighting and 1 far volume for low resolution lighting.

I may release it on the asset store depending on the feedback I get here. I don’t plan on updating it past 5.3.1f.

watched path tracing. i like it =)
how complex scene can be to render it in stable 30 fps? for example sponza scene.
can you do download demo for it?

Here you can download some of the demos seen in the video.

It really depends on the graphics card, the high-end GPU’s to come out in the past few years can definitely handle a small scene like ‘Crytek Sponza’ in real-time(60FPS). Admittedly I haven’t used the real-time GI a lot , I normally use the ‘Delayed’ setting that renders at 5FPS instead of 60FPS.

For benchmarks, both of these are done on Windows 8/10 with a Nvidia GT750M:
Small test scene 20k triangles lighting updating at 5FPS(‘Delayed’ setting) with a 32x32 volume(‘Very Low’).
Sibenik Cathedral 80k triangles lighting baked on start with a 128x128 volume(‘Medium’), this only renders at 30FPS instead of 60FPS on GT750M(can be fixed by making the volume smaller then 128x128 at the sacrifice of quality).

tested on my gtx 980. strange, but feel like i have low hardware.
anyway all work fine.

And here’s other questions I answered:

This looks pretty cool.

1 Like

I saw the video and I didn’t quite understand what this was about, is like doing style transfer from an image to a shader (example)? could you expand on this feature it looks interesting

Yes I think the core method is the same as in the video but it is less detailed because it has to run in real-time. Here’s a Shadertoy of a tropical bird style transfer shader https://www.shadertoy.com/view/lsfyDM.
Here’s the outline shader from the video https://www.shadertoy.com/view/MssyRj.

There are 2 neural network shader generators in Ethans Graphics Kit, Style Transfer(tropical bird example) and Transformation(outline example). For both you just select images in an Editor window, click one or two buttons and you have a usable shader in your Project folder. Style Transfer window also has an ‘Alter’ option so you can test out the neural network styles before generating a shader.

How they work, Style Transfer:
A neural network is trained to classify between the style images you provide. Once trained it can be run forward then backward to transfer or ‘dream’ the target style onto whatever pixels you feed into the network. For example if the 1st style is ‘tropical bird’ and you transfer the 1st style to an image, the neural network will be altering the image to what it identifies as a ‘tropical bird’.

Transformation:
A neural network is trained to transform the pixels from a ‘source’ image to a ‘target’ image.

Outline Transformation example:

In the outline example you can see I drew some black outlines on the source image, the bottom shows the original vs with the generated shader. The generated shader applies a black outline.
Note: The added blue tint is because the example was tinted yellow, the neural net added blue to compensate. You can solve this by not using an example tinted yellow :p.

Included ‘Fullscreen Image Effect’ and ‘Temporal Image Effect’ components allow you to use the generated shaders right away, just drop the generated shader in the ‘Effect’ field.

1 Like

Thank you for explaning and sharing those shaders, now I see that as powerful feature that could facilitate the shader workflow better than substances, but it has a long way to go…

Hoping to see more improvements in the future, keep it up :smile:

Released open source under MIT license https://forum.unity.com/threads/ethans-graphics-kit-realtime-gi-sky-cloud-rendering-and-procedural-mesh-gen-open-source.497734/