Changing shader in native code

Hello!

To do some color tracking i do an offscreen render to a downsampled buffer with special shaders and read back the results. I’m not familiar enough with Unity to know if this is possible with cameras/shaders all in Unity.

If i were to make a native call to my routines, do my rendering with my shaders, is this going to totally kill Unity? or is there some form or a “I’ve mucked with the current frame buffer and current shader so reset yourself” call for Unity?

just to update this and ask another question…

apparently setting my vert/frag shaders with gluseprogram() does not hurt unity, at least “when” in the flow I happened to make my call, however what i render is not what i wanted, after looking at a gl snapshot in xcode for sometime, I can’t see why my rendering isnt doing what it should, but that’s for me to debug.

one of my shaders i figured i could do with renderWithShader, as it’s basically just a “screen effect”, however if i do that, and then in onRenderImage, do a blit (ala some other post process effects), i get a constant spew of gl errors coming out of gles_support. Any ideas?

I “think” what I’m trying to do could be done entirely with renderWithShader calls, or a disabled camera that uses my custom shader (for the offscreen render and read back), but for some reason I can’t wrap my head around the “how”. If anyone has an example or link to a render to offscreen and read back the buffer, I’d take it.

What do you want to achieve exactly?

you likely get a state clash as unity assumes singular existance … sure you can’t achieve it with replacement shaders or multiple camera overlay post fx and similar ways?

@Aubergine: i need to downsample the background image to an offscreen buffer using custom shaders that isolate a color, then read back that buffer, finding the isolated color.

@Dreamora: I think it can be done with replacement shaders, I just cant get my head around what all I need to do, and how to do it. The glError I see comes NOT from when I use my plugin and glUseProgram(), but when I use RenderWithShader, and the subsequent onRenderImage(), onRenderImage() is what is causing the glError, not my plugin

Moved the thread to it’s own in the iOS area. Sorry for the trouble.

http://forum.unity3d.com/threads/121956-Native-iOS-Shader-Integration

@pweeks, did you have any further success with your project?

So, you briefly want to make a black&white mask of your screen with a selection of color(s).

You can do it in a 2 pass image effect, or you can do it with 2 graphics.blit operations with 2 seperate shaders.

Check the Replacement shaders example in the download section. Custom depth buffer example does this.

Or you can check out my per-object glow in the assetstore that does this.