Controlling the Blur Filter

I’m doing a simulation of optics, showing how the distance of a lens affects the formation of an image. As the user drags a lens back and forth, I need to vary the blur on an image (texture) in real time to show it going in and out of focus. I’ve figured out how to get the blur to affect only a single texture (using a dedicated camera), but can’t seem to adjust the blur with high enough resolution. There’s either no blur or way too much, the steps being too big. Looks like the iteration parameter only take integers, and the spread parameter doesn’t seem to do much of anything. Even setting both to zero, there is still a fair amount of blur.

Is there anything I can do to get smaller steps with the blur? Any tweaking of the code possibly? Maybe some way to use a float instead of an integer? Or alternatively, another way to blur a texture in real time?

Looks like I have another problem. When I build for the web player, the blurred texture is not drawn correctly, as it is in Unity. The top half shows the bottom half of the image blurred, but the bottom half looks like it took a screen capture of a random portion of the screen and blurred it (see the attached ). It’s different each time depending on where the browser window is on my screen. Any ideas why it would look OK in Unity but not in the web player? Yes, I do have Unity Pro.

795208--29148--$Blur-Sample.jpg

are both windows the same size?

If you mean in the screen captures, yes both are the same area of the window. In both Unity and the web player the Unity frame is the same size.

The image is an upside down lamp on a table. The bright spot is the lamp shade. In the web player the lamp shade is shifted up and the lower part of the image is filled in with some random part of the screen.

Attached is a larger screen capture that shows the context. In this one I can tell that the random portion of the screen is a section of my dock drawn upside down.

795993--29183--$Blur-Sample-2.jpg

I can’t seem to get this blur filter to work. Doesn’t matter if the camera is enabled or disabled at launch, or if the filter is enabled or disabled. I always get the random garbage.

If I apply the blur filter to the main camera, it works correctly – the whole scene except for the 2D interface is blurred and there is no garbage drawn. Could it be that this filter can only be applied to the main camera? If I apply the filter to my image camera and don’t set its view port, it blurs the main camera and the 2D interface camera too. But if I apply it to the main camera, the 2D and image cameras are unaffected.

So maybe I just don’t understand how these image effects work. Are they full screen or nothing? Maybe I should be asking how do I use the blur filter to blur only one texture in one small part of the screen? Or do I need to write my own blur filter that works on a texture rather than a camera? (Which would negate part of the reason for getting Unity Pro.)

Just checked on two Macs, both OS 10.6.8 and Safari 5.1.2, and had the same problem. Likewise in Firefox 9.0.1 on those machines. However for a colleague running Safari 5.0.5 on Mac OS it works correctly. It also works correctly on Win XP in FF and IE. Not sure what to make of this.