Does anyone have a snippet of how I can have my textures applied using screen coords (i think) so its basically like its just been wallpapered onto my objects and when they move the texture stays static.
You’d have to make a vertex shader that produces UV coordinates based on camera coordinates rather than transforming world coordinates.
Im very much a noob with this shader stuff so if anyone has a sample it would be great. I will keep trying in the mean time. Thanks
I needed something similar in my last project, a friend helped me put this together:
struct v2f {
float4 pos:POSITION;
float2 uv:TEXCOORD0;
};
v2f vert (appdata_base v) {
v2f o;
o.pos = mul( glstate.matrix.mvp, v.vertex );
o.uv = v.texcoord;
o.uv = o.pos.xy / o.pos.w;
return o;
}
You will have some distortion though, depending on the FOV of your camera.
Could I trouble you for a little more context with that code snippet? Is it part of the shader, and if so, what would the rest of the shader look like?
I tried it like this and it works fine on a sphere (the unity builtin one) - but a cube or plane distorts the texture along the triangles - (this is on the pc version, 2.5 pro)
I was trying the TexGen EyeLinear - but I am on a PC and it seems to behave the same as objectLinear and not as I would expect it. Any ideas?
Shader "camera project" {
Properties {
_Color ("Main Color", Color) = (1,1,1,0.5)
_MainTex ("Base (RGB)", 2D) = "white" {}
}
Category {
Blend AppSrcAdd AppDstAdd
Fog { Color [_AddFog] }
// ------------------------------------------------------------------
// ARB fragment program
SubShader {
// Pixel lights
Pass {
Name "PPL"
Tags { "LightMode" = "Pixel" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct v2f {
float4 pos:POSITION;
float2 uv:TEXCOORD0;
};
float4 _MainTex_ST;
v2f vert (appdata_base v) {
v2f o;
o.pos = mul( glstate.matrix.mvp, v.vertex );
o.uv = v.texcoord;
o.uv = o.pos.xy / o.pos.w;
return o;
}
uniform sampler2D _MainTex;
float4 frag (v2f i) : COLOR
{
half4 color = tex2D(_MainTex,i.uv);
return color;
}
ENDCG
}
}
}
FallBack "VertexLit"
}
Using eyelinear is nice and short, but it doesn’t seem to work right for me either:
// EyeLinear texgen mode example
Shader "Texgen/Eye Linear" {
Properties {
_MainTex ("Base", 2D) = "white" { TexGen EyeLinear}
}
SubShader {
Pass {
SetTexture [_MainTex] { combine texture }
}
}
}
I did a slight modification to your shader (did the w divide in fragment, as I am suspecting the uv interpolation to cause the distortion you mention).
Also, as an addition to this, you might want to offset/scale the UV’s so you sample in the 0-1 range, and not -1,1.
Shader "camera project" {
Properties {
_Color ("Main Color", Color) = (1,1,1,0.5)
_MainTex ("Base (RGB)", 2D) = "white" {}
}
Category {
Blend AppSrcAdd AppDstAdd
Fog { Color [_AddFog] }
// ------------------------------------------------------------------
// ARB fragment program
SubShader {
Pass {
Tags { "LightMode" = "Always" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct v2f {
float4 pos:POSITION;
float4 uvproj;
};
float4 _MainTex_ST;
v2f vert (appdata_base v) {
v2f o;
o.pos = mul( glstate.matrix.mvp, v.vertex );
o.uvproj = o.pos;
return o;
}
uniform sampler2D _MainTex;
float4 frag (v2f i) : COLOR
{
i.uvproj /= i.uvproj.w;
half4 color = tex2D(_MainTex,i.uvproj.xy);
return color;
}
ENDCG
}
}
}
FallBack "VertexLit"
}
Edit: Replaced lightmode with “always” so the object will be visible even if it is not hit by a pixel light.
thanks ToreTank- the w divide fixed the uv interpolation distortion.
And yes, the texture was being repeated twice across the width and height of the screen. I added this after your w divide line:
i.uvproj = (i.uvproj + 1) * 0.5;
And now it places the texture at the single width by height.
ap
Hey, I like this. I’ve added priceap’s line to ToreTank’s code, and put in support for Unity’s scaling/tiling values, as well as the main colour. Both were declared before, but unused.
I’ve also taken the liberty of giving it a proper name, and cleaning it up because I can’t help myself.
Shader "ViewportTexture" {
Properties {
_Color ("Main Color", Color) = (1,1,1,0.5)
_MainTex ("Base (RGB)", 2D) = "white" {}
}
Category {
Blend AppSrcAdd AppDstAdd
Fog { Color [_AddFog] }
SubShader {
Pass {
Tags { "LightMode" = "Always" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct v2f {
float4 pos : POSITION;
float4 uvproj;
};
float4 _MainTex_ST;
v2f vert (appdata_base v) {
v2f o;
o.pos = mul( glstate.matrix.mvp, v.vertex );
o.uvproj.xy = TRANSFORM_TEX(o.pos, _MainTex);
o.uvproj.zw = o.pos.zw;
return o;
}
uniform sampler2D _MainTex;
uniform float4 _Color;
float4 frag (v2f i) : COLOR {
i.uvproj /= i.uvproj.w;
i.uvproj = (i.uvproj + 1) * 0.5;
half4 color = tex2D(_MainTex,i.uvproj.xy);
return color*_Color;
}
ENDCG
}
}
}
}
Looks good Daniel! So now we should wait to hear from snicholls whether he finds it useful before we add anything more to it. Without direction, we might end up with a water shader or something.
Totally forgot about this post lol, erk thanks alot for this, I will be trying it in a few hours when I get back!
This community is awesome…
Thank you so much for this guys, I was just looking for an example of this myself. Looks like the work has been done for me already.
I think this shader could be used for a classic cartoon hand-painted background look.
Shouldn’t this be done in the vertex shader instead?
It looks painful to do all this calculations for each pixel instead of for each vertex!
You can (try it out!), but you’ll get distortion as the graphics card interpolates the values non-linearly. Vertex data interpolators do perspective correction, which is exactly what you want for 3D data, but works against us in this case. Check out this article for details.
Doing the perspective division in the vertex shader will work if you use an orthographic camera, or if your object’s polygons are parallel with the camera’s frustum, but otherwise you’ll get wobbliness.
:o I got served!
I wasn’t thinking of using this on anything else besides a plane facing the cam. Aka a billboard.
If this is pixel-perfect, it opens some pretty shiny doors!
Is there a way of getting this to work on the iphone at all?
with unity 3 and for 3GS+ only if you build for armv7 + ogles 2.0. Then there is at least a chance for it
EDIT: Nevermind, the shader does compile!
Although an additional question: Can anyone think of a way to scale the projected texture to the objects bounding box, instead of the screen?