Hi all. I need to use materialpropertyblock for UI images. How can I handle it?
I have a lot of UI square images that have different properties (Color property for example or another one). Thanks
You can’t use material property blocks with UI. What is it you want to change? Color is better changed through the UI components than a MPB. The color is placed into the mesh as vertex colors. If it’s just lots of squares then changing the sprite color is perfectly fine. You could also write a custom component that puts the color into the vertexes yourself if it needs to be more complex.
Thanks, I have written a custom shader. The property is not color. I know an image ui has a color property
it is four float (vector4: x and y bottom left coordinate point and height width) , I need to mask a texture on these squares and overlay them
I almost have 200 squares and can not have 200 pass call for that
What about putting the data into a different channel such as uv1 or uv2? The canvas allows for adding some extra channels. I have done something similar in the past using uv1 and a custom sprite to write in the channel.
perfect I understood it
Can you refer to me a link about that?
So I set a channel uv2 float4
float4 uv2:TEXCOORD2. my data are between 0 and 1 but I think a texture RGBA can not have enough precision
putting the data into a channel instead of sending to a shader with setvector
You can create a new BaseMeshEffect class and then put the data into the verts like this example:
Then make sure you have enabled the channel in the canvas under the Additional Shader Channels
some questions:
1- ModifyMesh override function is executed only once?
2- can we add different channels for different uis or not (in other words can we add texcoords to special uis not a canvas)
Its called whenever the canvas needs to generate a mesh. https://docs.unity3d.com/ScriptReference/UI.IMeshModifier.html
What do you mean uis? You can modify selected elements, you dont have to attach your MeshModifier to every element.
no, you said I must change Additional Shader Channels from the canvas so I can change it only for the canvas and if I want to have different channels for different images for example,I need to add all channels to consider all of them
Yes you need to make sure that the responsible canvas has the additional channels enabled. Then you can modify its child images.
again thank you
using System.Linq;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
[AddComponentMenu("UI/Effects/Position As UV1", 16)]
public class PositionAsUV1 : BaseMeshEffect {
[SerializeField]
RectTransform m_maskImageRectTr;
[SerializeField]
RectTransform m_imageRectTr;
Vector4 vec;
[SerializeField]
int m_rowIdx;
[SerializeField]
int m_colIdx;
[SerializeField]
Vector2 m_margin;
// Use this for initialization
protected override void Start() {
base.Start();
Vector2 vv = m_imageRectTr.sizeDelta.Divide(m_maskImageRectTr.sizeDelta);
vec.x = ((m_colIdx - 1) * (m_margin.x + m_imageRectTr.sizeDelta.x) + m_imageRectTr.sizeDelta.x) / m_maskImageRectTr.sizeDelta.x;
vec.y = ((m_rowIdx - 1) * (m_margin.y + m_imageRectTr.sizeDelta.y) + m_imageRectTr.sizeDelta.y) / m_maskImageRectTr.sizeDelta.y;
vec.z = vv.x;
vec.w = vv.y;
}
public void SetRowCol(int _row,int _col) {
m_rowIdx = _row;
m_colIdx = _col;
}
protected PositionAsUV1() { }
public override void ModifyMesh(VertexHelper vh) {
UIVertex vert = new UIVertex();
for (int i = 0; i < vh.currentVertCount; i++) {
vh.PopulateUIVertex(ref vert, i);
vert.uv1 = new Vector2(vec.x, vec.y);
vert.uv2 = new Vector2(vec.z, vec.w);
vh.SetUIVertex(vert, i);
}
}
}
struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
float2 uv1 : TEXCOORD1;
float2 uv2 : TEXCOORD2;
};
struct v2f
{
float2 uv : TEXCOORD0;
float2 uv1 : TEXCOORD1;
float2 uv2 : TEXCOORD2;
float4 vertex : SV_POSITION;
};
sampler2D _MainTex;
float4 _MainTex_ST;
sampler2D _OverlayTex;
float4 _Rect;
fixed4 _Color;
float _Factor;
float _Factor2;
v2f vert (appdata v)
{
v2f o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = TRANSFORM_TEX(v.uv, _MainTex);
o.uv1=v.uv1;
o.uv2=v.uv2;
return o;
}
fixed4 frag (v2f i) : SV_Target
{
// sample the texture
fixed4 col = tex2D(_MainTex, i.uv);
float2 overlayUV=float2(i.uv1.x,i.uv1.y)+i.uv*float2(i.uv2.x,i.uv2.y);
fixed4 col2 = tex2D(_OverlayTex,overlayUV);
fixed4 finalCol=fixed4( col.r<=0.5 ? 2*col.r*col2.r: 1-2*(1-col.r)*(1-col2.r),
col.g<=0.5 ? 2*col.g*col2.g: 1-2*(1-col.g)*(1-col2.g),
col.b<=0.5 ? 2*col.b*col2.b: 1-2*(1-col.b)*(1-col2.b),1);
finalCol=lerp(finalCol,col2*col,_Factor);
finalCol=lerp(col,finalCol,_Factor2)*_Color;
finalCol.a=col2.a;
return finalCol;
}
Looks great. Happy to help
What is the justification for this obtuse restriction? Why doesn’t CanvasRenderer have material property blocks?
It arbitrarily limits UI elements and its super unintuitive. I’m trying to write a custom UI component with a fairly complex shader that needs quite a few bits of data passed in and that’s in addition to vertex colors and UVs.
I originally wanted to pass in several properties but I’ve been banging my head on this for hours only to come to find out there is no way to even pass in a single color property on a per instance basis. Encoding a uniform property value into additional UV channels is just silly and inefficient. And what if we wanted to pass in a second texture?
I need many instances of this component with differing data values so I don’t want draw calls to scale with instances.
+1 for using MaterialPropertyBlocks with CanvasRenderer. MaterialPropertyBlock is an essential unity workflow. Why should it be any different for UI components with custom shaders/materials?
Unity really should consider adding MaterialPropertyBlock for CanvasRenderer. My game consists mostly UI Text and Image that are using CanvasRenderer. This will have huge impact on my workflow.
Same problem here. This makes no sense at all. They could simply allow it to work. I have thousands of objects on a UI I need to set parameters dynamically, but there’s no way to make it performantly assigning parameters manually
Just came across this limitation for the umpteenth time.
There might be good reasons for this, but with no explanation why this is the case, it seems like such an arbitrary limitation. UI is such a great candidate for MBPs - 99% of the time, if I need a custom shader for a UI element, its going to be broadly the same with a few unique parameters. But nope. The “best” way I’ve worked around this is by trying to squeeze things into the rgb of the element’s color tint (only good for floats), as that’s at least per-element.
Having “MaterialPropertyBlock” for “UI” would resolve pile of headaches for many of us.
would be really nice to have it in a future update .
What if I want to use a Dissolve Shader effect and change the Dissolution Level in a specific renderer? That would require the support for controlling the renderers canvas by canvas.
Just gonna pitch in my support for MaterialPropertyBlocks for UI. Unity’s UI has basically not changed since 4.6 in 2015, and yet UI is such a major part of any game or application. The kind of projects we build at the company I work at are pretty much only UI. Having to use new material instances for each UI element that you want to do more with than just the basics is a real performance bottleneck. Hacking around this by using additional shader channels is not a scalable or user friendly solution.