Hey guys,

I want to achieve the same transformation as the Effect in Photoshop “Polar Coordinates > Rectangular to Polar”. For example, a free HDRI looks like this when processed with the filter.

Before:

Target:

So I created a Shader Graph in URP and just plugged a UV Polar Coordinates node into the texture node. The output somewhat goes in the right direction, but is not quite correct.

Can you please teach me how a correct node setup would look? Optional, the output should be an even square and not rectangular.

Well, equirectangular coordinates are essentially polar coordinates. This is how our GPS coordinates work as well. So the x and y in the first image directly translate to the lat / lon angles.

The second image is an azimuthal projection. This is a radial projection where you use the x angle as input for a direction vector (using sine and cosine with that angle) and use the y angle as radius.

Of course you can’t put a round projection into a rectangle. So you either have some left over gaps or you need to zoom in and clip the result. Your image looks like you the left over corners are filled with a solid color. So the image is actually a perfect circle, stretched over a rectangle with the 4 corners filled with a constant color. Maybe extrapolated from the border of the image. To convert the first image into the second, all you need to do is setup a mapping / projection from one space into the other. In shadery you generally want for each pixel / texel in the target space calculate the source coordinate. So you take the UV (x / y position) and offset it to the center (subtract 0.5 from both). Now you want to calculate the “atan2” of this new vector which gives you the angle (or the source x coordinate after proper scaling) and also calculate the magnitude / length of that vector (which gives you the radius or the source y coordinate).

In pseudo code it would be something like:

```
v = UV - 0.5;
v *= 2;
x = (PI + atan2(v.y, v.x)) / (2*PI);
y = clamp01(1 - length(v));
```

This should be all you need. Note that length(v) would be in between 0 and 1 as long as the positions are within the circle. However since we also have to draw the corners, the length would be larger there (up to 1.414). That’s why we simply clampl the result so the coordinates outside the projection just return the bottom edge of the source image. Though instead of using clamp, you could use an if statement, so if y is smaller than 0, you could just return a constant color instead.

Of course you would use the new x and y coordinates as UV for the texture lookup. Depenging on the wrap mode of the texture the result may be shifted or cut off. Though the x / y values my code calculates should be in the 0 to 1 range. If you need an offset somewhere (like the azimuth angle) that could be added in easily. You just have to make sure it wraps around correctly.

@Bunny83

Hey thanks for your answer. Sadly im kind of a dork when it comes to shaders. I only worked with node-based shadersetups before.

Im using this Equirectangular shader at the moment but how do I incorporate your pseudo code into it?

I think I have to alter the `fixed4 frag`

function and alter the return of the `return texCUBE(_MainTex, unit);`

. But how would I do so? The third axis Z is throwing me off. If you won’t answer it directly I would gladly apreciate if you point me to related tutorials or documentation.

Big thanks in advance

**EDIT:**

I tried editing it and I get not quite the right result. Is it because I have a 3rd dimension Z as well? I also marked the section in the shader, where I added your code snippet.

```
// Upgrade NOTE: replaced 'mul(UNITY_MATRIX_MVP,*)' with 'UnityObjectToClipPos(*)'
Shader "Conversion/CubemapToEquirectangular" {
Properties{
_MainTex("Cubemap (RGB)", CUBE) = "" {}
}
Subshader{
Pass {
ZTest Always Cull Off ZWrite Off
Fog { Mode off }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#pragma fragmentoption ARB_precision_hint_fastest
//#pragma fragmentoption ARB_precision_hint_nicest
#include "UnityCG.cginc"
#define PI 3.141592653589793
#define TWOPI 6.283185307179587
struct v2f {
float4 pos : POSITION;
float2 uv : TEXCOORD0;
};
samplerCUBE _MainTex;
v2f vert(appdata_img v)
{
v2f o;
o.pos = UnityObjectToClipPos(v.vertex);
o.uv = v.texcoord.xy * float2(TWOPI, PI);
return o;
}
fixed4 frag(v2f i) : COLOR
{
float theta = i.uv.y;
float phi = i.uv.x;
float3 unit = float3(0,0,0);
unit.x = sin(phi) * sin(theta) * -1;
unit.y = cos(theta) * -1;
unit.z = cos(phi) * sin(theta) * -1;
// Edit added
unit = unit - 0.5;
unit *= 2;
unit.x = (PI + atan2(unit.y, unit.x)) / (2*PI);
unit.y = clamp(1 - length(unit), 0, 1);
// Edit added end
return texCUBE(_MainTex, unit);
}
ENDCG
}
}
Fallback Off
}
```