Hi there,

I’m sorry if what im about to ask is a newbie mistake but im currently working on a virtual tour 360 and how it works is basically im putting an equirectangular panorama image to a sphere object and turning it inside out so that the image faces towards inside of the sphere. To do that, i found a shader that takes the sphere’s surface normals as an input and projects it to the equirectangular image. The formulas used in that shader are shown below

``````float3 a_coords_n = normalize(a_coords);
float lon = atan2(a_coords_n.z, a_coords_n.x);
float lat = acos(a_coords_n.y);
float2 sphereCoords = float2(lon, lat) * (1.0 / PI);
return float2(1 - (sphereCoords.x * 0.5 + 0.5), 1 - sphereCoords.y);
``````

From that code, i know the formula of longitude and latitude. What im confused is that why is the longitude normalized to (0<->1) range and the latitude is normalized to (-0.5<->0.5) range? Why is the range different? why not normalize them both to (0<->1)?

Thanks!!!

Well, both coordinates are in the range 0 to 1. Why do you think otherwise? They used acos for the latitude so they map the -1 to 1 range to 180° - 0° or “PI - 0”. Since they divide by PI you get a “1-0” range. They flip the range in the end to get 0-1

For the longitude they use atan2 which returns a value between -PI and PI. They also divide by PI to get -1 to 1. Finally they multiply by 0.5 to get -0.5 to 0.5 and then they shift by 0.5 to bet 0 to 1.

So it does convert them as expected. Do you have any issues with that code?

Note that the “lat” calculation is not really the latitude as it doesn’t go from -90° to 90° (or -PI/2 to PI/2) because they did not use “asin” which would give you that. They used “acos”. As you may know the cosine is shifted by 90° compared to sine. So they automatically get a 0 - 1 range which is what they needed.

Hi Bunny83,

Thank you for your explanation. Now i understand i mistook the range that the acos returns and the range of the latitude (-90 to 90 degree) but now i know that the lat is in the range of 0 to 1, i kinda got lost in the “lat not being the latitude”.

From what i understand in this shader is using the “Equirectangular Projection” which takes every normals on the sphere’s surface, convert it to UV coordinate and take the color on that UV coordinate from the equirectangular image and then sample that color to the vertex.

Now, correct me if im wrong but what i know is to get the uv coordinate using the equirectangular projection, the u=longitude and the v=latitude.

If the “lat” in that shader isn’t the latitude, then is the formula on that shader wrong? also is the reason that the shader code is using the 0 to 1 range is to match the uv coordinate range which is 0 to 1?

To provide you with a little more detail maybe, i will put the complete code below:

``````Shader "Custom/Equirectangular" {
Properties {
_Color ("Main Color", Color) = (1,1,1,1)
_MainTex ("Diffuse (RGB) Alpha (A)", 2D) = "gray" {}
}

Pass {
Tags {"LightMode" = "Always"}
Cull Front

CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#pragma fragmentoption ARB_precision_hint_fastest
#pragma glsl
#pragma target 3.0

#include "UnityCG.cginc"

struct appdata {
float4 vertex : POSITION;
float3 normal : NORMAL;
};

struct v2f
{
float4    pos : SV_POSITION;
float3    normal : TEXCOORD0;
};

v2f vert (appdata v)
{
v2f o;
o.pos = UnityObjectToClipPos(v.vertex);
o.normal = v.normal;
return o;
}

sampler2D _MainTex;

#define PI 3.141592653589793

{
float3 a_coords_n = normalize(a_coords);
float lon = atan2(a_coords_n.z, a_coords_n.x);
float lat = acos(a_coords_n.y);
float2 sphereCoords = float2(lon, lat) * (1.0 / PI);
return float2(1 - (sphereCoords.x * 0.5 + 0.5), 1 - sphereCoords.y);
}

float4 frag(v2f IN) : COLOR
{