Hello! I am trying to make a space game, and I have successfully implemented a Level Of Detail system where I subdivide an icosphere in a separate thread as a camera gets closer to a surface. However, when I scale the sphere up to real-scale earth size(6378 km) and position it at (0, -6378000, 0), the surface of the planet jitters as I move the camera around. What’s weird is that when I turn on wireframe mode in the editor, I can see two wireframes - one that is moving, and one that is not. What’s weird is that this surface movement only occurs when the movement of the camera has a y component. Basically, the level of the terrain seems to be fixed relative to the camera, and after the camera passes a certain threshold, the terrain seems to “snap” back to where it was supposed to be. I believe this is due to the camera relative rendering in HDRP using floating point coordinates, but I am not entirely sure. I am planning on implementing a floating origin system of my own, so will disabling camera relative rendering fix my problem?
It is to do with the limits of floating point numbers, in this case Single Precision floats, which is what Unity uses internally for stuff.
Look into a floating origin… that’s one approach to solving this issue.
1 Like