Continuation of 'Any Plans for Supporting Large Worlds'

From Any plans for supporting Large Worlds? - Unity Forum

@alsharefeeee

I came across your post and it’s definitely a worthwhile topic of discussion, as many users are interested in large world coordinate support. Unreal’s support for large coordinates is now enabled by default.

Having used it briefly, my understanding is that it uses native 64-bit values on the CPU, which is then passed as two 32-bit Vector3 values to the GPU. The first vector is a grid position, and the second vector is a position within the grid. In terms of performance, overhead appears to be minimal and it works in a way that is mostly transparent to the end user.

I believe it would be a great addition to Unity, especially for procedurally generated worlds.

While the floating camera design works it is substantially more difficult to implement compared to Unreal’s solution.

Honestly, your post received undue criticism. Thanks for sharing your thoughts around the importance of this capability.

1 Like

Oh thank you @stonstad . For some reason some of Unity hardcore fans hate the Large Worlds support idea and I guess this speaks a lot about the older generation of Unity developers whose main focus was 2D and mobile (I am one of them, my focus was mobile).

By the way, since my post Godot engine seems to have somewhat implemented LW support. That means almost all major game engines now somewhat support LW. I also discovered that indeed DOTS do support Large Worlds (with a lot of bugs, of course) and we all know that DOTS is hard to use compared to the normal Unity. Though, it does seems to becoming easier. Hopefully in the near future @Joachim_Ante_1 will consider to add it, maybe as a package for Gameobjects!

For now I just decided to work with the smaller areas of my project until some day where me :slight_smile: + DOTS + Unity is ready for LW.

What I didn’t like is the mentality of some, “If I am not going to use LW then I don’t want any other Unity developers to use it too.”. They are afraid that it will be too heavy on the current Unity, completely forgetting that it could be a package for those who want it.

Oh and I did as @hippocoder advised and tried out Unreal Engine 5 and I hated it. Because I have been a Unity guy for more than 10 years, but if I was younger, and had very little experience in Unity, I would have definitely jumped the ship just like I did with UDK and CryEngine many years ago.

https://discussions.unity.com/t/902530

That’s not it. Having double precision and all those jazz requires deep engine changes and many paths inside the engine because many of the platforms Unity’s supporting cannot ever be supporting huge worlds. The implementation is long and messy, all those manhours could be spend on features more people would use, often more basic features. Like decent animation system and whatnot.

3 Likes

50km as stated by unity official is way above functional. I would say it starts to jitter at around 2km. Though that is in frist person VR

I thought Unity already did it O.O this is pretty standard

1 Like

I initially thought the same. But Unreal’s approach of passing two Vectors instead of one … doesn’t seem like a deal breaker for most platforms. I understand what you are saying about Unity’s delivery cadence and prioritization of features.

The needs vary based on the game. I am currently working on an open world 3rd person HDRP in Unity, and I can get pretty far from the origin without any jitter. I have not tested at 50km, but it would not surprise me if that mostly worked. My map size is an 8x8km.

By contrast, I worked on a URP first person VR in Unity a few years ago and the jitter was a serious problem past about 1-2km. I resolved that issue by stacking the camera and setting up a VR cockpit camera that stayed at the origin (0,0,0) and simply rotated to match the rotation of the spaceship as it flew around.

One thing to note is that a lot of 3rd party assets for building open worlds in Unity already support a floating origin. Terrain building, world streaming, and even vegetation rendering systems all support floating origin. Game developers can build large open worlds in Unity right now without waiting for 64 bit Vecter3.

3 Likes

Did you run into precision problems with either Allspace or Disputed Space? In a space sim, I’m seeing precision issues at around 5000 units, when a player walks across the bridge of a starship in flight. I’m just curious to understand what your experience was.

I see problems already at 2k. Multiplayer game so i don’t think there is a easy fix. We are on built in but tried a POC on HDRP midigated some of the jitter. But not all I guess it’s because of single precision on transforms and Physx

1 Like

And if you want to add multiplayer to it?

2 Likes

Yepp we need native origin shifting. I wonder have Rust solved it. Maybe they have smaller worlds than 2k?

Another issue is that most third party assets use different systems or bad systems for performing origin shifting. For instance, Vegetation Studio uses a reference Transform object to measure the origin offset, which will introduce large floating point numbers into its calculation eventually. I’m not positive but I would imagine this will introduce errors at some point.

Without 64 bit numbers, in my opinion the best approach is to use a grid based system where all objects in the scene have a coordinate (representing a cell on the grid that they fall within) and relative location in reference to this same cell. The coordinate positions can be stored using 64 bit values. Then you can origin shift by changing the “origin” cell on the grid. There are basically three different types of position values in this type of system:

  • Unity Scene Relative position used to position objects.
  • Grid Relative position (in reference to whatever cell the object falls within on the grid).
  • Raw position of objects in reference to the total size of the game map.

Converting between the three types of positions is relatively easy.

2 Likes

Yeah, we’ve got maps of similar size, also with no noticable physics jitter. Our scale is pretty big, though,by which I mean the player is controlling and interacting with objects bigger than a person. I imagine if we dropped a VR rig into the edge of one of our worlds it might not appear so stable any more!

Our world overall is closer to 50x50km, but is designed such that we can just origin shift when transitioning to a new area. Of course that solution wouldn’t work for all game types, though.

Calculate the relative difference of the two players respective origins and add that offset when you transfer across network to transpose a position from the world in one game instance to the other.

By the way, for a part of the whole issue - the part that cannot be worked around so “easily” with the origin-shift trick - Unity has introduced a feature to HDRP: https://docs.unity3d.com/Packages/com.unity.render-pipelines.high-definition@7.1/manual/Camera-Relative-Rendering.html

1 Like

Both of those needed a stacked camera to eliminate the flicker in the cockpit. Allspace included a VR option, and the VR cockpit jittered a lot without the stacked camera solution. With a stacked camera and the VR cockpit at the origin, there was no jitter at all.

With Disputed Space, I ran into a movement problem at one point caused by precision. I wrote my own movement controller for all of the ships, so I could get the kind of movement I wanted for that game. When I ran into a precision issue with that movement system, I rewrote some of my movement code. Basically I re-ordered the math operation so less precision would be lost. After I did that, the movement was fine even at really large distances.

Neither of my space games have a bridge for players to walk around on. Animation systems have problems doing math on skeletons at high distances from the origin. Even a simple idle animation on typical humanoid can get jittery after some distance. If you need to see humans walking around inside your spaceship, then I would suggest placing the ship at the origin and moving everything else around relative to that ship. That way, your humanoid animations can be run near the scene origin.

2 Likes

You can write your own code to do what ever you want on the client and server of any game you develop. For a huge open world multiplayer game, you will likely need to build your own custom server solution to handle everything. I don’t know if any off the shelf multiplayer solution would handle it. Even if some existing off the shelf multiplayer code could do it, who knows if it would be performant in the exact ways that your game needed. Each multiplayer game has unique needs, so off the shelf multiplayer code is usually not the best idea.

1 Like

One thing to keep in mind when talking about camera relative rendering in HDRP is that it helps with static objects but not animated things. HDRP can do a great job with mountains, buildings, rocks, and so on. But when animated people and animated creatures are running around, the animator system itself can still jitter. What this means is that origin shifting is usually still needed, even with HDRP, if you want to have animated characters on a 50x50km map. On a 8x8km map, origin shifting is not really needed.

2 Likes

Have you actually seen any problems using Vegetation Studio with a floating origin using Gaia or World Streamer? I use Vegetation Studio Pro. I have not seen any problem yet like that. But I have not intentionally tried to cause the problem and then carefully measure to see if anything moved. I am interested to see if anybody has seen that issue.

I am currently doing an 8x8km island surrounded by water, and then the player character is normal sized. I have messed around with and without origin shifting, and it was not a benefit for my 8x8km map. On a 50x50km map, I can definitely see a huge benefit for origin shifting.

1 Like

after 999 meters you get 3 decimal digits. Are your game third person. Even if you use HDRP with local camera space rendering I cant see how physics and transforms can be smooth at 8km