Unity WebGL build runs fine on localhost and local servers, but not on Unity Play where it gets stuck at loading

Hi all, I am currently trying to upload my game to Unity Play, but I am having troubles with getting it to load. Game is able to load on Unity’s build and run, and running a live server via VS Code works as well.

However after publishing using unity’s WebGL tool, it gets stuck at around 90% and maxes out the CPU for the browser.

These are my current build settings



My current build is found here if you want to test it:
https://play.unity.com/en/games/925c9904-ad81-4e65-ae70-6c8957cecf34/pinball-battles

Hope someone is able to help :pray:

Same for me, and there’s nothing notable in the console. Try making a development build and upload that to see if it prints something in the console. You may also want to add additional logging to see where it gets stuck.

The way it acts it feels like the app entered an infinite loop the moment it started up. So you may want to check what the first things are the code is doing and whether any of that might behave differently.

Common cause might be the use of Task.Delay() which is unsupported on the web. Or generally async/await code may behave differently on the web.

Thanks for the reply, appreciate your help to test. Unfortunately, I did a development build before and it printed out nothing that is different from a normal build. I thought it might be an infinite loop issue, but the same thing happens even when I had just build an empty scene, and the mystery is that game runs normally on the localhost but not on any servers.

I am assuming it’s something with my build settings, or a bug from the unity version? I ran a test on an empty project and published to unity play, the scene was able to load after 1 second of freeze.

I tried a few things, such as overriding the textures to Max 256, enabling/disabling compression but still the same result

I think the most suspicious logs in the console would be

Found NO interfaces on host .
Hidden/CoreSRP/CoreCopy shader is not supported on this GPU (none of subshaders/fallbacks are suitable)
Hidden/Universal/HDRDebugView shader is not supported on this GPU (none of subshaders/fallbacks are suitable)

You should try stripping down the build (make a copy of the project) to see where the issue comes from. If you create an empty scene in your existing project and that doesn’t work it would indicate that something still makes it into the build that causes the issue.

To be sure, create a new empty project and upload a build from that to see if this works. Because if that works, it’s a project issue and if not, it’s more likely an issue with Unity itself. Be sure to use the latest patch version (31f1) since these add mostly bugfixes.

1 Like

Hmm thanks for giving me a direction, I will try it, thanks!

Followed your suggestion for debugging. Managed to find the issue! Apparently there was an editor script from a SDK that adds quite a few unexpected stuff to the build when building, causing the game to run on an infinite loop.

For future game developers that may share the same problem I faced, do remember to check your project’s scripts that are tied to the editor!

Thank you CodeSmile for the help! :smiley: