Today I found an article that Nasa used Blend4Web for there WebGL Curiosity Experience. I was quite surprised how it ran and even more to know that it works on so many platforms, even on mobile browsers. As what I see where the current state of the Unity WebGL solution is, I’m a bit disappointed. It takes a lot of time to build to WebGL, on IE I get al sorts of warning and the audio won’t work and not to mention each browser gives me another performance. Even the load times compared to the Blend4Web is amazing smooth which I really don’t get when loading a standard simple cube scene with Unity.
Don’t get me wrong I’m really happy Unity made WebGL possible, but I’m thinking Unity is way behind what they do here. I know they might not support all kind of features, but I don’t care. They have a smooth worry less WebGL solution that works on so many levels, and wait… even audio and no annoying warnings.
Not to mention… it’s opensource doing the job better when it comes to smooth experience than a million company Unity Technologie. What’s happening guys, where is the progress? What Unity does in 1 year, they do those updated almost weekly. I don’t blame anyone, but the progress worries me a lot when I saw this Nasa application.
Well, it’s considered rather crucial issue, because it’s almost Septmeber, when Google Chrome will switch off WebPlayer. To say, 60-70% of webplayer users use Chrome. I do understand that there is a collaborative work to set up new standards of WebGL, but…
I agree on that. Last Unity Conference in Amsterdam they showed the engine demo with WebGL. I understand people can archive a nice graphics level with it at the moment, but what’s the use when you have only 2/3 browser supporting it? WebGL should be fine on a lot of platforms, but the current state only enables us to run Unity demos on Chrome, Firefox while IE works… it’s full of warnings. And when I check this Nasa demo, it runs on so much more then we have with Unity. I rather go with less features and more platform support / less warnings, then the using it’s current state.
It defently needs an update, before poo hits the fan
Yes, people can get further with WebGL reach (ie, get it to run on more browsers/devices) by using a smaller engine, and thus having a smaller codebase for the browser to cope with. But saying that Unity should cut features to get there is really oversimplifying it. Yes, we could make a small WebGL engine which would run in many places. But this would not be anything compatible or resembling “Unity” in any way you know it. At which point you may just as well use any of the already existing non-Unity webgl engines out there.
We are investing into modularizing Unity more, so that [WebGL] builds can ship with a few pieces of the engine as needed. But that is still very different from engines, which are really just “glorified model renderers” (I don’t mean this to sound condescending - there are a lot of good use cases for those).
Today I read this article someone wrote about why NASA started to use the Blend4Web WebGL solution instead continuing with the Unity WebGL or Webplayer.
Ofcourse the guy who wrote this has a clear view why he’s not using Unity WebGL solution and it maybe a little upset about why it is as slow and such as it is now. A lot of indepth information is missing and it defiantly isn’t NASA’s opinion about why they use Blend4Web. Maybe the person just had a bad day and decided to write about it.
But what I’m really curious about is NASA’s real opinion.
Blend4Web is WRITTEN in java-script and uses many of it’s specific features (especially asynchronous calls, multithreading) while unity is just generated asm.js code, and for now cant even use multithreading (I hope this will change really soon, couse now skinning/physics/audio suffers a lot from this).
As an experienced game programmer (+15 years of making games) I personally think that using java-script is not the best idea to make browser games (from the PERFORMANCE point of view) - but as the security point of view wins with the performance point of view, then we must accept that something that was possible to do in old intel atom machine with 1GB of RAM at 60FPS, now is impossible to achieve and steady 30FPS on waaay faster multicore i5 processor & 8GB of RAM - from my perspective we have two choices now: 1] Live with it and hope that somewhere in time the js will be more performant 2] Drop browser gaming & switch to native one (for may this is probably not acceptable solution from marketing point of view).
Current “Multithreading” in JS is rather limited, because Web Workers cannot share memory. This makes existing multi-threaded code impossible to map to JS, and makes many multithreaded algorithms impossible to implement. This will change in the future when browsers implement Shared Array Buffers, which will bring multi-threading with shared memory to JS. Mozilla has benchmarked Unity in internal Firefox test builds with this feature enabled, and got
very significant performance boosts out of this on multi-core machines in areas like Physics or AI, which benefit from multithreading.