NoesisGUI: the definitive UI middleware, available soon on Unity

At Noesis Technologies we are proud to present our recently announced NoesisGUI middleware. NoesisGUI is a vector based User Interface framework based on XAML, allowing the creation of powerful interfaces using advanced tools like Expression Blend or any other editor that can export to XAML format (even from your favorite text editor!)

The visual power that NoesisGUI delivers is astounding, as every item can be animated, projected in 3D and skinned… all using the flexible XAML language, and UI elements can lay out on top of your scene or integrate into the 3D as a dynamic texture.

Our solution is fully written in C++, GPU accelerated and designed to take advantage of multi-core processors, so NoesisGUI is FAST. But you don’t need to be a C++ expert because currently we are integrating it with Unity so you will be able to use it easily from Unity scripts.

Please visit our web site at noesisengine.com to see all the detailed info about NoesisGUI and the presentation video.

Stay tuned to this thread and to our official twitter account: @noesisengine

Can it be used in web player?

As above, but iOS and Android as well.

as its c++ sounds like pro only

Ill relook at this once 3.5 is out, and we have seen the new GUI system in Unity.

Hi,
Many thanks for your comments.

Our first objective is to target windows desktop by now, but will work to reach other platforms as well.
The web player currently is not in our roadmap because the use of plugins is not allowed there.

I’ll maintain you informed with the progress.

The clean, well established and well-tooled xaml markup with vector-based rendering, done with a C++ back-end with a C# api? I am so there. Loving these scaleform-like technologies :slight_smile: Although I am concerned that web-player isnt going to be targeted. I thought the webplayer did support plugins?

What price do you have in mind for this?

If the price is reasonable and if it can be used in Web player this would definitively replace any potential GUI solution (even a probable Unity one ?) for me.
Waiting for the price.

Ooooooo, very smart! - but as been mentioned all depends what Unity 3.5 does with GUI…

Any update on when this will be available?

Hi!
We are still working in the port of our rendering backend to Unity. Once this is done we’ll open a betatesting program so it will be possible to evaluate our GUI system very soon.

About the pricing we have not taken a final decision yet, but for people involved in our betatesting program we are thinking about giving it for free.

As soon as I have updated information I will post it here.

Looks great…Looking forward to try it.

I’m interested. :wink:

This is looking very sexy, you definitely have my interest here. Would be nice to get a better idea of the workflow and how to implement it in our own game.

Hi,
The workflow is really simple.

On one side there are the xaml files that will be created by the artist team. The xaml defines the visual aspect of the UI elements, the animations, the connections between them and even the visual reaction to some events. This file is passed to our xaml compiler that creates an optimized resource that can be loaded and used inside the game.

On the other side, there is the “code behind” (as it is described in the MS terminology) that is the code that is executed as a response to an event from a UI element. For example, when a button is pressed, a Button_Click function could be executed. This function is written in the native language of the engine (for example C# in Unity) and the unique connection with the xaml file is to maintain the same name of the function.

For example, our button in the xaml file could be:

<Button Content="Click me!" HorizontalAlignment="Left" VerticalAlignment="Top" Width="75" Click="Button_Click"/>

And the code behind file could have this method:

private void Button_Click(object sender, System.Windows.RoutedEventArgs e)
{
    // TODO: Add event handler implementation here.
}

With this approach, artists and programmers can work independently. The programmers can modify the code inside the Button_Click method and implement any logic on it, while the artist team can modify the xaml file, creating a visually complex button (using the skinning abilities of xaml, adding animations…) and they will only need to update the resource when its done, compiling the source xaml file.

i cant wait ;))) when when ? i wanna try it :slight_smile:

That’s interesting. A wrapped up solution where object definition/design and ingame execution are separated like this would clean up a lot of stuff in game code … So it’s a real advantage.

Waiting for the price, and the performance benchmarks under mobile devices compared to other solutions. I understand mobile couldn’t afford vector based triangulation, but I guess there should be an option to turn it on/off for mobile optimization ?

p.s : isn’t there any way to integrate the objects without relying on C++ plugins ? Like I don’t know, an automatic conversion to [mesh+textures+C# scripts] —> prefab, like most custom Unity GUI solutions are doing actually ?

This looks really good I have to say. When will you guys go into beta (roughly) and is it a public or closed beta?

Hi!
We’ve been very busy these days preparing the launch of the beta for the native C++ version of our UI middleware before the end of the year. We’ve still some work to do for the Unity version, but until it’s out you will be able to test the upcoming release and experiment the power of XAML. Stay tuned!