Unity 3D within windows Application Enviroment

Good evening everyone,

I am a new developer and I am currently working on a Windows application. I have to integrate a window within the application that will show a 3D model and is controlled by the Windows application. I am a huge fan of the Unity engine and have worked on it as a hobby of mine. I am just wondering if it is possible to have an instance of a compiled Unity executable within my Windows application environment. So when I run the program, the Unity project will be loaded within the software. Here is a diagram to show what I mean. I don’t need to know how to integrate it, just need to know if such integration is possible. Thank you.

1569073--92875--$Windows Application Example.jpg

You’re talking about embedding Unity player’s window inside another application, sadly Unity doesn’t expose such functionality, though it is possible.

The only thing I can suggest, is to vote for this feature - http://feedback.unity3d.com/suggestions/runtime-embedding-to-another-wi :slight_smile:

Thanks. I know in theory its possible, but I was hoping it would work with Unity simply because I’ve worked with it before and I’m quite fond of the engine.

When you say its possible, do you mean there is a possible way currently like a workaround? I don’t need a deep explanation, just a general idea would be useful. Thanks.

Could you embed webplayer html instead (using webbrowser component, or what is it called these days : ),
then maybe even can interact with it using javascript?

Sorry for not being clear…Sadly there’s no workaround, there has to be some tweaking in internal engine code.

Lame. Guess I’ll have to use another engine then. :confused:

Have you tried this,
http://forum.unity3d.com/threads/10855-Unity-in-a-Window/page3?p=199899&viewfull=1#post199899

Or how about just embedding exe (WindowsFormsHost) and using ports/file system to transfer messages…?

That seems pretty viable. I’ll test some stuff out then, thanks.

Dear all,

We are new to unity and for a customer project we need to control a unity object with touch from a WPF c# touch application.

The object in unity is a small test 3D cube, given by our customer, that we should be able to rotate from a WPF container

How can we control that cube from our application ?

In WPF we have the possibility to have acces to the UNityWeb container as activex control, do you think if we run unity in that web browser we will be able to directly manipulate it with touch ?

Thnaks for your prompt help which will helpus to move forward as we are stuck here for days now

If have trial version of unity for testing this and create a small object with your help in case something need to be setup in unity object for manipulation

Help really appreciate
regards
serge

It is possible to embed, or rather, reparent Unity executable window to another window, like it is possible with any other Windows application.
Just obtain the window handle and pass it to the SetParent WinAPI call
( it is a good idea in that case to launch the unity executable with -popupwindow parameter )

Note that I’ve done this with the whole parent window area, so I’m not completely sure whether it’s possible to occupy only portion of the window ( by passing, say, HWND of a groupbox / panel to the SetParent ) as is depicted in the topic.

We’ve done this and although somewhat cumbersome the Unity exe then runs completely embedded in the parent surface/window.
( although now that I am thinking about it we should have probably rather launch the exe and implement e.g. fullscreen/resolution logic in the exe itself… )

The Unity exe has then focus so any changes have to be communicated, if required, to the win/form/wfc process and the only way of doing so is via sockets - everything else is safely buried deep inside this custom mono

1 Like

No, at least not with released versions.
We’ve tried that, but I think it needed some modifications inside Unity. Don’t know, what state that thing is.

erm, what exactly does “need” some modifications ?
The Unity executable window has HWND like any other Windows application => can be reparented / assigned to other window
( we run the Unity executable windowed, maybe fullscreen could pose a problem )
If it wouldn’t possible, I wouldn’t see our .NET/WinForm app running, calling external Unity program and assigning it into prepared winform - which certainly is not the case :slight_smile:

btw I’m talking about ‘classic’ WinAPI, not Metro and such, which I don’t know, if that’s what you meant
– or maybe I’m describing slightly different use case

I meant that for now Unity assumes it’s running in it’s own window. Reparenting it might work, but stuff like input, joysticks etc. might not function properly. You’ll have to test it pretty thoroughly.

The scene in our setup is driven by external network data, but the keyboard input definitely works ( we use it to adjust parameters at runtime ) and IIRC mouse too.
But you are maybe right that this setup might not support all possible controller/s configurations.

Btw for all things concerned - the window in which Unity runs is still the same - only its parent is changed.
It’s not completely ‘clean’, however : the parent window can still resize / minimize / maximize the content ( the Unity exe ) - although we don’t do that in runtime - but it has not focus, for example.
The startup is a bit quirky, too, but it was sufficiently reliable so far.
( it takes some time for Windows to adjust all sizes/contexts etc. and also timing has to be right with proper process/es Wait/Refresh call/s )

– I’m not writing hypothetically - we use this ‘in production’ so to speak as part of a bigger desktop application which runs @ several clients now
but the approach is not ideal - as I wrote earlier - we should probably have had done all resolution/fullscreen stuff in the client unity exe all along and not be dependent on the winform
nevertheless - it works like this for now

hey!

Patch 4.5.5p1:
“Windows Standalone: You can now embed windows standalone player into another application, simply pass -parentHWND and windows standalone application’s window will be created with specified parent. See Command line arguments documentation for more information.”

so much for the “modifications inside Unity.” I guess
haven’t tested it yet, but good job -)

Hi, how to make container app pass touch input in to Unity Embedded? Keyboard Input works great.

It would be pointless, because in 4.6, Unity Windows Standalone isn’t capable of processing touch input. That will only be available in 5.0

We are using TouchScript from AssetStore to handle input from Windows for us. So for now there is no option to forward Touch to this plugin or Unity App or some workaround for this?