I am a new developer and I am currently working on a Windows application. I have to integrate a window within the application that will show a 3D model and is controlled by the Windows application. I am a huge fan of the Unity engine and have worked on it as a hobby of mine. I am just wondering if it is possible to have an instance of a compiled Unity executable within my Windows application environment. So when I run the program, the Unity project will be loaded within the software. Here is a diagram to show what I mean. I donât need to know how to integrate it, just need to know if such integration is possible. Thank you.
Youâre talking about embedding Unity playerâs window inside another application, sadly Unity doesnât expose such functionality, though it is possible.
Thanks. I know in theory its possible, but I was hoping it would work with Unity simply because Iâve worked with it before and Iâm quite fond of the engine.
When you say its possible, do you mean there is a possible way currently like a workaround? I donât need a deep explanation, just a general idea would be useful. Thanks.
Could you embed webplayer html instead (using webbrowser component, or what is it called these days : ),
then maybe even can interact with it using javascript?
We are new to unity and for a customer project we need to control a unity object with touch from a WPF c# touch application.
The object in unity is a small test 3D cube, given by our customer, that we should be able to rotate from a WPF container
How can we control that cube from our application ?
In WPF we have the possibility to have acces to the UNityWeb container as activex control, do you think if we run unity in that web browser we will be able to directly manipulate it with touch ?
Thnaks for your prompt help which will helpus to move forward as we are stuck here for days now
If have trial version of unity for testing this and create a small object with your help in case something need to be setup in unity object for manipulation
It is possible to embed, or rather, reparent Unity executable window to another window, like it is possible with any other Windows application.
Just obtain the window handle and pass it to the SetParent WinAPI call
( it is a good idea in that case to launch the unity executable with -popupwindow parameter )
Note that Iâve done this with the whole parent window area, so Iâm not completely sure whether itâs possible to occupy only portion of the window ( by passing, say, HWND of a groupbox / panel to the SetParent ) as is depicted in the topic.
Weâve done this and although somewhat cumbersome the Unity exe then runs completely embedded in the parent surface/window.
( although now that I am thinking about it we should have probably rather launch the exe and implement e.g. fullscreen/resolution logic in the exe itself⌠)
The Unity exe has then focus so any changes have to be communicated, if required, to the win/form/wfc process and the only way of doing so is via sockets - everything else is safely buried deep inside this custom mono
No, at least not with released versions.
Weâve tried that, but I think it needed some modifications inside Unity. Donât know, what state that thing is.
erm, what exactly does âneedâ some modifications ?
The Unity executable window has HWND like any other Windows application => can be reparented / assigned to other window
( we run the Unity executable windowed, maybe fullscreen could pose a problem )
If it wouldnât possible, I wouldnât see our .NET/WinForm app running, calling external Unity program and assigning it into prepared winform - which certainly is not the case
btw Iâm talking about âclassicâ WinAPI, not Metro and such, which I donât know, if thatâs what you meant
â or maybe Iâm describing slightly different use case
I meant that for now Unity assumes itâs running in itâs own window. Reparenting it might work, but stuff like input, joysticks etc. might not function properly. Youâll have to test it pretty thoroughly.
The scene in our setup is driven by external network data, but the keyboard input definitely works ( we use it to adjust parameters at runtime ) and IIRC mouse too.
But you are maybe right that this setup might not support all possible controller/s configurations.
Btw for all things concerned - the window in which Unity runs is still the same - only its parent is changed.
Itâs not completely âcleanâ, however : the parent window can still resize / minimize / maximize the content ( the Unity exe ) - although we donât do that in runtime - but it has not focus, for example.
The startup is a bit quirky, too, but it was sufficiently reliable so far.
( it takes some time for Windows to adjust all sizes/contexts etc. and also timing has to be right with proper process/es Wait/Refresh call/s )
â Iâm not writing hypothetically - we use this âin productionâ so to speak as part of a bigger desktop application which runs @ several clients now
but the approach is not ideal - as I wrote earlier - we should probably have had done all resolution/fullscreen stuff in the client unity exe all along and not be dependent on the winform
nevertheless - it works like this for now
Patch 4.5.5p1:
âWindows Standalone: You can now embed windows standalone player into another application, simply pass -parentHWND and windows standalone applicationâs window will be created with specified parent. See Command line arguments documentation for more information.â
We are using TouchScript from AssetStore to handle input from Windows for us. So for now there is no option to forward Touch to this plugin or Unity App or some workaround for this?