First of all, I've read a couple of threads that say this cannot be done. So, I don't think I'm going to even bother asking if this can be done.
I want to create a C# app that is used to design GUIs for unity games. It would be nice to be able to use Unity to render the UI to the user so they get a 100% accurate representation of what the UI will look like in the game. If I can't do this, then I'll have to create a custom renderer that (as closely as possible) imitates how the Unity GUI renderer works (alignment, spacing, etc).
I've heard that the Unity Web Player can be embedded as a control in a winform. How can I go about doing this? I've seen screenshots where the web player shows up in the "Toolbox" pane in visual studio. I figure at the very least I can use the web player to provide a "preview" window in my application so the user can easily test their UI changes. I haven't found any instructions for getting the player integrated.
This sounds like a really cool idea. We looked at doing something like this at one point. Here is a link that might help: http://stackoverflow.com/questions/478611/how-do-i-embed-gecko-using-gecko-sharp-on-mono-windows . Google around and you will find a few other places where this is being done with mono. Another solution is that you can actually spawn a winform application from within Unity or a standalone deployment for your tool GUI. Then you have a "native" GUI and Unity running side by side. We did that for domain specific editing tool and you can pretty much control anything you want in Unity like this. We were picking things off of the scene graph and manipulating them in a way that was restrictive to non-developers.
The problem you will likely run into is that the Unity GUI is not at all component based like you are probably used to in AS3, Java or .NET. It's a (mostly) stateless API based GUI. So to create and place controls you will have to create your own component wrapper.