1. Enter the Microsoft Developer Challenge for a chance to win prizes. Read more about it here.
  2. Enter the Google Play Indie Games Contest in Europe. Read more about it here.
  3. Unity 5.5 is now released.
  4. Become a closed-alpha partner on the new Facebook games platform. Read more about it here.
  5. Get early access to our Google Daydream Technical preview. Click here for more details!

Unity in a Window

Discussion in 'Wish List' started by zumwalt, May 17, 2008.

  1. sloopidoopi


    Jan 2, 2010
    I'm trying to integrate the UntityWebplayer ActiveX control into a WPF application. Has anybody tried to integrate the Webplayer with the WinFormsHost control?
    Microsoft has some info how to integrate activeX controls in WPF: http://msdn.microsoft.com/en-us/library/ms742735.aspx but i had no success. When i create a instance of the AXUnityWebPlayerAXLib instance (this is the Unity-activeX control) in the code and add it to a WinFormHost i see the Unity logo from the player when running the application and nothings happens any further. When I set the "src" property via code nothing is displayed (even no Logo). I know other Members reported that you have to set the "src" in the editor. So I created a Winform class and embedded the WebPlayer there and set the "src" via the editor , but when i create an instance of this class i see also nothing in the running application.
    Is there a way to use the WinFormHost or is the web browser component of .Net the only way when i want to mix Unity with WPF?
  2. Christian


    Dec 2, 2008
  3. KEMBL


    Apr 16, 2009
    Is there some approach to get HWND of the current webplayer window where script works? This needs to apply GDI/WPF to webplayer window and build nice looking GUI interface :)

    I understand, that legal method (GetForegroundWindow) is turned away by security reasons, but all I need is HWND of my current window webplayer, not more!

    At least, may be yo can just to add new system value myHWND with this integer, to new Unity version?

    P.S. Was tried to use hwnd of WebPlayer window, which I get with showin.exe programm and type in to edit box in WebPalyer, so they know which is its self HWND. Webplayer fails to operate with HWND by security exception ggrrrrrr... :( Standalone version success draw GDI over it self window.

    Will be better if program can to operate with it self HWND only :) Take this idea in to the new Unity release please please please please! Cant see any security risks there!
    Last edited: May 26, 2011
  4. McStevenF


    Jun 2, 2011
    I am creating a 3D electromagnetic simulation tool that includes 3D CAD to create the objects that interact with light or radio waves. I would like to use unity as a 3D rendering engine and graphical geometry editor inside of a native (windows/mac/iOS) application. The native application is used to select and manage the objects in a tree view control and text editing of all of the properties of the objects in a property editor.

    First question: Am I crazy???

    Second question: Is there a decent way to do this without a native application?

  5. PhilG


    Nov 9, 2009
    Another one here wanting to embed unity 3d in an appplication.
    We have a product that embeds a vitools player into a winforms app and would like to this with unity.
  6. noontz


    Nov 7, 2009
    +1 ;).. two way communication out´o´da´box a nice, enjoyable tutorial for dessert..
  7. justinlloyd


    Aug 5, 2010
    You could embed the Unity rendering window inside of a Windows form by hooking the creation of the DirectX device vfunc and pass in your own HWND when you start the Unity application in suspended state. Of course you still have the issue of inter-process communication, so either sockets or retrieving the .NET state object for Unity would be the next step. Detour EndScene and you can pass your values back and forth between your app and Unity without fear of a race condition.

    Alternatively, you could easily inject your Winforms app directly into the Unity process, hook your Winforms app as a detour in to EndScene and then just make use of Unity's own functions through reflection, no inter-process communication required at that point as you are directly inside of Unity's process address space and have the run of the place.

    It should be a reasonably trivial exercise to create a wrapper around the Unity Player that makes it an encapsulated custom control that can communicate with a host process. Maybe... two days of work. At the outside.
  8. npsf3000


    Sep 19, 2010
    Do it, stick it on the market, it'll be the easiest money you'll ever make :p
  9. justinlloyd


    Aug 5, 2010
    It has a limited appeal and limited lifespan once Unity3D adds the proper functionality to their engine. I'll be lucky if I sell a dozen licenses to the solution. What should I charge for that? Fifty bucks? A hundred? A thousand? Anything less than five thousand bucks return on the development and I might as well just give it away for free. That way I don't have to deal with the support costs either.

    On a complete separate topic; piracy is a huge problem with anything Unity library/add-on related, e.g. rl-forum, cgpersia, etc. Selling individual licenses to sub-$1,000 programming libraries is not worth my time these days.

    If you want to code up your own solution, all of the details are covered in the MSDN.
  10. unisip


    Sep 15, 2010
    Hey, we tried to develop something like whazt Justin describes:
    - run a WPF EXE application and a Unity EXE side by side
    - Unity gets rendered in WPF and communication is done through .NET Remoting

    Now, we're having one problem that we couldn't solve:
    - if we attach the Unity rendering to WPF as a direct child of the Main WPF window, events like mouse clicks and moves are received ok by Unity. For some reason, scroll wheel event isn't received.

    - if we use a HWNDHOST in WPF to render Unity (which is useful if we want to have other WPF controls on top of Unity), then Unity never receives any mouse events. We tried the same embedding approach with other programs like Notepad and they all receive mouse evetns.

    Does anyone have nay idea why Unity would behave differently when it comes to mouse events?
  11. justinlloyd


    Aug 5, 2010
    Not sure why that would be the case with the mouse scroll wheel events. I was not using those events in my solution and I don't have time to look in to it today -- product release on 1st November so things are a little stressful!

    I know that some windows events do not get passed through to many applications, scroll wheel events have always been weird in this way too.

    Just as an off-the-cuff idea, you might look in to a creating a system wide hook to snag the missing events and forwarding them on your application.

    And as a question: Does your WPF host application correctly receive the scroll events?
  12. chasmash


    Jul 16, 2010
    Justin or unisip,

    Were either of you able to get this working? Like many others in the forum I am looking to embed unity within a wpf application. The ideal would be that wpf can be put on top of unity as unisip mentioned.

    Just wondering if anyone has accomplished this.

  13. marsMayflower


    May 23, 2012
    I have a winform application that uses a webbrowser control to display the webplayer plugin. I then use javascript to communicate directly to the host application. It works great.
  14. rocketfoe


    Jun 27, 2012
    Hi Unisip,

    do you have any more info on how to implement this? We want to render Unity inside a WPF (or winforms) app and communicate with it. Do you have any tips or examples you can share?
    Thanks a lot!