@HiggyB:I have PM you.
AS FAR AS I KNOW, ALL UNITY3D USERS IN CHINA ARE USING LEGAL UNITY3D NOW AND I AM ONE AMONG THEM. SO PLEASE PAY ATTENTION TO WHATYOU ARE TALKING ABOUT.Originally Posted by dreamorahow many chinese things do you know that are legal when it comes to multi hundred USD software?
ps:Well, as you know, we lent all of our money to Uncle SAM
Please stop shouting, there's no need for all caps. I'll look for your PM now but am no longer worried about what you're doing (it was passing, a quick check showed all seemed to be ok so I moved on).
Has someone managed to find a good workflow to embed unity into windows forms? My current project is of the industrial visualization type. I know there are few of us out there
In one of our last projects we had the same problem. We want to embed a Unity3D window in a C# application to visualize some 3D data with it.
After some research we decided that the best way to integrate Unity in a C# application is to use the web browser component of .Net and display a unity webplayer in this component. To handle the communication between the external c# app and the webplayer, unity opens a tcp server and listen for incoming connections from localhost. After loading the web browser content (in this case the Unity app), the c# app connect themself to the unity server so that both apps can exchange serialized data over a tcp connection.
With some tricks it should be possible to code a custom SendMessage function in Unity and the c# app that can call any method of the other component, like it's implemented for the Webplayer - Browser communication.
Here some Screenshots of our Unity3D/C# Hybrid application:
On windows, I simply use c# to do a basic form with WebBrowser Component. It uses IE as browser, but it works fine. Although have some problems to enter fullscreen mode.
It is very easy to load stuff into it and you can use also other content in it (if you need to i.e. Flash, Java, PHP, etc).
I am trying to load a html demo of unity into a webbrowser component in c# but it seems like it doesnt load the unity webplayer and the demo doesnt work.
I have tried to deactivated all security options from IE but in vain.
Could you tell me more on how you did it ?
Also, I agree that using Unity in an external window would be great, like most other engines, a simple hwnd to give or something like that.
I don't want to be rude but the GUI system in the editor is really really,apart from some scripts/simple buttons there is nothing much to do.
It would be a lot better if we could use a Unity "sdk" in pure code instead of being forced to use the editor.
Ok, just found why it wasnt working!
I was compiling for x64...
Hi Christian , could you please upload a small example of your method about opening a tcp and do perfect communicate between unitywebplayer and c# program?
thankyou very much!
not sure if is relevant, but is there a way to do this on mac for a regular desktop app?
I'm trying to integrate the UntityWebplayer ActiveX control into a WPF application. Has anybody tried to integrate the Webplayer with the WinFormsHost control?
Microsoft has some info how to integrate activeX controls in WPF: http://msdn.microsoft.com/en-us/library/ms742735.aspx but i had no success. When i create a instance of the AXUnityWebPlayerAXLib instance (this is the Unity-activeX control) in the code and add it to a WinFormHost i see the Unity logo from the player when running the application and nothings happens any further. When I set the "src" property via code nothing is displayed (even no Logo). I know other Members reported that you have to set the "src" in the editor. So I created a Winform class and embedded the WebPlayer there and set the "src" via the editor , but when i create an instance of this class i see also nothing in the running application.
Is there a way to use the WinFormHost or is the web browser component of .Net the only way when i want to mix Unity with WPF?
during my web research for solving this problem I've found this one:
Embeding a Unity Standalone EXE in an .NET App. Maybe it work...
Is there some approach to get HWND of the current webplayer window where script works? This needs to apply GDI/WPF to webplayer window and build nice looking GUI interface
I understand, that legal method (GetForegroundWindow) is turned away by security reasons, but all I need is HWND of my current window webplayer, not more!
At least, may be yo can just to add new system value myHWND with this integer, to new Unity version?
P.S. Was tried to use hwnd of WebPlayer window, which I get with showin.exe programm and type in to edit box in WebPalyer, so they know which is its self HWND. Webplayer fails to operate with HWND by security exception ggrrrrrr... Standalone version success draw GDI over it self window.
Will be better if program can to operate with it self HWND only Take this idea in to the new Unity release please please please please! Cant see any security risks there!
Last edited by KEMBL; 05-26-2011 at 01:58 PM.
Frag them all
I am creating a 3D electromagnetic simulation tool that includes 3D CAD to create the objects that interact with light or radio waves. I would like to use unity as a 3D rendering engine and graphical geometry editor inside of a native (windows/mac/iOS) application. The native application is used to select and manage the objects in a tree view control and text editing of all of the properties of the objects in a property editor.
First question: Am I crazy???
Second question: Is there a decent way to do this without a native application?
Ability to use a Unity rendered area in a window on a regular form based application. Here is what I am talking about, I brought this subject up in the help area, but unfortunately no one understood what HWND was or what a target.hwnd was. Instead they thought I was talking about freezing pixels on a screen. Never could figure out why they thought that. Anyway, using the end result of the Unity build and tell it to render in a target window object. In the example image here, I am rendering to a picturebox. I tell my 3d applicaiton to use the PB.HWND's handle as the 3d render window handle. I know Unity can't do this at the moment, but it would come in very handy.
Another one here wanting to embed unity 3d in an appplication.
We have a product that embeds a vitools player into a winforms app and would like to this with unity.
+1 .. two way communication outŽoŽdaŽbox & a nice, enjoyable tutorial for dessert..
You could embed the Unity rendering window inside of a Windows form by hooking the creation of the DirectX device vfunc and pass in your own HWND when you start the Unity application in suspended state. Of course you still have the issue of inter-process communication, so either sockets or retrieving the .NET state object for Unity would be the next step. Detour EndScene and you can pass your values back and forth between your app and Unity without fear of a race condition.
Alternatively, you could easily inject your Winforms app directly into the Unity process, hook your Winforms app as a detour in to EndScene and then just make use of Unity's own functions through reflection, no inter-process communication required at that point as you are directly inside of Unity's process address space and have the run of the place.
It should be a reasonably trivial exercise to create a wrapper around the Unity Player that makes it an encapsulated custom control that can communicate with a host process. Maybe... two days of work. At the outside.
It has a limited appeal and limited lifespan once Unity3D adds the proper functionality to their engine. I'll be lucky if I sell a dozen licenses to the solution. What should I charge for that? Fifty bucks? A hundred? A thousand? Anything less than five thousand bucks return on the development and I might as well just give it away for free. That way I don't have to deal with the support costs either.
On a complete separate topic; piracy is a huge problem with anything Unity library/add-on related, e.g. rl-forum, cgpersia, etc. Selling individual licenses to sub-$1,000 programming libraries is not worth my time these days.
If you want to code up your own solution, all of the details are covered in the MSDN.
Hey, we tried to develop something like whazt Justin describes:
- run a WPF EXE application and a Unity EXE side by side
- Unity gets rendered in WPF and communication is done through .NET Remoting
Now, we're having one problem that we couldn't solve:
- if we attach the Unity rendering to WPF as a direct child of the Main WPF window, events like mouse clicks and moves are received ok by Unity. For some reason, scroll wheel event isn't received.
- if we use a HWNDHOST in WPF to render Unity (which is useful if we want to have other WPF controls on top of Unity), then Unity never receives any mouse events. We tried the same embedding approach with other programs like Notepad and they all receive mouse evetns.
Does anyone have nay idea why Unity would behave differently when it comes to mouse events?