Hello all. I am trying to integrate ffmpeg into unity using the platform invoke on C#. Here are the steps I am using for this. 1) Downloaded the latest version of ffmpeg 2) configured using --enable-shared and --enable-static 3) copied all the dlls from ffmpeg to Assets/Plugins I then use the DllImport("avcodec") to import the functions. However this is where I face the probelms. I keep getting a DllNotFoundException. In fact there are two different logs I see. If I try to load a dll that doesn't exist in the plugins directory (example: mydll.dll), I get a DllNotFoundException mydll. However with avcodec I get DllNotFoundException <full path of the dll>. This leads me to believe that the problem is not with the dll opening but when the dll loads there might be something going wrong. I wrote my own little dll for experimentation and compiled it with the cc -shared compiler directive. The function exported from the dll is invoked without any hiccups. Here are some issues I am not sure about: Is there a problem with dlls loading other dlls? Am I compiling ffmpeg wrong? Are there dependencies that are not being loaded? Checked this out in dependencywalker it shows ieshims.dll as not loading. This doesn't make sense because ffmpeg.exe loads properly and doesn't throw a tantrum. I am not sure if anyone else has tried something like this. I would love to view some opinions and suggestions.
there is no problem with dlls loading other dlls but the other dlls must be in a $PATH$ place or the right place (there are some threads on it), they will not work just cause they are next to your wrapper within plugins as the relative path is incorrect.
Just for closure - the problem I had was that my dll did not load because it could not load other dlls. This was because I had to place these dependency dlls in the root directory of the project and not inside the assets/plugins directory. The current working directory switches to the root directory from where the application is invoked. Pushing these dlls into system directory also worked but it is not an option for me.
Hi cyrax, did you ever get ffmpeg to work? I'm trying to do the same in order to decode mp3s, but I'm not quite getting there. Any chance you could give me a little help here?
Ahoy there Jack Rabbit. Yes I got it working. I wrote a dll that exposed the ffmpeg API as an interface. I placed this dll in the assets/plugins directory. The dependency dlls from ffmpeg (depends on what apis you are using in ffmpeg but place all of them to start off) go in the root directory for your project. If you did not place the dependency dlls here you will have dllnotfound exceptions being thrown. Oh make sure you are using 32 bit dlls all around including the ffmpeg ones. I've been bitten once and lost a whole lot of time. If you end up with 64 bit dlls you will see dllnotfound exception again. Also do use dependency checker to verify that you are not missing any dlls. Sorry but I can't post any code here as I do not own the copyrights to it. Do shoot any questions you have.
I have a question! You wrote the dll yourself? I'm trying to use http://code.google.com/p/ffmpeg-sharp/ but it keeps to threw dllnotfound exception.. I've put the dlls in the root folder as you did but nothing... Maybe it's because they are nested in the other? Do you have any clue? I've asked that in http://forum.unity3d.com/threads/150769-ffmpegSharp-DllNotFoundException before finding this thread, i hope you're still receiving notification for it!
I know this is an ancient thread, but I am also trying to load video files into Unity. Can anyone elaborate on the steps to get ffmpeg working with Unity? Specifically, how to decode a video file, and copy the contents to a unity texture. The op made it sound like he just downloaded pre-built dlls, but don't all unity plugins need to be .net? And what did he mean by "2) configured using --enable-shared and --enable-static"?
What he meant by that is the commands typed into make on linux to compile the plugins (or FFMpeg) from source, I've input similar ones before when building source code. (The last time I did that was when i tried to build an LFS system! ) And he wrote an interface DLL to call FFMpeg from there from what I've read. I'm actually looking for an FFMpeg solution myself, although I wish to have it work for multiple platforms for a game I'm making.
I have a working prototype of Unity/FFmpeg integration for a video player but unfortunately I cannot share details and/or source right now. I think directly using a c# wrapper around ffmpeg's api it's not the best way, it is way too complicated and involves a LOT of marshalling, dealing with structs and so on. What I did is: I created my video player framework in c++, it uses ffmpeg and is able to decode video/audio , manage timing, and blit frame data to texture memory. It can deal with dx11, dx9 and openGL texture, since Unity can use one of these three libraries and each one has a different way of giving you access to texture data. Then I wrote a c# wrapper for this player framework, and I use it in Unity passing as a parameter the Unity plane and texture I wanna use as a projection screen. "Billboarding" of the plane and some aspect-ratio fixing are easily done on the c#/Unity side as well. Btw, did anyone notice that, if you don't recompile it with the right flags, FFmpeg is GPL? That means that if you use it, your whole projects must be GPL'ed and that might not be a viable solution for many project, particularly commercial ones. If you get FFmpeg sources, you can recompile it using LGPL license (this process will exclude some codecs, of course) which is not as viral as GPL, but you still have some requirements to fullfill, in order to be 100% compliant with the license. And pay particular attention if you intend to static-link FFmpeg (as opposed to dynamic linking with DLLs)
Hi Gufino2, with your advice, I finally could transfer frames into texture of Unity objects. Thank you. My next problem is, how would I deal with sounds??? I searched a little, but it doesn't seem that I can play specific sound frame in Unity.
Ok, now I can play audio frames by using OnAudioFilterRead(), but the frame speed is still wrong. I guess my next task will be the synchronization between audio and video.
@steunity2 Are you able to share any part of your implementation? I need to do something similar, but I was hoping to not have to rewrite the wheel.
In conclusion, I actually quit implementing this because I was running out of time till our product releasing date. It's too complicated to explain every step I have done, but I can give a simple idea I tried. Basically you need to develop a c++ native dll which uses ffmpeg library to implement your player's back-end (I call ffunity lol), and Unity side as the front-end. Unfortunately I can't share ffunity.cpp at this point, but it uses lots of ffplay's implementation. For how to use ffmpeg library, this site gives a good tutorial: http://dranger.com/ffmpeg/tutorial01.html First, I have struggled with Unity's limitation of updating GUI in main thread, so what I ended up to solve this was to have FrameCallback(...) to receive frames from native code (ffmpeg implementation) and GetFrame(...) as the trigger. For video frame, you can continuously paste the frame texture to a Image component to play movie. Then you have to deal with tons of issues like frame rate, audio frame, buffer, synchronization between video and audio... To me the audio frame was a nightmare and I gave up here. Sorry, this may not help you much, but I wish you can complete this and teach me how. Ask me any question, I will try my best to answer. Native code, say "ffunity.h" Code (CSharp): #ifndef FFUNITY_H #define FFUNITY_H extern "C" { #include <libavcodec/avcodec.h> #include <libavformat/avformat.h> #include <libswscale/swscale.h> #include <libavutil/avutil.h> } #define UNITY_PLUGIN extern "C" __declspec(dllexport) namespace FFUnity { using VideoFrameCallback = void(*)(int id, uint8_t* data, int width, int height); VideoFrameCallback mVideoFrameCallback = NULL; UNITY_PLUGIN void SetVideoFrameCallback(VideoFrameCallback callback); UNITY_PLUGIN bool GetVideoFrame(int id); } #endif //FFUNITY_H Unity side C#: Code (CSharp): using UnityEngine; using System.Collections; using System.Runtime.InteropServices; using System.IO; using System; public delegate void VideoFrameCallback (int id, IntPtr data, int width, int height); public struct VideoFrame { public int id; public byte[] data; public int width; public int height; public bool resized; } public VideoFrameCallback videoFrameHandler; public RawImage videoProjector; private VideoFrame videoFrame; private Texture2D frameTexture; [DllImport("ffunity")] public static extern void SetVideoFrameCallback (VideoFrameCallback fn); [DllImport("ffunity")] public static extern bool GetVideoFrame (int id); void Start () { // *** Video *** videoFrame.resized = false; // Create a texture in format of RGB24 frameTexture = new Texture2D (10, 10, TextureFormat.RGB24, false); // Set the texture into video projector videoProjector.texture = frameTexture; // Flip the projector videoProjector.rectTransform.rotation = Quaternion.Euler (0, 180, 180); // Register frame callback videoFrameHandler = new VideoFrameCallback (VideoFrameHandler); SetVideoFrameCallback (videoFrameHandler); } void Update () { // Update video frame if (isPlaying) { int id = Time.frameCount; bool frameReceived = GetVideoFrame (id); if (frameReceived && videoFrame.id == id) { if (!videoFrame.resized) { frameTexture.Resize (videoFrame.width, videoFrame.height); videoFrame.resized = true; } frameTexture.LoadRawTextureData (videoFrame.data); frameTexture.Apply (); } } } void VideoFrameHandler (int id, IntPtr dataPtr, int width, int height) { videoFrame.id = id; videoFrame.data = new byte[height * 3 * width]; //RGB24 videoFrame.width = width; videoFrame.height = height; Marshal.Copy (dataPtr, videoFrame.data, 0, height * 3 * width); }
Definitely a good reason to write any plugin, I must say. Would do good for mobile, if implemented right.
Hi @steunity2 Could I have your email to discuss the question about how to use ffmpeg in unity. I decode rtmp stream by ffmpeg and get rgb image, but it's all black. I want to know how you finish it. thank you
Are you using following functions at Unity side?: frameTexture.LoadRawTextureData (rgbData); frameTexture.Apply (); Sounds like you did not format your rgb data right. Regarding my email, message me personally.
Hi ,steunity2 Happy to tell you that playing rtmp stream is OK( just image, no audio), I guess I did not get the data which from dll to unity by my function getframe in update() and unity died. So I use the way you said that thank you for your idea
Hi, pool611 I am stuck at how to decode rtmp stream by ffmpeg and get an rgb image. How do you implement that? Thank you.
Hello strmnati At first , you should finish a example(C++ is OK) of ffmpeg to decode rtmp stream. Then, the website (http://docs.unity3d.com/Manual/NativePlugins.html) show that how to make a DLL for Unity and build your ffmpeg example to DLL and put DLL to Asset/Plugins/x86(or 86_64). At last, I create a thread to run the function of get frame from ffmpeg DLL, otherwise unity will die. how to implement image data in unity, you can look at steunity2's idea, his/her code is easy to understand.
I found an issue. My c++ example is ok to run decoding and create frame. Bur, I create DLL and unity use it. The editor will report an error: "Failed to load 'Assets/Plugins/mydecoder.dll" How did you include ffmpeg library that unity can find it?
@strmnati I fixed a similar problem by adding ffmpeg's library binaries (libavcodec-57.dll, etc) in the same Plugins folder as the wrapper dll (mydecoder.dll in your case).
I have another question, mostly to @steunity2 and whoever used his method of the frame callback : I actually managed to work with ffmpeg in Unity, without the callback function. It works well when playing the video at 25fps, but then it doesn't when I am decoding frames at a higher pace. I am getting intermittent gray frames... Basically I am just calling the decode function from the main Unity thread, and upload the resulting pointer as raw data to texture. Something like that: Code (CSharp): [DllImport("ffmpegLib")] private static extern IntPtr FFmpegLib_DecodeNextVideoFrame(IntPtr handle); void Update() { if (needNewFrame) { IntPtr frameData = FFmpegLib_DecodeNextVideoFrame(videoHandle); videoTexture.LoadRawTextureData(frameData, videoWidth * videoHeight * 3); videoTexture.Apply (); } } I am skipping the other part of the code which raises the 'needNewFrame' flag upon 25fps timer, or upon UI events (time slider). Typically the gray frames appear when using a GUI slider, i.e. causing decoding to happen at a very high rate. I am suspecting that there is something asynchronous in the process, which causes to overwrite the texture data before it is completely uploaded. So now I am actually wondering if I should not use the frame callback which @steunity2 proposed. But in your example code I'm missing ffunity.c, and particularly the location in the ffplay code from which you're invoking the frame callback. If it is just after decoding a frame then I don't see the difference with just returning the frame data pointer as I am doing now. Thanks for any help
Where's needNewFrame declared? Also, I think that maybe the frame decoding should be done in a coroutine (maybe in start?) that checks if the time bit's come to a new frame, then decode and upload to texture. Update()... updates before your next game render's drawn, of which could interfere with the video frame-rate, and likewise, the timing required to make sure 50/60fps video plays properly. I seriously wish I could help with example code, but I'm not one for using native plugins... yet.
Any update on this? Would be really nice if somebody could provide an example of this working. I like the idea of using a c++ lib/dll to do all the ffmpeg/decoding, and just bliting the texture to unity. But what exactly does that look like?
Would be really nice if somebody could provide an example of this working. anyone use android ffmpeg intergration ?
You want a contact form: http://ffmpeg.giganeomarket.com/#!/purchase Changed "Buy Now" button to point into correct place. Cheers!