Search Unity

Doing 2D animation with 3d package?

Discussion in 'iOS and tvOS' started by hexdump, Jan 17, 2011.

  1. hexdump

    hexdump

    Joined:
    Dec 29, 2008
    Posts:
    443
    Hi!,

    I have come across an article talking about how Zombie Villie was done. I was surprised that devs said that graphics where done in a 2D package (body parts, weapons, etc...) but animation was done in a 3D package. Has anybody done this too? Have been looking for some insight in google without luck. My main concern is if animation will show too "robotized".

    Thanks in advance,
    HexDump.
     
    Last edited: Jan 17, 2011
  2. pepworks

    pepworks

    Joined:
    Jan 18, 2010
    Posts:
    19
    Hi!

    Interesting, I have been wondering how they did it cause it looks somewhat "special".
    (Or you can say robotized ;) )

    If you want to go for smooth 2D animations I would prefer to pre-render them to pngs
    and use SpriteManager 2 to play the animations. Using SpriteManager 2 is very similar
    to what you do in Adobe Flash with Keyframe animations - just you have a good performance ;)

    Check out http://www.pepworks.com for the flash demo
    and how it looks 1:1 in Unity3D:
    http://itunes.apple.com/us/app/pep-the-dragon-lite/id396120544?mt=8

    best regards,

    pep
     
  3. MikaMobile

    MikaMobile

    Joined:
    Jan 29, 2009
    Posts:
    845
    It really just depends on the style you're going for, and what limitations you want to sign up for. Sprite sheets are simple and direct, and the only limitation on how smooth your animations are is (a) how many frames you can make that will fit into memory and (b) your own skill as an artist. That said, making dozens or hundreds of frames of animation can be very time consuming, and prohibitively memory intensive if you have a lot of variety on screen or are developing for retina display.

    Personally I prefer our cutout-style of animation, where we construct a 2d-looking mesh in a 3d package and animate their body parts via translation, rotation, scale, and the occasional texture swap. The reason you won't find much help on google is because, well... nobody else is really doing it. The benefits are pretty substantial, you can animate a large suite of animations with a single piece of art by creatively manipulating their various parts through translation, scale and rotation alone, saving enormous amounts of time and memory. You end up using far less artwork, and your animations can run at any framerate by virtue of being curve based, allowing you to procedurally change their speed at runtime. Further, you can leverage Unity's animation system, allowing for animation layers, crossfading, and additive animation for things like hit reactions and gun recoils that simply layer on top of whatever else is currently playing (such as a run cycle).

    There are downsides though - it takes a lot of planning to make your base artwork usable for as many animations as you can squeeze out of it. Making a cutout model that can look like its attack, running, swimming, jumping, etc... sometimes its very difficult or just plain impossible to pull off without completely swapping the texture. This complicates the content creation process a bit. In OMG Pirates, we ended up using about a dozen different base "poses" for the ninja. Still, a dozen different sprites isn't bad when you consider he has several hundred frames of animation. Creating him with a sprite sheet would have been impossible without making his animation much simpler.

    If you're worried about it looking "robotized" because the run cycle in Zombieville is a little... lame, then I don't blame you. I think I cranked that out in like 5 minutes. A better representation of what the style is capable of would be our more recent projects, like OMG Pirates! or our soon-to-be-released RPG, Battleheart. There's video of both on our website, www.mikamobile.com.
     
    Last edited: Jan 17, 2011
    Ali_V_Quest and theANMATOR2b like this.
  4. hexdump

    hexdump

    Joined:
    Dec 29, 2008
    Posts:
    443
    Hi!,

    Thanks a lot for the information.

    OMG Pirates! looks incredible.

    MikaMobile I have been reading some post around about how you mounted models in maya. I have read that you were using bones in order to get the model rendered in 1 draw call. Would you continue using bones today that we have a batching system? Would you just use a hierchy of quads?

    Another thing, what do you mean by cutout-style? I guess you have your characters splitted in several parts, arms, head, etc... and then you mount the parts to make the character in maya?.


    Thanks in advance,
    HexDump.
     
    Last edited: Jan 17, 2011
  5. MikaMobile

    MikaMobile

    Joined:
    Jan 29, 2009
    Posts:
    845
    I still use bones, primarily due to the shader animation I've been employing in Battleheart. Automatic batching is not "free" cpu-wise, nor is skinning, so there's overhead either way you slice it.

    As far as what I mean by "cutout", you basically just described it as I would have.
     
  6. hexdump

    hexdump

    Joined:
    Dec 29, 2008
    Posts:
    443
    Thanks again,

    By shader animation what do you mean :S. Are you using vertex shaders on iphone?

    And last question (I swear!!!). I'm just programmer, but wanna test this method a little to see if I can include it in my workflow (don't have a modeler right now, but I can do things in max). So, are you using skeltons definitions as the ones for 3D? I mean do you have bones connected in a hierarchy or just assigned a bone to every quad and you freely move them? I have been trying setting an hierarchy and weighting the bones to just affect the quad vertices they are attached (all to 1), but haven't got it right yet :).

    Thanks in advance,
    HexDump.
     
    Last edited: Jan 18, 2011
  7. MikaMobile

    MikaMobile

    Joined:
    Jan 29, 2009
    Posts:
    845
    Regarding shader animation: I animate the color values of my characters and effects at runtime, for things like smoke clouds or magical sparklies that have their alpha fade out, or for tinting characters such as flashing red when taking damage, flashing green when they're healed, pulsing blue when they have a shield, etc. You can see it in the Battleheart trailer. If they weren't made from a single mesh, I'd have to edit their shared material, but then that would effect all instances of that enemy unless I made them create their own version of the shader every time they spawn and assign it to each batched piece, blah blah blah... it's just easier to use skinned meshes for what I'm doing.

    Whether your skeleton is in a heirarchy or just free-floating in your 3d app really doesn't matter as long as you skin them properly. Not sure how its done in Max, but in maya I just set bone weights manually on a per vertex basis.
     
    theANMATOR2b likes this.
  8. hexdump

    hexdump

    Joined:
    Dec 29, 2008
    Posts:
    443
    Hi!,

    Yes, This evenening I have been playing more deeply with max and I was able to do something aceptable. Thanks a lot for the advices, you have been really kind, no everybody shares knoledge in such on openly way.

    In the other hand, there's something that has come to mind when animating my character, how do you handle for example things that can't be animated through Position, Rotation and Scale? I mean for example an enemy that is just a ball and when detects you does a textured animation (By texture animation I mean a sequence of frames, you know, the old good sprite animation).

    Last question (I know I said before it was last, but... the more I read the more interested I am), I have read in another post that you are using a bone for texture information. When you say you swap the texture, do you mean swap to another different texture, or just show another frame in the current you are using?

    Thanks again for all the cool tips,
    HexDump.
     
  9. ndj23

    ndj23

    Joined:
    Nov 11, 2010
    Posts:
    19
    My input to this thread, is just to say thanks to MikaMobile, for openly sharing so much of your development process.

    After playing Zombieville/OMG, I was inspired to give it a go myself (with zero 3D/animation experience). With mostly the wealth of information from their posts, I've made huge strides, and actually have my first game in early development now.

    Resolving the "robotized" issue seems to me a matter of mastering this technique, as well as swapping the textures as needed for different poses. Things actually end up looking more smooth than most any Sprite based game, in my opinion.
     
  10. MikaMobile

    MikaMobile

    Joined:
    Jan 29, 2009
    Posts:
    845
    Yea, I use empty nodes in my scene (such as a single bone with nothing attached to it) to store meta-data sometimes, since we don't really have any other means of doing so with the FBX format. Like in OMG Pirates!, we had a dozen different textures for the ninja for different poses. In maya, we could have made a custom attribute for swapping the texture, but alas that would not be usuable by Unity at all, much the same way that constraints and IK have to be baked down to the bones because they can't be interpreted in Unity. So knowing that all we could rely on was raw transform information, we set up a relationship where some extra bone floating in the scene somewhere would change the texture based on its X-scale. So x-scale of 1 = texture #1, X-scale of 2 = texture #2, and so on. Then in Unity, we had a script that essentially did the same thing - it checked the transform.localScale.x value of the same bone, and changed the renderer.material.mainTexture of the character's mesh in a LateUpdate loop.

    This wasn't a perfect solution though, since if you're crossfading or interpolating from texture 8 to texture 2 for example - it was possible for the textures in between to flicker briefly if you caught a glimpse of a frame in between. We had to go to some extra lengths to ensure that we never interpolated across some garbage frames. If we end up doing a lot of texture swapping in our next project I think I'll figure out a more graceful solution, but it got the job done for OMG.

    More recently I've been using the same technique to drive alpha values, such as for fading a visual effect over time.
     
    theANMATOR2b likes this.
  11. MetaMythril

    MetaMythril

    Joined:
    May 5, 2010
    Posts:
    150
    First off, I want to say congratulations for producing some quality games! I've got all three of them and they are a treat! Loved the "Powered by Rum" technology in OMG Pirates! XD

    I'm still a bit new to Unity and I'm more of a programmer (my brother is the artist, I'll leave most of that to him). I hate prying but as you said, "nobody else is really doing it", I'm still having trouble understanding how you are assembling your mesh in Unity.

    Does the model make it into Unity as a single mesh or do you "assemble" your cutout mesh together in Unity?
    Someone on UnityAnswers described the method as "animated planes", I don't think that is accurate, or is it?
    Would you consider doing a basic tutorial for the community?

    I understand you may be busy and time is money in indie dev world, I just think the positives you've outlined for this method would help a lot of Unity developers that aspire to create quality 2D depicted games more efficiently than using complicated sprite sheets which severely limit the complexities of animation and available resources for mobile devices. I would honestly like to put forward money if need be for the opportunity to see such a method available for the Unity community to digest. Please PM me if you are interested in the monetary offer and thanks again for the already great support and love you have shown to the Unity community and I wish for your continued success!
     
    Last edited: Feb 12, 2011
  12. MikaMobile

    MikaMobile

    Joined:
    Jan 29, 2009
    Posts:
    845
    The model is a single mesh before it comes into unity - merged in Maya or equivalent software, and skinned to a heirarchy of bones so that the different parts can still be moved independently of each other.

    I've been thinking about putting together a demo scene of our "cutout" technique and making it available on the Unity Asset Store, since it seems like there's interest.
     
    theANMATOR2b likes this.
  13. ndj23

    ndj23

    Joined:
    Nov 11, 2010
    Posts:
    19
    As I stated earlier in this thread, I had no experience in 3D and very little in Maya. Yet through reading all of the interviews and posts on this technique, I have been able to figure it out, and create my own models. It took a couple of months of reading and trying stuff, but it's not too difficult to get it down. This coming from someone who didn't know what a mesh, bone, or plane was when he started.

    While a demo might be nice (I'm sure I'd learn from it as well), don't just wait around for a step by step on how to do it. MikaMobile has posted pretty extensively on the topic, and there is a thread where someone created an example with a billiard table using this technique. I'll find the thread and add it to this post.



    http://forum.unity3d.com/threads/22462-Suggestions-for-optimizing-draw-calls-in-billiard-game?highlight=billiard
     
    Last edited: Feb 12, 2011
  14. amacgregor

    amacgregor

    Joined:
    Jan 6, 2011
    Posts:
    4
    MikaMobile,

    That would be amazing, I'm really interested in learning more about the 'cut-out' technique.
     
  15. MetaMythril

    MetaMythril

    Joined:
    May 5, 2010
    Posts:
    150
    @MikaMobile Thanks for the response! Sign me up for being interested in a demo on the Asset Store.

    @ndj23 That is an awesome thread, thanks for linking it. MikaMobile's comments in that thread definitely got me thinking in the right direction.

    I'll look at that billiard table demo as well.
     
  16. bigdaddio

    bigdaddio

    Joined:
    May 18, 2009
    Posts:
    220
    Hmm, I thought the animations in zombiville were really charming and fun as opposed to overly produced like some AAA house would do. In no way would I have changed them. As a matter of fact if you look at the 35 game pack there is a substandard knock off with a guy shooting dinosaurs. They even tried to mimic the animation style.

    I am going to have to try messing about more with trying to create 2D animations in 3D. Personally I'd love to see a tutorial of some sort.

    BTW my grandson loves to sit with me and play Battleheart, his dad (my son) had to pick it up as well. Oh and he's two. Keep up the great work.
     
  17. Ferazel

    Ferazel

    Joined:
    Apr 18, 2010
    Posts:
    517
    I just wanted to say that I really would like to get my hands on that MikaMobile example tutorial (even willing to pay for it!). I'm just really impressed with the graphics that you are able to get out of the device. While browsing the forums I can get a general idea of the technique that you are using, I'd really love to see a working example of it!
     
  18. Daara

    Daara

    Joined:
    Apr 7, 2011
    Posts:
    3
    I'm also interesting in this demo. As a student it would be a great way to learn something new that no one else is really doing.
     
  19. swiv

    swiv

    Joined:
    Mar 22, 2009
    Posts:
    30
    Are you still thinking of putting something up on the Asset Store MikaMobile?
     
  20. oquendo

    oquendo

    Joined:
    Jan 26, 2011
    Posts:
    8
    MikaMobile,

    First, Thanks for your open attitude and the tons of help you're providing.

    Now, I'm trying to implement your technique and I'm not too seasoned a dev yet. In fact, I'm teaching myself Maya in the process of putting this thing together, so excuse me if my question is trivial or silly. Anyway, here goes:

    You say you're animating by means of creating quads and then combining them into a mesh. No problems there. But then you say you attach bones to every quad in order to animate them. Do you by any chance mean you attach joints? Because as far as I can find out at Maya Help, bones are just visual cues for connections between joints. Is that correct?

    Also, when you say
    Could you expand a bit on that? Are those the skin weights you modify with the Component Editor? What would be the use of that weighting? I mean, wouldn't you achieve the same result by just parenting the quads to the skeleton?

    Finally, if all the quads are combined into one mesh, does that mean they all share the same texture? Do you then use just one image file with all possible poses for every body part?
     
    Last edited: Apr 9, 2011
  21. oquendo

    oquendo

    Joined:
    Jan 26, 2011
    Posts:
    8
    If MikaMobile is not available, I would appreciate anyone's help, really.

    I'm sort of stuck at this point (please see post above).
     
  22. Krobill

    Krobill

    Joined:
    Nov 10, 2008
    Posts:
    282
    I don't know about MikaMobile's exact technique or anything related specifically to Maya since I use Max but :

    - I think when he says he "set bone weigths manually on a per vertex basis", he's talking about skin weight. All the quads are merged into a single mesh and skinned on a 'simple' skeleton. Each quad has it's 4 vertices linked to one bone through skinning. I don't know how it works in Maya but with Max you can either assign skin weights with enveloppes or manually. Typically with skinning each vertex is influenced by several bones (with different weights) to allow smooth mesh deformations but in this particular case you only assign one bone by vertex.

    - You are right, you could definitely achieve the same result with multiple independant quads parented to the skeleton. If those quads use the same material they could be dynamically batched and you would end up with one draw call also. In fact, it would be possible to batch multiple characters which is impossible with skinned meshes.

    - If all quads are combined into one mesh, it doesn't necessarily mean they use the same material and texture. Submeshes allow you to assign multiple materials to different part of a mesh but at a cost. It is highly recommanded that all your quads share the same material, hence the same texture also. In fact you should stuff as much things as possible on a few texture atlases to minimize the number of total materials.
     
  23. oquendo

    oquendo

    Joined:
    Jan 26, 2011
    Posts:
    8
    Thank you, your post set me on the right track. In the end, I decided to go MikaMobile's way, combining everything into one mesh, if only because he has used it with great results. I'll make sure to try the parenting method in the future, though.

    The only problem I'm encountering right now is I don't know how to arrange the quads so that the right ones appear on top. I have them all in the same plane, so of course I could move them back and forth, but I don't know if that's the idea.

    By the way, 'Transcripted' looks great.
     
  24. oquendo

    oquendo

    Joined:
    Jan 26, 2011
    Posts:
    8
    I just noticed something weird (to me, anyway). If i move the top quad backwards, the texture still shows on top of the other quads.

    Anyome knows the reason for this? How can I arrange the quads, preferably still leaving them all in the same plane?
     
  25. Krobill

    Krobill

    Joined:
    Nov 10, 2008
    Posts:
    282
    If you are using alpha blended material, it's perfectly normal. Without going into details, alpha blended polygons need to be rendered from back to front to display properly.

    Unity3D internally sort alpha blended objects based on the depth of their pivot. Distinct objects are rendered from back to front. But when you consider a single mesh, Unity does not sort each polygon in this object. The cost would be to high. So polygons will be rendered in the order they are sent to the GPU which is... hard to tell. I'm not sure how it's working exactly but I guess polygons are rendered in the ordrer of the triangles Array which might be the order of creation of the polygons in your 3D package... or not.

    You must either find a way to control or at least predict the ordrer of rendering of your polygons... or use an alpha testing based shader (or use multiple quad objects and let Unity handle the sorting which is how we do it ^^ )

    and thanks for Transcripted ^^
     
  26. oquendo

    oquendo

    Joined:
    Jan 26, 2011
    Posts:
    8
    OK, and if I use several quads, how do I make sure they'll display in the correct order?
     
  27. Krobill

    Krobill

    Joined:
    Nov 10, 2008
    Posts:
    282
    If those quads are separate objects, like I said, Unity will render them in the correct order (provided the pivots of the quads are somewhere on the planes not at a different depth relative to the camera)
     
  28. oquendo

    oquendo

    Joined:
    Jan 26, 2011
    Posts:
    8
    I know you said that, but I'm not sure what determines the right order in Maya either, so that thay're rendered properly when I export to Unity... :¬ <
     
  29. goodhustle

    goodhustle

    Joined:
    Jun 4, 2009
    Posts:
    310
    Wow, thanks for the technical tip! I've been meaning to ask you for a long time now whether you had a pipeline for texture swaps to get into Unity, and here it is. :) That's a smart way to do things for sure. Do you use MEL scripting to actually design the texture swapping in Maya and keep the extra empty nodes in sync with the texture indexes?
     
  30. ndj23

    ndj23

    Joined:
    Nov 11, 2010
    Posts:
    19
    I've experimented with this technique for a while now.

    I came across the same issue oquendo had with the quads that were physically behind others, showing on top of the front quads. When importing the FBX to Unity from Maya though, everything shows properly.

    As a precautionary note, I'm pretty inexperienced, and self taught really (studied MikaMobile's and others posts on these forums). What I did was reverse the normals on any quads that didn't "move to the back" after I put them behind another quad. This way it looks right while I animate. Then, before exporting, I fix the normals for any quads I reversed (As I said, Unity displays them how I intended, based on depth).

    This could be an entirely incorrect way to solve that problem, but it works for me.
     
  31. Daara

    Daara

    Joined:
    Apr 7, 2011
    Posts:
    3
    I've been experimenting with it for a few days now and think I have the general idea on how it was done. But I like others I had problems once I combined them into a single mesh. They would then improperly put themselves on top of other pieces even when they were located behind them. Would the rest of you care to share your steps you took in your reproduction of this effect? I have yet to get it imported into Unity properly yet so I feel like I've done something wrong.

    1. Drew up my character in Photoshop.
    2. Cut my character into Arms, Legs, Chest, Head, and Waist then saved for web devices as a png with transparency.
    3. Create a polygon plane for each body part and rotate that plane 90 degrees.
    4. Add a lambert to each poly plane, change the lambert color to img, and import each body part as the img.
    5. Arrange each plane to assemble my character.
    6. Create joints for each plane and smooth bind planes onto the joints.
    7. Create animations by moving joints etc.
    8. Merge all planes into one mesh.
    9. Export/Import into Unity (however I have yet to do this properly)
     
  32. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    I'm not sure how to display different frames at different parts of the animation. Are they simply hidden quads parented with the others but scaled to 0?
     
  33. oquendo

    oquendo

    Joined:
    Jan 26, 2011
    Posts:
    8
    @ndj23 - When you say that on Unity quads are displayed based on depth, do you mean you arrange the quads in Maya along an axis? (As in the quad you want shown on top is at z=0, then the following at z=-1, and so on?)

    Is there no way of doing it keeping all quads in the same plane?

    @Daara -

    I saved it as a PSD file. I guess it's not the best option overhead-wise, but I'm not concerned about that at the moment.
    Pretty much, only I changed to PSD file

    I frst combined into a mesh, then assigned weights to the vertices using the Paint Skin Weight Tool + the component editor (to set every unwanted weight to zero), then animated.

    That didn't work out, though (I had the problem with transparencies explained above). So now I'm trying to achieve the same result by simply creating a skeleton and parenting one or two joints to every quad, then animating, as advised by Krobill above.

    I hope it helps.
     
    Last edited: Apr 12, 2011
  34. Toad

    Toad

    Joined:
    Aug 14, 2010
    Posts:
    298
    Just wondered if you were going ahead with this as I'm definitely another potential buyer. :)
     
  35. Daara

    Daara

    Joined:
    Apr 7, 2011
    Posts:
    3
    If anyone could post a tutorial that would be great. I seem to be doing something wrong :(
     
  36. linusnorden

    linusnorden

    Joined:
    Sep 27, 2010
    Posts:
    123
    Please do ill be waiting :)


     
  37. MikaMobile

    MikaMobile

    Joined:
    Jan 29, 2009
    Posts:
    845
    I've solved the depth sorting issue in two different ways over the years.

    In Zombieville and OMG Pirates, I used cutout shaders in Unity. This has the benefit of sorting properly based on depth no matter how the mesh is constructed on a per polygon basis. This is the simplest approach, and will result in everything rendering the right order relative to the camera. However, cutout shaders are quite a lot slower than alpha blended shaders, and they have the downside of producing a hard, pixelated edge on the border of your transparent elements (since pixels are either 100% opaque, or 100% transparent, no feathering in between).

    While making Battleheart though, I discovered that if I merge the various quads of my character models in the order from back to front in Maya, the resulting mesh's rendering order is always the same even with an alpha blended shader. So for example, let's say you have a character made of three parts, and you want part #1 to always render in front of #2, which always renders in front of #3. If you select the individual quads in the order 3 > 2 > 1 and then combine the meshes together (in maya this is under polygons > mesh > combine) the resulting mesh will always render in the desired order. The benefit of this is that you have very direct control over how the mesh will render, and you can freely use alpha blended shaders which are faster and look better. The downside is, of course, that the order is static, and not re-evaluated if one part or another needs to change it's sorting order on the fly.

    Edit: Oh yea, and the reason I use skinned meshes rather than relying on batching for my characters is because I like to animate scale, which unfortunately makes objects with differing scale unable to batch. If you don't animate scale though, its probably fine to just do a bunch of free floating animated quads.
     
    Last edited: May 8, 2011
    theANMATOR2b likes this.
  38. Halordain

    Halordain

    Joined:
    May 24, 2011
    Posts:
    1
    Can someone please expand on what it means to "skin" a "skeleton"? I got to the point of texturing 2D images onto each of my polygon planes, but then you lost me at the point of "joints," "weights," "skinning," ... This sounds like something done interactively, but Maya isn't exactly an intuitive click-and-drag interface. Which commands/features did you use to accomplish the animation step?

    Thank you very much for all of your help.
     
  39. rogerimp

    rogerimp

    Joined:
    Jun 21, 2010
    Posts:
    17
    Hey, just discovered this thread. I have been working on an app using this technique (animating 2d illustrated characters in Cinema 4d) and using null objects to drive animations that don't cross over in FBX, like texture swaps is quite an interesting workaround. I was pretty much convinced I'd have to animate these with script or perhaps inside Unity. I've also been struggling with proper z-sorting. I've placed all the different pieces of my model ever so slightly in front of or in back of each other along Z, and they're all flat, but I still get swapping depending on camera position, grr! I'm very much inspired by your ideas of doing everything in one mesh, Mika, and I'm going to try this out too.

    I can't thank you enough for sharing your sizable experience with the community. It's sharing and active discussion like this that make the Unity community one of the best reasons to use Unity :)
     
  40. rogerimp

    rogerimp

    Joined:
    Jun 21, 2010
    Posts:
    17
    Hey MikaMobile, As I try to implement your idea from Battleheart, I came upon this question: How would you animate texture swaps in this system? Let's say I have one discontinuous mesh for the entire body of my character, and I've mapped each piece to its proper image in one large texture atlas using UV editing tools in my 3d package. If I'm going to swap out the :) mouth for a :-O mouth, I would need to change the UV mapping for the mouth part of my character mesh. I don't think I can get an animated change in UV mapping to make it through FBX into Unity... And if the whole thing is one single mesh, I can't well swap out the texture on a material or shift the material's UV map without affecting the rest of the body.

    I'm really interested in trying out your single-mesh approach as it sounds awesome for reducing draw calls and solving all my z-sorting issues at the same time, plus making it pretty easy to atlas my character textures. So how the heck did you manage to swap textures with only one mesh?

    As always, thanks so much for sharing your knowledge :)
     
  41. MikaMobile

    MikaMobile

    Joined:
    Jan 29, 2009
    Posts:
    845
    For something simple like a facial expression change, I'd just use a little polygon that just covers the part you want to change. This is how I did the characters blinking in Battleheart: I have a version of their face that just covers up the normal eyes with closed ones, and that poly is bound to a bone which is scaled down to zero until its needed.

    In OMG Pirates, I made the polygons for each body part a little oversized, to accomodate different poses. So there's a portion of the texture atlas where the ninja's head would go, his left leg, his right leg, etc, and I tried to make each region big enough to handle whatever pose I'd need to eventually fit in there for each limb. When I swap to entirely different textures, the same body parts would occupy the same regions.

    In Zombieville 2, I've actually been doing a complete model swap sometimes, for things like enemies dying or being set on fire or whatever - they have multiple versions of themselves all under one root node which I just activate/deactivate their renderers as needed.

    There's lots of ways to accomplish these things, and what works for you might vary from project to project.
     
    Last edited: Jul 14, 2011
    theANMATOR2b likes this.
  42. rogerimp

    rogerimp

    Joined:
    Jun 21, 2010
    Posts:
    17
    Okay, so you really couldn't do per-polygon texture/uv swaps in Battleheart where you have the entire model as one mesh, so you float little quads to cover-up underlying texture, presumably with a Transparent Cutout shader that you discovered will respect z-sorting better, as I understand. Cool! So far in my (somewhat naive) approach I've been floating everything above an empty face: my eyebrows, blink eyes, mouth shapes.

    But as you say, when you map one texture atlas to a single joined mesh like in Battleheart you could easily swap out the entire atlas, even if some parts of it don't change. With a multi-mesh model like I've been using I could just animate the UV offset of a single mesh to replace its texture with a different one from the same atlas as well, and indeed I'm trying this out right now with mouth shapes and eye shapes, which the app should jump between rather than animate smoothly like eyebrow and pupil position. I think I understand correctly. I'm still going to give your single-mesh approach a go even if I have to float just two quads on top of my character's face... it's definitely an improvement over trying to manage 13 independent depths.

    I'm just now trying to figure out how to tackle facial animation and as you say, I'm sure it's a matter of deciding what technique works best for me, so thanks for helping me puzzle through my options, and indeed sharing some really interesting ones I hadn't considered.

    (By the way, love your games! I have wasted many many hours on Zombieville USA.)
     
  43. spentak

    spentak

    Joined:
    Feb 11, 2010
    Posts:
    246
    This might still be relevant. My game Word Warrior http://itunes.apple.com/us/app/word-warrior/id400289406?mt=8 Used a very similar method to MikaMobile. We rigged a character full of planes, and animated the skeleton. Each plane was a body part. In our game we have an inventory of many objects that you can swap through (change weapon, helmet, etc.). So we mapped each plane to the first object in a sprite sheet (the top left). Every body part also had to be created within a bounding box. That way the art didn't get cut off by the mesh. Then in unity we simply moved the UV offset to change armor/graphics. Sprite Manager has a way of implementing this too, but we found doing it the way we did was the most simple. Though we had large sprite sheets on the screen, our performance didn't really suffer any because it was only 1 draw call. As for depth, on the characters we just set the z position of each plane to be a unit or two in front or behind the other plane so everything looked the way it was supposed to. In a 2D game "depth" is really a deception. Your character might not even been on the same z axis as the enemy when he shoots his gun to hit him, but it doesn't matter because the player can't tell the difference.
     
  44. MetaMythril

    MetaMythril

    Joined:
    May 5, 2010
    Posts:
    150
    I think I'm starting to better understand this method now.

    Quick question. What do you scale your models to in Maya so they show with minimal tweaking in Unity using an orthographic camera setup. Currently I'm having to scale everything up massively (at about 2k) just to get it to look right on screen but if I scale it up in Maya I can't zoom out much before the camera clips it out of the scene. I'm sure I could just adjust the clipping distance for the camera in Maya but I wanted to make sure I was doing things right. I'm still a bit of a noob to Maya but I'm getting the hang of it.
     
  45. LimeSplice

    LimeSplice

    Joined:
    Jul 15, 2011
    Posts:
    111
    Did you do this animation inside Unity or Maya (other 3D app)?
     
  46. spentak

    spentak

    Joined:
    Feb 11, 2010
    Posts:
    246
    The animations were performed in maya on the rig with all the planes on it.
     
  47. Deleted User

    Deleted User

    Guest

    We are releasing Umotion 2D for Unity3d that use a similar 3D approach.

    Umotion 2D is coded with BlitzMax with MiniB3D module and it exports the 3D animation data and automatically the texture Atlas optimized for 2D projects in Unity ( you can export to .3DS too).

    Umotion 2D is a sort of 3D tool like Maya but limited and highly optimized to export 2D animation for Unity3D:

    http://forum.unity3d.com/threads/90341-Umotion-2D-Easily-Create-2D-Animation-for-Unity
     
    Last edited by a moderator: Aug 14, 2011
  48. rogerimp

    rogerimp

    Joined:
    Jun 21, 2010
    Posts:
    17
    Hey again all. I've finally gotten around to building my new set of characters using MikaMobile's approach, connecting all objects into a single mesh. I did a very simple test with three quads and it worked beautifully! All pieces of the mesh were placed on z=0 but they all rendered in Unity in the same order the polygons were modeled in sequence, bottom-to-top. This worked beautifully with soft alpha blending (Unlit/Transparent shader) and no matter my camera rotation, without flickering. YAY!

    That's the good news. The bad news is, though my simple test worked beautifully, my actual model (195 polys) did not. In areas that overlap, a number of polys that should be below others appear on top. I have been spending the last day or two proving beyond a doubt that both in Cinema 4D and in the FBX that it exported, the polygons are stored in the correct order. I have proved this to my satisfaction by viewing the polygon data in C4D's Structure panel and its exported CSV data, and by comparing this with the polygon list I've dug out of the exported FBX file using Autodesk's FBX Converter (http://usa.autodesk.com/adsk/servlet/pc/index?id=6837478&siteID=123112).

    I wrote a quick Unity script to export the vertices and triangle lists in the same CSV format for easy comparison. It handily shows that my character's mesh, straight from the FBX prefab, has a different polygon ordering inside Unity (though the vertices are in the same order). What the truck?! What do you think I should do? MikaMobile, did you run into this at all?

    Thanks again for the awesome discussion, everyone. +1billion
     
  49. MikaMobile

    MikaMobile

    Joined:
    Jan 29, 2009
    Posts:
    845
    Never ran into this, though I'm using Maya rather than Cinema 4D. In my experience, selecting a series of individual quads back to front and then merging them always produces the same predictable sorting order in maya and in game, regardless of poly count. I've tried this with some pretty huge meshes, not 195 individual quads, but close to it. Not sure what's up in your case.
     
  50. rogerimp

    rogerimp

    Joined:
    Jun 21, 2010
    Posts:
    17
    Argh, I've heard some people say that Unity optimizes meshes into tri-strips upon import - whether you like it or not. Some of my tris are a bit out of order inside the FBX, so I'm currently trying to solve this by stripping the tris in C4D in a way that still preserves overall body-part ordering before they hit Unity, hoping that Unity will be cool with it and not touch them, but so far Unity still likes to rearrange things so it's a bit of trial-and-error right now, ouch.

    MikaMobile, from what I've seen, your style does more scaling and translation of individual body parts that could be simple quads, so maybe your max length of contiguous tris is 2 (a quad) and so Unity doesn't mess with things as much? Just a guess. I'm trying to do like, bendy skeletons where the body parts have a couple dozen tris that are weighted smoothly between different joints, and this kind of jives with what I'm seeing:

    In these screencaps, the polys are in the correct order in my 3d package, and I've even triangle-stripped them by hand. But in Unity, you'll see the right side of the leg is moved out of order, interestingly, it looks like the order of contiguous tris has been respected though.




    Anyway, my current plan is to write a few scripts to re-strip my model inside C4D in several different ways, in the hope that one of these would be exactly what Unity would do by itself anyway... :) Perhaps also write a script to reorganize my model inside C4D in the way that Unity mangled it to try to understand its method.

    If I'm still unable to fool Unity into respecting my poly order, I could try setting it manually by messing with the Mesh.triangles in the runtime. I hope I've still got a trick or two up my sleeve.

    Definitely poke me if you have any other ideas. And dude, Mika, THANK YOU!