Search Unity

Volumetric Smoke Shader

Discussion in 'Works In Progress - Archive' started by jack-riddell, Sep 30, 2015.

  1. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    Re did the normals to make them a bit harder. because I am using normal maps the particles look a lot more detailed so you need less of them to get the same effect. This effect has an emit rate of 5 which is quite low compared to the average effect you will get off the asset store which is usually set to 25 + particles per second.


    Target effect. there is a lot of small things I am having a hard time replicating like the soft fluffy effect or the deep dark cracks but i think i am getting closer.


    the one thing I still don't like is the outline of the particle its too knobby for the shape of the smoke flow it needs to be more like this
     
    Seneral, Martin_H, yc960 and 2 others like this.
  2. Obsurveyor

    Obsurveyor

    Joined:
    Nov 22, 2012
    Posts:
    277
    Are you rendering with a real skybox or IBL? You're never going to get it with just an all white ambient light. The tops of the black smoke are lit by directional light from the sky, that's why you get the white tops and the blackish-blue(sea reflecting light) sides/bottom.
     
  3. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    Not sure I can afford IBL, this is a particle shader after all. So far its all being done with a single texture lookup a couple of dot products in the pixel shader and some light math. I am trying to target mobile devices so i cant go too crazy. In the future I might add in the Light Proxy Volume stuff from the latest unity update but I'm not 100 % on its performance / implementation.
     
    Martin_H likes this.
  4. P4p3Rc1iP

    P4p3Rc1iP

    Joined:
    Feb 17, 2015
    Posts:
    13
    Hey first of all I want to say that I love your shader and am really looking forward to the next version!

    But I also have a problem/question/request. I'm using the RedLights2 plugin (https://www.assetstore.unity3d.com/en/#!/content/29575) and I have some trouble getting the light to work properly in your shader. The light is always applied fully to the particle effect, regardless of distance, location or direction. Their documentation (http://data.redplant.de/pbal/redLights2.0_Manual.pdf page 8-9) says I need to include a preprocessor defines in custom shaders somewhere before the shader's surface function.

    Now I don't understand much of shader code, but I presume the problem is that there is no surface function in your shader..?

    Could you by any chance help with this, or point me in the right direction?

    Cheers
     
  5. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    I had a look on page 8 but couldn't see the info about processors i think they might have updated the docs?
    You will have trouble integrating third party lighting plugins with my particle asset as it will require reprogramming a lot of the shader code and because its a particle shader instead of a standard surface shader its format is different than standard unity shaders.
    In the shader I first calculate direction and intensity in the vertex shader then pass that data to the frag shader that combines it with the normal map to generate the final pixel colour. if you want to integrate RedLights2 you will need to calculate the intensity and direction of the red lights in the vertex shader and patch those results in with the other lights then my shader should do the rest. code that has "unity_LightPosition" or "unity_LightAtten" is where you need to start.

    my suggestion would be to try working around this issue by putting unity lights into a separate layer that only effects the particle effects and fudge the redlights.

    sorry i couldn't be more help but integrating a third party lighting solution with a third party rendering solution when neither solution was built to work with the other is an incredibly complex problem.
     
    Martin_H likes this.
  6. P4p3Rc1iP

    P4p3Rc1iP

    Joined:
    Feb 17, 2015
    Posts:
    13
    Ah yeah, it seems the documentation on their site is a little outdated compared to what's supplied with the plugin.

    But I figured it would be unlikely you could magically make it work. :)

    I already tried the workaround with putting the lights in different layers and that seems to do fine for most stuff, so it's not really a big problem and more a "would like to have".

    Thanks for taking a look though!
     
    Martin_H and jack-riddell like this.
  7. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    Completed work on a new particle effect, my plan is to release with a big fire particle effect a small fire effect, a couple of explosions and hopefully a few non fire smoke effects. I might also render out a few variants of each with things like thicker or thinner smoke or with and without sparks.


    Big Fire


    Medium Fire


    Small Fire


    if you can think up any other ideas for stuff to include in release please tell me.

    also how important is a web player demo to people? if you see a youtube/vimeo video is that enough or do you need to see it real time?
     
    John-G likes this.
  8. yc960

    yc960

    Joined:
    Apr 30, 2015
    Posts:
    228
    for particle youtube is usually ok, if you do make a demo I would go for standalone download demo, web won't work on chrome
     
  9. John-G

    John-G

    Joined:
    Mar 21, 2013
    Posts:
    1,122
    Will they be part of the current system, or a seperate asset? Looking great indeed.
     
  10. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    they will initially only be part of the mobile version of the asset that I am planing on releasing soon but once the mobile version is done I will be adding them to the standard version along with a bunch of other updates for free.
     
    yc960, John-G, Martin_H and 1 other person like this.
  11. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    Re rendered out some of the particle effects for the big explosion with slightly different smoke sim and normal framework.
    this particle effect needs next to 0 particles to work, what I'm showing here is emitting at a rate of 1.5 the only restricting factor is the start particle overlap / popping.

    when done this will be perfect for RTS games or for games with long view distances. with an emit rate of 1.5 you can spam theses effects with minimal impact on performance.
     
    yc960, John-G and hopeful like this.
  12. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    an interesting side effect to the normal capture setup I am currently using is that it also captures ambient occlusion.
    unfortunately when I normalize the normals the ambient occlusion data is lost.
    My Normal emissive shader needs the normals normalized because it packs the normal data into 2 colour channels and unpacks it in the fragment shader. The unpacking process only works with normalized normals. The non emissive shader does not need normalized normals though so i get to keep the ambient occlusion data which is encoded in the length of the normals.
    The ambient occlusion data makes a big difference as you can see in the following comparison.

    non emissive shader with ambient occlusion on the left, emissive shader with no ambient occlusion on the right.



    unfortunately its too late for me to find a work around that includes the ambient occlusion data in the emissive version of the shader but it will be something I keep in mind when I do the next update to the main shader package that you have all bought.
     
    hopeful, yc960 and John-G like this.
  13. yc960

    yc960

    Joined:
    Apr 30, 2015
    Posts:
    228
    Absolutely impressive work, what is the ETA? And can I get access to the premature version now? Like email u order number or something?
     
  14. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    working on the user docs and some of the final art work, when that's done I will do a round of testing then release the final product. I can include you in the testing round if you want.
     
    yc960 likes this.
  15. yc960

    yc960

    Joined:
    Apr 30, 2015
    Posts:
    228
    Yes please.
     
  16. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    Rendering out the art is taking a lot longer than i thought but progress is good.


    going for an effect a bit like this only with denser smoke
     
    hopeful, yc960, John-G and 3 others like this.
  17. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    I have been working on a demo scene for the release while my computer renders out new textures. I made the following gif, its a bit big (60 meg) and probably should just be a short youtube video but its late on saturday here so the cbf level is high.
    https://media.giphy.com/media/l0MYz3HYrLMrY4Evu/giphy.gif
    the big smoke In the center is an old render with a low frame rate and lots of noise but you can see the potential.
    the goal was to create the same feeling as this scene in Godzilla.
     
    yc960 likes this.
  18. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    Did some tests to compare a non normal mapped particle effect to a normal mapped particle effect in a more game like environment. The normal mapped effect looks good but there is still something not right with it. I'm not sure what it needs to look better.
     
    yc960 and manpower13 like this.
  19. Haagndaaz

    Haagndaaz

    Joined:
    Feb 20, 2013
    Posts:
    232
    I think the normal mapping is just too strong, in the example it looks more akin to bubbles than smoke. Smoke is a lot softer and subtler methinks
     
    chiapet1021 likes this.
  20. chiapet1021

    chiapet1021

    Joined:
    Jun 5, 2013
    Posts:
    605
    Yeah, I think it's mostly the really pronounced normal mapping at the base of the effect. It looks much more natural farther up, as the particles expand in size and distance from each other. Is there a way to have normals start off less intense, then increase in strength over the life of the particle? I'm not sure if that will look better, just a guess/suggestion on my part.
     
  21. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    Maybe its the number of bubbles/ bubble size. at the end of the smoke the bubbles are quite large and vary greatly in size where as at the start they are all the same size and they are smaller / more frequent
     
    chiapet1021 likes this.
  22. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    Ok i think i might have worked out what the problem is or at least part of the problem. One of the reason the smoke looked wrong was because of creases in the normal maps being visible.

    (old example)


    if you look closely at the base of the smoke you can see where the "Bubbles of smoke" meet. this is creating an effect similar to this


    yet if you look at actual smoke you never see those creases, it just gets too dark to see whats going on then the light hits the peak of the next "smoke bubble".



    by reducing how much the light wraps around to the dark side of the particle and increasing the ambient occlusion i can reduce how visible the "creases" are creating a more believable effect.

    (tweaked lighting)

    (old example)


    the normal map still isn't perfect. the number of "smoke bubbles" is too high early on and the "Bubbles" are too uniform in size. something I plan on rectifying on my next pass at the smoke effect.
     
    chiapet1021 likes this.
  23. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    Also just turning down the normal mapping makes it look a lot more natural.
     
  24. Martin_H

    Martin_H

    Joined:
    Jul 11, 2015
    Posts:
    4,436

    My main concern is that large scale smoke moves a lot slower and doesn't dissolve that fast. Check these references out:



    Also look at the small chimney on the left around the 1:20 mark.





    And watch this one at least for the first 2 minutes to catch the moment when the cam pans up to show how far the cloud extends:





    My understanding is that only white smoke from coolant towers dissolves roughly as quickly as in your demos, because it's only vaporized water that can fully dissolve into air. The smoke from fires on the other hand is dark from the particles it carries up into the air. Those are tiny pieces of solid matter, they can't dissolve. It seems like those types of smoke rather "dilute" veeery slowly till you can barely see the particles anymore, but until they get back down to the ground somehow (with rain maybe?), they're staying up.

    @BoredMormon: I'm sure you have a better understanding of such matters as a chemical engineer. Maybe you can chime in here, correct any mistakes I may have made in my assumptions and possibly even can think of some real world chemical reaction to look at for reference that could give a better compromise under the constraints of game particle systems.

    I very much understand why it's impractical to give the particle systems a very long lifetime. But still, I think this might be the key reason for any examples not yet looking realistic.

    Also look at the pulsating "mushrooms" of smoke in the last video. Those also seem to contribute a lot to the overall look of real large-scale smoke, but I have no idea yet how to recreate that effect.
     
    jack-riddell likes this.
  25. Kiwasi

    Kiwasi

    Joined:
    Dec 5, 2013
    Posts:
    16,860
    Most of my work is about stopping fires and explosions. If a fire starts it means I didn't do my job properly.

    The typical white 'smoke' you might see over an industrial facility is water vapour. Mostly from the coolant towers, but a little bit comes from direct steam losses. The height and density of the cloud depends on the amount of water released, and the ambient temperature and humidity conditions. The water eventually dissipates into the atmosphere and falls again as rain.

    Deliberate combustion, say from an incinerator, leaves no visible smoke. You can often see a plume of hot air that distorts the background. But many modern facilities actually recycle that heat.

    Deliberate combustion using a flare provides a large flame, but virtually no smoke.

    Finally you have incomplete combustion. This is what you get when something goes horribly wrong. This produces the black smoke typically associated with fires. The smoke is mostly made up of unburnt or partially burnt particles. As the column rises these get diluted down with regular air. Eventually they fall to the ground in regular rain. This can have quite nasty effects.

    Not sure if any of this helps. Let me know if you have more questions.
     
    jack-riddell and Martin_H like this.
  26. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    Test will emissive fire added. texture compression errors add a lot of pixelation to the flames so I'm using true colour / rgba32 compression which isn't perfect but I'm going to try using texLOD to sample a higher mip level and see if that helps.

    the added detail of the flames helps hide the flaws in the normal mapping. I am working on adding some sparks but I am not sure how helpful it will be to an effect of this size. once this is done I will be making another version of the smoke with added turbulence then hopefully packaging it all up and releasing it.
     
  27. StaffanEk

    StaffanEk

    Joined:
    Jul 13, 2012
    Posts:
    380
    Last edited: Jul 29, 2016
    yc960 likes this.
  28. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    the procedural voxel thing is cool but a bit impractical from memory.

    I am looking into light probes but it might not make the first release. The documentation on how the light probe system works is a bit sparse especially when dealing with opengl / glsl shaders.
     
  29. Martin_H

    Martin_H

    Joined:
    Jul 11, 2015
    Posts:
    4,436
    They only work for baked light anyway, right?
     
  30. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    baked light only / whatever you can get out of enlightens semi dynamic GI stuff
     
    yc960 and Martin_H like this.
  31. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    Update on the light probes. it seems i need to be in "LightMode"="ForwardBase" to get unity to pass in the spherical harmonics i need however i also need to be in "LightMode"= "Vertex" to get the point light data i need.
    there is less documentation here than a mexican boarder crossing so it might take me a while to solve.
     
    yc960 and Martin_H like this.
  32. Martin_H

    Martin_H

    Joined:
    Jul 11, 2015
    Posts:
    4,436
    Maybe you should ask in the slack chat. Afaik there are some Unity employees who might be able to give you a hint on the undocumented stuff. Also making a request thread in the documentation subforum might be worth a shot.
     
    hopeful likes this.
  33. Petter_G

    Petter_G

    Joined:
    Apr 19, 2016
    Posts:
    12
    @jack-riddell

    Absolutely great work, extremely useful.

    I took the liberty to tweak your normal baking Blend file. Here is my take on light based normal baking. Feel free to use this file, and/or change it however you please.

    I added negative light for a more uniform capture. Now data is added from both negative and positive vectors. The scene has a 0.5 uniform background light that gives objects a baseline neutral direction. It is also possible to add this baseline normal direction to an objects emission shader, so that occluded areas still get a baseline normal direction. (camera light path must be used with a mix shader in that case)

    In this setup, the sphere in the middle will be rendered as though it was baked with a tangent space normal baker. Rendering volumetric smoke probably needs some fudging of values to look "good", but this scene acts like a sort of ground truth from where one can experiment until a good look is achieved. The middle pixel of the sphere has a vector that points directly to the camera.

    Since normal maps are vector data, they should be outputted linearly, so I changed Blender color management to output RAW non-color values, so that the lights behave more like vector direction casters, instead of light casters. (This is vital to get the middle vector value to be equal to 0.5 and to give more range to negative vectors)

    I also added compositing nodes to normalize the vector values(same as Auto Contrast in Photoshop). No matter what the exposure or brightness, the biggest vector will always be 1, middle 0.5, and lowest 0 respectively. (The object shape needs to be such as to have one pixel be a maximum value, the normalization is turned off if Use Nodes is unchecked in the compositing node editor)

    Image renders should be outputted in a 16-bit color depth and then changed to 8-bits in Photoshop for best, and least "artifacty" gradient values. (Photoshop performs dithering to gradients that can't fit into the 8-bit range)


    Compositing normalization.

    Raw linear color values in the scene color management tab.

    A sphere where every pixel points towards the correct vector.


    I still have a big problem with your shader, and I wonder if you can help me.
    Is it possible to have the particle normal direction face the camera position? Currently the normals always face the camera view direction which results in very drastic and ugly light changes whenever the view rotates.

    I noticed that in your videos you never actually rotate the camera, which explains why it isn't as noticeable.

    I tried to check normal direction to 1 in Shuriken which would make normals face the camera position but it has no effect. I also checked the Billboards Face Camera direction in the Unity quality settings with no effect on normals.

    Is there a technical reason why this would be difficult to achieve? I think this would be very important to add since a constantly and arbitrarily changing normal direction makes any decal effect stand out like a sore thumb whenever the player looks around.

    I'm making some complex stationary particles and they absolutely break if they change when the player looks around. I would be extremely thankful if you could make particle normals face the camera position instead of view direction. I'm already very thankful for this great asset.
     
    Last edited: Aug 1, 2016
  34. Martin_H

    Martin_H

    Joined:
    Jul 11, 2015
    Posts:
    4,436

    Hello Petter and welcome to the forum! Thank you very much for trying to improve the setup and sharing your results! It taught me some new things!

    I made the first iteration of a blender setup to render smoke normal maps from it, so I feel partly responsible for its shortcomings, even though Jack came up with his own solution, and that's the one distributed with the Asset. I've never looked at his setup too closely so there might be some differences.

    I wanted to use negative lights originally, but I simply didn't know you could actually do that in Cycles 0_o. The strength slider stops as 0 and it seems you have to enter the negative values by typing them in if you want to get past the 0. Blender never was the most intuitive software...
    Thank you so much for letting us know about this!

    I have downloaded your file and experimented a bit with it. I fear that your solution does happen to produce a good result for a sphere, but that is rather by accident and not by design :-/. Please try rendering a flat surface that fills the entire camera view with your setup. It should in theory render as a flat blue color (until the compositing is applied it does). Also lower the sample count a fair bit for that render please. 1024 samples takes over a minute to render on my GTX 660 for just the simple sphere. That's rather impractical to render animations and iterate on them. We need solutions that are as fast to render out as possible.

    I think the method for normalizing the normal vectors that you implemented does not actually work as intended (or I misunderstood your intentions). If you apply your setup to the rendering of a flat plane it simply boosts the noise from the render to fill the 0..1 range. If you lower the sample values this effect becomes even more pronounced, but you should be able to clearly see it either way. Imho a general purpose normal map render setup should be able to render out the neutral blue normalmap color from a flat plane as well.

    When you render a sphere with your setup the individual channels happen to already be in close to 0..1 range, because the lighting setup is sound and at first glance seems to be superior to what Jack and I had done. I'll take what I've learned from your scene and see how I can improve the smoke render setup. The lack of normal detail in the shadow areas was something that always bugged me.

    For normalizing the normal map in a vector normalization sense, I simply used the free xNormal plugin for Photoshop. I think chances are good that that will be a higher quality solution than anything we can do in blender directly.

    I'm still interested though, simply because I'm curious. Originally I tried to do something like that, but didn't find a solution. I tried again today by using multiple math nodes to recreate the mathematical (had to look that up on the web: http://www.fundza.com/vectors/normalize/ ) way of scaling the vectors to a length of 1, but that looks really wrong. I know too little about how normal map tech works, it might very well be that the blue channel mustn't be changed during the normalization, or at least not in the same way the other two vectors are scaled.

    Anyway, I'm not even sure if the concept is applicable. After all normal maps are fundamentally designed to describe orientation of a solid surface and smoke is something different entirely. I think it might be best to simply compare different setups and go with the one that looks best in a realistic usecase. Whether or not all the vectors in the normal map have the same length shouldn't really be an issue as long as it looks good, right?

    I can't really say anything about the color management settings, because I know almost nothing about that aspect of blender. It sounds like a very reasonable thing to try out! But I wouldn't surprised either, if "eyeballing it" looks better in-game.

    About the 16 bit render and downsample to 8 bit in Photoshop I think I'll have to disagree though, because with this kind of setup I'm rather sure you will never get a smoke sim rendered that has so little noise that 8 bit bit-depth is an issue. It would simply take too long. You might manage it if you are really patient or have access to a renderfarm, but even then I think there would not be any visible benefit in the final on-screen result. 8 bit precision in normal maps can very easily become a problem when you do bakes of hardsurface assets with very subtle curves in the shape, but in my personal experience you could get rid of even that kind of banding by simply overlaying it with a tiny bit of noise (at least the one time I encountered it, and admittadly baking 16 bit and downsampling with dithering might look ever so slightly cleaner in the end). I use noise to get rid of colorbanding from vignette effects too.

    Thanks again for sharing! I'll let you know if I mange to make any improvements over our old setup that works with smoke.
     
  35. Petter_G

    Petter_G

    Joined:
    Apr 19, 2016
    Posts:
    12
    @Martin_H

    Thank you very much.

    The sample count is irrelevant. Of course you can change it as it is appropriate. I attempted to simply create a baseline scene, from where one can make appropriate changes.

    You misunderstood my intention with the normalization. I don't normalize the vectors, but the color range. i.e. if the render is too bright or too dark it will still output the lowest and darkest values. Similar to the Auto Contrast filter in Photoshop. If you want to normalize the vector directions "mathematically" you still have to use a third party plugin.

    You need to have a pixel with lowest/highest values for this to work obviously. The normalization of render colors won't make sense in every context, but it is easy to turn off. You can simply adjust your lights or exposure instead.

    No. No. No, and No. This is due to the color normalization. There are no peaks or valleys in the brightness. Get it? Believe me, my setup is an almost perfect light normal renderer. For smoke color normalization might not work if no correct minimum or maximum is found. You can simply adjust the film exposure to be 2 and turn of compositing nodes and the result of the sphere is the same as with color normalization. But then you have to adjust if the object has a different color.

    I agree fully. That is why I said that my scene can be used as a sort of "ground truth" for further tweaking.

    Here you are completely wrong. Xnormal does the same thing as my scene. The default Cycles render asumes that your end result is a gamma corrected "photograph". It is very important to output values linearly just like Xnormal does. From there you can eyeball it until it looks good on a volumetric scene. If you don't use a RAW color profile, you will see that the middle of the sphere is not 0.5, and that the negative lights needs to be weaker to account. Extremely hard to get symmetrical results with pos/neg light.

    Rendering in Cycles happens in a high dynamic range. It wouldn't matter if you save 16-bit or 8-bit images, unless you forced Blender to work in 8-bits, but then your render would look like a Avant-garde music video. Unless you know of some Blender memory saving technique that I am unaware of.

    Thanks, I will.

    Oh, and the hard object renderer needs to be in RAW too. That is why jack isn't using 0.5 a the middle value in the shader nodes.

    I wonder if jack-riddell can help me with my question.
     
    Last edited: Aug 2, 2016
    Martin_H likes this.
  36. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    Thanks for your input. I cant tell you exactly whats going wrong without looking at it but it might be due to the way the normals have been baked out, a particle with too much Z will change lighting unrealistically when rotated.
    Due to technical limitations I am stuck with a view based normal for the time being and the particle normal setting on particle effect only effects per vertex normal lighting the per pixel normal map calculation is not connected to the vertex normals in any way.

    the gif is a bit stuttery because I am rendering out a smoke sim at the same time.
    I have the shader transitioning the normal lighting from light to dark slightly quicker than you would get on a hard object for artistic reasons.

    I never filmed the particles while moving the camera because I don't like the shakes and stutters you get not because I wanted to scam people and hide flaws in my package.

    I had a quick look at your blender file and it looks close to what I am currently using now. In my current blend file I am using a big sphere surrounding my render area with a shader that emits both positive and negative light based off the sphere normals. Instead of trying to balance the light around 0.5 I am simply remapping the values in the scene colour management settings so 0 = 0.5. I have also written some custom normalizing tools in unity to make the texture import process quicker and to remove some of the normal errors you get from light based normal capture. I know i have been saying it for a while now but I am getting close to releasing the mobile version and when that's done all the advancements I made will be coming to ANMP.

    while not applicable in the ANMP package you currently have, non normalized particles with "Low Light" spots can create better lighting in some circumstances as they contain both normal and some occlusion data.

    I hope this had helped answer some of your questions.
     
    Martin_H likes this.
  37. Petter_G

    Petter_G

    Joined:
    Apr 19, 2016
    Posts:
    12
    I didn't mean that you tried to scam anyone, lol, just that the error isn't noticeable. I wasn't talking about rotating around an object however. I meant mouselook. Place an FPS controller into the scene, then place a single puff of smoke and a light source. When you look around, you will see that the lighting is completely wonky. It might work in many scenarios. But in my setup it is catastrophic. ( fps game with complex singe quad particles). A big stack of smoke nearby will look fine and beautiful.

    I don't mean to criticize this package too bad. What you are doing is absolutely great work, and it works in many scenarios. It's just that it doesn't work in mine.

    Would it be possible to make every quad particle to have it's normals face the center of the camera position somehow?

    Also have you tried to add neutral emission add shader in Blender to get normal direction under control? In my setup you need to add 0.25 white emission and set exposure to 2.0 to get a correct render.

    Are you using RAW color values?
     
  38. Martin_H

    Martin_H

    Joined:
    Jul 11, 2015
    Posts:
    4,436
    Sorry about that!

    Sounds good, thanks for sharing!

    The last GIF you posted looks great, except for the shadow side looking a bit flat. I think some AO (even if it was just baked into the albedo texture) could do wonders I think, but I'm sure you're already working on something ^^.

    Keep up the great work!
     
  39. yc960

    yc960

    Joined:
    Apr 30, 2015
    Posts:
    228
    All these talk is making me hungry for the update! Can't wait to put it in a field test in my project.
     
  40. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    My next release is going to combine everything into a single texture for both performance and productivity reasons that's why there is not much detail on the dark side of the particles. I can't current put ambient occlusion, xyz normals, emissive lighting and transparency into 4 colour values (but i do have a plan too). the limitation on the number of details I can fit creates the flatness you see in my last post. I am working on improving the dark side rendering by researching particle ambient light (i have a forum post detailing the issue/solution but i cant find it) by the time I am updating the package you bought i should have an answer to this problem

    my current blender setup solves all these problems including your crazy .25 emission problems. take the following blender file. render it out. normalize the resulting image in Photoshop and it should be good to go.

    as for your normal issue I will see what i can do. I am currently occupied with my next release but once that is done I will be focused on fixing ANMP. If you could send me a gif or better yet a unity scene documenting your problem it would make it a lot easier to fix/solve.
     
    Petter_G and Martin_H like this.
  41. Martin_H

    Martin_H

    Joined:
    Jul 11, 2015
    Posts:
    4,436
    Sounds good! Let me know if I can help.
     
    jack-riddell likes this.
  42. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
  43. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    I've seen the underlying 'mushroom' phenomenon expressed properly in realtime rather a lot during my experiments in unity in recent years. But it was using a lot of resources because an actual simulation was being run in realtime using DX11 compute shaders. These simulations are usually referred to as fluid simulations but are more often used to render smoke, fire etc. The basic version of the technique is bound to a grid which is one of the reasons it's only seen limited use in games to date. But there are more advanced versions which use flexible dynamic grid cells so are no longer trapped in a box or wasting a lot of processing power on empty cells. Nvidia are close to showing this off via an example implementation they call Nvidia Flow but last time I checked neither their standalone beta nor their example implementation in UE4 were out, let alone anyone attempting to convert it to unity.

    In any case there will still be quite an overhead from using such a technique. But there are other possibilities, ones that use the sort of data these simulations can provide, but pre-bake it. These simulations create vector fields representing velocity which can then be used to drive particles. There is an asset called MegaFlow which can create basic vector fields or import ones created in other apps, and then drive unity particles with the data. I have no experience creating vector fields in other apps though, since I've had the luxury of pursuing the heavy realtime simulations for my niche uses of this sort of tech. A key question is how many different frames of vector velocity data you need to get a bit of the 'mushrooming' effect. Again, because I've not worked much with baked data of this sort I can't really advise, but I would expect it is possible to get an interesting, practical result given the right amount of experimentation.

    I actually have a reason to experiment with baking off some data from the realtime simulation in unity soon, so I will let you know if I get anything that could be useful to the results you seek. Even if I don't get anywhere with that I will probably post again with at least one visual illustration of what I'm on about.
     
    Martin_H likes this.
  44. Petter_G

    Petter_G

    Joined:
    Apr 19, 2016
    Posts:
    12
    I checked it out. The light values make perfect sense. Thank you very much for this scene. Using motion blur to make the smoke animation smoother is genius.

    Is there a reason you use true normals in the Geometry node? True normals are only for showing the actual geometry surface normals without smoothing. I think using smooth shading on the Sphere Caster and using the "plain" normal value in the Geometry node would result in a smoother emission, since the hard faces of the sphere would be linearly interpolated.

    Thank you very much. I will post a gif of my problem. If you can help me with this I will be eternally grateful.
     
  45. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326

    I am not sure Mega flow will produce the mushrooming your talking about because to form a decent mushroom you need a dynamic velocity field to form the mushroom and move it through the smoke flow. Mega flow creates only a static flow field which means even if it creates a mushroom it will be fixed in position and not able to move.
     
    Martin_H likes this.
  46. jack-riddell

    jack-riddell

    Joined:
    Feb 11, 2014
    Posts:
    326
    true normals is just a legacy issue. in more recent versions I switched to plain normals and rotated them into camera space so i can rotate the camera without issue. 9 times out of 10 the normal format is not a problem though that's why i haven't updated it yet.
     
  47. Petter_G

    Petter_G

    Joined:
    Apr 19, 2016
    Posts:
    12
    @jack-riddell

    I noticed that your final emission value is 1.1. Is there a reason for this? I also don't quite understand why you have a multiply by 100. Sorry for asking so many questions, it's just that the fine details of Cycles rendering interest me a whole lot.

    EDIT: I get it, you use these values as the "fudging" coefficients. Instead of playing around with tonnes of values you simply change two. right?
     
    Last edited: Aug 2, 2016
  48. Martin_H

    Martin_H

    Joined:
    Jul 11, 2015
    Posts:
    4,436
    Try turning it off, do you see a difference? Afaik smoke sims can't use motion blur yet and google didn't turn up anything saying otherwise (in 2012 it was on the todo list of a GSOC project, but that was about all I could find regarding this).

    Thanks a lot for sharing! Holy cow, those render times. Your patience must be legendary! It took over half an hour for me to bake the sim and then 22 minutes to render a single frame (frame #50)!

    This is the frame (22 minutes rendertime):


    With a few tweaks I have this (42 seconds rendertime):


    It is a fair bit more noisy (and ~30 times faster), but it also helps hide the artifacts from the smoke sim voxel resolution. It'd be interesting to see how it compares in a full animation. The noise might be more obtrusive there than in a still image.

    If you want to take a look, here is the blend file:
    http://www.keinebilder.de/temp/BigFlame2.blend

    If I was to further optimize the render quality I'd try to re-invest some of those time savings into higher render resolutions, post process filtering and then downscaling in photoshop.


    It would be interesting to see, but I doubt that baked solutions will be practical for my usecases. I could see it work as a hero asset in a fixed setpiece, but in a dynamic world where ideally wind effectors are still able to have some effect on particlesystems I'd hope for a shuriken based solution to fake that kind of effect. Just seems more practical for my usecase.
     
  49. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    There are some ways round positional limitations but even then I wouldn't expect the effect to look good if only one frame of velocity field data is used. I wasn't touting MegaFlow as a complete solution. I was speculating that there may be fertile ground in between the two extremes I was discussing. e.g. a bunch of vector field data that is interpolated between over time to achieve something approximating the desired effect.
     
    Martin_H likes this.
  50. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    The ultimate destination I'm on about doesn't end up with the entire system being rigidly fixed in a manner that makes interaction and variation completely impossible. The idea is that the baked velocity data is one influence over the particles but other standard particle influences can also remain in effect. Plus relatively crude manipulation of how the vector field is positioned and rotated can achieve surprisingly good results sometimes. But yeah still got to get the timing and blend of these factors right and it might be more hassle than its worth.
     
    jack-riddell and Martin_H like this.