Would it be possible to create a shader that would allow me to morph an object? The idea is to get vertex transformations to run through hardware instead of software. I'm looking for functionality similar to this: http://forum.unity3d.com/viewtopic.php?t=16425&highlight=morphtargets I guess the key question is whether or not I can transform vertices using a shader. From there we'd have to figure out how to create targets. Thanks!
Hey, you can easily transform vertices in the vertex shader, actually, that's what the vertex shader does Normally the vertex shader just multiplies the vertex position by the model-view-projection matrix (excerpt from UnityCG.cginc): Code (csharp): pos = mul( glstate.matrix.mvp, v ); Morphing is just changing the vertex position v before the multiplication by the MVP matrix happens. The problem is though, how to create morph targets -- so to know what additional transformation to apply per vertex. Unfortunately Unity does not support vertex streams so you would have to pass the information to the vertex shader in any of the per vertex data (tangents, bone weights, normals, uvs) you're not currently using. If your transformation can be described procedurally, then you can just code that in the vertex shader. A good starting point would be to see how can it be done with vertex streams and not do it this way: http://http.developer.nvidia.com/GPUGems/gpugems_ch04.html
Thanks for the help with direction. Unfortunately I don't think I can do it procedurally. I could however have the vertex positions beforehand. Would it be possible to code the target positions into the shader? I'm talking about thousands of vertices... Read through some of that link and it looks like this method is all done on hardware and (if I get it working) would be significantly faster than the software morph script. Thanks again!
Let's start with the most important question: why do you need to do it on the hardware? Since Unity doesn't aid you in doing so, maybe calculating that on the CPU will be perfectly fine? You could just try the code from the post you referred to and profile it with your scenes. Anyways, how many morph targets do you need? Just one additional? If so, that can be done on the GPU with Unity. You would need one mesh. Mesh.vertices would store the vertices of the original mesh, Mesh.tangents would store the vertices of the morph target. In the vertex shader you would then lerp between the two: Code (csharp): float4 finalVertex = lerp(v, tangent, _BlendAmount); where _BlendAmount is a property set on the material (Material.SetFloat).
Kuba, I'm working on this solution and I'm not sure exactly how I need to approach it (in terms of giving the shader the vertices and tangents). From what I see I can use the appdata_base and that will give me the information I need for the base shape (based on the object that I have the shader applied to). How do I go about sending the tangents to the shader? I don't have a lot of background with Cg (basically learning as I go). Is there a different way of sending the vertex and tangent information to the shader? I may not even be asking the right question... Thanks again.
Okay, so I think I have the basics of it working. The only problem I'm running into now is applying the tangents of the target mesh to the base mesh. I've tried doing this through a script, but I don't know how to overwrite the tangents that are already on the geometry. Sooooo close....here is what I have right now... Code (csharp): Shader "Blend 2 Shapes2" { Properties { _Blend ("Blend", Range (-1, 1) ) = 0.0 _Color ("Main Color", Color) = (.5, .5, .5, 0) _MainTex ("Base (RGB) Alpha (A)", 2D) = "white" {} } SubShader { Pass { Lighting On CGPROGRAM #pragma vertex vert #include "UnityCG.cginc" // Standard Unity properties struct v2f { float4 pos : POSITION; float4 color : COLOR0; float4 uv : TEXCOORD0; float4 tangent; }; uniform float _Blend; // blend value between both shapes v2f vert (appdata_tan v) { v2f o; v.vertex = lerp(v.tangent, v.vertex,_Blend); o.pos = mul(glstate.matrix.mvp, v.vertex); return o; } ENDCG } } } Here's the C# code that I'm running on the object with the shader. Code (csharp): using UnityEngine; using System.Collections; using System.Threading; public class Morpher : MonoBehaviour { Vector3[] vertices; Vector4[] tangents; Thread thread; public MeshFilter targetMeshFilter; Mesh targetMesh; Mesh mesh; MeshFilter meshFilter; public Shader shader; // Use this for initialization void Awake () { targetMesh = targetMeshFilter.mesh; //target mesh meshFilter = GetComponent(typeof(MeshFilter)) as MeshFilter; //mesh filter of current object mesh = meshFilter.mesh; vertices = mesh.vertices; for (int i=0;i<vertices.Length;i++) { //mesh.tangents[i] = new Vector4(targetMesh.vertices[i].x,targetMesh.vertices[i].y,targetMesh.vertices[i].z,-1.0f); mesh.tangents[i] = targetMesh.vertices[i]; } } void Start() { } // Update is called once per frame void Update () { //parse vertices in a separate thread thread = new Thread(ParseVertices); thread.Start(); } void ParseVertices () { for (int i=0;i<vertices.Length;i++) { // Assign tangents of target mesh to base tangents //mesh.tangents[i] = new Vector4(targetMesh.vertices[i].x,targetMesh.vertices[i].y,targetMesh.vertices[i].z,-1.0f); mesh.tangents[i] = targetMesh.vertices[i]; } } } Sorry about the double space...not sure why it's doing that.[/code]
Hey, sorry, I totally missed your response, forum marked everything as read for me. The problem is on the C# side. mesh.tangents returns the copy of the array, so you should first assign it to a local value, then loop over all elements from that local array and set them to desired values, and then assign the whole array back to mesh.tangents. Code (csharp): Vector4[] tangents = mesh.tangents; for (int i = 0; i < tangents.Length; i++) tangents[i] = calculated_value; mesh.tangents = tangents;
Thanks for the tip. I'm definitely getting something different, however it's not morphing properly. I have a sphere, and then another sphere that just has the vertices pulled out at some parts (like spikes). When I change the blend value the sphere just seems to get larger (scales) as a whole instead of just moving the vertices that have different positions (and vertices seem to go to a random place). It's probably something simple that I'm overlooking. Thanks again!
Okay, I have it working...with several bugs. So, issue #1 The mesh SCALES when I change the blend value. It will morph the mesh as well, but it scales it really big too. How do I fix this?