Search Unity

Concern that Unite announced Asset Pipeline changes are not fixing the core issue.

Discussion in 'Editor & General Support' started by hoesterey, Nov 3, 2016.

  1. hoesterey

    hoesterey

    Joined:
    Mar 19, 2010
    Posts:
    659
    Hi,
    I just wanted to raise some concerns about the planned Asset pipeline changes. I have a lot of experience working in Unity with both small and large teams of over 100 people and the Asset Pipeline has always been a pain point.

    The updated pipeline as I understand it only imports "used" assets. While this will help speed import time, it doesn't really address the issue at the architectural level as I'd hope to see.

    I'd really like to see assets packaged at import time and unusable until the assets are imported. Here is why:
    • Currently Import and iteration time increases as team size goes up. e.g. If a set of textures for the player character take 30 seconds to import, that means a 100 person team will spend 50 man minutes to import this asset, each time those textures are changed. Were this asset "Packaged" so it did not require a reimport on every computer, this time would be consistent regardless of team size. One team member spending X seconds packaging the file before it was usable.
    • Non-Matching Meta files can take an entire team offline for hours. Imagine this scenario. An artist checks in a new environmental asset without having opened that asset in the engine. Two level designers sync and open unity both wishing to use the new asset. The asset generates DIFFERENT GUIDs in the meta files on each of there computers. They begin using the asset, hooking it up to scripts ect... They both check-in. One designers work is destroyed as the GUIDS no longer match up. Now multiply this one mistake in process by a team of 100. Something is consistently broke because mistakes happen. To illustrate how bad this is, I've actually had to have 2 people who do nothing but police check-ins to ensure nothing is missed. We still miss things.
    • Bad GUIDS spread like a virus - When lots of files are being created a few errors in check-ins can not only cause guids not to match but actually cause one object to have the GUID of another. I've seen this happen where Object A that was checked in with a bad meta file assigned itself to all scripts that object b was previously assigned to. Tracking this type of thing down is hard, and often takes a team down while you "Freeze" check-ins to avoid spreading the issues further.
    What are the Unity teams thoughts around the Asset Pipeline? Can we hope to see larger changes?
     
  2. BFS-Kyle

    BFS-Kyle

    Joined:
    Jun 12, 2013
    Posts:
    883
    Regarding your first point, just to clarify are you suggesting something like checking in both the Texture files, and the processed texture data (i.e. from the Library folder)? I like the idea of it in theory, but in practice I still see a few issues - firstly, different platforms all import differently, so would the original package need to include every single platform type? Secondly, would this be committed to the project? That could potentially increase a projects size by a lot, especially for every platform involved (if working for cross-platform projects). They do have the Cache Server, which is meant to help with this sort of issue (i.e. one user imports fully for 30 seconds, everyone else just downloads that data from the cache server in 3 seconds, saving 90% of the import time).

    For the second and third points - I feel like they are issues with the workflow, rather than something that Unity can enforce or really fix. If I commit to my project one script, when I have edited 2 that are linked together, that can cause compiler errors. That is not Unity's fault (or something they can have any control over), but it is my fault for committing only half of my work. In the same way, if an artist commits new art files, but does not commit the .meta files, they are only committing half of their work and can cause the same errors as you have noted. If the issue has happened multiple times and you now have people simply checking all commits to make sure they are correct, I would consider trying to automate it - e.g. with Git you should be able to setup a commit hook to check that if you are committing a new file, there MUST be another file with the same name + ".meta", otherwise reject the commit. That could help reduce the chances of that popping up. Also, when committing lots of files from multiple people, I think you are referring to a chance of having conflicting GUID's. That is possible, although quite low - and if anything it shouldn't happen often, you wouldn't often be having multiple people adding multiple huge quantities of files at the same time. I guess you might, but I just can't see it happening too often. As a solution to that, you could have them bring in all their assets to Unity, make sure they are all importing correctly, then before using them for anything commit them. That way you will resolve any conflicting GUID's before you start using them.
     
  3. hoesterey

    hoesterey

    Joined:
    Mar 19, 2010
    Posts:
    659
    Heya Kyle,
    -As far as packaging other AAA engines do create an engine consumable file when textures and fbx files are imported. E.g. Unreal creates .uasset files. I've found this workflow to work better for medium to large teams despite increase in size. Yes they would be checked in.

    -Even with the cache server I worked on a project where needing to re-import meant you were done for the day with a 12+ hour import time. I'm sure Unity's solution will help with this issue but I'm not sure it will completely cure it. Maybe it will though, which is why I raise the concern, to start a discussion. :)

    -with the 2nd and 3rd point I agree its a work flow issue. I've worked with teams that have implemented Perforce side checks that validate meta files are checked in, and other safeguards. But when you get one or more teams working on the same project often in different countries and time zones, with endless new hires, lets just say mistakes happen, and when they do they are costly.

    - Having the same GUIDs is rare. That said I've experienced this several times. Not sure how it happens but when GUIDs get conflicted it has been painful.
     
  4. ArachnidAnimal

    ArachnidAnimal

    Joined:
    Mar 3, 2015
    Posts:
    1,825
    This new pipeline seems it might be beneficial if you're re-importing the project. So if I download a new version of Unity and the project need to be re-imported, it would be nice to specify only to re-import used assets. With the amount of unused assets (mainly textures) in my project, this could speed up import time by maybe 30-40%.
    However, during development, I fail to see how NOT importing the asset is beneficial. I'm putting it into the project, I'm going to be using it at least once during development.
     
    hoesterey likes this.
  5. Joachim_Ante

    Joachim_Ante

    Unity Technologies

    Joined:
    Mar 16, 2005
    Posts:
    5,203
    Cache server will work even better in the new asset pipeline. Basically we will have a per machine asset cache. It is no longer tied to project. All dependencies are fully declared and we will enforce full determinstic import (binary output).
    Thus asset import can also be shared across multiple projects.

    Due to that cache server will work better because all asset import can be cached through it. You can think of the per machine cache as a cache server which unity simply reads from directly...

    Cache server itself (server tech has some scalability issues right now when projects become huge + 50 person teams access it, basically bottleneck becomes the cache file server.

    The protocal is simple so several +50 person teams have rewritten the node.js server to scale better. The new one will scale better to massive scale usage, new cache server will be written in C++ and highly optimized)

    Multiple processes will speedup import times. Switching platforms becomes a non-event since it is just another dependency change. If all is already imported previously it will switch instantly.

    Additionally several of the things that happen only when you have > 500k assets in your projects, like refresh of has anything changed time scaling massively better is being heavily optimized. We are writing perf tests from the start to keep this very important numbers below < 1 seconds while we develop at massive scale project size.


    So i think combination of
    1. import / download from cache server only what is used by unity as opposed to full project import
    2. better cache server
    3. multiprocess import
    4. switch platform be instant if everything already imported
    5. faster refresh
    6. full support for dependencies

    Given that I am pretty certain it will solve big scale project issues very well and become the best pipeline for that scale out there.

    Guid conflicts are not an asset import problem IMO. They are a version control integration problem.

    Collab has great tracking and UI for preventing artists to do this.

    Many teams have written svn / perforce hooks to enforce unity rules. Maybe we should provide some documentation / sample version control hook scripts for users ourselves.

    But guid problems are totally solvable in unity today on your end today via version control hooks that prevent artists from pushing the wrong thing into version control.
     
    ArachnidAnimal likes this.
  6. Joachim_Ante

    Joachim_Ante

    Unity Technologies

    Joined:
    Mar 16, 2005
    Posts:
    5,203
    The benefits primarily come from multiple people working in a project. Usually different people work in different scenes.
    Thus each person using a different set of assets.

    Currently all assets have to be downloaded by everyone, thus everything scales with full project size as opposed to what you use right now.

    Eg. lets say you pull from version control. Maybe some artists changed a section you are not working on, in the new system you dont need to wait for importing those assets and when you use cache server you also don't have to wait for downloading them from cache server, until you start opening that scene using those assets.
     
    ArachnidAnimal likes this.
  7. hoesterey

    hoesterey

    Joined:
    Mar 19, 2010
    Posts:
    659
    Good to hear that the cache server is improving.

    Also I have not used Unity Collab on a project yet. I believe they were started before Collab was up and running. I'm excited to set it up and test it out. Previously we wrote some custom Perforce and TFS plugins too try and force the check-in of meta files but we ended up needing to disable some of the functionality due to some issues we ran into.

    Thanks for the reply, really appreciate you taking the time to alleviate my concerns.
     
  8. hoesterey

    hoesterey

    Joined:
    Mar 19, 2010
    Posts:
    659
    Hi,
    Sorry to necro this but I thought about this some more and still am concerned. Basically we are still having tons of issues with "libraries" in general.

    -We constantly have to delete them when they become corrupted. With a shared library this means the entire team is down for a day.
    - We can't actually ever send Unity QA many of the issues we have. E.g. I had an issue with oculus not rendering correctly that was fixed by deleting/rebuilding the libraries. I can't easily send this as its not part of my project.
    -We have odd issues pop up on one computer and don't have an easy way to see the differences as the libraries are not in source control. Thus we have no way to know if its a library issue (often it is)
    -It is creating work flow hick-ups. An engineer wants to enter a scene to fix a bug that he doesn't often enter and Bang everything from that scene needs to import.

    In my experience paying the cost to download a larger project 1 time and then a smaller cost to download new things at the beginning of every day is a better workflow with large teams. I get the theory of "only download what you need" but in practice this isn't working for us and honestly could still be done with package files while also having a track-able object in source control.

    A single deletion of the library now takes the entire team down for the day. The savings of not downloading assets you don't use (until you do) isn't proving to offset the disadvantages.

    I'm still hopeful Unity moves toward package files as I think they are a more stable solution.

    Thanks!