Search Unity

RhythmTool - Music Analysis for Unity

Discussion in 'Assets and Asset Store' started by HelloMeow, Sep 26, 2014.

  1. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280

    RhythmTool is a straightforward scripting package for Unity, with all the basic functionality for creating games that react to music.

    RhythmTool analyzes a song without the need of playing the song at the same time. It can analyze an entire song before playing it, or while it’s being played.

    There are a number of types of data it provides:
    • Beats
    • Pitch
    • Onsets
    • Changes in overall intensity
    • volume

    This data can be used in various ways and is provided through an easy to use asset and event system.

    RhythmTool is designed to analyze and sync songs with a known length. Unfortunately it is not possible to analyze a continuous stream of data, like a web stream or mic input.

    Questions and feedback
    Any questions, feedback or features you would like to see? Post it here or send me an email at tim@hellomeow.net
     
    Last edited: Nov 19, 2021
    Gekigengar, woskyleo and daniFMdev like this.
  2. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Good news everyone,

    In the last few weeks I have been able to make some big improvements to the tempo detection. Both the tempo detection and the synchronization with the song are much more accurate. It will also be more straightforward to use the tempo detection results. This really needed improving.

    The next update will bring a lot of changes and improvements:
    • Improved tempo detection
    • Improved the way tempo detection data is stored and how it can be used
    • Added BPM calculation
    • Analysis results can now be saved and loaded
    • RhythmTool is now derived from MonoBehaviour
    • Added OnReadyToPlay and OnEndOfSong messages
    • Renamed variables and methods to make more sense
    I'm currently cleaning up the code and re-writing the documentation. ETA is 2-3 weeks.
     
    elbows likes this.
  3. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Version 1.6 has been released!
     
  4. -JohnMore-

    -JohnMore-

    Joined:
    Jun 16, 2013
    Posts:
    64
    Hi,

    Sorry, I already found the original thread and the documentation :)

    - deleted stupid questions-

    Thanks!
     
    Last edited: Feb 7, 2015
  5. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Just a heads up,

    RhythmTool appears to work perfectly fine with Unity 5, but the Examples have a small issue where the music appears to be too low or inaudible. This is because Unity 5 doesn't convert Unity 4's 2d sounds correctly, basically turning them into 3d sounds.

    To solve this, just turn the "Spatial Blend" slider all the way to 0 for the AudioSources in the affected example scenes.
     
  6. sinetwo

    sinetwo

    Joined:
    Mar 23, 2015
    Posts:
    5
    Hi HelloMeow, this may be what i'm looking for. I'm effectively trying to create a metronome that has a looped beat, and play samples above that at decided times, and judge whether the person playing the game accurately hit it on those times or not.

    Would this be able to help with that workflow? it doesn't need to automatically calculate the BPM as I would do that, but I would need it to 'start' detection after an offset (e.g. if the song has an intro which should not be counted.)
     
  7. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    I don't think it will. This is mainly for analyzing songs on the fly.
     
  8. Demigiant

    Demigiant

    Joined:
    Jan 27, 2011
    Posts:
    3,242
    Hi @HelloMeow!

    I just discovered your package and am reeeeally interested in its BPM feature for a future project, so I'm gonna bother you with a couple of questions, if I may.
    1. If I understood correctly, BPM is calculated in realtime while the song plays. Is it possible for RhythmTool to parse bigger chunks of the song at startup (even if it requires some time) and return a more accurate BPM? This would also allow to use Rhythm tool in a "lighter" mode, without having to analyze stuff other than at the beginning.
    2. Out of curiosity, why no event system (or maybe I just missed it)? It would be nice to have an "OnBeat" (and so on) event that one could listen to, which would also prevent the "current beat for multiple frames" issue you mention in the docs.
    Cheers,
    Daniele
     
  9. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Hi,

    It's possible to analyze the entire song at startup by just checking an option in the inspector. RhythmTool doesn't just calculate the BPM, it also tracks beats.

    Technically it would be possible to detect the BPM by analyzing a small part of a song, but not out of the box and not without making some changes. And if you do, you won't have the data for when the beats actually occur.

    It doesn't use Events. Mostly because I haven't had the need for my own games. It's on my list though.
     
  10. Demigiant

    Demigiant

    Joined:
    Jan 27, 2011
    Posts:
    3,242
    Thank you!
     
  11. joshvidamo

    joshvidamo

    Joined:
    Sep 12, 2013
    Posts:
    1
    Hi @HelloMeow!

    I've been looking for this for a looong long time now. This is worth the pay, thank you for this!

    I just want to know if you provide tutorials on a breakdown of using this Rhythm tool? Such as using beats as an interaction with the player's input and etc. (ex. Guitar Hero)

    If you do, that would be so cool 'cause I've been studying this tool for a while now and I think the documentation doesn't cover how to use the beats before, during, after they arrive on time. Or if it does, I'm having a hard time understanding it. :(

    Best Regards,
    Josh
     
    Last edited: Feb 19, 2016
  12. bahk007

    bahk007

    Joined:
    Sep 24, 2016
    Posts:
    4
    @HelloMeow
    Fantastic asset. I've downloaded and uploaded into my Unity project.
    I'm wondering if there is a way to code the data that the script is picking up and apply it to transformations of a set of objects. Kind of like a audio visualizer but without instancing objects.
     
  13. h_nob

    h_nob

    Joined:
    Oct 3, 2014
    Posts:
    2
    Hi @HelloMeow!
    Thanks for a good asset

    I have a question.
    If it is over bpm 180, it will be 90 in half, how can I solve this?
     
  14. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Hi,

    Unfortunately that's a limitation of the method used to find the most likely beat length. It's limited to a certain range, which is between 80 and 160 bpm. I'm still trying to improve this.
     
  15. h_nob

    h_nob

    Joined:
    Oct 3, 2014
    Posts:
    2
    Thanks for the reply.
    I'm looking forward to the update!
     
  16. smnerat

    smnerat

    Joined:
    Sep 23, 2012
    Posts:
    35
    @HelloMeow I tried out the demo and it was pretty awesome, I was getting solid onsets and beats for the several songs that I tried. My only question before purchasing is the number of bands that you have. Am I limited to just the low, mid, high and beats? Or is that just for the best detection?

    I need to use between 12 and 24 bands, with 16 probably being the average, so at 16 bands would I still be able to get solid detections?
     
  17. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Hi!

    There are 4 default analyses with different frequency ranges for onset detection. These ranges are somewhat arbitrary, but they appeared to give the most useful results. You can change these ranges, or add more analyses with different ranges, although I'm not sure how that will affect the results.
     
  18. smnerat

    smnerat

    Joined:
    Sep 23, 2012
    Posts:
    35
    As a heads up to anyone else, it's pretty easy to setup analyses for bands besides the defaults. The onsets I get lineup pretty well with what I expect and hear. I haven't tested out using more bands yet, but I doubt the performance would drop very much. Solid purchase.

    Code (CSharp):
    1. int FrequencyToSpectrumIndex (float f) {
    2.     var i = Mathf.FloorToInt (f / AudioSettings.outputSampleRate * 2.0f * 1024);
    3.     return Mathf.Clamp (i, 0, 1024);
    4. }
    5.  
    6. void SetAnalyses() {
    7.     float[] frequencies = new float[]{ 31.5f, 63, 125, 250, 500, 1000, 2000, 4000, 8000, 12500, 16000, 20000 };
    8.     float bandwidth = 1.414f;
    9.  
    10.     for (var i = 0; i < frequencies.Length; i++) {
    11.         int a = FrequencyToSpectrumIndex (frequencies[i] / bandwidth);
    12.         int b = FrequencyToSpectrumIndex (frequencies[i] * bandwidth);
    13.         rhythmTool.AddAnalysis (a, b, i.ToString());
    14.     }
    15. }
     
    Last edited: May 30, 2017
    Akshara likes this.
  19. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    RhythmTool 2.0 has been released. This new version brings a number of improvements and big changes:
    • RhythmTool, RhythmEventProvider and the documentation have been rewritten from scratch
    • Removed RhythmTool.IsBeat and RhythmTool.IsChange. ContainsKey or TryGetValue for both the beats and changes collections can be used instead
    • Replaced RhythmTool.NewSong() with RhythmTool.audioClip
    • Renamed RhythmTool.calculateTempo to RhythmTool.trackBeat
    • Renamed RhythmTool.preCalculate to RhythmTool.preAnalyze
    • Renamed RhythmTool.storeAnalyses to RhythmTool.cacheAnalysis
    • Added RhythmTool.Reset and RhythmEventProvider.Reset, an event that occurs when a song has restarted or a new song has been loaded
    • RhythmEventProvider now needs to be given a RhythmTool Component instead of using all RhythmTool components in the scene
    • RhythmEventProvider no longer uses UnityEvents. Instead it uses c# events, for a more consistent api. In the previous verions c# events and UnityEvents were used in different cases. This means the events are no longer available in the editor
    If you're using a previous version and plan to use version 2.0, you will need to keep an eye on the following:
    • Use RhythmTool.audioClip instead of RhythmTool.NewSong()
    • Use the new C# events in RhythmEventProvider instead of the old UnityEvents
    • Give every RhythmEventProvider a RhythmTool component to use for it's events
    All the examples and the optional AudioImporter have been updated as well.
     
  20. yozzozo

    yozzozo

    Joined:
    Oct 7, 2013
    Posts:
    5
    Hi @HelloMeow,

    I've bought RhythmTool and integrated it into my game, but running into a couple issues. The game uses a short song that loops, as opposed to a single longer song that starts and ends. I'm detecting the end of the song and then simply calling Play again to loop.

    One issue is that the beat events from the RhythmEventProvider seem to only get sent until the song loops, then they are never sent again.

    Another thing is, I'm relying on BeatTime() to calculate normalized beat time (to synchronize GameObjects), and that seems to work fine even as the song loops. However, there's a delay right when as it loops -- where I'm assuming the song is being re-analyzed by RhythmTool, despite me having the "Cache Analysis" option turned on. Eventually, the tracking "catches up" to the playing song, and it works fine, until it loops again. When things loop, shouldn't it just set its internal tracking indices to 0 and re-use the existing analysis if the song hasn't changed?

    Thanks!
     
  21. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Hi Yozzozo,

    The issue with the RhythmEventProvider is a known bug and has been fixed in version 2.0. Please contact me at tim@hellomeow.net if you want me to send you a fix for an older version.

    The other issue (with BeatTime) might be caused by the analysis itself. The synchronization of the beats can be off near the very start of the song. I'm trying to improve this for the next update. It's not likely that it is re-analyzing the song, because it doesn't do that when song hasn't been changed.
     
  22. yozzozo

    yozzozo

    Joined:
    Oct 7, 2013
    Posts:
    5
    @HelloMeow Thanks. Regarding the RhythmEventProvider issue, I'm actually using version 2.0, so this may be something new...
     
  23. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    I've found something that I hope might be causing the issue.

    Unity doesn't make a distinction between an AudioSource that has stopped playing and an AudioSource that is paused. RhythmTool did not handle this correctly. When a song reaches the end and stops playing, it assumes it's paused. So when Play() is called and the song restarts, it does not reset the loop that checks for some of the events.

    Adding the following lines in RhythmTool.cs should fix it.

    In OnSongEnded() around line 500 add Stop();

    Code (csharp):
    1.  
    2. private void OnSongEnded()
    3. {
    4.     Stop();
    5.  
    6.     if (SongEnded != null)
    7.         SongEnded();
    8. }
    9.  
    And in OnReset() around line 520 add currentFrame = 0;

    Code (csharp):
    1.  
    2. private void OnReset()
    3. {
    4.     lastDataFrame = 0;
    5.     currentFrame = 0;
    6.  
    7.     if (Reset != null)
    8.         Reset();
    9. }
    10.  
     
  24. yozzozo

    yozzozo

    Joined:
    Oct 7, 2013
    Posts:
    5
    @HelloMeow This fixed the issue for me! The song now sends beats reliably on every loop. Thanks!
     
  25. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    @yozzozo That's good to hear. A new update including this fix has been released today.
     
  26. larrytran

    larrytran

    Joined:
    Mar 4, 2015
    Posts:
    5
    Hi can you explain this part of the code in the Visualizer example?
    Code (CSharp):
    1.         foreach (Line line in lines)
    2.         {
    3.             Vector3 pos = line.transform.position;
    4.             pos.x = cumulativeMagnitudeSmooth[line.index - rhythmTool.currentFrame] * .2f;
    5.             pos.x -= magnitudeSmooth[rhythmTool.currentFrame] * .2f * rhythmTool.interpolation;
    6.             line.transform.position = pos;
    7.         }
    It seems that you are using the cumulative magnitude as a measure of distance for each Line. But why do you multiply by the interpolation of the past frame? It seems to work just fine without it.

    Also is this the best way to move these Lines towards an endpoint? Why is using cumulative magnitude better than just counting the number of frames between and interpolating between frame indexes?
     
  27. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Interpolation is used to move lines smoothly. Otherwise the lines would only appear to move when currentFrame changes, which is only roughly 30 times per second.

    In the example the lines move based on the loudness of the song, which is why cumulativeMagnitudeSmooth is used. The position of each line is based on the sum of all magnitudeSmooth from currentFrame up to the line's frame index.

    This is how to move the lines at a constant speed, only based on the line's frame index.

    Code (csharp):
    1.  
    2. foreach (Line line in lines)
    3. {
    4.     Vector3 pos = line.transform.position;
    5.     pos.x = line.index - rhythmTool.currentFrame - rhythmTool.interpolation;
    6.     line.transform.position = pos;
    7. }
    8.  
     
  28. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    Hi Tim, first of all: Awesome tool, thank you for sharing it via the Asset Store! We're currently using it to add procedural beatmap generation into Holodance and results so far are very promising!

    I ran into one thing I found a little odd and just wanted to check with you about this:

    Code (csharp):
    1.         _low = new Analysis(0, 30, "low"); //0hz - 645hz
    2.         _mid = new Analysis(30, 350, "mid"); //645hz - 7500hz
    3.         _high = new Analysis(370, 900, "high"); //7500hz - 20000hz
    4.         _all = new Analysis(0, 350, "all"); //0hz - 7500hz
    To me, this looks like there's a gap between 350 and 370, and, more importantly, _all ignores anything above 7500 hz. Is this done this way intentionally, or maybe just an oversight?

    Obviously, making a change here could throw off a lot of things relying on the current ranges, so I want to make sure this is correct as early as possible ;-)
     
  29. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Hi,

    Yeah, that's intentional. The ranges are kind of arbitrary and happened to work the best for most songs I used while testing. A gap like that between the different ranges doesn't really affect the results, because most sounds are made up of a wide range of frequencies.

    Using different ranges like this really only allows you to get a very rough idea of the kinds of onsets that are detected. Recently I've been looking into alternative or better methods that could provide better data.
     
  30. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    Thank you, that clarifies it. And can't wait for better data, even though the results are already a lot of fun to play and work perfectly in many cases.

    The greatest issue I see at the moment is tempo-recognition and BPM. While it does work decently in some cases, we often get "half beat" (or maybe "double beat"): The song is actually fairly fast, like 150+ BPM but RhythmTool thinks it's 75+ BPM. While with certain kinds of music this is a bit fuzzy (Dubstep would be an obvious example), I have seen this with 4 to the floor techno / trance tracks.

    Another issue is that most current music actually never changes tempo but might start with little percussion, so at first the tempo detection is way off (simply because there's no onsets that would give you any reasonable way to guess the tempo) and then later slowly adapts. In other words: I see a lot of tempo changes where there aren't any. The way I do this when I manually sync a song is by guessing the BPM, and then comparing the bars in Cubase with the waveform and make sure the "ones" line up throughout the song.

    Would it be possible to add a checkbox for this assumption ("assume the song does not change tempo"), and then add an heuristic like that? Basically something that tells RhythmTool "look, this song doesn't change tempo, so when you make the right guess, it will work throughout the whole song, which you can test by getting the 'dominant BPM', and then checking if that matches most beats when you create them under that assumption".

    Closely related: For quite a few features we actually need to know where the bars start. So while it's great to know "here's a beat", what we often need is "this is the first beat of that bar". With constant BPM and an offset (song might actually start after 0.378 seconds intro), this would be easy to calculate (we actually even have that calculation, in other words, we build our own ticks based on offsets and tempi in various sections of songs where we know those for sure, in the case of procedural mapping, I'd be quite happy to have it work at least with constant-temp songs).
     
  31. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Currently the beat tracking is limited to a range of 80-160 BPM. Anything outside of that range gets halved or doubled.
    This is an unfortunate side effect of the way competing BPMs are handled. Otherwise the detected BPM could fluctuate between half or double the actual value.

    Optionally looking for a static tempo is a great idea that I will add to the list. This would only be possible when pre analyzing songs.

    Bar detection, like a lot of music feature extraction, is not easy. At least not to do fairly consistently for most songs. It might seem easy if you know certain parameters, but these depend on the song and are not trivial to detect, even if you already have beat locations. It is one of the features I want to add, along with the detection of pronounced held notes and better beat tracking.
     
    jashan likes this.
  32. Jos-Yule

    Jos-Yule

    Joined:
    Sep 17, 2012
    Posts:
    292
    Looking at the data that is retuned by the tool, is it possible to subscribe to the Beat event, and trigger other sounds to match/sync to that beat? ie. I'd like to cut/crossfade 2 tracks, with the same beat, can I use the beat event to start the other clip in sync?

    Thanks!
     
  33. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Hi,

    That is possible, but I think that the current version has two problems that could make it more difficult.

    The first is that it can be tricky or messy to handle multiple songs. You might need 2 RhythmTool components. The second is that beat tracking prioritizes smooth transitions when a different beat length and offset are detected. This is nice for gameplay and can mask incorrect beat tracking, but it will stand out when you sync a sound to it.

    I'm currently redesigning RhythmTool and these problems are some of the reason why.

    Instead of using the Beat event, I would recommend looking at the most common BPM and offset for the whole song and using those to sync the songs.
     
  34. Jos-Yule

    Jos-Yule

    Joined:
    Sep 17, 2012
    Posts:
    292
    Yes, that was my "fall back" option. I'm pretty sure that the samples/tracks I'm going to be using will have consistent BPM, so if I can detect the BPM (or know it before hand) I can figure that out from there.

    Thanks, good luck on v2!
     
  35. RonTang

    RonTang

    Joined:
    May 20, 2017
    Posts:
    8
    Hello @HelloMeow
    Thanks for your very good asset.
    I have a special case.
    I want to detect much frequent onsets , so I modify the var "frameSpacing" in RhythmTool to 666...
    But I think bpm broken , all the music bpm show 199.I believe this is not an easy issue for me.
    So I am looking forward your help.
    Thanks HelloMeow , good luck on v2!
     
  36. RonTang

    RonTang

    Joined:
    May 20, 2017
    Posts:
    8
    OK, I fixed it by my self. Many other var need modify due to "frameSpacing" changed. Thanks.
     
  37. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Yes. I wouldn't recommend changing frameSpacing because most of the algorithms are tuned to use the default value.

    In version 3 this shouldn't be an issue.
     
  38. ECHO-425

    ECHO-425

    Joined:
    Feb 2, 2018
    Posts:
    19
    Hi @HelloMeow

    Firstly - thanks for creating a great asset pack! I'm still looking through all the documentation in detail, but can already see how powerful & useful the tool is.

    I've seen this question pop up a few times on this thread, but have not seen a solution: How can I pre-calculate BPM for a song, i.e. before playing it? I am trying to figure out how to loop through all the frames or beats to calculate the average time between each, but to no avail. Do you know of or recommend a certain approach for pre-calculating a track's BPM using RhythmTool?

    As some insight into what I'm trying to accomplish - I'm essentially creating a DJ Mixer where tracks are loaded in and analyzed in preparation to being played. I currently get the track waveform, but I'm looking to mark the beats on the waveform & calculate BPM for use in mixing.

    Thank you in advance for your help!
     
  39. ECHO-425

    ECHO-425

    Joined:
    Feb 2, 2018
    Posts:
    19
    I figured it out :) was accessing beat.bpm wrong.... thanks again for making this tool! Cheers
     
  40. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
  41. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    Something else but kind of related: I'm working on a system that lets players add meta-data to music, like tempo and meter, per section. With that information, you can trivially calculate not only where each beat is but also which one is the first beat of a bar. You could also use that information to quantize onsets.

    My understanding of how RhythmTool currently works is that it does first try to estimate the BPM and then already use that information ... but I haven't really looked deep enough into the code.

    So, my question, or feature request, would be to have a way to put the human-created, highly reliable data into the system and use that in pace of any algorithmic heuristics to figure out the same information. For our project, we will need both: If someone has already created the meta data, we'd like to use that as reference; otherwise, the heuristics (i.e. standard approach) would be very helpful, too ;-)

    Is that something that's compatible with your current approach?
     
  42. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    The beats are detected by first finding the most likely beat length and then the most likely beat offset.

    In the current version it isn't really possible to easily have manual data. I'm not happy with the way that is done at all. That's one of the main reasons why I'm completely rewriting RhythmTool.

    In the version I'm working on, there is a RhythmData asset that can have several Tracks with different types of data. These tracks can be provided by Analyses, but it is also possible to edit them in the editor and hopefully eventually in-game as well.

    I'm currently trying to figure out if and how I should do an event system. I'm leaning towards just having the RhythmData asset, because it is very flexible. An event system would be nice, but it would also limit the ways in which it can be used. If anyone has any suggestions I would love to hear them.
     
    jashan likes this.
  43. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    Awesome! This is what I was hoping for ;-)

    Personally, I'm not using the event-system at all. What we do is get all the onsets in a pre-processing step, generate the whole beatmap based on that data, using several iterations over the whole data that includes finding decent positions for each "note", and then start playing the song. IIRC, I did cut a few places that are designed to be able to handle streamed music or do the calculations on-the-fly ... which does create some limits in our approach but I found that to be acceptable given the freedom it added ;-)

    Admittedly, I do have a kind of event-system on top of our beatmap player, so that's probably part of the reason I ignored all events-related stuff in RhythmTool (we added RhythmTool for creating procedural maps two years after the initial release, usually we work with hand-made maps).

    So ... for us, the most important thing about an event system coming with RhythmTool is that the tool can be used without the event system ;-)

    What is more important for us is being able to access those different "channels" in a convenient way. Like, one heuristic we use to generate our actual gameplay events is something like "if it's on an beat, and there is one additional onset on one channel, always make it a note, discard other notes close to it if necessary" or "if there is one strong onset on one channel, or two onsets on two channels, make it a note" or "if there is only one onset, but we don't have any other onsets nearby, take that".

    Another thing that would be useful (and I believe it's actually already in there) is the general loudness. Among other things, we could probably use that to adapt our hitsounds to the loudness of the song to avoid the hitsounds dominating, or being drowned in the the music.

    I believe that if you want to have an event system, the two most important things are a) being able to get the events before they actually happen, with the time when they will happen, as well as getting the events exactly on time (so that's two different approaches that I believe will both be needed in most rhythm game scenarios). And ... b) being able to configure how the events are actually generated.

    b) seems like something that's very tricky to implement ... probably you could do it by supporting delegates or offering an interface, and one "default implementation".
     
    HelloMeow likes this.
  44. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    That should be pretty straightforward. You can find a RhythmData's Tracks by name and/or by Feature type. Feature is the base type for all data types. It has a timestamp and a length in seconds. Then you can find a Track's Features within a certain time range by passing a start and end time. This felt much more flexible and intuitive than using frames. It's trivial to build an event system on top of it.

    Yes, the current version records volume for each Analysis, which is kind of unnecessary. In the new version there is a single Analysis called VolumeSampler. I want to give it some parameters like smoothing and sample rate.

    Thanks for the pointers. I've experimented a bit with different event systems. One uses a ScriptableObject asset as event provider, which is easy to configure and does not depend on scenes. Another prototype uses a component, which is similar, but is tied to a scene. And another uses a singleton as a global system.

    If I could pick your (or anyone else's) brain on another thing. I've got multiple prototypes for the Analyzer. One is a component, that can be configured by adding Analysis components. This feels like it fits in with Unity's way of doing things and it is easy to use. It supports multiple configurations because you can have multiple GameObjects with different settings. The drawbacks are that it's tied to a scene and I'm not sure how to make it support edit mode.



    Another is a regular class, with separate settings which can be configured in a settings window. This has the major advantage that you can analyze several AudioClips simultaneously and that you can analyze any AudioClip in edit mode. It's not tied to any scene. The drawback is that it has global settings and that it is a bit less intuitive. Which would you prefer?

     
    Last edited: Sep 26, 2018
  45. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    third option: you can make multiple SO profiles (see CreateAssetMenu attribute) - in similar fashion e.g. postprocessing pack uses them
    - each separately configurable and assignable to given analyzer instance/s
    (if i got your question right)
     
  46. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Thanks, this is the kind of input I'm looking for.

    This is one of the designs I've tested. It had similar drawbacks as the component based version. It's not clear how it should work in edit mode. For example, which profile is the editor going to use and how? And how would the user give the profile to the Analyzer in play mode? Would the Analyzer be a component? It felt like a less intuitive version of the component based version.
     
  47. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    Analyzer is a component with public member, e.g.
    Code (CSharp):
    1. public AnalyzerSettings analyzerSettings;
    where AnalyzerSettings is ScriptableObject asset created in the Editor from Create menu (CreateAssetMenu attribute)
    You can create as many of them as needed, it can be assigned either manually via Inspector, or by whatever other means as needed (if e.g. user has list of settings, those can be assigned to Analyzer at any time)
    SO has advantages such that it's not part of a scene (it's an .asset file in the project), it's changes are saved in Play mode and so on
    But I'm not *entirely* sure what you're exactly after right now, esp. since it would probably need event handlers to be set up too - I'd probably go from there - i.e. making setting up events as easy as possible, then solve component dependencies
    (a SO asset file is rather comfortable to work with overall, but it's slightly different from normal scene stuff)
     
  48. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Yes, this is more or less how it worked. The issue was that it was was messy to get the Analyzer component to work in edit mode (right-click AudioClip > Analyze). While the settings are an asset, the analyzer is still a component in a scene somewhere. It wasn't all that different from the other component based version.
     
  49. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    Ah, so you wanted editor script to do some processing on an asset (AudioClip) - you'd need (possibly separate) editor script for that though and then having both editor script and runtime analyzer using e.g the same common independent plain c# class
    While good place for analyzed data would be probably a ScriptableObject asset then (component in the scene can reference it, and for the analysis using the common part if needed)
    I'm still not sure I'm getting it right, but hopefully you'll come up with something usable :)
     
  50. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    One more thing which might fit your intended usage:
    After creating the offline analysis asset via right-click AudioClip editor script, you store reference to the AudioClip itself, alongside analysed data (you can e.g. make analysis separate step via custom editor button) -
    you can then reference this SO with AudioClip reference and data in AnalysisPlayer component (solely for playing back preprocessed data), and have (existing) Analyzer component processing realtime audio (with AudioClip directly referenced on it) - here separation of offline/cached analysis playback, and realtime not seen before audio playback would probably make sense, too