Skip navigation
Currently Being Moderated

Unpause animation if key value equals X

Jan 15, 2012 3:51 PM

Tags: #key #value #music #automation #midi #expressions #pitch

tldr: I need an expression that plays a sequence of images on a layer (and leave it on the last image) when the value of a key (keyframe?) in another layer of the same comp equals a specific value.

 

I am fairly new to AE and trying to port a project I made exclusively using Processing. Wrapping my head around how Expressions work is proving difficult, and I learn best by example, and am even finding it hard to piece together how to do this from what I've found online.

 

The situation and what I need:

I have imported key frames from a midi file (using this script) into a layer called "midi" and a layer that is an animation of frames . The idea is that if midi's effect ch_0_pitch has a value of, in this case, 36, that the frames in layer "ds_bass" animate when they come across one of these keys.

 

This is basically a "bass drum" animating on a "bass note" pitch. The midi file contains references to the entirety of drum midi needs (ie hi hat, snare, bass), each having a different pitch value. While I could simply render each individual "pitch" midi sequence to a midi file and import it as keys and then use a time remap expression to unpause and pause the animation on markers, which has worked, I need the capability to specify pitches for other instruments and run specific animations based on what pitch is being hit. As well, going through and adding a marker for each midi note is tedius and inaccurate (as there are very small differences visually between two keys with different values played synchronously, ie a bass hit and a hi hat hit).

 

I found something online that accomplished this at first, with markers, placed in TimeRemap of the ds_bass layer. I would go note by note in the midi and add markers to the ds_bass layer for this to work, and it perfectly paused and ran the animation when I needed it:

 

try {
      m = thisLayer.marker.nearestKey(time);
      linear(time, m.time, outPoint, 0, outPoint - m.time);
} catch (e) {
value

}

 

But seeing as I needed more to automate this process (this is a four minute piece with a lot of instrumentation and I'd like to be able to use this as a template for another project), I looked at the reference and other resources and the best I could come up with was something like this, which doesn't cause errors but also doesn't do anything (as well on TimeRemap of ds_bass layer in the same comp as midi):

 

try {

     if (thisComp.layer("midi").effect("ch_0_pitch").nearestKey(value) == 36) {

          m = thisComp.layer("midi").nearestKey(time);

          linear(time, m.time, outPoint, 0, outPoint - m.time);

     }

} catch (e) {

value

}

 

In Processing this was all handled with a switch in a method that brought in midi info on noteOn signals, and that checked for pitch values and ran methods based on pitch which ran sequences of images. Maybe I am going about this the wrong way, admittedly, as I am terribly new to AE, but this seems like it should be completely possible to automate all of this for very solid and automated timing mechanisms based directly off of the midi data.

 

Many of the other animations will be dependent on running a sequence based on pitch, in the sense of having a puppet that can animate ArmL to ArmLPlaySnare OR ArmLPlayHat - having a single "resting" frame that executes different animations (these are sets of 1-5 .pngs) dependent on midi pitch. Again, how it was managed in Processing.

 

If anyone can give some insight I would greatly appreciate it. I moved to AE as I found I'd need to render visuals in non-realtime (from midi data), and it seemed superior to learn AE as it has render capabilities that are more inline and intuitive with this project. At the moment I am stuck. Thanks for your time.

 

Edit - If I could simply copy the key frames I need to the ds_bass layer and then use the marker script (first above) with nearest key instead I figure it would work fine, though that's merely a more satisfactory convenience, not a proper fix. I'd still have to easily render dozens of midi files and do this for each. Best case scenario would be that I have one keyframed midi file with all channels in it (in my "master comp?") that can be referenced by the individual compositions. At the moment I'm putting instrument specific midi-keyed layers into each instrument specific comp.


 

Message was edited by: cheerbot: added references online to resources used

 
Replies
  • Currently Being Moderated
    Jan 15, 2012 3:59 PM   in reply to cheerbot

    Interesting. So will your ch_0_pitch effect have keys for multiple drums? If so, the the bass time remapping expression needs to start at the current time and go backwards in time, key-by-key, until it finds a key that matches that drum. Then it will know how far along in the animation the time remapping should be. If there's only one instrument in the set of keys, the expression just needs to find the most recent, previous key, which is pretty easy.  Are there keys for both note on and note off events? With a little more info on the nature of the keyframe values, we can probably zero in on this.

     

    Dan

     
    |
    Mark as:
  • Currently Being Moderated
    Jan 15, 2012 6:34 PM   in reply to cheerbot

    I think it's all very do-able, but you have a number of issues to deal with. This is the basic expression that will calulate how long it has been since a note 36 keyframe:

     

    myId = 36;

    note = thisComp.layer("midi").effect("ch_0_pitch")("Slider");

    n = 0;

    if (note.numKeys > 0){

      n = note.nearestKey(time).index;

      if (note.key(n).time > time) n--;

    }

    if (n > 0){

      while (n > 0){

        if (note.key(n).value == myId) break;

        n--;

      }

    }

     

    if (n > 0)

      time - note.key(n).time

    else

      0

     

    The problem is that the script generates keys for note on and note off and there doesn't seem to be a good way to know which is which. So the note off keys are going to mess up the timing.

     

    So maybe you need to modify the script to leave those out. That should work for animations that start when the note sounds (like animating the drum, for example), but your other animations need to start before the note (like an arm moving to strike the drum). So you probably need a marker on the animation layer to denote where the animation needs to be when the note fires. Then your expression needs to calculate the timing that will syncronize the drum key to the marker on the animation.  I did a tutorial over at Creative Cow a while back that discusses synchronizing with markers that might be of some help:

     

    http://library.creativecow.net/articles/ebberts_dan/audio_sync.php

     

     

    Dan

     
    |
    Mark as:
  • Currently Being Moderated
    Jan 15, 2012 11:51 PM   in reply to cheerbot

    I think you're close, but it occured to me that it might be better to modify the script to generate layer markers instead of keyframes. Markers can have parameters associated with them so a single marker could contain the pitch, the duration, and the velocity. That would get rid of the pesky note off keys. I think all you'd be losing would be the note-off velocity (I don't know if that's critical to your plans). I'll take a look at the script tomorrow if I have time.

     

     

    Dan

     
    |
    Mark as:
  • Currently Being Moderated
    Jan 16, 2012 10:16 AM   in reply to cheerbot

    If you go into the script's applyChannelToLayer() function and wrap the piece of code that populates the picth, val, and dur arrays with a test for non-zero velocity, I think that will eliminate the keyframes that mess up the expression:

     

     

    if (vel > 0){

      pitchTimes.push(t);

      pitchValues.push(pitch);

      velTimes.push(t);

      velValues.push(vel);

               

      var durS = "" + dur;

      if(vel && typeof(dur) != "undefined")

      {

        durTimes.push(t);

        durValues.push(dur);

      }

           

      lastT = t;

    }

     

    Give that a try and see if it helps.

     

    Dan

     
    |
    Mark as:
  • Currently Being Moderated
    Jan 16, 2012 4:45 PM   in reply to cheerbot

    I'm not sure if you really need the note off events, because you have the note duration. Note off just gives the velocity of the key release, doesn't it? If you can live without that, you should be able to calculate note off time from note on and duration. Of course, I may be missing something (it wouldn't be the first time)...

     

    I don't know if you've see this on my site, bu it has some info on multiple animations on the same layer (deliniated by markers) that might be useful:

     

    http://www.motionscript.com/design-guide/marker-sync.html

     

    I did try to convert the script to save all the data in markers, but it hung AE. I don't know if it's a bug I introduced or if it chocked on so many markers, but I may try it again, because that seems like the way to go if it works.

     

    Dan

     
    |
    Mark as:
  • Currently Being Moderated
    Jan 16, 2012 4:56 PM   in reply to cheerbot

    Cheerbot -- I love the animation you linked http://andsuch.org/pub/saturn_v4_test.mov !

     

    Dan -- Looks like you found a good fix for his needs, eliminating the note-offs. I'll definitely add a checkbox for that (or maybe just remove all the note-offs, they're not so interesting with the Duration track...)

     

    Is there any benefit to using Markers instead of Keyframes? The Keyframes carry values, which seems nice.

     

    Cheerbot -- my thought was that in some cases one could use the music authoring tool to construct a reduced or altered MIDI track that was specifically for the AE import. Polyphony makes for confusing keyframes. Very simple scores (like the Two Part Invention example I used) work ok as-is.

     
    |
    Mark as:
  • Currently Being Moderated
    Jan 16, 2012 5:06 PM   in reply to david van brink

    I was thinking markers might be a good choice because you can stuff the note, velocity, and duration in the parameters of one marker. That way you wouldn't have to correlate separate streams of keyframes.

     

    Dan

     
    |
    Mark as:
  • Currently Being Moderated
    Jan 16, 2012 8:01 PM   in reply to Dan Ebberts

    Oh! I... I never used markers. They can hold multiple attributes. Ok. Awesome. Got it. Thanks.

     
    |
    Mark as:
  • Currently Being Moderated
    Jan 19, 2012 6:46 AM   in reply to cheerbot

    This has been very helpful. Nice project.

    I had just discovered Davids script and also wanted to create keyframes for the note on event only. I'm just painting notes like a pianoroll, but with just one keyframe for every pitch you can decide to paint a bezier curve or linear curve instead. Thanks everyone involved.

     
    |
    Mark as:

More Like This

  • Retrieving data ...

Bookmarked By (0)

Answers + Points = Status

  • 10 points awarded for Correct Answers
  • 5 points awarded for Helpful Answers
  • 10,000+ points
  • 1,001-10,000 points
  • 501-1,000 points
  • 5-500 points