• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Assessing coordination / synchrony between to walkers?

New Here ,
Feb 19, 2018 Feb 19, 2018

Copy link to clipboard

Copied

Would it be possible to use AA to be used to asses the amount of coordination or synchrony between two people walking?

We are new to AA and will be running an experiment where we want to measure synchrony between two people walking in a straight line side by side.

Assuming of course we had a high enough quality camera, set up appropriately and  adequate distinct markers set up on peoples shoes/ankles, and the ground, would it be possible? 

We would want to not only track the location of the co-walkers feet across the frames, but compute a measure of relative phase between peoples feet at various time points. Is there anyway for AA to output the kind of data we would need, without us having to take measurements ourselves frame by frame.

Views

155

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Feb 19, 2018 Feb 19, 2018

Copy link to clipboard

Copied

Motion tracking can give you the position data for each point you track. Each tracker will give you a separate set of keyframes. You are going to have to be a math wizard to write an expression that compares the position data of tracker one in relation to the position data of tracker 2 and takes into account camera movement, the position of each body in relationship to the ground and all of the other factors to give you some meaningful data that tells you what the difference in time is between the left heel of person 1 striking the ground and the left heel of person 2.

An easier approach may be to add two nulls to the composition, label null 1 person 1 and null 2 person 2 then manually set markers each time the left heel strikes the ground. You can do this by previewing the video and striking a key each time a foot hits the ground.

If you want to automate the process you will have to establish a ground plane for each walker that stays exactly the same Y position, then write collision detection with that value.

The more I think about this problem the more I am thinking that After Effects can do it but there is probably a much easier way to get this data. If you have not done the test yet it might be easier to record the data from some kind of wearable (Apple Watch or something similar) and use that info to analyze the data. Accurately tracking the movement is pretty easy, establishing a ground plane for each walker is going to require some precise camera work and a lot of careful work in post, and the calculations are going to require even an expert like Dan Ebberts, the expression master of all expression masters, to spend several hours fiddling with the calculations.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Feb 19, 2018 Feb 19, 2018

Copy link to clipboard

Copied

Hi Rick,

Thanks for your quick and helpful reply

No we haven't collected data yet, we just exploring our options.

We were thinking of having a  stable camera at one end of a hall, and having people walk towards (and then away from)  it (maybe 20 meters), side by side in a straight line. Would that mean many of the issues you pointed out might not be present.

Yes we were hoping to automatize the process if possible.

Adding markers to the floor (where people would step) may actually be a possibility, which if i understood correctly would mean we could automatize the process right?

I'm not sure apple watches would detect the movement of each foot though? (although i must admit i hadn't thought of or looked in to the possibility of using an apple watch.  But i will, thanks for the idea). Though we do have other options in mind.

Thanks again for your input Rick

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Feb 19, 2018 Feb 19, 2018

Copy link to clipboard

Copied

LATEST

What I meant about the difficulty of automating the process was that if you track someone's foot then it is going to move up and down on the Y-axis. Let me just talk through it and see what problems I run into.

If the person is walking toward you then as they get closer to the camera the foot is going to be farther down the y-axis. You would have to write an expression that looked for two or three frames that had the same Y value below a minimum threshold and record that number. The problem with expressions is that they do not store data and in order to look back you have to write a recursive expression that examines every frame and then finds some pre-determined value. This makes the process very slow. The calculations happen pretty quickly for the first 30 frames but then the time to calculate the next frame starts to go up exponentially. If the shot is a couple minutes long you could easily get into a situation where it took several minutes a frame to make the calculations. There are just a lot of potential problems with that approach.

Let's try a different approach.  Instead of tracking the motion why not put a microswitch in the shoe of each person that turned on a light when their foot strikes the floor. You could have a different light on the left and right foot. Then you could track the position of the lights and sample the color to tell when the lights turn on and off. This data could be used to animate the position of a line graph so you could tell when each foot was on the ground. That might be a start of an idea. The different graphs could be aligned so you could see when each foot hit the ground and how in sync they were. I am not sure how you would go about measuring the time difference between the on and off pulses but you could quite easily figure out the time each frame turned on and record that value on a text layer. If the person walking took 300 steps you would need 300 text layers and then you could retrieve that data to make some other calculations. This is looking very complicated and may be a dead end.

If I think of other approaches I will let you know. If this were my project I would want 10 days to run tests and try expressions to determine if the project was doable in AE. I would offer no guarantee that I would be successful in automating the process. Just for a frame of reference, I have been using After Effects on average probably 10 to 15 days a month for the last 25 years and I write expressions some new expressions about every 10 projects. I could probably get something done in a day for a shot that was 1 minute long by simply manually counting frames. Unless you have a bunch of these tests to run it would be much faster to just set some markers by hand and type in the numbers or animate the graph by hand.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines