17 Replies Latest reply on Aug 26, 2018 5:12 AM by SharonfromMD

    lip sync in character animator not very accurate

    davidl7576562

      Hello Word,

       

      I have just started experimenting with character animator and have set up a puppet with the mouth shapes as described in the tutorial.

      When I import my audio for sync it has a good try but is not particularly close to being accurate.

      Is there a way to train the lip sync to my voice ?

      Do I have to overly enunciate ?

      Or is it because i am Australian and it has no idea what I am talking about ??!

       

      Hope you can help.

       

      Thanks

        • 1. Re: lip sync in character animator not very accurate
          oksamurai Employee Moderator

          We are constantly working on lip sync to make it better, for everyone regardless of nationality or language. If there are particular sounds that don't seem to work well for you, please let us know here. Or even better, upload a short unlisted video to YouTube and show us how it messes up. And maybe try different example puppets to see if the mouths feel better or worse for you...sometimes it's a subjective thing what works best.

           

          You also have the option to edit your lip sync after recording, as explained about 3 minutes into this video: Recording & Editing with Takes (Adobe Character Animator Tutorial). This will be even easier in future releases, we're working on it.

           

          Hope that helps a little...

          1 person found this helpful
          • 2. Re: lip sync in character animator not very accurate
            davidl7576562 Level 1

            thanks of the reply, I will see if i can put together a youtube to show the current results.

            One of the most obvious problems is that it quickly reverts back to the neutral pose between other sounds. 

            I will check that video out for now, thanks again.

            • 3. Re: lip sync in character animator not very accurate
              impact900

              I am having the same sort of lip syncing problem.

              for example: No matter how loud & clear I talk, whenever I'm making the W-Oo sound or the Oh sound, it picks up L. My puppet shows his tongue and everything. It makes me have to constantly edit the vismes after every performance.

              • 4. Re: lip sync in character animator not very accurate
                oksamurai Employee Moderator

                If possible, can you post here or DM me a link to an audio file that shows off your issue? Like, say a bunch of W-Oo or Oh words that seem to be incorrectly showing up as L sounds? That would help us tremendously and possibly fix the issue for future releases.

                • 5. Re: lip sync in character animator not very accurate
                  El Wombat Level 2

                  I've found that for example the English pronunciation of "Oh" as in only (nli) gets picked up way better than the German or Spanish ones. I assume that's because the "Oh" sound in "only" is a rather easy to hear diphtong while both German and Spanish "ohs" are really close to a-class vowels.

                   

                  As I have mentioned before - but not yet made a video of it, sorry - there is a similar focus on English with the letter/viseme "R". While "r" as in "further" or "corpus" get picked up just fine, the "rolling" "r" that hispanics use so fondly as is "Tarrrrrrragona" seems to not find any customers within Ch. I don't think that the French and German "R"s get picked up well, either. They are also glottoral, like the US/UK "R"s, but are produced higher in the mouth, they sound like the "H" that Slavish people use when trying to speak English, like when they say "Chralftime" instead of "halftime". Basically, the German and French "R" are "H"s with more pressure and power, produced in the same mouth area.

                   

                  This is all not very linguistic, I know, but I still hope this helps.

                  • 6. Re: lip sync in character animator not very accurate
                    El Wombat Level 2

                    …and to make Ch lip-syncing usable by any languages, spending all my life pondering different languages and ways of pronunciation (being raised bilingual), seems realistic only if Ch provides the possibility to influence the actual "picking-up system", ideally I would be able to record sounds and then tell Ch what viseme to use on them.

                     

                    While the lip-sync keeps impressing me every time after so many months of working with them, I also have the same kinds of problems that are being mentioned in this thread a lot and the mistakes, for me, too, are recurrent.

                    • 7. Re: lip sync in character animator not very accurate
                      oksamurai Employee Moderator

                      Yes, if you could send us an audio file with these recurring issues / sounds, that would be helpful!

                      • 8. Re: lip sync in character animator not very accurate
                        El Wombat Level 2

                        Just finally did the recording. PMd you the whole project, hope that helps! Thanks for making the world a better animated place.

                        • 9. Re: lip sync in character animator not very accurate
                          impact900 Level 1

                          Hey, sorry for responding late; I wanna show you some of these files so you can see some of the issues I'm having with lip sync.

                          Can I share a DropBox folder with you? I wanna send you an archive folder that has audio, screenshots of my unedited vismes, a video of how it looks together, and the .puppet file.

                          • 10. Re: lip sync in character animator not very accurate
                            oksamurai Employee Moderator

                            Sure that would be great.

                            • 11. Re: lip sync in character animator not very accurate
                              mickeyd95133885

                              I’m Dutch and also experiencing problems with the O and Woo sounds translating in an L sound or something.

                               

                              Would be great if there was a tool inside CH that would learn how every user would like the visemes to appear and if wrong, we could correct them. That way every user could have it’s own unique and correct lipsync

                              • 12. Re: lip sync in character animator not very accurate
                                oksamurai Employee Moderator

                                Agreed, that would be a nice feature.

                                • 13. Re: lip sync in character animator not very accurate
                                  Farish

                                  Is there a way I can add additional mouth shapes to edit after the lip sync takes into effect? For instance, I am saying "A" on a certain part, however my only options are "Ah" "Uh" "Oo" & "Oh."

                                  • 14. Re: lip sync in character animator not very accurate
                                    ivolea

                                    W-oo is definitely buggy. The shape being captured is obviously tighter and smaller but it does not recognise it (sad).

                                    • 15. Re: lip sync in character animator not very accurate
                                      ivolea Level 1

                                      Oh and I tried to post a jpeg and png exported from Adobe Photoshop itself only to receive an error ... guess what the error was ... "File Type Forbidden.

                                      • 16. Re: lip sync in character animator not very accurate
                                        alank99101739 Level 4

                                        Not sure what going on with the forum for the screen shot... but I did not quite understand your comment about "W-oo" being buggy.

                                         

                                        The way I normally set up a Mouth for other expressions is I have

                                         

                                        • Mouth Group
                                          • Happy
                                          • Grin
                                          • Angry
                                          • Mouth
                                            • Ah
                                            • R
                                            • L
                                            • S
                                            • Uh
                                            • Smile
                                            • Neutral
                                            • (etc)

                                        That is, I have "Mouth Group" listing all the non-standard mouth positions, followed by a "Mouth" under which go the standard visemes that Lipsync uses. I then set up two swap sets - one for the standard visemes ('Smile', 'Neutral', plus any other standard visemes I want to trigger) with 'Neutral' as the default trigger in the swapset, plus another swapset at the Mouth Group level ("Happy", "Grin", Angry" etc plus "Mouth" as the default in the swapset). So by default it uses visemes, but if I apply the "Angry" trigger that hides the "Mouth" group and displays the "Angry" layer instead. This have worked pretty well for me once I set it up this way.

                                         

                                        Putting non-standard visemes in the "Mouth" group did not work so well - I had the standard layer being displayed at the same time as the non-standard layer. By hiding the whole "Mouth" group it got around the problem.

                                        • 17. Re: lip sync in character animator not very accurate
                                          SharonfromMD Level 2

                                          I'm presuming here you are trying to have it lip sync live. I'm a composer of children's songs. I have found that the lip sync works best when I just have the program lip sync to the recorded voice parts only. It's not perfect, but way better than if I have the instrumentation with it. Generally, it records too many sounds, which I can get rid of. If I just concentrate on the long notes, it looks fairly good.

                                           

                                          My suggestion as a work a round, would be to record your voice into a decent digital recording (I use Logic Pro X) and have the program do it automatically. You would then import that as an audio file, then select it and select the scene you want. You then go up to the timeline drop down menu and select "Compute lip sync from scene audio"  Good luck.