If anyone is interested in this workflow, here’s a more precise description of what I am currently doing. I would love to be able to optimize further, so if anyone has ideas please let me know.
General setup: I created a G3+ character in CC4, and used the Blender Datalink plugin to push it into a Blender 4.5 project (using the CC/iC Link plugin in Blender). I used the Rigify option in the CC/iC Link plugin to create a Rigify rig for this character. This is the main actor in my Blender scene.
Using iClone 8.6 and AccuLips, I created the lipsync on copy of just the Body mesh of my character (no clothing, no eyes, no hair etc, because I only am interested in the lip sync).
After the lip sync is to my satisfaction in iClone, I created a blank Blender project to be the container for the transfer of the lip sync animation (rig control key frames, shapekey animations). I used the Blender Datalink plugin to send the avatar and/or motions to Blender. In Blender, this creates actions that holds the rig animation keyframes, as well as the shape key animation keyframes.
In the interim Blender project, where I imported rig action animation, I removed all of the rig action keyframe channels that I don’t want. Basically I remove everything that isn’t teeth.T, teeth.B, jaw_master, and tongue_master channels. (I wrote a Blender plug-in to make this part easier).
I also had to tweak the jaw_master rig contol F-Curve for X-quarternion curve to open the mouth slightly. For some reason in this character the mouth is more closed after the Rigify rig. I have not investigated root cause of this yet and the fix/workaround is fairly straightforward.
Lastly I had to make sure that all of the keyframes that came from iClone are at the right position in time. I don’t know if it is a limitation of my knowledge of the Blender Datalink, but if I set the animation range in iClone to something like frame 300 to frame 1200, the keyframes in Blender all start at frame 1. This is annoying because I really need the iClone frame numbers to be in sync with the Blender frame numbers. Due to limitations of the Datalink, I am only able to export about 1200 frames at a time (24fps) from iClone otherwise the Datalink hangs at 30% progress and I need to force quit iClone. So, for a 6000 frame lip sync animation, I need to export the keyframes in 1500 frame chunks. Since the frame numbers are not preserved, I have to manually align these in Blender.
Now that everything is cleaned, aligned, verified and renamed in my interim Blender project, I save that project and open my main animation. I “append” the interim character into my main project to bring in both the shape key animations and the rig action animation. Using the Action Editor in Dopesheet I assign the face rig action to my main character. Using a Shapekey transfer plug-in, I transfer the shapekeys from my interim character to my main character.
This feels like a lot of work, but this flow is deterministic and repeatable, and in the big picture of things, I can do it on a 6000 frame lip sync in about 1 hour. I would love to have some optimizations on this. Three obvious ones would be changes to iClone’s Blender Datalink plugin: (1) add an option to filter what is sent to Blender; and (2) add an option to preserve the frame numbers between iClone and Blender; and (3) support export of a large number of frames without crashing. I’m new to iClone and Reallusion - is there a developer feedback process?
thanks