How do Unity and Unreal sequencers compare?

Can Unity sequencer:

  • instantiate and maintain life of objects per camera (example: a light rig or a table shows up only with certain cameras)
  • animate objects and retime animation per track
  • switch between IK and FK on characters
  • parent constrain objects to others (ex: gun to a hand then drop it) and bake constraints
  • bake physics of a track
  • layer animation of select limbs additively or average
  • for those who have used both in production: how do they compare?

I have not used Unreal, but the combination of Sequences + Cinemachine + Animation Rigging seem to correspond to most of the above.

  • Sequences makes it easier to manage lots of Timeline objects, then you can put objects you want in each sequence (e.g. I have lots of “shot” sequences which I put whatever lights etc I want for that shot).
  • Can certainly animate objects per track, scale the timing, edit animation clips etc. (I don’t know if functionality is identical.)
  • I mix prerecorded animation clips with “Animation Rigging” which allows weighted IK overrides to be layered on top of the animation clip.
  • There are animation rigging constraint classes, but have not used them. From the description that is what they would do. (I just cheat and have a gun as the child of the hand, then disable that and enable a separate gun to drop to the floor - its all I have needed so far.)
  • You can record a track as an animation clip… not sure if that is the same as baking physics. Not sure on that one.
  • There are “avatar masks” to allow selective overrides of parts of a body (but its all or nothing, not weighted). There is also Animation Rigging which you can do IK overrides which are weighted. So I think “kind of” is the answer here.
  • No idea how it compares. Its a hobby for me, with results at https://episodes.extra-ordinary.tv/. I am trying to move to the new HDRP render pipeline at the moment though, which is changing how some things are done.

I hope that was some use!

1 Like

Hi Laurent,
I can maybe shade some more lights to the already bright answers from Allan, specially around the Sequences package (a bit less knowledgeable on IK and FK char anim)

  1. Sequences works a bit like Control tracks. It allows you to automatically activate/deactivate GameObjects that are contained in a given “sequence” (sequence being whatever editorial structure you want, a shot, a camera shot, a scene, etc…) So in this sequence, you can put regular objects (props, set elements, characters, light rigs, camera rig, audio, etc…) and they will be “tied” to this sequence.

  2. You can do it using Animation tracks animation clips or manual keyframing, contained in this sequence, so once again, “valid for” and contained in this time range.

  3. You can use the Recorder package to bake animation to a single animation clip (or an FBX file). You can either use Record tracks and clips in the timeline or the Recorder window to record things. (unity animation file, FBX, movie, image sequence, etc…)

Hope this helps and I would be happy to answer your questions about Sequences and Recorder if you have some.

1 Like

Thank you @akent99 and @ventrap .

  • In the US, Unreal has quite a bit more market share than Unity when it comes to virtual production or linear content production, have you or your team members used Unreal sequencer to study what Unity needs to improve?
  • If so, how do both sequencers compare?
  • Would you say that there is feature parity?
  • How about ergonomic?
  • How have you adapted some of Unreal sequencer’s workflow to match Unity?
  • Timeline required far more clicks than Unreal’s Sequencer, how is it now with Unity Sequencer?
  • Finally do you have a video showing an artist doing the same sequence in both softwares? Not a tech demo or a tuto, more of a real life production.

Hi Laurent,

Well, the Unreal Sequencer is more the equivalent of the Unity Timeline editor for me. An editor where you see animations, activations, etc… aligned the way you want them in time through tracks and clips.
The Unity Sequences package is a complementary to the Timeline. It allows you to easily create an editorial structure without having to manually create GameObjects, timeline assets, assigning playabledirector components, etc… It does that for you automatically as well as organizing the created assets in the folders of your project, organize your GameObjects that are contained in your different editorial structures in the Scene Hierarchy, etc… It’s more like a Timeline helper.
On top of that, Sequences can also be used to organize, manage and use your prefabs (we call them Sequence Assets as they are tinted by Sequences and are assigned a Category (Set, Character, Props, Audio, etc…) If you use them, dedicated track groups are created for you in the timeline, the hierarchy is again automatically populated for you to maintain a good practice organization, etc… You have access to an editor Window, the Sequence Assembly to inspect what is contained in your current sequence (editorial structure) and more.
More info can be found here: It’s showtime! New tools for cinematic creators | Unity Blog
I’m not aware of such a side-by-side/comparison video between how it’s done in Unreal and how it’s done in Unity though.

I see. Auto scaffolding. What’s the workflow to replace an object by another and keep all compatible animation?

“and now the animators can get to work” – any video of that part on the editing floor? That part is interesting to gauge the flow of work.

Timeline is an undercooked nightmare that has so many gotchas and weird behaviors, you’ll be pulling your hair really fast.

Try to use it and see for yourself.

Thanks for the warning, I thought that they fixed all that before building on top of it.

I think it makes a difference if you are trying to create a game or just use as an animation engine for rendering an animated video. I am not creating a game, but have worked out my personal workflow for using it to create animated video clips. As of 2022.1 I can now use most of the tools I want together. (There were some key bugs for me that got solved recently - e.g. Animation Rigging 1.1 and Sequences when used together often resulted in exceptions. Animation Rigging 1.2 now released solved these problems.)

For example, you can create an animator controller on characters when playing a game, to make the character walk, run, roll etc while playing a game. They don’t blend in with Timeline (well, not easily). I find it more effective to use animation clips directly in the timeline and never use the animation controller approach. That works for me because I am not developing a game with a cutscene - I am creating little video clips. I also am not trying to use signals to synchronize timelines with dynamic game actions. (I do certainly hear people finding Timelines make their game design easier too, but I am just not in that space myself.)

I do think trying it yourself is a good idea however!

I am not from Unity, but here is my answer in case useful.

If you are using an Animation Track (the main thing I use when animating characters), I design my characters to all have the same components. E.g. I wrote my own “facial expression” component. I then created little animation clips to set the strengths of these expressions. Each character gets its own blendshapes etc for the different emotions.

8198070--1069146--upload_2022-6-11_12-36-5.png

I also use “Humanoid” animation clips - which also adds a level of abstraction between the animation movements and the character. After doing that, replacing a character is a drag and drop into the Animation Track binding section. Timeline assets (stored as files on disk) are not tied to characters. You have to create a “scene” and then drop the character you are using from the scene into the binding slot.

8198070--1069149--upload_2022-6-11_12-38-7.png

I do simple animation inside Unity using their tools. They are pretty basic. Keyframe properties at a particular time. (They do have new “Live Capture” tools, but I am using some free open source tools so no experience with them yet.)

For example, I have a component to make the eyes of a character look at a target object. I then animated via keyframes the Y position of the target object so they eyes went from looking straight ahead to down. Simple stuff, and it works fine.
8198070--1069152--upload_2022-6-11_12-45-35.png

I would not personally want to create a full body walk animation this way. Instead I find existing animation clips from the asset store and other places. What interests me is also services like DeepMotion.com that you can upload a video clip to and it generates an animation clip for you. There are also open source tools around. They are getting better! (I think a full mocap suit is still best.)

I find it more useful to create animation clips (I am creating a series, so I reuse them a lot - e.g. different character poses, sitting, running). I am experimenting with webcam based (free software!) mocap, leap motion cameras, Quest 2 headset and VR controllers, etc. All cheap things (this is a hobby for me!) Full mocap suits are tempting, but the price tag still too high for me.

The end result is an animation clip or pose that I blend in the timeline. I do some live recordings, but I minimize it so I can produce episodes faster. My favorite tool is using a Quest 2 VR headset that generates Blender BVH files. I am not doing hollywood grade productions, just a hobby, so I can get away with this! :wink:

1 Like