- AI-Enhanced Creator
- Posts
- Stylizing videos with AI
Stylizing videos with AI
EbSynth vs. Runway Gen-1
Hello Creative Crew.
I’ve made a post showing how to use Runway Gen-1 to apply style on top of existing videos.
There is another software that offers a similar functionality, called EbSynth. I’ve used it recently to make this video and I want to show you the process and break down the differences between Gen-1 and EbSynth.
Let’s start creating!
PS: I added a referral program at the bottom of the newsletter. Refer 1 friend, and I’ll send you the book AI Explore: Collaborations 1 for free. If you enjoy these posts, why not share them, right? 😁
The best use of AI for videos at the moment is style transfer. We have original footage to which we apply an alternate style.
For this to work, we need three elements:
Original video footage,
Style frame/image
Software to combine them together
Original video footage can be done by recording ourselves or animating 3D models.
Both work and you can experiment depending on your use case.
For this example, I found a free 3d model on Mixamo and chose an animation that would fit the video.
I created the Style Frame in Stable Diffusion. I created a lot of concepts in Midjourney and made the final stylization choices in Stable Diffusion.
Now we come to the Software.
I’ve used both EbSynth and Gen-1 in various projects and both work well, if you know their limitations and how to work with them.
EbSynth works as a brush. We have a canvas, which is the original video, and then we paint over it with the style frame. This works well when there is not a lot of movement. When the characters appearance is mostly the same, the paint will stay where it is, showing the new picture, as closely to the original as possible.
When there is movement, which creates new shapes and pixels, the paint will move with it. This can result in distortions and empty space, where there shouldn't be one.
Gen-1 on the other hand, follows the movements and shapes of the original video and uses the style frame to create a new scene with the reference. This means the moving characters will keep their movement. However, as the software creates new images (which make the video), they will vary from the original style frame.
So these are the limitations of the software:
EbSynth - Retains the style frame style, struggles with excessive movement.
Gen-1 - Changes the style of the style frame, but movements are clear and cohesive. It also has a green screen option, which isolates the subject quickly.
For my example, there is not much movement, and I wanted to have a more detailed character, so I went with EbSynth.
What is the EbSynth workflow?
Convert the original video into an image sequence. I used After Effects, you can use online converters as well. Put the images into a folder and name them in a sequence (converters usually do this for you already).
Create your style frame and put it in a separate folder. Make sure it has the same name as the corresponding image in your image sequence. In my example it’s 0000_0000.
Also make sure the images have the same resolution. It will not work otherwise.
Run EbSynth and connect the two folders. Drag the style reference folder into the “keyframe” box and the original video image sequence into “video” box.
Press Run All and you’ll get your newly stylized video.
As you can see there are some distortions after the character moves, but the style looks exactly as the style frame.
Here is a comparison of Gen-1 with the same image using the green screen feature.
These tools create some interesting new ways to create.
I’ve been sharing a lot of workflows for visual storytelling and I’m putting them all into an e-book, for an expedited learning experience - A 2D artists guide to AI supported workflows.
I’ve already written chapters on Creating concept images, Brainstorming, Stylization, Animation and Combining AI tools with non AI tools.
If there is a workflow you’re curious about, if you have an end product in mind and are not sure how you could support it with AI, hit reply and send me a message! I would love to hear from you and make the e-book even more relevant for you.
It’s an exciting time to be a creator. We’re able to create in ways never possible before. Let’s explore and embrace them together. 😁
If you do enjoy these posts, why not share it with others?
Create your own referral link, invite a friend and I’ll send you a gift.
You can also stop by my website or social media and as always,
Keep creating and don’t forget to have fun. ☀