Glyf.SPace

2D to 3D Character Animation and Texture Tests (Glyf.Space)
 Role: Character Design, Rigging, Animation, 2D/3D Direction
 Tools: Adobe Illustrator, Adobe After Effects, Joysticks and Sliders, Duik, Cinema 4D, Glyf.Space, ChatGPT
Overview
 Glyf.Space brought me on to explore their AI animation tool through two creative experiments. The first involved transforming a rigged 2D character into a 3D render. The second tested fabric textures under cinematic lighting.
Approach
 I led concept, design, animation, and direction, collaborating closely with the Glyf.Space team to push the tool’s capabilities. Through several iterative test runs, I refined both the animation and rendering processes, blending traditional motion workflows with AI assisted techniques.
Impact
 The project resulted in two polished sequences. One showcased a fully realized 3D version of the original 2D character, and the other demonstrated realistic fabric textures enhanced with cinematic lighting. These experiments highlighted the potential for AI integration within professional animation pipelines.

2D Character to 3D render

Started with a 2D run cycle I designed and animated from scratch, clean, fun, full of energy. Then came the challenge: could I push it into 3D using Glyf’s AI?
At first, the results were… wonky. The style I had in mind wasn’t coming through. So I used ChatGPT to help build reference-based stills from my grayscale frames, dialing in tone, lighting, and vibe until the AI finally got it. Had to simplify the mouth too (the morphing was wild).
In the end, I got a 2D-to-3D hybrid that actually felt like my original vision, but elevated, cinematic, and just cool to look at.

C4D x Glyf: Shirt Test

For the second test, I went full product-hype, built out a shirt reveal in Cinema 4D with bold camera moves and a clean, grayscale render to give Glyf the best shot at understanding the motion.
Still, the AI got tripped up by depth and fast camera shifts. So I stepped in with ChatGPT and crafted reference-based visuals to lock in the lighting, reflections, and natural vibe I was aiming for, think soft terrain, reflective water, high-end product energy.
After a few rounds of tweaks, I found the sweet spot: cinematic, punchy camera motion that the AI could actually handle without warping. The result? A solid blend of 3D animation and AI enhancement,  and a glimpse at how these tools can play nice when you guide them right.

Behind the Scenes

Here are a few behind-the-scenes GIFs from early test animations that didn’t make it into the final project. These versions had more movement and expressions than the AI could interpret, so I simplified the designs to ensure the final animations were successful. I’ve always loved looped animations, so I made sure both the 2D and 3D versions loop seamlessly. Even though these weren’t approved, they were fun to create and offer a glimpse at what happens behind the curtain before the final renders.

Test Styleframes

Test Styleframes – Glyf.Space
To get the Glyf software to produce the results I wanted, I explored multiple test versions of how my animations could look. For the 2D-to-3D test, the goal was to see if the software could translate a 2D design into 3D in my preferred style and determine if it could make a motion designer’s workflow easier.

For the 3D animation test, I wanted to see if the software could handle texturing automatically, removing the need to manually texture within traditional 3D software. I used ChatGPT to help define the look for both the 2D and 3D briefs, allowing me to generate styleframe concepts that I could feed directly into Glyf.Space for testing.

Below are the test styleframes and renders created during this process.

Glyf test renders

Here are some of the test renders I created while experimenting with Glyf.Space. It was exciting to see the results, but it also showed me where I would need to adjust my animations to get the best outcome. The morphing was a bit unpredictable, getting some moments right and missing others, but overall it was a fun and valuable experiment.

I can see real potential for this tool to help motion designers pitch the idea of motion to clients and agencies before fully committing to animating in a specific style.
Previous
Previous

THOUGHTSPOT

Next
Next

SproutSocial