Posts

Animations

Image
Working on implementing the animations was pretty fun. The hardest part of it were skeletons and how animations can use the same mesh, but with a different skeleton. I took the animations from Mixamo and testing was pretty hard, mainly because I would get error messages even though I was using the same skeleton for different animations. The issue was that Mixamo might change the skeleton based on the animation, so when importing a new animation the skeleton might not be compatible and the animation was not working.  Importing Animations I'm using Assimp to import models, skeletons and animations. Importing models was pretty easy, but when I started to deal with animations I understood at what point FBX are weird. Because of how they separate the transforms Assimp has to create "helper" nodes and add them to the skeleton. I had to make a function to collapse those nodes into the actual bone like so: while (src->mNumChildren == 1 && IsAssimpFbxHelperNode(src-...

Lights

Image
The first thing I wanted to add in my engine were lights. I did implement lights in Open GL, but only in 2D. I had some knowledge of how light works, but implementing it is different.  Fortunately Open GL has some great articles explaining how light works. Understanding and adding Blinn-Phong lighting to my engine was a fun experience. To see the light move in the scene and on the object was very rewarding after a couple of days of trying to understand how the math works. The unfortunate part was Vulkan. Because of how we can separate buffers into descriptor sets and bindings I would have to rewrite the shaders and the pipelines multiple times to make everything work. It took me a while to understand how descriptor sets actually work, but once I did everything was a lot easier to implement. Light System My light is managed by a Light System object which is part of the world. Whenever I want a new light in the world I can call the function CreateLight . The light system is responsib...

From OpenGL to Vulkan

Image
The jump from OpenGL to Vulkan was pretty rough. The amount of boilerplate required just to get Vulkan running is insane. What surprised me even more was realizing how much abstraction OpenGL was actually handling behind the scenes. For me, the hardest part was wrapping my head around synchronization in Vulkan. With multi-threading in the mix, it was tough to even know where to start. In traditional apps, I could set breakpoints—even across different threads—and piece together what was going on. But Vulkan’s command submission model, which separates the Host (CPU) and Device (GPU), makes that approach nearly impossible. So how did I deal with all of that? Debugging My main debugging tool early on was the validation layers. I expected them to be similar to OpenGL’s, but they’re actually much better. The amount of information you get from a Vulkan validation layer message is easily 10 times more helpful than in OpenGL, where you often had to set multiple breakpoints just to figure out wh...

Introduction

Working with Unity and Unreal is part of my day job, so I’ve spent a lot of time using high-level tools to build games and interactive content. Outside of work, I started exploring OpenGL to get a better grasp of how things actually work under the hood. That curiosity eventually led me to Vulkan, where I could dive into low-level details and take full control over the rendering pipeline. A big reason for that shift was the industry's move away from OpenGL toward modern graphics APIs—and all the exciting new tech they enable. Epic’s Nanite and Lumen systems in Unreal Engine 5 were big eye-openers for me. Nanite is a virtualized geometry system that lets you render incredibly high-detail assets—billions of triangles—by streaming and culling only the visible data on the fly. It eliminates the need for traditional LODs and baking, and is built around GPU-driven rendering and multi-threaded command generation—capabilities that older APIs like OpenGL struggle to support. Lumen, on the ot...