Moving On

As some of you know, a few weeks ago I left WB Games Montréal. I’ve accepted an amazing, life-changing opportunity at DICE’s Frostbite Labs to work on cutting-edge, industry-leading future graphics technologies, jaw-dropping creative experiences and other unannounced fantastic topics, with my good friend and graphics luminary Johan Andersson and a super-team of unequivocally talented engineers (Graham Wihlidal, Niklas Nummelin, …) and amazing artists.

This was not an easy decision to make. Moving to another country and leaving the (two) teams I’ve built, coworkers and friends at WBGM, and moving away from friends and family is a big deal. Building a new studio and team is not always easy, and I learned a lot. I’m grateful for the undeniable trust that was put in me, for all we’ve accomplished and have yet to show the world, and for having the opportunity to initiate & expand WB Games’ worldwide tech sharing with my friend Jon Greenberg at Netherrealm, Jared Harp and Piotr Mintus at Monolith, and everyone else who contributed. I’m convinced that you will be amazed with what is yet to come from WBGM, and from the other WB studios.😉

The beginning of my adventure in Stockholm is coming up quickly. More on this soon!😉

dicejohan

Finding Next-Gen – Part I – The Need For Robust (and Fast) Global Illumination in Games

Figure 1: Direct and Indirect Illumination from a single directional light source. [1]

This post is part of the series “Finding Next-Gen“. Original version on 2015/11/08. Liveblogging, because opinions evolve over time.

Global Illumination?

Global illumination (GI) is a family of algorithms used in computer graphics that simulate how light interacts and transfers between objects in a scene. With its roots in the Light Transport Theory (the mathematics behind energy, how it transfers between various media, and leads to visibility), GI takes into account both the light that comes directly from a light source (direct lighting/illumination), as well as how this light is reflected by and onto other surfaces (indirect lighting/illumination).

As seen in Figure 1, global illumination greatly increases the visual quality of a scene by providing a rich, organic and physically convincing simulation of light. Rather than solely depending on a manual (human) process to achieve the desired look, the mathematics behind GI allow lighting artists to create visually convincing scenes without having to worry about how they can manually replicate the complexity behind effects such as light scattering, color bleeding, or other visuals that are difficult to represent artistically using only direct illumination.

Continue reading “Finding Next-Gen – Part I – The Need For Robust (and Fast) Global Illumination in Games”

Finding Next-Gen: Index

It’s Been Awhile…

It’s been a while since I posted something here. Life and being busy building a team and new graphics technology for a new game at WB Games Montréal is my excuse. But you’re right, it’s no excuse and I need to spend more time blogging. Hopefully this should get me to share more, which I definitely miss doing. So here’s my attempt.😉

This is the index page for a series of blog posts I’m currently writing about some challenges in real-time rendering and my perspective on these topics.

In no way is this an attempt to sum it all up or provide perfect solutions, but rather add to the discussion on topics that are close and resonate with me, while respecting my various NDAs. What you will find here is undeniably inspired and fueled by the various presentations and discussions from the latest conferences, as well as from various discussions where graphics programmers tend to hang out. The following wouldn’t be possible without this amazing community of developers that share on a daily basis – thanks to everyone for the inspiration and for always sharing your discoveries and opinions! Much needed for progress.

Also, this page will most likely evolve and change. Some topics might appear, be grouped, and some might greatly change pending on how much content I can put together. Feel free to come back and check  this page over time.

Please leave comments if need be, and thanks for reading!

Finding Next-Gen

Deformable Snow and DirectX 11 in Batman: Arkham Origins

It’s been a while, but I finally found some time for a quick post to regroup the presentations I’ve done this year at the Game Developers Conference (GDC) and NVIDIA’s GPU Technology Conference (GTC). These presentations showcase and explain some of the features developed for Batman: Arkham Origins.

Continue reading “Deformable Snow and DirectX 11 in Batman: Arkham Origins”

Blending Normal Maps?

What is the best way to blend two normal maps together? Why can’t I just add two normal maps together in Photoshop? I heard that to combine two normals together, you need to add the positive components and subtract the negative components, then renormalize. Looks right to me… Why shouldn’t I be using Overlay (or a series of Photoshop blend modes) to blend normal maps together? I want to add detail to surfaces. How does one combine normal maps in real-time so that the detail normal map follows the topology described by the base normal map?

If this is something you’ve heard before, something you’ve asked yourself, check out this article, written together with Stephen Hill (@self_shadow) on the topic of blending normal maps.

Continue reading “Blending Normal Maps?”

Approximating Translucency Revisited – With “Simplified” Spherical Gaussian Exponentiation

Lately, someone at work has pointed out the approximation of translucency Marc Bouchard and I developed back at EA [1], which ended up in DICE’s Frostbite engine [2] (aka The Battlefield 3 Engine). Wanting to know more, we started browsing the slides one by one and revisiting the technique. Looking at the HLSL, an optimization came to my mind, which I’ll end up discussing in this post. In case you missed the technique, here’s a few cool screenshots made by Marc, as well as tips & tricks regarding implementing the technique and generating the inverted ambient-occlusion/thickness map. See the references for additional links.

Continue reading “Approximating Translucency Revisited – With “Simplified” Spherical Gaussian Exponentiation”

A Taste of Live Code Editing With Visual Studio’s Tracepoints

Needless to say that from one software project to another, compile times vary greatly. When debugging we often spend a significant amount of time changing some lines of code, recompiling, waiting and then relaunching the software. While it is true that one has to change their debugging “style” from one project to another (i.e. having a thorough understanding of the problem before “poking around” code is definitely a plus when dealing with big code bases where edit-and-continue is not possible), waiting for the compiler and linker when debugging is never fun. It’s also very unproductive.

In parallel, some of the tools we use everyday allow us to greatly improve our debugging efficiency, though sometimes we are completely oblivious to the power that is available to us. Regardless of how fast or slow compile and link times are on your current project, any tools that can mitigate the recompile cycle when debugging are welcome. One such tool is Visual Studio’s tracepoint: a breakpoint with a custom action associated to it.

Continue reading “A Taste of Live Code Editing With Visual Studio’s Tracepoints”