Inside Epic Games’ “dogfooding” of Unreal Engine using Fortnite

2022 was a big year for the concept of “dogfooding,” a slang phrase in the tech world that refers to developers who use their own software like an everyday user. The process is intended to help developers better understand how their customers use the product and spot any opportunities for fixes that might not be obvious while working on the back-end.

Ironically, it was in 2022 that we learned certain studies they were not doing enough dogfooding. Meta criticized its employees for not spending enough time on the prototype metaverse product worlds horizonand Unity removed an internal game project which had reportedly been spun to allow the company to learn its own engine with a commercially released product.

One studio that hasn’t been shy about discussing their dogfooding process is Fortnite and the creator of Unreal Engine, Epic Games. The studio published a blog at the end of January explaining how development in Fortnite Let the Unreal Engine team “battle test” new Unreal 5.1 features like Lumen, Virtual Shadow Maps, and Local Exposure.

We wanted to learn a little more about Epic’s “battle test” process, and luckily, Technical Director Nick Pendwarden was ready to chat with us. Here’s a little insight into what Epic Games’ “virtuous circle” looks like.

FortniteDevelopers help improve Unreal Engine features

Penwarden began our talk with a practical example of how the Fortnite The team helped with the Nanite feature that first appeared in Unreal Engine 5.0. With the 5.1 update, Penwarden said Epic was able to make improvements that came from looking at the Fortnite the team uses Nanite in the real world.

“Nanite was really good at drawing billions of polygons really fast,” he recalled. “However, he was limited to objects that are static or rigid, so they can’t be animated.”

That meant that when the Unreal team was working on tech demos to showcase Nanite, they focused a lot on static objects and environments like cityscapes, stone, and cars. After it shipped, the team needed to better determine which feature Nanite should expand to next.

if you have played Fortnite, you will remember that the game world is not full of such environments. There are different types of forests, snowy landscapes and other strange areas.

Secondary FortniteThe Unreal environments led the Unreal team to dive into how Nanite could be used on vegetation. To do that, the engine needed the ability to support vertex animation, alpha masks, to render sheets. Most traditional vegetation rendering methods use alpha masks, so integrating that feature into Nanite became a priority.

“It’s helpful to have actual projects in the real world that indicate what [features] they’re the most important,” he said. The Unreal team gets some of that data from third-party developers who are using the engine, but those developers are working on a wide range of projects. Fortnite it helps the team narrow their focus, because if that team struggles, so will some of Epic’s other customers.

A screenshot from Fortnite showing the new features available in Unreal Engine 5.1.

Penwarden also commented that by shipping games in a real-world scenario, it forces toolmakers to look at technology outside of isolated, optimized test cases. In those cases, a developer can gain reliable access to all of the memory and processing power to inject graphical fidelity into a single frame.

“But how does that tool work when you use it in the context of a large-scale game, when you have a fraction of a frame and memory to work with?” she asked. As for the programming, [the feature] It has to fit in with all these other features and updates that also run as part of the game.”

The difference between internal test games and a game like Fortnite

In our many wanderings through the world of game development, we’ve met tool makers who show us their technology by showing us a game that their team created in-house. Some of these are more robust than others, few are really intended to be full commercial projects.

Penwarden said there’s a big difference between doing projects like these and shipping games with your technology. “With tech demos, you can be very intentional about the features you want to test…somehow, the features you’re testing can drive the content you create,” she observed. “With a full production of the game, it has more weight backwards.”

Knowing what content and gameplay experience you want to sell to players can show you the strengths and weaknesses of a tool you might be developing. In a tech demo, it might be easier to cut other content to make room for features that display the tool, but in a full game, that cut feature might be a high priority that makes the whole project work.

That mentality apparently reflects how the Unreal and the Fortnite teams interact within Epic. “We’re not necessarily going to go in there and say, ‘Hey Fortniteyou need to use this new technology,'” he said. “The Unreal Engine team typically builds tools, features, and bug fixes based on interactions with the Fortnite third-party team and developers, and it’s those developers who are ultimately the most excited to play with the technology.”

“They will start using [new features] in interesting ways, maybe in ways we didn’t expect,” Penwarden continued. Those unexpected uses help test new features that will eventually make their way into an Unreal Engine update.

What can tool makers who don’t make their own games learn from Epic?

The game tools dogfooding process has changed a lot in the history of game development. In the days when studios released their own engines or tools, there was more constant interaction between the developers creating the tools and the developers creating the game. At companies like EA, you’ll still see this relationship for tools like the Frostbite engine, which is implemented in many studios under the publisher.

But today, many developers build their games with technology fully licensed from other companies (perhaps with some internal modifications), and tool makers are often not in the business of releasing commercial products. How does Epic’s workflow work when you’re not… well, Epic?

Penwarden pointed to the company’s partnerships with virtual production tool makers as a key example of how other developers might think about this challenge. “Epic hasn’t made any movies or TV shows ourselves,” he said. So when the company partners with studios like Industrial Light and Magic on shows like The Mandalorian, it’s learning more about its own engine from experts who have a history in the world of visual effects for film and television.

A screenshot from The Mandalorian season 3.  Din Djarin and Grogu look on from the cockpit of the N1 Starfighter.

“Being able to find a good partner who’s trying to build an actual product, or who’s trying to use your technology in production, and work closely with them” is really important, Penwarden said, noting that such close relationships can bring some of the same benefits that Epic gets from working with the Fortnite developers

He said one of Epic’s hopes is that as the company continues to work with ILM and game studios, it will be able to help developers produce assets that can be used reliably across both mediums. Getting the engine to a state where there is a “convergence” in assets between linear media and gaming has been one of the company’s goals.

“That’s why we’re so interested in virtual production,” he said. “Being able to create content and assets for high-end linear experiences…and then being able to take those same assets and build a game out of them, you’re giving the world more ways to experience the content.”

Epic Games is in an unusual position where as a company it can thrive as a tool maker and game creator. But the explosion of investment in UGC platforms like Roblox it could mean that similar relationships between these types of developers become more relevant in the coming years.

And if you’re inside your studio creating great tools for content teams right now, we hope you’re enjoying “eating your own dog food” with your coworkers.