2020 Vision: The Witcher 2 was a stunning tech achievement that still looks great today

Publish date: 2023-01-17

Next-gen before their time? There's an elite selection of technologically advanced titles that appear towards the tail-end of any given generation, where developers are upping their game, experimenting with the kinds of techniques we'll see in the era to come - and it's typically on PC where we tend to get these nascent next-gen experiences. CD Projekt RED's The Witcher 2 - released in May 2011 - is one such release, a game that required a radical process of re-architecting before arriving on Xbox 360 almost a year later. But what made The Witcher 2 so special, how did it push PC hardware and can even today's mainstream graphics tech handle the game's legendary ubersampling?

Of course, the profile of The Witcher 2 is especially heightened at the moment with the recent series release on Netflix generating unprecedented interest in Geralt of Rivia's escapades - but it's not just the story and the world presented in these games that has driven their popularity. Starting with The Witcher 2, we've seen CD Projekt RED deliver some hugely ambitious, game-changing technology. The fact that the studio targeted PC - a format 'apparently' in decline back in 2011 - was remarkable in itself but without explicitly targeting mature (ie old) console hardware, CDPR pushed its game to the next level. Alongside titles like Far Cry 3, Battlefield 3 and Crysis 3, the PC showcased visuals much closer to those we would see in the now current generation of console hardware and it did so two-and-a-half years before they would arrive.

The Witcher 2's next-gen credentials are first established when it comes to characters and environments. In 2011, CDPR's new showcase pushed geometry count beyond the capabilities of the consoles meaning that edges on models were much smoother than the average AAA game, and the amount of detail on models (represented by real geometry instead of just textures) was much higher than typical high budget releases. Take the opening scenes of the game in the Timerian Dungeon - if you look across many of the surfaces and individual assets, you see almost unprecedented levels of detail. Outdoor scenes thick with undergrowth and dappled lighting look even more beautiful.

While rich in detail, the environments were literally just a stage for the drama unfolding between The Witcher 2's characters. It's pretty standard to run the highest quality character models in cutscenes only, dropping to lower LOD representations in-game - but PC on its highest setting used the self-same models throughout. And the quality on display here was next-level stuff compared to its contemporaries: the main character cast of Iorveth, Roche, Triss, and Geralt all have rich models and with very high resolution textures. Even the supporting cast right down to the most random NPC had time and effort put into their designs and lavish amounts of GPU time spent rendering them. Also interesting is the sheer number of them rendered within any one scene. This is no big deal in the current generation, but this was certainly not common in 2011 - especially up against its RPG contemporaries.

Vision 2020: a technical analysis of The Witcher 2 today - and analysis of how well the game runs on modern PC hardware.

There's a good reason for that. Typical triple-A fare was built to console specs while The Witcher 2 wasn't. At the time of release, the Witcher 2's highest settings reserved 600MB of GPU memory just for the textures - not including other assets such as geometry or the framebuffer itself. That's more memory set aside for core artwork than was available on either of the 512MB-based consoles of the time. The amount of geometric detail on-screen also pushed beyond the limits of the consoles. Of course, The Witcher 2 did come to Xbox 360 in what is a simply brilliant downport. The conversion itself is a technological showcase but the cuts made are obvious: every single asset on-screen has noticeably reduced polygon counts, including character models - even when they are close to the camera. Check out the video above and you'll see that profound, across-the-board compromise was required in order to bring The Witcher 2's extreme quality art assets across to the Microsoft console.

In fact, CD Projekt RED pushed so hard that even the PC version itself had some obvious limitations. The Witcher 2 ran on DirectX 9 and that ageing API's lack of multithreading made for a release that couldn't fully utilise the CPUs of the era. With more of a single-thread performance focus, aspects like level of detail switching even on the highest settings look aggressive by today's standards. Just walking around many of the hub areas presents very noticeable geometric level of detail pop-in. It was not jarring for the time perhaps, but it was a necessary concession given the GPUs of the time, the amount of polygons on the models themselves and the fact that the game did not fully utilise quad-core or octo-threaded CPUs too well.

While detail was pushed aggressively regardless, asset quality was not the defining visual feature of the game: The Witcher 2 shipped with a state of the art lighting system too. It was based on deferred rendering, which enabled many more dynamic light sources than standard forward-based rendering systems. The sun casts real-time light with individual characters and objects casting shadow maps on themselves and the world. Likewise, all smaller lighting indoors or outdoors from light sources such as torches and fires are dynamic in nature. These smaller lights could even cast real-time shadows as well.

This adherence to real-time lighting also applied to the game's indirect lighting. In indoor scenes where lighting did not actually change, global illumination or bounce light was emulated with manually placed point lights. Outdoor scenes take a more simplistic less manual approach: in the hub and overworlds, there is an ambient brightness added to shadowed regions and with an additional layer of ambient colour. Meanwhile, CDPR implemented screen-space ambient occlusion to simulate the shadowed occlusion of bounced ambient lighting on static and dynamic world objects.

An alternative take on The Witcher 2 came via the Xbox 360 downport, which is also available to play at 9x resolution on Xbox One X. Here's what we thought of that.

All of this allowed for a fully dynamic time of day in any of the hub and open locations in the game with different weather systems in play too. It's nifty tech for the time, but it also meant that time itself could play a part in the way the game plays with some AI routines and quests feeding into time of day changes - adding in a further layer to the storytelling.

But 'real-time, all the time' has its costs, especially in 2011 and especially with the extreme quality assets CDPR was working with - and the fact that even PC graphics tech of the era had its limitations. There may be real-time shadows from the sun but the resolution of the shadow maps leave much to desire by 2020's standards: when moving the camera, they tend to jitter somewhat, looking rather distracting. What shadows you had from point lights indoors also present with a very low resolution without any form of filtering. Also, it appears that CDPR elected to disable self-shadowing on characters for such lights in many scenarios. Where you do see them, the quality is simply not a good match for the characters themselves, looking stilted and jagged.

Despite this, CDPR pushed even harder in other directions with a view towards scaling The Witcher 2 onto the PC hardware of the future. Alongside the standard options to scale textures, shadow quality and model LODs, The Witcher 2 shipped with two option high-end settings - a tradition this game established and which would be seen again in the Witcher 3. I refer, of course, to cinematic depth of field and ubersampling. Nine years later, can modern PC hardware handle these features? Well, factor out those GPU-killing features and both the AMD Radeon RX 580 and the GeForce GTX 1060 can run The Witcher 2 at 60fps or beyond at 1080p and even 1440p resolution. But with everything ramped up, the game crushes even today's mainstream GPU champions. At 4K resolution, the RTX 2080 Ti can't sustain 60fps and some cutscenes will even take you down into the 20s. So why does The Witcher 2 still challenge today's top-end hardware?

First, let's talk ubersampling - which is just a cheeky way of saying 2x2 ordered grid super-sampling. Back in the day, the game was accused of being 'badly optimised' when really, the game was running at four times the chosen pixel count with ubersampling enabled! Selecting 720p would mean you were actually rendering internally at 1440p. By extension, those users enabling ubersampling at 1080p were actually demanding 4K from their top-end GTX 580 graphics cards. Put simply, when The Witcher 2 released in 2011, any kind of HD resolution with ubersampling enabled would effectively make the game unplayable. The funny thing is that I do actually recommend using the feature today - simply because anisotropic filtering in this game is broken and doesn't work with GPU control panel options either. But I'd still recommend a GTX 1080-class GPU to get this running at full HD resolution.

The future of CD Projekt RED: Cyberpunk 2077. The Digital Foundry team discuss the RTX-enabled Gamescom 2019 demo.

The second infamous forward scaling option is the cinematic depth of field option. Basically, The PC version of the game offered up an extremely high quality native resolution bokeh effect - which was not at all common in 2011. This was coded to what was an incredibly high level of precision for the time to the point where programmer Bart Wronski literally describes it as "insane". Since the game was running under DX9, it tests the limits of the API to achieve the quality and feature set it possessed. Essentially, it is a bandwidth virus that scales with resolution and has two passes for near and far plane depth of field - and its performance cost corresponds with the size of the bokeh effect on out of focus regions. The larger those bokeh shapes are, the more expensive it is. This is the reason why it manages to make an RTX 2080 TI croak at 4K resolution. It's a stunning effect and cutscenes don't look as good without it but clearly, it's an effect that still needs to be deployed with care even on the hardware of today.

In most other respects, The Witcher 2 works well on today's PC hardware, despite the level of fidelity it aimed for. CD Projekt RED pushed out ahead of its contemporaries in terms of asset detail and also made the daring switch to 'real-time, all the time' lighting at a time when most games still baked out much - if not all - of their scene lighting. In doing so, CDPR pushed out a title that meaningfully challenged the best PC hardware of the era - but more importantly, it saw the studio begin to lay the groundwork for The Witcher 3. And even in those areas where the game did not shine so brightly, lessons were definitely learned that would be remedied in the sequel: The Witcher 2's over-reliance on single-threaded CPU performance and its basis in DirectX 9 was not a deal breaker in 2011, but it did negatively impact the way the title scaled onto future hardware. It's safe to say that The Witcher 3 is a radical improvement in that regard.

All of which leads us on to Cyberpunk 2077. At the surface level, there are parallels with The Witcher 2. It ships before the next-gen consoles arrive and its design is ambitious enough to seem beyond the capabilities of the PlayStation 4 and Xbox One. Demonstrations so far have seen the game running at 4K resolution at 30fps on top-end GTX and RTX graphics technology paired with high-end Intel CPUs. On the surface, the ambition that powered The Witcher 2 seems to be alive and well then - and the timing of Cyberpunk's 2020 release means that it'll still be hot when Xbox Series X and PlayStation 5 arrive later in the year.

However, there are some profound differences this time around. There will be current-gen console versions, presumably launching day and date with the PC build. This generation, consoles and PCs are much closer in design, while the 16x leap in memory capacity delivered by PS4 and Xbox One over their predecessors should serve the current wave of consoles well. On top of that, we also have the enhanced PS4 Pro and Xbox One X available to users already, delivering a meaningful upgrade in GPU power - plus CDPR has access to ready-made low-level APIs that should span PC and two console generations. Even so, based on what we've seen so far, getting Cyberpunk 2077 running well on vintage 2013 hardware must surely be a massive challenge - but then again, so was The Witcher 2 port for Xbox 360 eight years ago...

Will you support the Digital Foundry team?

Digital Foundry specialises in technical analysis of gaming hardware and software, using state-of-the-art capture systems and bespoke software to show you how well games and hardware run, visualising precisely what they're capable of. In order to show you what 4K gaming actually looks like we needed to build our own platform to supply high quality 4K video for offline viewing. So we did.

Our videos are multi-gigabyte files and we've chosen a high quality provider to ensure fast downloads. However, that bandwidth isn't free and so we charge a small monthly subscription fee of £4.50. We think it's a small price to pay for unlimited access to top-tier quality encodes of our content. Thank you.

Support Digital Foundry

Find out more about the benefits of our Patreon

ncG1vNJzZmivp6x7psHRqJ6apZWne6%2Bx02ibop%2BZqa6tss6upZ2qqWJ%2FcX6PZquhnV2strWvx56pZmpdZ31zfIyvoKyhn6N6s7HTq6asqJWYwarCxA%3D%3D