Wednesday, September 27, 2023

Intel’s DLSS rival, XeSS, appears to be a success

Must read

Shreya Christina
Shreya has been with for 3 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider team, Shreya seeks to understand an audience before creating memorable, persuasive copy.

Digital Foundry it came out an in-depth look on the scaling technology that will be included in Intel’s upcoming Arc GPUs and compared its performance to Nvidia’s offering, Deep Learning Super Sampling (DLSS). Based on the tests they’ve run so far, Intel’s Xe Super Sampling, or XeSS for short, seems to hold up quite well against more mature technologies – although it’s worth noting that Digital Foundry only ran tests on Intel’s best card, the Arc A770, and mostly in a single game, Shadow of the Tomb Raider.

The idea behind XeSS and other similar technologies is to run your game at a lower resolution and then use some machine learning algorithms to scale it up in a way that looks much better than more basic scaling methods. In practice, this allows you to play games at higher frame rates or enable fancy effects like ray tracing without sacrificing a huge amount of performance, because your GPU is actually rendering fewer pixels and then upscaling the resulting image, often using special hardware. For example according to Digital Foundry, if you have a 1080p screen, XeSS will run the game at 960 x 540 in the highest performance (AKA highest FPS mode) and at 720p in the “Quality” mode before it is then scaled up to your native resolution. monitor. If you want more information on how exactly it does this, I suggest you take a look Digital Foundry‘s write down Eurogamer.

Image with three trimmed frames of a fishing net.  The two on the left, made by Nvidia's DLSS and Intel's XeSS in their quality modes, show individual parts on the net.  In the native game on the right, parts of the net are missing.

In this case, Nvidia’s DLSS (left) and Intel’s XeSS (center) did a better job of preserving the details in these fishing nets than the game’s built-in anti-aliasing (right).
Image: Digital Foundry

In Digital FoundryIn XeSS’s tests, XeSS did this job quite well when running on the Arc A770 (the technology will also be usable on other, non-Xe graphics cards, including those integrated into Intel’s processors and even Nvidia’s cards). It provided a solid increase in frame rates compared to playing the game in native 1080p or 4K, and there wasn’t a huge drop in quality as you’d expect without some sort of upscaling. In addition to the results of Nvidia’s DLSS, which is more or less the gold standard for AI-powered upscaling at the moment, XeSS was able to maintain a similar amount of sharpness and detail in many areas, such as foliage, character models and backgrounds.

Digital Foundry found that XeSS added two to four milliseconds to frame times, or the amount of time a frame was displayed on the screen before being replaced. That could in theory make the game feel less responsive, but in practice the fact that you get more FPS helps things a bit.

That said, XeSS had a few hiccups that were either non-existent or noticeably less intense when using DLSS. Intel’s technology struggled mainly with thin details, sometimes with flickering moiré patterns or bands. These kinds of artifacts can definitely be distracting depending on where they pop up, and they got worse as Digital Foundry pushed the system and asked it to upscale lower and lower resolution images to 1080p or 4K (something you might be dealing with particularly demanding games). Nvidia’s technology wasn’t completely immune to these problems, especially in modes that were more focused on performance than image quality, but they certainly seemed to be less common. XeSS also added some extremely noticeable jitter effects to water and some less intense ghosting to certain models when in motion.

Gif with the same scene rendered using three different technologies.  In the Intel version, it shows a flickering pattern on a man's shirt, which is not present in the other versions.

These kinds of moving issues may not be easy to spot in photos, but if you’re actually playing a game, it probably stands out like a sore thumb. Intel’s version is in the middle, Nvidia’s in the left, and a version without any upscaling is on the right.
Gif: Digital Foundry

Intel also struggled to keep up with Nvidia when it came to a few specific topics — DLSS, in particular, handled Lara Croft’s hair significantly better than XeSS. There were times, however, when XeSS’s results looked better in my eyes, so your mileage may vary.

XeSS is clearly still in its early stages and specifications about the Arc GPUs it will mainly support are just starting to come out. That makes it hard to say how it will perform on Intel’s low-end desktop lineup and on the laptop graphics cards that have been around for a few months. It’s also worth noting that, as with DLSS, XeSS won’t work with every game – until now, Intel’s site lists only 14 games as compatible, compared to the approximately 200 titles that DLSS works with (although the company does say it is partnering with “a lot of game studios” like Codemasters and Ubisoft to get the technology into more games).

Still, it’s nice to at least get a taste of how it will work and to know that it’s at least competent. Without naming names, other first attempts at this kind of technology have not always held up DLSS as well as XeSS. While we still don’t know if Intel’s GPUs will actually be any good (especially when compared to AMD’s upcoming RTX 40 series and RDNA 3 GPUs, which has its own upscaling technology called FSR), it’s good to know that at least one aspect of it is a success. And if Intel’s cards end up being bad for gaming, maybe XeSS can really help with that – it’s the small wins, really.

More articles

Latest article