Struggling videogame conglomerate Embracer Group appears to be embracing the rise of artificial intelligence: In its latest annual report, the company laid out a strategy for incorporating AI in its future work, saying that the tech “has the capability to massively enhance game development by increasing resource efficiency” and “adding intelligent behaviors, personalization, and optimization to gameplay experiences.”

The use of generative AI in game development is a touchy topic to say the least. Broadly speaking, a lot of creative types tend not to care for it to put it mildly, but high-priced executives really do; some people say it will inevitably put people out of work (you don’t have to pay a machine to spit out a picture, after all), while others (again, high-priced executives) insist, to borrow a phrase, no it won’t.

Regardless of where you come down on that particular divide, efforts to incorporate it into game development have faced significant criticism: Blizzard recently went so far as to reassure gamers that it’s not using generative AI in World of Warcraft. But that’s not putting off Embracer, which said the rapid development of large language models (LL…

Read More

Nvidia has rolled out its latest Game Ready driver for its GeForce RTX graphics cards. Version 536.99 goes big on Baldur’s Gate 3 support, with Nvidia claiming that enabling DLSS 2 boosts performance by an average of 93% across its latest RTX 40-series of GPUs.

Nvidia says the driver delivers “additional optimizations and enhancements to further improve the hundreds of hours you’ll spend playing this epic role-playing game.”

By way of example, Nvidia’s numbers show the RTX 4060 Ti increases from 35.5fps at 4K max settings to 77.9fps with DLSS 2 enabled. To be clear, that’s the performance of the 4060 Ti with DLSS 2 either enabled or disabled. The new driver isn’t doubling performance with the same settings.

Indeed, Nvidia isn’t making any specific claims for performance increases with like-for-like settings. So, it’s probable that the gains are modest. After all, Nvidia would be sure to shout about major step in pure driver-related performance.

Of course, DLSS 2 is compatible with both RTX 30- and RTX 20-series, too. So, there should be some good DLSS gains to be had with GPUs from both generations.

That said, the RTX 20-series in particular h…

Read More

Hogwarts Legacy has received a huge patch which points its wand at spiders and says “expelliarmus!” A new Arachnophobia mode has been added which, when toggled, changes the appearance of all the game’s spiders, removes their skitters and screeches, and removes certain spider-centric effects from the game. Interestingly enough, this seems to have been prompted by a popular mod that appeared shortly after launch which did the same thing.

After which, the bug jokes pretty much write themselves. Developer Avalanche has been on a squishing mission and the patch notes are voluminous with multiple save game fixes, various fixes to issues that could cause the player to fall out of the world, multiple potential crash fixes, fixes to problems with the collections and Field Guide pages, and fixes to certain quests that could be left impossible to complete if the player did certain things.

The full list is mind-numbingly extensive, but I sifted through it for the vaguely funny bits so you don’t have to.

  • Resolved Poacher Ranger treadmilling onto a wall when the Avatar casts Disillusionment and hides after a cutscene
  • Resolved bumping …
Read More

Helldivers 2 has a great metagame layered over all the bug- and bot-blasting: The Galactic War, which sees players defending and liberating planets in a dev-controlled campaign. The whole thing has this fun air of mystique to it at the moment—can you actually cut off supply lines? Are the devs making it hard on purpose? Are the bots secretly being controlled by Hatsune Miku? Okay, I made that last one up.

One theory has been particularly thorny—the insistence that farmers are ruining everything. For the uninitiated, a popular way to grind medals has emerged during the Helldivers’ current war against the automatons. Step one: Find an “Eradicate Automaton Forces” mission. Step two: Blitz your way through it. Step three: Grab your medals and bounce. We don’t recommend that last part ourselves, but it hasn’t stopped people from taking the warpath of least resistance.

So what’s the problem? Well, the current major order requires unlucky helldivers to defend eight planets, and every single operation (a collection of missions) will have at least one civilian evacuation mission. These are a lot harder than your regular doses of liber-tea even after nerfs, so play…

Read More

Intel has released a whitepaper outlining a way to simplify its CPU architectures by removing legacy 16-bit and 32-bit support, therefore making them 64-bit only. Intel believes this change will lead to better optimized processors, meaning better performance and efficiency.

The building blocks of Intel’s processor range can be traced all the way back to the original 16-bit 8086 processor released in 1978. Rather than include support for decades-old software and operating systems, Intel believes virtualization technologies have developed enough to emulate the features required for legacy systems. It’s saying it’s time to move on.

Today’s 64-bit CPUs include processes to “trampoline” their way into 64-bit operation. According to Intel, “Intel 64 architecture designs come out of reset in the same state as the original 8086 and require a series of code transitions to enter 64-bit mode. Once running, these modes are not used in modern applications or operating systems.”

In layman’s terms, there’s a bunch of stuff in there that’s basically useless, and has been for a long time. But it’s a bit more complicated than just switching everything over to 64-bit. Windows has…

Read More

There’s a freshly leaked image of a Radeon chiplet GPU design—reportedly from a now-cancelled Navi 4C die—and it’s rocking anywhere between 13 and 20 different chiplets on one GPU. I love a bit of ambition, but I think we’re starting to see why AMD really might be calling time on competing at the high end in its next graphics card generation.

The GPU design schematic has come from the Moore’s Law is Dead channel (via Videocardz) and shows a hugely more complex chiplet design than we’ve seen in the current Navi 31 silicon that powers the Radeon RX 7900 XTX. That was the first chiplet GPU in the gaming space, and honestly I thought it was a minor miracle AMD had managed to get the card working as well as it did as a gen1 implementation of the tech.

It must be said, though, that it wasn’t necessarily a true chiplet design in the same vein as AMD’s recent Ryzen processors. Those processors use multiple compute chiplets—multiple dies with multiple CPU cores on each—along with an accompanying I/O die to take care of the interconnects. The approach with the latest Radeon, on the other hand, is still chiplet-based, but only contains a single …

Read More