Unity has had a rough year. The company was caught up in a firestorm of criticism in September of last year when it announced a runtime fee that would charge developers a fee every time a Unity game was installed. Even well before the fee showed up, Unity had dug itself so deep in the community of game developers that it’s spent over a year trying to get back to the positive reputation it once maintained — and the work still isn’t done, even after a new CEO and the full-out cancellation of its runtime fee.
But this week, Unity is hoping to turn things around with Unity 6 being released — the first numbered Unity update in nearly a decade. Last year, Unity 6 represented a growing pillar of game development that was trying to make a quick grab at cash. But now, it represents a rightfully battered company that’s trying to win back some favor. And in order to do that, it needs a damn good game to show off its new tech. That game is Den of Wolves.
It’s the latest title from GTFO developer 10 Chambers Collective, which was formed from developers who worked on Payday and Payday 2. It’s also one of the first games to commit to using the new Unity 6 engine, despite the fact that 10 Chambers started work on an older version of Unity. I spoke with Hjalmar Vikström, co-founder of 10 Chambers, and Ryan Ellis, VP of Product at Unity, to understand what the new engine brings and the outlook from developers on everything from game optimization to generative AI.
Get your weekly teardown of the tech behind PC gaming
If you’ve played GTFO, it’s probably surprising that the game was built on Unity. It doesn’t look like a Unity game, which is something Vikström acknowledged almost immediately.
“So many people were so surprised when they saw us, when we announced GTFO, and they’re like, ‘What? Unity?’” Vikström told me. “That hasn’t been the case like in Unity, you get the beautiful games. It’s always been Unreal, you know.”
Advertisement
Right, Unreal Engine. For more than a decade at this point, you thought of a big console and PC release as an Unreal game, while Unity was for indies and a flood of mobile titles (some better than others) taking up space in the App Store. Vikström even shared that the team was approached by Unreal reps after originally showing off the GTFO trailer, shocked that the company wasn’t using Unreal Engine or its own tech.
He recognized that Den of Wolves has become some sort of poster child for the new engine.
The obvious question: why? “Unity is much quicker for us. It’s quicker to iterate on. It’s, you know, you can create a small, empty project and do a simple prototype with boxes and things, and then, like, this is what I’m thinking. And it takes a couple of seconds to reload everything.”
It’s that lightweight, multiplatform approach that originally brought 10 Chambers to the engine. Although 10 Chambers has a team nearly 100 people now, and ambitions of a team with multiple hundred people, the collective started as a group of four or five former co-workers. For them, Unity represented a way to quickly get their vision up and running as team members wore multiple hats across programming and game design.
Advertisement
The negatives attributes that have been associated with Unity in the past were actually a positive for the team.
“The nice thing is that Unity’s base level performance, like an empty project, doesn’t have that much overhead. It’s quite cheap,” Vikström said. “It’s actually one of the benefits of them also being on mobile, and, you know, even web browser games. They can’t have a clunky, huge-ass default project because, when you start a new thing, it needs to be super slim to be able to run on anything, basically.”
Vikström wasn’t shy about the spin of Unity 6, especially given the past year of troubles the company has faced. He recognized that Den of Wolves has become some sort of poster child for the new engine, and even went as far as to say that he didn’t “want to sound like a Unity shill.” And to that end, Vikström described some of the issues the team has come up against with Unity in the past.
“I mean, it’s a ton of work, like our struggles with physics in old versions of Unity. I don’t know how many months I’ve spent on physics optimizations in Unity,” he said. “It’s just a lot of work, but you can, you know, if you know your s**t, you can get it done.”
Although Unity made sense for a small group of developers working on their first game under a new studio, it didn’t mean that the team had to necessarily go with Unity 6 for Den of Wolves. Some changes in the latest version of the engine made it a more more compelling offering, however.
There’s no denying that GTFO is a beautiful game, and it looks like Den of Wolves will be equally as impressive. For GTFO, the visual accomplishments were despite using Unity as the underlying engine, not because of it. Vikström highlighted the modularity of Unity as one of its strong suits, and how the team swapped out the default rendering pipeline in GTFO for its own tech. Now, the team is using what you get out of the box.
“The HDRP, you know, it’s come a long way now. We didn’t go with that for GTFO because it was too immature,” Vikström said. “One of the main reasons for switching to Unity 6 — because we didn’t obviously start on Unity 6 because we’ve been developing for a bit — it’s because of the visuals, you know? And that’s quite a strong statement, actually, for a Unity game.”
The Unreal renderer is why you associate so many beautiful games with the engine. Although Unity has plenty of flexibility, its rendering pipeline hasn’t been the best in the past. It started with URP, or the Universal Render Pipeline, but Unity 6 has HDRP, or the High Definition Rendering Pipeline.
Advertisement
It includes advanced shaders for subsurface scattering and translucency, ray tracing and path tracing, volumetric fog and clouds, and a fully open pipeline that developers can rewrite in the engine’s native language, C#. And for 10 Chambers, the maturing of the rendering has made a world of difference.
It’s not just a matter of throwing more tech at the problem. Vikström held ground that the new pipeline wasn’t a solution to instantly make a game look better. “I think it comes down to artistic choices more than cutting-edge tech,” the developer told me. “You need to make some choices. You can’t just say it’s gonna look hyper realistic, you know, because then you’re competing with everything trying to be hyper realistic. And then, especially in earlier versions of Unity, you were not in for a good time.”
The maturing of the rendering pipeline makes a big difference, but Ellis highlighted some specifics that make Unity 6 tick. “We made some very, very significant graphics improvements in Unity 6 overall. An example of that is a new capability, which is called GPU resident drawer, which essentially offloads things that work on the CPU to the GPU. We’ve seen improvements of up to four times in the CPU performance just by essentially toggling this thing on,” the Unity executive told me.
There are some other new additions, like Spatial Temporal Post-Processing, or STP, which looks like Unity’s take on the Temporal Super Resolution (TSR) feature in Unreal Engine 5. There’s also GPU occlusion culling, which calls objects out of the game world that aren’t visible — once again, a feature available in Unreal Engine. Unity may be known for mobile and indie releases, but it’s clear with Unity 6 that the engine is trying to scale up.
I’m sure all of the changes in Unity 6 are great for developers, and I know that Unreal Engine has plenty of its own draw, as well. That’s great for game developers to duke it out with each other about which is better, but I’m not a game developer (and I suspect most people reading this aren’t, either). I moved on from the engine itself to one of the most pressing questions still on the mind of PC gamers — optimization.
I’ve previously spoken with developers about the optimization problem on PC, but even more than a year later, the issues persist, particularly with Unreal Engine releases — Silent Hill 2 comes to mind as a recent example. Surprisingly, Vikström didn’t point to the engine when talking about performance issues. I asked if 10 Chambers’ humble beginnings helped when it came to optimizing GTFO, and the developer didn’t mince words: “Yep, 100%.”
Advertisement
“It’s quite easy to do when you’re two programmers. We could just decide to do [optimization] and then we start doing it. But you still need to have that mindset of, you cannot have a underperforming shooter, right?”
The typical answer to the optimization problem is this — there are just too many hardware configurations on PC. That’s true, but Vikström focused more on the practical issues that come up when trying to optimize a big game across an even larger team.
“You just need to be at it. Spend time of it. You know, do the work on it,” Vikström said. “And I 100% understand why that is super hard on, like, a 500, 1,000-person team, because who owns that? Who makes 200 artists change their art? Or, like, ‘hey, you can’t do your shadow meshes like this.’ Hopefully in good studios you can get that done, but otherwise, it’s quite easy to lose that ownership.”
It’s a reality that game developers are well aware of but game players rarely consider. Particularly with large teams, work is often very siloed. You don’t have a creative, freeform environment where every programmer knows every artist, all of whom talk with designers. There are different departments, all of which need to work together to optimize a game. If a game is too heavy, that has implications for programmers, artists, and game designers, and it’s not as simple as waving a wand to mobilize hundreds of people to upend all the work they’ve put in.
Even with that executive dysfunction, Vikström recognized that optimizing a game for PC is just straight-up hard. “Optimization, we’re super passionate about it, but it’s hard. I mean, it’s super hard. But we’re still at the size where we can actually go, ‘hey artists, we need to change the shadow meshes,’ or ‘hey programmers, we need to optimize thread management.’ And that gives me a lot of hope because it’s just weeks and months of work.”
Advertisement
Still, Vikström says the team at 10 Chambers is committed to the process. “I’m not saying Den of Wolves will be, out of the gate in early access, the best-optimized game. It won’t be because, you know, that s**t is hard. We can’t allow any stuttering, or texture loading, or, you know, these CPU spikes. We want to get rid of them.”
Optimization is one of the big topics in gaming today, but the other one is AI. In particular, generative AI. Generative AI has already displaced the jobs of game developers, and it doesn’t show any signs of slowing down. According to Vikström and Ellis, it’s a topic that has to be approached with nuance because, like it or not, AI isn’t going anywhere in game development.
“We really believe that the creator needs to be at the center of all these things, and that AI is purely there to help provide assistance,” Ellis told me. “We’ve seen some incredible things that people can create with AI, and yet they also seem to lack a soul, or some sort of real spark of creativity. And in the gaming world, so much of this is about that spark.”
It’s great to hear that an executive recognizes the creativity required to make a game work, but Vikström, who’s working in the trenches of game development, pointed to some more practical examples. With a background in programming, the developer specifically called out repetitive tasks, such as writing boilerplate code — foundational code that can be reused across several different scenarios.
Like it or not, AI isn’t going anywhere in game development.
Advertisement
“I’ve done this 10 times before, but it was a year ago now, so I don’t remember the boilerplate code for it. I just asked, like, ChatGPT or something, ‘Hey, I need a boilerplate for opening a window, aligning three buttons.’ It’s not hard programming, but it would just take an hour to get there because you need to find the right boilerplate for it. So that saved me, you know, a couple of hours, which was great.”
The other area of impact is art — and that’s where we’ve seen displacement, particularly with 2D artists at Activision Blizzard. Vikström says that generative AI is useful for visual ideation, but that you really need to be an artist to use these tools in the first place.
“It’s not really concept art, but it’s more like, ‘this is the look we’re kind of going for,’” Vikström said. “You need to be artistically-minded to see those qualities… You can be totally unartistic and get a nice-looking image, but you don’t know what you’re looking at. Like, what is good about this image? So, you still need to be artistic.”
There’s not an easy answer to the AI question in game development, as even developers themselves are unsure how to properly (and creatively) leverage the tools. And there’s no way we’ve seen the end executives trying to automate away jobs that were once filled by people. It’s also clear that generative AI is extremely powerful as a tool for game development, and hopefully over time, it’ll make for better games — and a few lost jobs are possible.
Advertisement
Ellis summed up the AI point nicely: “You see these titles come from an indie creator who maybe didn’t have a whole lot of money, but they came up with this beautiful idea or approach that just feels novel and unique, and it can take off like wildfire. And, we don’t see those things as being things that AI can create.”
Just in time for the 2024 US elections, the call screening and fraud detection company Hiya has launched a free Chrome extension to spot deepfake voices. The aptly named Hiya Deepfake Voice Detector “listens” to voices played in video or audio streams and assigns an authenticity score, telling you whether it’s likely real or fake.
Hiya tells Engadget that third-party testers have validated the extension as over 99 percent accurate. The company says that even covers AI-generated voices the detection model hasn’t trained on, and the company claims it can spot voices created by new synthesis models as soon as they’re launched.
We played around with the extension ahead of launch, and it seems to work well. I pulled up a YouTube video about the blues pioneer Howlin’ Wolf that I suspected used AI narration, and it assigned it a 1/100 authenticity score, declaring it likely a deepfake. Suspicions confirmed.
Hiya threw a well-earned jab at social media companies for making such a tool necessary. “It’s clear social media sites have a huge responsibility to alert users when the content they are consuming has a high chance of being an AI deepfake,” Hiya President Kush Parikh wrote in a press release. “The onus is currently on the individual to be vigilant to the risks and use tools like our Deepfake Voice Detector to check if they are concerned content is being altered. That’s a big ask, so we’re pleased to be able to support them with a solution that helps put some of the power back in their hands.”
Advertisement
The extension only needs to listen to a few seconds of a voice to spit out a result. It works on a credit system to prevent Hiya’s servers from getting slammed by excessive requests. You’ll get 20 credits daily, which may or may not cover the flood of manipulative AI content you’ll come across on social media in the coming weeks.
While iPads are cheaper and much handier to carry around than MacBooks, you often need an extra iPad accessory or two to make them as useful. While an attachable keyboard can be great for anyone with a writing job (hello!) an Apple pencil is critical for everything from studying to designing. Thankfully, it’s cheaper than ever to get the budget option with the USB-C Apple Pencil on sale for $65, down from $79. The 18 percent discount brings the accessory to $5 less than its Prime Day price.
Apple released its USB-C Pencil in late 2023 as a cheaper option than its counterparts, the second generation Apple Pencil and Apple Pencil Pro. This Pencil is compatible with all iPads with a USB-C port and offers the hover feature when using an M2 iPad Air or the iPad Pro. It also has some great perks like low latency, tilt sensitivity and pixel-perfect accuracy. However, it doesn’t have pressure sensitivity like its fellow Apple Pencils.
Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.
Artificial intelligence is increasingly making its presence felt in more areas of our lives, certainly since the launch of ChatGPT. Depending on your view, it’s that big bad bogeyman that’s taking jobs and causing widespread copyright infringement, or a gift with the potential to catapult humanity into a new age of enlightenment.
What many have achieved with the new tech, from Midjourney and LLMs to smart algorithms and data analysis, is beyond radical. It’s a technology that, like most of the silicon-based breakthroughs that came before it, has a lot of potency behind it. It can do a lot of good, but also, many fear, a lot of bad. And those outcomes are entirely dependent on how it’s manipulated, managed, and regulated.
It’s not surprising then, given how rapidly AI has forced its way into the zeitgeist, that tech companies and their sales teams are equally leaning into the technology, stuffing its various iterations into their latest products, all in the aim of encouraging us to buy their hardware.
Check out this new AI powered laptop, that motherboard that utilizes AI to overclock your CPU to the limit, those new webcams featuring AI deep-learning tech. You get the point. You just know that from Silicon Valley to Shanghai, share-holders and company execs are asking their marketing teams “How can we get AI into our products?” in time for the next CES or the next Computex, no matter how modest the value will actually be for us consumers.
Advertisement
My biggest bugbear comes in the form of the latest generation of CPUs being launched by the likes of AMD, Intel, and Qualcomm. Now, these aren’t bad products, not by a long shot. Qualcomm is making huge leaps and bounds in the desktop and laptop chip markets, and the performance of both Intel and AMD’s latest chips is nothing if not impressive. Generation on generation, we’re seeing higher performance scores, better efficiency, broader connectivity, lower latencies, and ridiculous power savings (here’s looking at you, Snapdragon), among a whole slew of innovative design changes and choices. To most of us mere mortals, it’s magic way beyond the basic 0s and 1s.
Despite that, we still get AI slapped onto everything regardless of whether or not it’s actually adding anything useful to a product. We have new neural processing units (NPUs) added to chips, which are co-processors that are designed to accelerate low-level operations that can take advantage of AI. These are then put into low-powered laptops, allowing them to use advanced AI features such as Microsoft’s Copilot assistant to tick that AI checkbox, as if it makes a difference to a predominantly cloud-based solution.
The thing is though, CPU performance, when it comes to AI, is insignificant. Like seriously insignificant, to the point it’s not even mildly relevant. It’s like trying to launch NASA’s JWST space telescope with a bottle of Coke and some Mentos.
Emperor’s new clothes?
I’ve spent the last month testing a raft of laptops and processors, specifically in regard to how they handle artificial intelligence tasks and apps. Using UL’s Procyon benchmark suite (makers of the 3D Mark series), you can run its Computer Vision inference test, and that can spit out a nice number for you, giving you a score for each component. Intel Core i9-14900K? 50. AMD Ryzen 9 7900X? 56. 9900X? 79 (that’s a 41% performance increase, by the way, gen-on-gen, seriously huge).
Advertisement
Here’s the thing though: chuck a GPU through that same test, such as Nvidia’s RTX 4080 Super, and it scores 2,123. That’s a 2,587% performance increase compared to that Ryzen 9 9900X, and that’s not even using Nvidia’s own TensorRT SDK, which scores even higher than that.
The simple fact of the matter is that AI demands parallel processing performance like nothing else, and nothing does that better than a graphics card right now. Elon Musk knows this – he’s just installed 100,000 Nvidia H100 GPUs in xAI’s latest AI training system. That’s more than $1 billion worth of graphics cards in a single supercomputer.
Obscured by clouds
To add insult to injury, the vast majority of popular AI tools today require cloud computing to fully function anyway.
LLMs (large language models) like ChatGPT and Google Gemini require so much processing power and storage space that it’s impossible to run them on a local machine. Even Adobe’s Generative Fill and AI smart filter tech in the latest versions of Photoshop require cloud computing to process images.
Advertisement
It’s just not feasible or possible to really run the vast majority of these AI programs that are so popular today on your own home machine. There are exceptions, of course; certain AI image-generation tools are far easier to operate on a solo machine, but still, you’re far better off using cloud computing to process it in 99% of use cases.
The one big exception to this rule is localized upscaling and super-sampling. Things like Nvidia’s DLSS and Intel’s XeSS, and even to a lesser extent AMD’s own FSR (although this is predominantly based on deep-learning models, applied via rasterization hardware, meaning you don’t need AI componentry) are fantastic examples of a good use of localized AI. Otherwise though, you’re basically out of luck.
Yet still, here we are. Another week, another AI-powered laptop, another AI chip, much of which, in my opinion, amounts to a lot of fuss about nothing.
Halloween season is finally here, meaning there’s no better time to watch a horror movie. Be it a tale of exorcism or a psychological thriller about the dangers lurking in every corner, horror movies have a unique way of tackling our primal fears, making us more alert, and giving us a much-needed fright. The streamer has a considerable collection of horror movies covering every subgenre and theme under the sun, so there’s no better place to be this Halloween season.
Some of the best new movies to stream offer chills and thrills while delivering a high-quality experience for terror-starved audiences. Netflix stays consistent every month with new and exciting arrivals that make up for whatever movies are leaving the service. We also found some of the best movies on Netflix, to give you something to watch between scary movies. With supernatural stories, psychological thrillers, and good old-fashioned slashers, these are the best horror movies that Netflix has to offer, and we wholeheartedly recommend them.
You must be logged in to post a comment Login