AMD quietly introduced two EPYC CPUs that intrigue me — the 8124p is a 16-core, 125W CPU, while the 8224p is an affordable 24-core ThreadRipper Lite alternative
AMD has quietly launched its EPYC Embedded 8004 Series of processors, aimed at high-demand applications such as networking, storage, and industrial edge environments.
Built on AMD’s “Zen 4c” architecture (the first processor series in the AMD embedded portfolio with these cores), they offer from 8 to 64 cores and support up to 1.152TB of DDR5 memory. With TDP values from 70W to 225W, the series is designed to combine power efficiency with reliable performance.
The lineup, based on the EPYC 8004 “Siena” series launched last year, features six models: the 64-core 8534P, 48-core 8434P, 32-core 8324P, 24-core 8224P, 16-core 8124P, and 12-core 8C24P, each catering to different performance needs.
“Siena” is coming to the embedded market
The 8224P, with its 24 cores and 180W TDP, is a standout due to its balance of power and affordability, making it a decent alternative to Threadripper Lite. The 8124P, with 16 cores and a 125W TDP, offers a budget-friendly option for multi-threaded applications without high power consumption. Also worthy of mention is the 12-core EPYC 8C24P which has a 100W TDP, but a cTDP range of 70-100W.
Advertisement
The new embedded processors come equipped with advanced features like Direct Memory Access (DMA) for offloading data transfers, Non-Transparent Bridging (NTB) for reliable CPU communication, and DRAM Flush to NVMe for secure data retention during power loss. With dual SPI for secure boot and Device Identity Attestation for preventing unauthorized modifications, the series offers data integrity and system security.
Built with a compact SP6 socket that’s 19% smaller than the Embedded 9004 Series, the EPYC 8004 Series fits into space-constrained systems while maintaining high efficiency. It has a seven-year lifecycle support, which isn’t dissimilar to the non-embedded 8004 processors, and includes Yocto Framework support allowing for optimized Linux-based operating systems tailored to specific embedded needs.
Summing up the new embedded processors, ServeTheHome says, “Slotted between the AMD EPYC 4004 series and EPYC 9004 series, the EPYC 8004 “Siena” platform is nice because it gives a large number of cores to a platform that costs less than the EPYC 9004 series, but that is more expandable than the EPYC 4004 series. For many applications, things like connectivity are more important than achieving maximum clock speed on a CPU. With the AMD EPYC Embedded 8004 series, the company is signaling that “Siena” is coming to the embedded market. To be clear, we think the EPYC 8004 is a stellar platform. We do wish, however, that AMD did a bit more branding work to differentiate why a chip is in its Embedded line.”
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Just in time for the 2024 US elections, the call screening and fraud detection company Hiya has launched a free Chrome extension to spot deepfake voices. The aptly named Hiya Deepfake Voice Detector “listens” to voices played in video or audio streams and assigns an authenticity score, telling you whether it’s likely real or fake.
Hiya tells Engadget that third-party testers have validated the extension as over 99 percent accurate. The company says that even covers AI-generated voices the detection model hasn’t trained on, and the company claims it can spot voices created by new synthesis models as soon as they’re launched.
We played around with the extension ahead of launch, and it seems to work well. I pulled up a YouTube video about the blues pioneer Howlin’ Wolf that I suspected used AI narration, and it assigned it a 1/100 authenticity score, declaring it likely a deepfake. Suspicions confirmed.
Hiya threw a well-earned jab at social media companies for making such a tool necessary. “It’s clear social media sites have a huge responsibility to alert users when the content they are consuming has a high chance of being an AI deepfake,” Hiya President Kush Parikh wrote in a press release. “The onus is currently on the individual to be vigilant to the risks and use tools like our Deepfake Voice Detector to check if they are concerned content is being altered. That’s a big ask, so we’re pleased to be able to support them with a solution that helps put some of the power back in their hands.”
Advertisement
The extension only needs to listen to a few seconds of a voice to spit out a result. It works on a credit system to prevent Hiya’s servers from getting slammed by excessive requests. You’ll get 20 credits daily, which may or may not cover the flood of manipulative AI content you’ll come across on social media in the coming weeks.
While iPads are cheaper and much handier to carry around than MacBooks, you often need an extra iPad accessory or two to make them as useful. While an attachable keyboard can be great for anyone with a writing job (hello!) an Apple pencil is critical for everything from studying to designing. Thankfully, it’s cheaper than ever to get the budget option with the USB-C Apple Pencil on sale for $65, down from $79. The 18 percent discount brings the accessory to $5 less than its Prime Day price.
Apple released its USB-C Pencil in late 2023 as a cheaper option than its counterparts, the second generation Apple Pencil and Apple Pencil Pro. This Pencil is compatible with all iPads with a USB-C port and offers the hover feature when using an M2 iPad Air or the iPad Pro. It also has some great perks like low latency, tilt sensitivity and pixel-perfect accuracy. However, it doesn’t have pressure sensitivity like its fellow Apple Pencils.
Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.
Artificial intelligence is increasingly making its presence felt in more areas of our lives, certainly since the launch of ChatGPT. Depending on your view, it’s that big bad bogeyman that’s taking jobs and causing widespread copyright infringement, or a gift with the potential to catapult humanity into a new age of enlightenment.
What many have achieved with the new tech, from Midjourney and LLMs to smart algorithms and data analysis, is beyond radical. It’s a technology that, like most of the silicon-based breakthroughs that came before it, has a lot of potency behind it. It can do a lot of good, but also, many fear, a lot of bad. And those outcomes are entirely dependent on how it’s manipulated, managed, and regulated.
It’s not surprising then, given how rapidly AI has forced its way into the zeitgeist, that tech companies and their sales teams are equally leaning into the technology, stuffing its various iterations into their latest products, all in the aim of encouraging us to buy their hardware.
Check out this new AI powered laptop, that motherboard that utilizes AI to overclock your CPU to the limit, those new webcams featuring AI deep-learning tech. You get the point. You just know that from Silicon Valley to Shanghai, share-holders and company execs are asking their marketing teams “How can we get AI into our products?” in time for the next CES or the next Computex, no matter how modest the value will actually be for us consumers.
Advertisement
My biggest bugbear comes in the form of the latest generation of CPUs being launched by the likes of AMD, Intel, and Qualcomm. Now, these aren’t bad products, not by a long shot. Qualcomm is making huge leaps and bounds in the desktop and laptop chip markets, and the performance of both Intel and AMD’s latest chips is nothing if not impressive. Generation on generation, we’re seeing higher performance scores, better efficiency, broader connectivity, lower latencies, and ridiculous power savings (here’s looking at you, Snapdragon), among a whole slew of innovative design changes and choices. To most of us mere mortals, it’s magic way beyond the basic 0s and 1s.
Despite that, we still get AI slapped onto everything regardless of whether or not it’s actually adding anything useful to a product. We have new neural processing units (NPUs) added to chips, which are co-processors that are designed to accelerate low-level operations that can take advantage of AI. These are then put into low-powered laptops, allowing them to use advanced AI features such as Microsoft’s Copilot assistant to tick that AI checkbox, as if it makes a difference to a predominantly cloud-based solution.
The thing is though, CPU performance, when it comes to AI, is insignificant. Like seriously insignificant, to the point it’s not even mildly relevant. It’s like trying to launch NASA’s JWST space telescope with a bottle of Coke and some Mentos.
Emperor’s new clothes?
I’ve spent the last month testing a raft of laptops and processors, specifically in regard to how they handle artificial intelligence tasks and apps. Using UL’s Procyon benchmark suite (makers of the 3D Mark series), you can run its Computer Vision inference test, and that can spit out a nice number for you, giving you a score for each component. Intel Core i9-14900K? 50. AMD Ryzen 9 7900X? 56. 9900X? 79 (that’s a 41% performance increase, by the way, gen-on-gen, seriously huge).
Advertisement
Here’s the thing though: chuck a GPU through that same test, such as Nvidia’s RTX 4080 Super, and it scores 2,123. That’s a 2,587% performance increase compared to that Ryzen 9 9900X, and that’s not even using Nvidia’s own TensorRT SDK, which scores even higher than that.
The simple fact of the matter is that AI demands parallel processing performance like nothing else, and nothing does that better than a graphics card right now. Elon Musk knows this – he’s just installed 100,000 Nvidia H100 GPUs in xAI’s latest AI training system. That’s more than $1 billion worth of graphics cards in a single supercomputer.
Obscured by clouds
To add insult to injury, the vast majority of popular AI tools today require cloud computing to fully function anyway.
LLMs (large language models) like ChatGPT and Google Gemini require so much processing power and storage space that it’s impossible to run them on a local machine. Even Adobe’s Generative Fill and AI smart filter tech in the latest versions of Photoshop require cloud computing to process images.
Advertisement
It’s just not feasible or possible to really run the vast majority of these AI programs that are so popular today on your own home machine. There are exceptions, of course; certain AI image-generation tools are far easier to operate on a solo machine, but still, you’re far better off using cloud computing to process it in 99% of use cases.
The one big exception to this rule is localized upscaling and super-sampling. Things like Nvidia’s DLSS and Intel’s XeSS, and even to a lesser extent AMD’s own FSR (although this is predominantly based on deep-learning models, applied via rasterization hardware, meaning you don’t need AI componentry) are fantastic examples of a good use of localized AI. Otherwise though, you’re basically out of luck.
Yet still, here we are. Another week, another AI-powered laptop, another AI chip, much of which, in my opinion, amounts to a lot of fuss about nothing.
Halloween season is finally here, meaning there’s no better time to watch a horror movie. Be it a tale of exorcism or a psychological thriller about the dangers lurking in every corner, horror movies have a unique way of tackling our primal fears, making us more alert, and giving us a much-needed fright. The streamer has a considerable collection of horror movies covering every subgenre and theme under the sun, so there’s no better place to be this Halloween season.
Some of the best new movies to stream offer chills and thrills while delivering a high-quality experience for terror-starved audiences. Netflix stays consistent every month with new and exciting arrivals that make up for whatever movies are leaving the service. We also found some of the best movies on Netflix, to give you something to watch between scary movies. With supernatural stories, psychological thrillers, and good old-fashioned slashers, these are the best horror movies that Netflix has to offer, and we wholeheartedly recommend them.
You must be logged in to post a comment Login