Connect with us

Tech

‘Digital squatting’ hits new levels as hackers target brand domains

Published

on


  • Decodo reports 68% rise in digital squatting scams over five years
  • Techniques include typosquatting, combosquatting, TLD squatting, and homograph attacks, tricking users into sharing credentials or payments
  • WIPO logged 6,200 domain disputes in 2025, the highest ever; Decodo urges brands to register domains beyond .com for protection

Digital squatting is getting increasingly popular among scammers, ruining businesses and their reputations at an unprecedented pace.

This is according to a new report from Decodo, which said that there’s been a 68% increase in these cases in half a decade.

Source link

Advertisement
Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Bulk RRAM: Scaling the AI Memory Wall

Published

on

The hunt is on for anything that can surmount AI’s perennial memory wall–even quick models are bogged down by the time and energy needed to carry data between processor and memory. Resistive RAM (RRAM)could circumvent the wall by allowing computation to happen in the memory itself. Unfortunately, most types of this nonvolatile memory are too unstable and unwieldy for that purpose.

Fortunately, a potential solution may be at hand. At December’s IEEE International Electron Device Meeting (IEDM), researchers from the University of California, San Diego showed they could run a learning algorithm on an entirely new type of RRAM.

“We actually redesigned RRAM, completely rethinking the way it switches,” says Duygu Kuzum, an electrical engineer at the University of California, San Diego, who led the work.

RRAM stores data as a level of resistance to the flow of current. The key digital operation in a neural network—multiplying arrays of numbers and then summing the results—can be done in analog simply by running current through an array of RRAM cells, connecting their outputs, and measuring the resulting current.

Advertisement

Traditionally, RRAM stores data by creating low-resistance filaments in the higher-resistance surrounds of a dielectric material. Forming these filaments often needs voltages too high for standard CMOS, hindering its integration inside processors. Worse, forming the filaments is a noisy and random process, not ideal for storing data. (Imagine a neural network’s weights randomly drifting. Answers to the same question would change from one day to the next.)

Moreover, most filament-based RRAM cells’ noisy nature means they must be isolated from their surrounding circuits, usually with a selector transistor, which makes 3D stacking difficult.

Limitations like these mean that traditional RRAM isn’t great for computing. In particular, Kuzum says, it’s difficult to use filamentary RRAM for the sort of parallel matrix operations that are crucial for today’s neural networks.

So, the San Diego researchers decided to dispense with the filaments entirely. Instead they developed devices that switch an entire layer from high to low resistance and back again. This format, called “bulk RRAM”, can do away with both the annoying high-voltage filament-forming step and the geometry-limiting selector transistor.

Advertisement

The San Diego group wasn’t the first to build bulk RRAM devices, but it made breakthroughs both in shrinking them and forming 3D circuits with them. Kuzum and her colleagues shrank RRAM into the nanoscale; their device was just 40 nm across. They also managed to stack bulk RRAM into as many as eight layers.

With a single pulse of identical voltage, an eight-layer stack of cells each of which can take any of 64 resistance values, a number that’s very difficult to achieve with traditional filamentous RRAM. And whereas the resistance of most filament-based cells are limited to kiloohms, the San Diego stack is in the megaohm range, which Kuzum says is better for parallel operations. e

“We can actually tune it to anywhere we want, but we think that from an integration and system-level simulations perspective, megaohm is the desirable range,” Kuzum says.

These two benefits–a greater number of resistance levels and a higher resistance–could allow this bulk RRAM stack to perform more complex operations than traditional RRAM’s can manage.

Advertisement

Kuzum and colleagues assembled multiple eight-layer stacks into a 1-kilobyte array that required no selectors. Then, they tested the array with a continual learning algorithm: making the chip classify data from wearable sensors—for example, reading data from a waist-mounted smartphone to determine if its wearer was sitting, walking, climbing stairs, or taking another action—while constantly adding new data. Tests showed an accuracy of 90 percent, which the researchers say is comparable to the performance of a digitally-implemented neural network.

This test exemplifies what Kuzum thinks can especially benefit from bulk RRAM: neural network models on edge devices, which may need to learn from their environment without accessing the cloud.

“We are doing a lot of characterization and material optimization to design a device specifically engineered for AI applications,” Kuzum says.

The ability to integrate RRAM into an array like this is a significant advance, says Albert Talin, materials scientist at Sandia National Laboratories in Livermore, California, and a bulk RRAM researcher who wasn’t involved in the San Diego group’s work. “I think that any step in terms of integration is very useful,” he says.

Advertisement

But Talin highlights a potential obstacle: the ability to retain data for an extended period of time. While the San Diego group showed their RRAM could retain data at room temperature for several years (on par with flash memory), Talin says that its retention at the higher temperatures where computers actually operate is less certain. “That’s one of the major challenges of this technology,” he says, especially when it comes to edge applications.

If engineers can prove the technology, then all types of models may benefit. This memory wall has only grown higher this decade, as traditional memory hasn’t been able to keep up with the ballooning demands of large models. Anything that allows models to operate on the memory itself could be a welcome shortcut.

From Your Site Articles

Related Articles Around the Web

Advertisement

Source link

Continue Reading

Tech

Snapchat now lets you inform others when you have arrived at your destination

Published

on

After launching a “Home Safe” feature that lets users notify friends and family when they’ve arrived home safely, Snapchat is now introducing additional alerts to inform others when users have arrived at other destinations.

The social media giant announced on Monday that with its new “Arrival Notifications,” users can now set one-time or recurring alerts for locations beyond their home, providing an automatic way to share when they’ve arrived at specific places.

“Arrival Notifications now work for everyday moments — like letting someone know you’re back for the night while traveling, or automatically sharing when you arrive at a weekly class, practice, or meeting — without needing to remember to send a message,” the company wrote in a blog post.

Image Credits:Snapchat

As with the platform’s Home Safe alerts, Arrival Notifications can only be sent to friends you choose to share your location with. It’s worth noting that location sharing on Snap Map is off by default. No one can see your location or receive an alert unless you choose to share it, Snapchat explained. One-time alerts expire after they’re sent or after 24 hours.

To use Arrival Notifications, you need to share your location with a trusted friend that you want to keep in the loop. Then, you need to tap on your friendship profile and scroll down to “Arrival Notifications.” You can pick a location on the map and give it a personal name. For example, you could set the location for your “run club” or the location for “piano lessons.” You can then choose a one-time or recurring alert, after which Snapchat will notify your friend when you arrive.

Advertisement

The new feature comes as Snapchat announced last summer that Snap Map now has more than 400 million monthly active users.

Snap Map, which launched in 2017, was originally a way for users to see their friends’ locations and browse public snaps from around the world. The feature now also offers ways for users to discover local hotspots and find things to do.

Techcrunch event

Boston, MA
|
June 23, 2026

Advertisement

With its Home Safe and Arrival Notifications features, Snapchat is looking to further compete with services like the family location-sharing app Life360 and Apple’s “Find My.”

Source link

Advertisement
Continue Reading

Tech

Living In The (LLM) Past

Published

on

In the early days of AI, a common example program was the hexapawn game. This extremely simplified version of a chess program learned to play with your help. When the computer made a bad move, you’d punish it. However, people quickly realized they could punish good moves to ensure they always won against the computer. Large language models (LLMs) seem to know “everything,” but everything is whatever happens to be on the Internet, seahorse emojis and all. That got [Hayk Grigorian] thinking, so he built TimeCapsule LLM to have AI with only historical data.

Sure, you could tell a modern chatbot to pretend it was in, say, 1875 London and answer accordingly. However, you have to remember that chatbots are statistical in nature, so they could easily slip in modern knowledge. Since TimeCapsule only knows data from 1875 and earlier, it will be happy to tell you that travel to the moon is impossible, for example. If you ask a traditional LLM to roleplay, it will often hint at things you know to be true, but would not have been known by anyone of that particular time period.

Chatting with ChatGPT and telling it that it was a person living in Glasgow in 1200 limited its knowledge somewhat. Yet it was also able to hint about North America and the existence of the atom. Granted, the Norse apparently found North America around the year 1000, and Democritus wrote about indivisible matter in the fifth century. But that knowledge would not have been widespread among common people in the year 1200. Training on period texts would surely give a better representation of a historical person.

The model uses texts from 1800 to 1875 published in London. In total, there is about 90 GB of text files in the training corpus. Is this practical? There is academic interest in recreating period-accurate models to study history. Some also see it as a way to track both biases of the period and contrast them with biases found in data today. Of course, unlike the Internet, surviving documents from the 1800s are less likely to have trivialities in them, so it isn’t clear just how accurate a model like this would be for that sort of purpose.

Advertisement

Instead of reading the news, LLMs can write it. Just remember that the statistical nature of LLMs makes them easy to manipulate during training, too.


Featured Art: Royal Courts of Justice in London about 1870, Public Domain

Source link

Advertisement
Continue Reading

Tech

Worried AI means you won’t get a job when you graduate?

Published

on

Lukasz Swiatek of the University of New South Wales Sydney discusses what advancement in technologies might mean for future graduates.

The head of the International Monetary Fund, Kristalina Georgieva, has warned young people will suffer the most as an AI “tsunami” wipes out many entry-level roles in coming years.

Tasks that are eliminated are usually what entry-level jobs do at present, so young people searching for jobs find it harder to get to a good placement.

Georgieva is not alone. Other economic and business experts have warned about AI taking entry-level jobs.

Advertisement

As young people prepare to start or continue their university studies, they may be feeling anxious about what AI means for their job prospects. What does the current research say? And how can you prepare for a post-AI workforce while studying?

The situation around the world

At the moment, the impact of AI is uneven and depends on the industry.

A 2025 report from US think tank the Brookings Institution suggests, in general, AI adoption has led to employment and firm growth. Most importantly, AI has not led to widespread job loss.

At the same time, consulting firm McKinsey notes many businesses are experimenting with AI and redesigning how they work. So, some organisations are seeking more technically skilled employees.

Advertisement

Crucially, AI is affecting each industry differently. So, we might see fewer entry-level jobs in some industries, but more in others, or growth in specialist roles.

For example, international researchers have noted agriculture has been a slow adopter of AI. By contrast, colleagues and I have found AI is being rapidly implemented in media and communications, already affecting jobs from advertising to the entertainment industries. Here we are seeing storyboard illustrators, copywriters and virtual effects artists (among others) increasingly being replaced by AI.

So, students need to look carefully at the specific data about their chosen industry (or industries) to understand the current situation and predicted trends.

To do this, you can look at academic research about AI’s impacts on industries around the world, as well as industry news portals and free industry newsletters.

Advertisement

Get ready while studying

Students can also obviously build their knowledge and skills about AI while they are studying.

Specifically, students should look to move from “AI literacy” to “AI fluency”. This means understanding not just how AI works in an industry, but also how it can be used innovatively in different contexts.

If these elements are not already offered by your course, you can look at online guides and specific courses offered by universities, TAFE or other providers.

Students who are already familiar with AI can keep expanding their knowledge and skills. These students can discover the latest research from the world’s key publishers and keep up to date with other AI research news.

Advertisement

For students who aren’t really interested in AI, it’s still important to start getting to grips with the technology. In my research, I’ve suggested getting curious initially about three key things: opportunities, concerns and questions. These three elements can be especially helpful for getting across industry developments: how AI is being used, what issues it’s raising, and which impacts still need to be explored.

Free (online) courses, such as AI For Everyone and the Elements of AI, can help familiarise virtually anyone with the technology.

Strengthening other skills

All students, no matter how familiar they are with AI, can also concentrate on developing general competencies that can apply across any industry. US researchers have pinpointed six key “durable skills” for the AI age:

  • effective communication, to engage with others successfully
  • good adaptability, to respond to workplace, industry and broader social changes
  • strong emotional intelligence, to help everyone thrive in a workplace
  • high-quality creativity, to work with AI in innovative ways
  • sound leadership, to help navigate the challenges that AI creates
  • robust critical thinking, to deal with AI-related problems.

So, look for opportunities to foster these skills in and out of class. This could include engaging in teamwork, joining a club or society, doing voluntary work, or getting paid work experience.

Don’t forget ethics

Finally, students need to consider the ethical issues this new technology creates. Research suggests AI is bringing about changes in ethics across industries and students need to know how to approach AI dilemmas.

Advertisement

For example, they need to feel confident tackling questions about when to use and not use AI, and whether the technology’s environmental impacts outweigh its benefits in different situations.

Students can do this through focused discussions with classmates, facilitated by teachers to tease out the issues. They can also do dedicated courses on AI ethics.

The Conversation

By Lukasz Swiatek

Lukasz Swiatek lectures in the School of the Arts and Media at UNSW Sydney. His main research areas are media and communication, higher education, and cultural studies. Over the years, he has taught a range of postgraduate and undergraduate courses, in media studies, communication, international and global studies.

Advertisement

Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.

Source link

Advertisement
Continue Reading

Tech

Ask Hackaday: How Do You Detect Hidden Cameras?

Published

on

The BBC recently published an exposé revealing that some Chinese subscription sites charge for access to their network of hundreds of hidden cameras in hotel rooms. Of course, this is presumably without the consent of the hotel management and probably isn’t specifically a problem in China. After all, cameras can now be very tiny, so it is extremely easy to rent a hotel room or a vacation rental and bug it. This is illegal, China has laws against spy cameras, and hotels are required to check for them, the BBC notes. However, there is a problem: At least one camera found didn’t show up on conventional camera detectors. So we wanted to ask you, Hackaday: How do you detect hidden cameras?

How it Works

Commercial detectors typically use one of two techniques. It is easy to scan for RF signals, and if the camera is emitting WiFi or another frequency you expect cameras to use, that works. But it also misses plenty. A camera might be hardwired, for example. Or store data on an SD card for later. If you have a camera that transmits on a strange frequency, you won’t find it. Or you could hide the camera near something else that transmits. So if your scanner shows a lot of RF around a WiFi router, you won’t be able to figure out that it is actually the router and a small camera.

Fire alarm? Camera? It is both!

The other common method uses a beam of light or a laser to try to see reflections of lenses, which will be retroreflective. The user views the room through a viewfinder, and any light that comes directly back will show up in the view. Despite some false positives, this method will find cameras even if they are not powered or transmitting. Even shining a flashlight, maybe from the same cell phone, around a dark room might uncover some camera devices.

There are a few other techniques. If you assume a spy camera probably uses IR lighting to see you at night, you can scan for that. A good tip is that your cell phone camera can probably see IR. (Test it on an IR remote control.) So looking around with your phone camera is a good, free way to find some cameras. A thermal imager might show hidden equipment, too, although it might be hard to determine if it is actually a camera or not.

You might be thinking: just look for the camera. But that’s not always simple. In the BBC article, the camera was the size of a pencil eraser. Not to mention, a quick search of your favorite retailer will reveal cameras made to look like smoke detectors, stuffed toys, USB chargers, and more. You can even get small cameras that can mount a fake button or screw head on the lens.

Advertisement

Testing

[Project Farm] has a video that tests a few detectors. The problem, of course, is that there are different kinds of cameras. Detecting the test camera doesn’t mean it will detect all cameras. Still, you can get some idea of how effective some detectors are compared to others.

Your Turn?

Given that none of the current ways to detect cameras work perfectly, what would you build to find them? Maybe an NLJD? Or maybe some tech to blind them? Tell us what you think in the comments.

Advertisement

Source link

Continue Reading

Tech

Can the Teacher Shortage Be Solved by a Shift in Mindset?

Published

on

In Mr. Seevers’ English class, the air feels different today. A quiet student draws an unexpected connection between “The Odyssey” and modern migrant stories. The room wakes up. One idea sparks another, and conversation bounces around the room. Mr. Seevers grins, scribbling connections on the whiteboard, forgetting the clock. Students lean in. Their voices matter. Teacher and students move together — focused, curious, absorbed. When the bell finally rings, Mr. Seevers realizes … this is why he never quit.

Moments like this still happen in classrooms, but maybe not often enough to sustain the enthusiasm most educators had when they started out. As a result, leaders are grappling with two familiar challenges: finding and keeping great teachers. Seats go unfilled, turnover disrupts continuity and costs balloon.

But more troubling is the quiet toll that teacher burnout takes on students. In an era of artificial intelligence, shifting expectations and high-stakes accountability, students need teachers who are present, enthusiastic, resilient and growth-minded.

The key to recruiting and retaining great teachers is helping them find and sustain a sense of “flow” — a state where their energy, purpose and performance align — and bring that vitality into classrooms. While burnout depletes a teacher’s psychic energy, flow replenishes it. The payoff for districts is profound: stronger retention, smoother recruitment and better student outcomes.

Advertisement

What Is Flow, and Why It Matters in the Classroom

Flow is the state of full absorption and intrinsic motivation. It’s the sweet spot between boredom (too easy) and anxiety (too hard). It’s that “in the zone” moment where “everything just clicks.”

In K-12 education, flow tends to center around student learning: creating lessons that challenge just enough, focusing attention without overwhelming it. But teachers benefit from that psychological sweet spot, too: planning in flow, teaching in flow and iterating in flow.

When teachers find flow, something subtle but powerful happens. Their focus and curiosity become contagious. Research on emotional contagion shows that a teacher’s mood shapes the climate of the classroom. Stress and frustrations spread quickly, but so do calm, curiosity and joy. Most teachers don’t realize how strongly their inner state influences student engagement, but it does.

Flow feeds on itself. The more teachers experience it, the more students do, creating cycles of focus, persistence and connection that drive better outcomes.

Advertisement

How the Pygmalion Effect Fuels Flow

The Pygmalion effect, also known as the Rosenthal effect, describes how higher expectations from teachers can lead to improved student performance. Educators communicate those expectations through tone, feedback, time and warmth, cues that shape how students view themselves as learners.

Those same beliefs drive teacher flow. While low expectations lead to overly easy lessons and unrealistically high expectations produce anxiety, teachers who believe their students can grow naturally design learning that stretches skills without overwhelming them, the ideal balance for flow.

That energy is contagious, too. Studies indicate that teacher flow can cross over to students, creating a “flow contagion,” an upward spiral of shared engagement and persistence. The Pygmalion effect sets the stage; flow helps bring it to life.

The Recruitment and Retention Problem, and Why Mindset Matters

Districts today are busy chasing recruitment metrics (number of applicants, credentialing pipelines), but retention is where the real crisis lies. K-12 teachers now report the highest burnout rates in the country, across all jobs and industries. Systemic pressures like inadequate funding, excessive workloads, challenging student behaviors, parent scrutiny and lack of administrative support have them constantly on their back foot instead of in their zone of flow.

Advertisement

Consider a school district rolling out new AI tools to streamline lesson planning, grading or student feedback. On paper, it’s a practical move to save time and modernize instruction. But without coaching or dialogue, many teachers feel blindsided. They’re being asked to integrate complex tools while juggling a full teaching load, testing demands and students’ social-emotional needs.

The result? Instead of embracing the technology and feeling empowered, teachers feel alienated. Some worry that creativity, intuition and human connection matter less than AI adoption. Those already close to burnout may see AI as one more way their professional judgment is being replaced or devalued.

That stress doesn’t stay contained. It seeps into the classroom. Students pick up on frustration and unease just as easily as they absorb enthusiasm and curiosity.

When AI integration is paired instead with coaching and the spirit of exploration, teachers have space to process fears, experiment with tools and reflect on what works. They move from compliance to curiosity. This relational support can transform AI from a threat into a trusted collaborator, helping educators reclaim time, creativity and joy in their work.

Advertisement

The Benefits of Coaching and Teacher Flow

Districts that invest in coaching can strengthen multiple points in their teacher lifecycle:

  • Onboarding: New teachers get guidance on finding flow in their planning and instruction.
  • Burnout prevention: Coaches help identify stressors early, redesign workflows and create guardrails for energy.
  • Sustained engagement: Teachers experience coaching as developmental support from their district.
  • Improve recruiting: Prospective hires see a workplace that values professional well-being.

When retention improves, districts recover not only in lowered costs but in preserved institutional memory, relationships, curricular coherence and, most critically, consistently high instructional quality.

How Teacher Flow Translates to Student Outcomes

  • Sustained energy → better pacing: Teachers who maintain flow teach more responsively, observe more closely and adjust in the moment.
  • Mindset-aligned expectations → higher student growth: Teachers who believe in student potential push appropriately, scaffold growth and persist.
  • Emotional contagion → classrooms that hum: A teacher in flow models calm, curiosity and agency, and students respond in kind.
  • Upward spirals of engagement: As students engage, teachers get feedback, adapt and reenter flow.
  • Reduced classroom disruption: Lower turnover means fewer substitutes, fewer gaps and more continuity.

Prioritize Human-Centered Support Systems

Administrators don’t control every budget or class-size metric, but they can decide how people are supported, how leaders lead and how change takes shape. The difference between a district that churns teachers and one that nurtures them often comes down to access to coaching, a growth mindset, relational support and an environment that values energy, flow and reflective practice.


Districts that cultivate a coaching culture can move from perpetual retention crisis to a thriving educational community where teachers and students grow together. Learn more about how BetterUp supports educator well‑being.

Source link

Advertisement
Continue Reading

Tech

Discord Will Require a Face Scan or ID for Full Access Next Month

Published

on

Discord said today it’s rolling out age verification on its platform globally starting next month, when it will automatically set all users’ accounts to a “teen-appropriate” experience unless they demonstrate that they’re adults. From a report: Users who aren’t verified as adults will not be able to access age-restricted servers and channels, won’t be able to speak in Discord’s livestream-like “stage” channels, and will see content filters for any content Discord detects as graphic or sensitive. They will also get warning prompts for friend requests from potentially unfamiliar users, and DMs from unfamiliar users will be automatically filtered into a separate inbox.

[…] A government ID might still be required for age verification in its global rollout. According to Discord, to remove the new “teen-by-default” changes and limitations, “users can choose to use facial age estimation or submit a form of identification to [Discord’s] vendor partners, with more options coming in the future.” The first option uses AI to analyze a user’s video selfie, which Discord says never leaves the user’s device. If the age group estimate (teen or adult) from the selfie is incorrect, users can appeal it or verify with a photo of an identity document instead. That document will be verified by a third party vendor, but Discord says the images of those documents “are deleted quickly — in most cases, immediately after age confirmation.”

Source link

Continue Reading

Tech

Hiroshima scientists crack the code for 3D printing tungsten carbide

Published

on


The university’s team reports that their approach centers on controlled “softening” of the material rather than complete melting. The process, known as hot-wire laser irradiation, reshapes tungsten carbide while maintaining its exceptional hardness and minimizing defects – an achievement that could transform how cutting, drilling, and construction tools are manufactured.
Read Entire Article
Source link

Continue Reading

Tech

Your Recap of Super Bowl 2026 Ads Is Here: Baby Yoda, Pokemon and Much More

Published

on

We all watched the Seahawks beat the Patriots in Super Bowl LX on Sunday night, and in between all the plays, gameday ads filled up our screens. Our roundup for you includes some featuring artificial intelligence, movie trailers and some of the funniest spots of the evening (and Supergirl’s pop-up for Puppy Bowl). 

As you’re scrolling through these videos, we urge you to choose your own favorites, but don’t miss Baby Yoda, Ken’s Expedia spot, Melissa McCarthy’s telenovela or DoorDash’s cheeky 50 Cent ad. 

The Mandalorian and Grogu

In a short teaser clip, The Mandalorian and Grogu trek through the snow, reminding us “the journey never gets any easier.”

Advertisement

Supergirl for the Puppy Bowl

OK, so this isn’t an official gameday ad, but Supergirl showed up for the Puppy Bowl today, and DC dropped a teaser for the movie, which is due June 26. 

Liquid Death’s exploding heads

No, it’s not an episode of The Boys. Liquid Death’s new spot aims to blow your mind with its energy drink — but without losing your head. 

DoorDash and 50 Cent’s ‘beef’

Follow 50 Cent on Instagram, and at any moment, you’ll catch him trolling some of his famous peers — mostly to their detriment. This DoorDash gameday ad makes fun of his reputation for beefing with people (with a slick joke about Diddy), while he schools us on the “art of delivering beef.” Take notes. 

Manscaped serenade

You’ll have to watch this for yourself and arrive at your own conclusion, but we’re only posting the short version here. Head to YouTube for the extended cut. 

Advertisement

Southwest Airlines pokes fun at seating chaos

You know the airline recently switched to assigned seating, and this timely SB commercial jokingly looks back at the mayhem of an era before the new policy.

Scream 7

It’s the return of Sidney Prescott, Ghostface and a fan theory about Stu with this new, flame-filled big-game trailer that makes Sidney’s daughter a target. The movie arrives in theaters Feb. 27.

Google Gemini helps design a dream home

Google adds a human touch to showcase its Gemini AI assistant in this SB spot, where a little kid is genuinely excited to dream up what his new home will look like — including a place for his pup. 

Oakley Meta AI glasses

Marshawn Lynch is among the stars in Oakley Meta’s first Super Bowl ad that will debut this Sunday. Check out how the AI-powered glasses capture what Sky Brown, Sunny Choi, IShowSpeed, Spike Lee, Kate Courtney and Akshay Bhatia are doing on and off the ground. 

Advertisement

Prepping for FIFA World Cup 2026

Sofia Vergara and Owen Wilson are counting down to World Cup 2026 and its Spanish-language coverage on Peacock and Telemundo. The games begin in June.

Anthropic throws shade at AI competition

We’ve covered how this Claude ad from Anthropic takes shots at OpenAI and its plans to test ads, but Dr. Dre’s What’s The Difference playing in the background kind of nails the message’s tone, in case it wasn’t clear. 

Dairy Queen’s Taylor and Swift

It’s a play on words in this DQ spot starring Tyrod Taylor and D’Andre Swift that’s urging fans to order platters for their own halftime snack breaks.  

Backstreet Boys for T-Mobile

A T-Mobile store performance from the Backstreet Boys earns tears from Druski, and a couple of cameos from Machine Gun Kelly and The Wrong Paris actor Pierson Fodé.

Advertisement

State Farm teaser livin’ on a prayer

Danny McBride and Keegan-Michael Key rep “Halfway There Insurance” and serenade Hailee Steinfeld in this comedic teaser for State Farm’s ad. The girl group Katseye also makes a brief appearance.

Xfinity’s Wi-Fi saves Jurassic Park?

Xfinity goes the full nostalgia route in its Jurassic Park-themed Super Bowl ad starring Jeff Goldblum, Laura Dern and Sam Neill, where Xfinity stops the dino disaster at the park from ever happening.

Melissa McCarthy in an e.l.f. telenovela

Dramatic. Glamorous. High stakes. Melissa McCarthy. What more can you ask for in a gameday ad/telenovela? 

Eos fragrance x Netflix fake-out

In a nod to Netflix’s Is It Cake, this body spray spot from Eos has contestants guessing if the scent is real or if it’s coming from a dessert. 

Advertisement

Uber Eats: Matthew McConaughey annoys Bradley Cooper

Uber Eats is back for this year’s Super Bowl LX ad run, and Eagles fan Bradley Cooper isn’t trying to hear what Matthew McConaughey has to say about… food. Will they come to blows? Check out Jerry Rice, Parker Posey and a few other celeb cameos.

Pepsi nabs a polar bear

We’ve seen polar bears working as mascots for years for one particular cola brand, but Pepsi’s blind taste test for this bear has it feeling disloyal.

Super freaky Svedka Vodka

Vodka brand Svedka has used robots in its ads before, but the company hits a couple of firsts with this commercial, which is soundtracked by Rick James’ Super Freak. It’s the first time a vodka ad has rolled out during the Super Bowl in 30 years, and it’s the first brand to use mostly AI to create its ad.

Pokemon’s big anniversary

Who’s hitting the big 3-0? Pokemon! In honor of the franchise turning 30, the company launched a new Super Bowl ad to jumpstart a year-long celebration. Lady Gaga and Jigglypuff duet, Charles Leclerc expresses his respect for Arcanine and a few other Pokemon favorites are spotlighted. 

Advertisement

Be a hero with Ring 

Ring reminds us that pets are truly family with its tender new ad that showcases its Search Party feature. AI tech to help find and reunite lost pets? Sounds like a win. 

Ken travels solo with Expedia

No, Barbie isn’t tagging along with Ken on his jet-setting travel adventures. But he does have Expedia every step of the way in this gameday spot that may be one of our favorites this year. Go, Ken!

Toyota’s superhero belt

Not all commercials are made to be chaotic or cameo-filled surprises, and Toyota’s spot marries nostalgia, family and charm in this tender ad for the RAV4.

Kinder Bueno brings babies and merciful aliens together

Actor William Fichtner commands a space center in this Kinder Bueno ad with intergalactic travel, astronaut babies and cute little aliens that spare the planet for one reason. 

Advertisement

Chris Hemsworth is creeped out by Alexa Plus AI

If you haven’t been able to picture Chris Hemsworth afraid of anything, here’s your chance to see how he reacts to Alex Plus in his home. Amazon’s AI assistant works hard to prove its worth and trustworthiness in this ad.

Instacart in its disco era

Instacart recruited Ben Stiller and Benson Boone as disco-loving performers for a series of Super Bowl commercials to introduce its new app feature: Pick bananas the way you like them. Look out for these harmonizing brothers to drop another fresh ad during the big game. 

Liquid I.V. and EJae of KPop Demon Hunters 

Singer EJae goes a cappella with a version of Against All Odds in this Liquid I.V. teaser dubbed as a Tiny Vanity Concert. Who can’t relate to singing in front of a mirror? The full ad will go live for game week.   

Squarespace doesn’t want you to lose it

A laptop doesn’t stand a chance against Emma Stone in the Squarespace spot titled Unavailable. Stone and Yorgos Lanthimos team up in this black-and-white ad directed by Lanthimos, where the actor destroys a few laptops over the domain name emmastone.com being unavailable. 

Advertisement

Pringles x Sabrina Carpenter

Sabrina’s new man is constructed entirely out of Pringles in the full Super Bowl spot for the stackable chip brand. He’s tall and mustached, but is he too much of a snack? You’ll have to watch yourself to see if the pop star and her edible lover last.

Budweiser rings in a milestone

The beer brand gets sentimental in celebration of its 150th anniversary this year, and this ad features a Budweiser horse mascot taking flight. (No, that’s not Pegasus.)

Tree Hut 

Tree Hut is known for its sugar scrubs and other body care products, and this goopy ad redefines what a smear campaign can look like. 

Hims on rich people

Opening with a few lines and images about rich people and health care, this Super Bowl ad from Hims — narrated by Common — asks you to consider your own wellness. 

Advertisement

Grubhub teases money… and food

Grubhub has delivered on its promise to “put their mouth where their money is” — and it’s not just about the food. Listen to what George Clooney has to say. 

Fanatics Sportsbook and Kardashian Kurse 

OK, technically Kendall Jenner isn’t a Kardashian, but you get the drift — and the rumors — with this cheeky Super Bowl ad from Fanatics Sportsbook, a sports betting platform.

Oikos powers you up

Kathryn Hahn impressively pushes a trolley uphill in this Oikos ad that also features Derrick Henry.

Michelob Ultra and Kurt Russell’s wisdom

Ante up for Kurt Russell and Lewis Pullman hitting the slopes in the newest Michelob Ultra big game spot, which also features two Olympians: T.J. Oshie and Chloe Kim. 

Advertisement

Nerds hang with Andy Cohen

It might be weird seeing Andy Cohen outside of his Bravo hosting duties and trading banter with Real Housewives of any city, but here he is. Nerds Candy suits up and hits the red carpet with Cohen in this SB spot. 

Bud Light keg roll

In this commercial, wedding attendees go after a Bud Light keg in a slow-motion scramble set to Whitney Houston’s I Will Always Love You. It’s Post Malone, Shane Gillis and Peyton Manning versus a particularly steep hill.

Universal Orlando Resort wants to change everything

Through the lens of four different visitors (and ads), the theme park is launching a campaign called This Changes Everything to encourage guests to take “transformative” vacations. You can follow one family in this “Lil’ Bro” Super Bowl spot.

Chris Stapleton and Traveller Whiskey make a moment

The singer strikes a chord in this whiskey ad, recalling Stapleton’s past Super Bowl performance when he was tapped to sing the National Anthem. 

Advertisement

Too salty to party with Ritz? 

This spot transports viewers to Ritz Island, but this isn’t a reference to the popular reality franchise, Love Island (as far as we can tell). Jon Hamm and Bowen Yang observe a party — and tantalizing Ritz crackers — from afar. They end up joining the function with a bit of help from Scarlett Johansson.

YouTube TV: Don’t support what’s ‘Meh’

Jason and Kylie Kelce contemplate the worst aspects of a world filled with meh in this ad for YouTube TV.

Lay’s potato tear-jerker

Who knew a potato chip ad could be so softhearted? It’s a family affair when it comes to farming potatoes, and sweet memories line the way to retirement. 

Volkswagen wants you to jump around

You know what? Hell yeah to House of Pain’s Jump Around, no matter the context. In the VW Big Game ad, the auto company beckons you to get out, get up and get around. 

Advertisement

Turbo Tax drama with Adrien Brody

To ease everyone into our least-favorite time of year, Adrien Brody acknowledges that death and taxes are sure things for us, but do they need to be painful? 

Source link

Advertisement
Continue Reading

Tech

AI’s GPU problem is actually a data delivery problem

Published

on

Presented by F5


As enterprises pour billions into GPU infrastructure for AI workloads, many are discovering that their expensive compute resources sit idle far more than expected. The culprit isn’t the hardware. It’s the often-invisible data delivery layer between storage and compute that’s starving GPUs of the information they need.

“While people are focusing their attention, justifiably so, on GPUs, because they’re very significant investments, those are rarely the limiting factor,” says Mark Menger, solutions architect at F5. “They’re capable of more work. They’re waiting on data.”

AI performance increasingly depends on an independent, programmable control point between AI frameworks and object storage — one that most enterprises haven’t deliberately architected. As AI workloads scale, bottlenecks and instability happens when AI frameworks are tightly coupled to specific storage endpoints during scaling events, failures, and cloud transitions.

Advertisement

“Traditional storage access patterns were not designed for highly parallel, bursty, multi-consumer AI workloads,” says Maggie Stringfellow, VP, product management – BIG-IP. “Efficient AI data movement requires a distinct data delivery layer designed to abstract, optimize, and secure data flows independently of storage systems, because GPU economics make inefficiency immediately visible and expensive.”

Why AI workloads overwhelm object storage

These bidirectional patterns include massive ingestion from continuous data capture, simulation output, and model checkpoints. Combined with read-intensive training and inference workloads, they stress the tightly coupled infrastructure upon which the storage systems are reliant.

While storage vendors have done significant work in scaling the data throughput into and out of their systems, that focus on throughput alone creates knock-on effects across the switching, traffic management, and security layers coupled to storage.

The stress on S3-compatible systems from AI workloads is multidimensional and differs significantly from traditional application patterns. It’s less about raw throughput and more about concurrency, metadata pressure, and fan-out considerations. Training and fine-tuning create particularly challenging patterns, like massive parallel reads of small to mid-size objects. These workloads also involve repeated passes through training data across epochs and periodic checkpoint write bursts.

Advertisement

RAG workloads introduce their own complexity through request amplification. A single request can fan out into dozens or hundreds of additional data chunks, cascading into further detail, related chunks, and more complex documents. The stress concentration is less about capacity, storage system speed, and more about request management and traffic shaping.

The risks of tightly coupling AI frameworks to storage

When AI frameworks connect directly to storage endpoints without an intermediate delivery layer, operational fragility compounds quickly during scaling events, failures, and cloud transitions, which can have major consequences.

“Any instability in the storage service now has an uncontained blast radius,” Menger says. “Anything here becomes a system failure, not a storage failure. Or frankly, aberrant behavior in one application can have knock-on effects to all consumers of that storage service.”

Menger describes a pattern he’s seen with three different customers, where tight coupling cascaded into complete system failures.

Advertisement

“We see large training or fine-tuning workloads overwhelm the storage infrastructure, and the storage infrastructure goes down,” he explains. “At that scale, the recovery is never measured in seconds. Minutes if you’re lucky. Usually hours. The GPUs are now not being fed. They’re starved for data. These high value resources, for that entire time the system is down, are negative ROI.”

How an independent data delivery layer improves GPU utilization and stability

The financial impact of introducing an independent data delivery layer extends beyond preventing catastrophic failures.

Decoupling allows data access to be optimized independently of storage hardware, improving GPU utilization by reducing idle time and contention while improving cost predictability and system performance as scale increases, Stringfellow says.

“It enables intelligent caching, traffic shaping, and protocol optimization closer to compute, which lowers cloud egress and storage amplification costs,” she explains. “Operationally, this isolation protects storage systems from unbounded AI access patterns, resulting in more predictable cost behavior and stable performance under growth and variability.”

Advertisement

Using a programmable control point between compute and storage

F5’s answer is to position its Application Delivery and Security Platform, powered by BIG-IP, as a “storage front door” that provides health-aware routing, hotspot avoidance, policy enforcement, and security controls without requiring application rewrites.

“Introducing a delivery tier in between compute and storage helps define boundaries of accountability,” Menger says. “Compute is about execution. Storage is about durability. Delivery is about reliability.”

The programmable control point, which uses event-based, conditional logic rather than generative AI, enables intelligent traffic management that goes beyond simple load balancing. Routing decisions are based on real backend health, using intelligent health awareness to detect early signs of trouble. This includes monitoring leading indicators of trouble. And when problems emerge, the system can isolate misbehaving components without taking down the entire service.

“An independent, programmable data delivery layer becomes necessary because it allows policy, optimization, security, and traffic control to be applied uniformly across both ingestion and consumption paths without modifying storage systems or AI frameworks,” Stringfellow says. “By decoupling data access from storage implementation, organizations can safely absorb bursty writes, optimize reads, and protect backend systems from unbounded AI access patterns.”

Advertisement

Handling security issues in AI data delivery

AI isn’t just pushing storage teams on throughput, it’s forcing them to treat data movement as both a performance and security problem, Stringfellow says. Security can no longer be assumed simply because data sits deep in the data center. AI introduces automated, high-volume access patterns that must be authenticated, encrypted, and governed at speed. That’s where F5 BIG-IP comes into play.

“F5 BIG-IP sits directly in the AI data path to deliver high-throughput access to object storage while enforcing policy, inspecting traffic, and making payload-informed traffic management decisions,” Stringfellow says. “Feeding GPUs quickly is necessary, but not sufficient; storage teams now need confidence that AI data flows are optimized, controlled, and secure.”

Why data delivery will define AI scalability

Looking ahead, the requirements for data delivery will only intensify, Stringfellow says.

“AI data delivery will shift from bulk optimization toward real-time, policy-driven data orchestration across distributed systems,” she says. “Agentic and RAG-based architectures will require fine-grained runtime control over latency, access scope, and delegated trust boundaries. Enterprises should start treating data delivery as programmable infrastructure, not a byproduct of storage or networking. The organizations that do this early will scale faster and with less risk.”

Advertisement

Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. For more information, contact sales@venturebeat.com.

Source link

Continue Reading

Trending

Copyright © 2025