Connect with us

Technology

Celebrating the 2025 Game Changers —Adaptation and resilience

Published

on

Celebrating the 2025 Game Changers —Adaptation and resilience

Lightspeed partnered with GamesBeat, Nasdaq, and leading judges and mentors once again to spotlight the 25 most innovative startups reshaping the gaming and interactive media industry. 

For half a century, video games have profoundly shaped consumer behavior and acted as a catalyst for significant technological innovation.

As an ever-increasing time of our lives is spent in immersive virtual worlds, gaming is expected to continue its pivotal role in how we play, work, and connect.

And like every industry, the gaming world is defined by outliers — the few companies that push the boundaries and pave the way for how the next generation of games are made and played. 

That’s why, after the success of last year’s program, Lightspeed, GamesBeat, and Nasdaq again teamed up with industry-leading judges and mentors to recognize extraordinary startups revolutionizing gaming and interactive media

Advertisement

This year’s theme: resilience and adaptation

The gaming industry is in constant flux, so this year’s list emphasizes resilience and adaptation — honoring ideas and strategies that demonstrate a startup’s ability to not only withstand the challenges facing the gaming & interactive media industry but excel in this turbulent environment. 

In June, we asked the gaming and venture communities to nominate standout startups across five key categories:

  • 3D technology & infrastructure
  • Generative AI & agents
  • Game studios & UGC
  • Interactive media platforms
  • Extended reality (AR & VR) 

To qualify, entrants had to have been in business no more than five years and have no more than 50 full-time employees. The goal of this list is to highlight startups that have unique and original visions with a strong focus on execution. 

Introducing the Top 25 

After receiving an overwhelming amount of entries (almost 40% more than last year), Lightspeed investors and GamesBeat editors narrowed the list of entrants to 75.

From there, a star panel of judges — including C-level gaming executives and senior operators across companies like Activision Blizzard, Amazon, TikTok, Riot Games, Tencent, and DeepMind — scored each candidate (with ~10 votes per startup). We then took the average scores across all judges for each company, sorted the list, and arrived at our final 25.

Advertisement

Here are our five best-in-category winners.

Best 3D technology & infrastructure: k-ID 

There are nearly a billion kids and teens globally that play games online. Yet, today, children pay the price for publisher shortcomings: trolling, cyberbullying, exploitation, toxicity, harassment, grooming, and other forms of online abuse.

The judges were impressed by k-ID’s bold approach in tackling one of the industry’s most pressing, complex challenges—providing a platform that allows both families and publishers to immediately create safe spaces for kids and teens online. At Lightspeed, we believe innovation should drive meaningful and positive change, and k-ID sets that standard, building infrastructure and technology that prioritizes both entertainment and safety. (Note: k-ID is a Lightspeed portfolio company.) 

Best Generative AI & agents: Bitmagic

The barrier to create games has always been high. While there are three billion gamers worldwide, there are only ~200,000 professional game developers. 

Bitmagic believes anyone should be able to create games. 

Bitmagic caught the judges’ attention for its groundbreaking use of generative AI — it’s the first system in the world that leverages generative AI to transform text prompts into fully interactive, multiplayer 3D games. By eliminating the traditional barriers to game development, we see that Bitmagic can act as a catalyst for game lovers, empowering users of all skill levels to bring their game ideas to life with unprecedented ease. 

Advertisement

With its recent availability on Steam Playtest, Bitmagic is well underway to democratize game development. 

Best game studios & UGC: Giant Skull

Led by industry veteran Stig Asmussen, the mind behind God of War III and the Star Wars Jedi series, Giant Skull is focused on creating AAA, story-driven action-adventure games. 

While their technical and visual achievements blew away our judges, our panel was also impressed by the company’s commitment to sustainability practices, work-life balance, and diversity, opting to employ remote developers all over the world.

Best extended reality (AR & VR): Eggscape

Eggscape impressed the judges with its unique blend of mixed reality (MR) and humor, a rare combination in XR. 

Known for their immersive VR experiences like Gloomy Eyes featuring Colin Farrell and Paper Birds featuring Edward Norton, Eggscape is an extended reality game where players navigate vivid 3D levels and interact with fellow gamers in real time, all while expecting the relentless invasion of alien robots. 

With Eggverse’s world-building tool that allows you to share creations with your friends, its combination of art direction, innovative gameplay, and social connection makes for a new direction in XR. 

Advertisement

Best interactive media platforms: Pok Pok

Gaming, especially for young children, can be overstimulating and addictive. Pok Pok stood out with its calm, creativity-driven approach. 

Inspired by Montessori principles, Pok Pok prioritizes open-ended, non-addictive educational games for children ages 2-8. Collaborating with the top minds in education, the interactive platform allows children to explore and learn at their own pace. 

The minimalist design and thoughtful execution have earned them an Apple Design Award, and highlight a shift in gaming towards more meaningful, balanced entertainment that resonates with both parents and children.

The 2025 Game Changers

Game Changers Judges on stage at GamesBeat NEXT in SF: Moritz Baier-Lentz, Dean Takahashi, Kylan Gibbs, Mihir Vaidya, and Lisha Li.

Gaming moves fast, and these companies are on the front lines of innovation. We’re thrilled to announce the additional 20 winners and honor them on stage at GamesBeat NEXT Summit tonight followed by a reception at our San Francisco office with our judges and some previous winners:

You can learn more about our winners on our new home page here

Advertisement

A special thank you to our judges, leaders, and mentors 

Thank you to all of the companies that submitted entries, as well as our panel of judges, who were inspired to give back to the community and brought their expertise and passion when evaluating these companies.

  • Allen Adham, Co-Founder and fr. Chief Design Officer at Blizzard Entertainment
  • Anna Sweet, CEO of Bad Robot Games
  • Ben Feder, Managing Partner of Tirta Ventures
  • Bonnie Rosen, General Manager of Disney Accelerator
  • Chris Bell, CEO & Game Director of Gardens Interactive 
  • Danny Lange, Vice President of BI & AI at Google
  • Dean Takahashi, Lead Writer of GamesBeat
  • Jim Yang, President of Hoyoverse / miHoYo 
  • Joe Tung, Co-Founder & CEO of Theorycraft Games
  • Johanna Faries, President of Blizzard Entertainment
  • John Hanke, Founder & CEO of Niantic
  • John W. Thompson, Fr. Chairman of Microsoft
  • Ken Wee, Chief Strategy Officer of Mattel and fr. Chief Strategy Officer of Activision Blizzard
  • Kylan Gibbs, Co-Founder & CEO of Inworld AI
  • Leo Olebe, VP of Global Partnerships at Microsoft Xbox
  • Maria Park, Vice President of Corporate Development at Krafton
  • Michael Chow, Co-Founder & CEO of The Believer Company
  • Mihir Vaidya, Chief Strategy Officer of Electronic Arts 
  • Moritz Baier-Lentz, Partner & Head of Gaming at Lightspeed
  • Riccardo Zacconi, Co-Founder and fr. CEO of King
  • Songyee Yoon, Fr. President & Chief Strategy Officer of NCsoft
  • V Pappas, Fr. COO of TikTok

Source link
Advertisement
Continue Reading
Advertisement
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Technology

The Simpsons will join Monday Night Football on ESPN+ and Disney+

Published

on

Menu

The town of Springfield will host a National Football League game in December at Atoms Stadium, but neither the Springfield Atoms nor the Shelbyville Sharks will take the field.

Instead, the Bengals-Cowboys game on December 9 will be transformed into the world of TV’s longest running sitcom The Simpsons for a special Funday Football edition of Monday Night Football. The special Simpsons-ized broadcast will air on the ESPN+ and Disney+ streaming services and the NFL+ mobile app. The game will broadcast in its regular form on ESPN, ESPN+, ABC, ESPN2 and ESPN Deportes.

The game will implement tracking technology to turn the players on the field and ESPN commentators Mina Kimes, Dan Orlovsky and Drew Carter into Simpsons characters. Kimes, Orlovsky and Carter will wear Meta Quest Pro headsets to see their virtual environments. The quarterbacks will be transformed into Bart for the Cincinnati Bengals and Homer for the Dallas Cowboys using Sony’s AI data analyzer and sports tracking and broadcast technology, according to .

The game will also feature more characters and pre-animated scenes from the show’s original cast including Hank Azaria, Nancy Cartwright, Dan Castellaneta, Julie Kavner and Yeardley Smith along with some surprise sports cameos. Characters like Lisa, Krusty the Clown, Carl, Lenny, Moe and Milhouse will be on the sidelines rooting for their respective teams. The announcement doesn’t mention Harry Shearer, so don’t expect Mr. Burns or Smithers to be at the game.

Advertisement

To view this content, you’ll need to update your privacy settings. Please click here and view the “Content and social-media partners” setting to do so.

This isn’t the first time that ESPN has turned a regular season NFL game into an animated spectacle. Last year, Disney, ESPN and the NFL teamed up to turn an October game between the Atlanta Falcons and the Jacksonville Jaguars into a Toy Story themed game that transformed London’s Wembley Stadium into Andy’s room. The kids’ cable network Nickelodeon has also aired a few NFL games for its NFL Slimetime broadcasts featuring live commentary from animated characters like SpongeBob voiced by Tom Kenny and Patrick Star voiced by Bill Fagerbakke and computerized slime spewing in the end zones after touchdowns.

Source link

Continue Reading

Technology

Study finds LLMs can identify their own mistakes

Published

on

Study finds LLMs can identify their own mistakes

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


A well-known problem of large language models (LLMs) is their tendency to generate incorrect or nonsensical outputs, often called “hallucinations.” While much research has focused on analyzing these errors from a user’s perspective, a new study by researchers at Technion, Google Research and Apple investigates the inner workings of LLMs, revealing that these models possess a much deeper understanding of truthfulness than previously thought.

The term hallucination lacks a universally accepted definition and encompasses a wide range of LLM errors. For their study, the researchers adopted a broad interpretation, considering hallucinations to encompass all errors produced by an LLM, including factual inaccuracies, biases, common-sense reasoning failures, and other real-world errors.

Most previous research on hallucinations has focused on analyzing the external behavior of LLMs and examining how users perceive these errors. However, these methods offer limited insight into how errors are encoded and processed within the models themselves.

Advertisement

Some researchers have explored the internal representations of LLMs, suggesting they encode signals of truthfulness. However, previous efforts were mostly focused on examining the last token generated by the model or the last token in the prompt. Since LLMs typically generate long-form responses, this practice can miss crucial details.

The new study takes a different approach. Instead of just looking at the final output, the researchers analyze “exact answer tokens,” the response tokens that, if modified, would change the correctness of the answer.

The researchers conducted their experiments on four variants of Mistral 7B and Llama 2 models across 10 datasets spanning various tasks, including question answering, natural language inference, math problem-solving, and sentiment analysis. They allowed the models to generate unrestricted responses to simulate real-world usage. Their findings show that truthfulness information is concentrated in the exact answer tokens. 

“These patterns are consistent across nearly all datasets and models, suggesting a general mechanism by which LLMs encode and process truthfulness during text generation,” the researchers write.

Advertisement

To predict hallucinations, they trained classifier models, which they call “probing classifiers,” to predict features related to the truthfulness of generated outputs based on the internal activations of the LLMs. The researchers found that training classifiers on exact answer tokens significantly improves error detection.

“Our demonstration that a trained probing classifier can predict errors suggests that LLMs encode information related to their own truthfulness,” the researchers write.

Generalizability and skill-specific truthfulness

The researchers also investigated whether a probing classifier trained on one dataset could detect errors in others. They found that probing classifiers do not generalize across different tasks. Instead, they exhibit “skill-specific” truthfulness, meaning they can generalize within tasks that require similar skills, such as factual retrieval or common-sense reasoning, but not across tasks that require different skills, such as sentiment analysis.

“Overall, our findings indicate that models have a multifaceted representation of truthfulness,” the researchers write. “They do not encode truthfulness through a single unified mechanism but rather through multiple mechanisms, each corresponding to different notions of truth.”

Advertisement

Further experiments showed that these probing classifiers could predict not only the presence of errors but also the types of errors the model is likely to make. This suggests that LLM representations contain information about the specific ways in which they might fail, which can be useful for developing targeted mitigation strategies.

Finally, the researchers investigated how the internal truthfulness signals encoded in LLM activations align with their external behavior. They found a surprising discrepancy in some cases: The model’s internal activations might correctly identify the right answer, yet it consistently generates an incorrect response.

This finding suggests that current evaluation methods, which solely rely on the final output of LLMs, may not accurately reflect their true capabilities. It raises the possibility that by better understanding and leveraging the internal knowledge of LLMs, we might be able to unlock hidden potential and significantly reduce errors.

Future implications

The study’s findings can help design better hallucination mitigation systems. However, the techniques it uses require access to internal LLM representations, which is mainly feasible with open-source models

Advertisement

The findings, however, have broader implications for the field. The insights gained from analyzing internal activations can help develop more effective error detection and mitigation techniques. This work is part of a broader field of studies that aims to better understand what is happening inside LLMs and the billions of activations that happen at each inference step. Leading AI labs such as OpenAI, Anthropic and Google DeepMind have been working on various techniques to interpret the inner workings of language models. Together, these studies can help build more robots and reliable systems.

“Our findings suggest that LLMs’ internal representations provide useful insights into their errors, highlight the complex link between the internal processes of models and their external outputs, and hopefully pave the way for further improvements in error detection and mitigation,” the researchers write.


Source link
Continue Reading

Technology

Here are the 5 Startup Battlefield finalists at TechCrunch Disrupt 2024

Published

on

Here are the 5 Startup Battlefield finalists at TechCrunch Disrupt 2024

The time has finally come to announce the five finalists of the Startup Battlefield. It all started earlier this year when the TechCrunch editorial team selected 200 companies from thousands that applied. From there, the team then chose the 20 finalists who pitched this week onstage at TechCrunch Disrupt 2024 to investor judges and packed crowds. 

This year’s finalists follow in the footsteps of Startup Battlefield legends like Dropbox, Discord, Cloudflare and Mint on the Disrupt Stage. With over 1,500 alumni having participated in the program, Startup Battlefield Alumni have collectively raised over $29 billion in funding with more than 200 successful exits.

The five finalists will pitch again on the Disrupt Stage on Wednesday, October 30 at 11:30 a.m. PT to Navin Chaddha (Mayfield), Chris Farmer (SignalFire), Dayna Grayson (Construct Capital), Ann Miura-Ko (Floodgate), and Hans Tung (Notable Capital).

Now, without further ado, here are the five TechCrunch Startup Battlefield 2024 finalists: 

Advertisement

It looks fake, or at least like a good illusion: There’s Gecko Materials founder Capella Kerst dangling a full wine bottle from her pinky finger, the only thing keeping it from smashing to pieces being the super-strong dry-adhesive her startup has brought to market. But it’s no trick. It’s the result of years of academic research that Kerst built on by inventing a method to mass-manufacture the adhesive. Inspired by the way real-life geckos’ feet grip surfaces, the adhesive is like a new Velcro — except it only needs one side, leaves no residue, and can detach as quickly as it attaches. It can do this at least 120,000 times and, as Kerst noted in a recent interview with TechCrunch, can stay attached for seconds, minutes, or even years.

Luna is a health and well-being app for teen girls that is designed to help them navigate teenhood. The app lets teens ask questions about their health and wellness and get responses from experts. It also lets them track their periods, moods, and skin. The London-based startup presented today on the Startup Battlefield stage at TechCrunch Disrupt 2024 to detail its mission to educate and support teen girls. Luna is the brainchild of best friend duo Jas Schembri-Stothart and Jo Goodall, who came up with the idea for the startup as part of an assignment during their MBA program at Oxford. 

For anyone who parties or goes out dancing, the risk of accidentally taking adulterated drugs is real. MabLab has created a testing strip that detects the five most common and dangerous additives in minutes. Co-founders Vienna Sparks and Skye Lam met in high school, and during college the pair lost a friend to overdose. It’s a story that, sadly, many people (including myself) can identify with. Thankfully, testing strips are a common sight now at venues and health centers, with hundreds of millions shipping yearly.

Six years ago, while researching for a college entrepreneurship competition, Valentina Agudelo identified a troubling gap in breast cancer survival rates between Latin America and the developed world, with women in her native Colombia and the rest of the continent dying at higher rates due to late detection. She realized that breast cancer is highly treatable when diagnosed early, yet many Latin American countries have large rural populations lacking access to mammograms and other diagnostic tools. So Agudelo and her two best friends decided to create a theoretical portable device that would detect breast cancer early.

Advertisement

In the summer of 2020, a fire broke out onboard a naval ship docked in San Diego Bay. For more than four days, the USS Bonhomme Richard burned as helicopters dropped buckets of water from above, boats spewed water from below, and firefighters rushed onboard to control the blaze. Before the embers had even cooled, lidar (light detection and ranging) scans were taken to assess how bad the damage was and to figure out how the fire even started. But the investigation was stalled, partially because of how hard it is to send lidar scans.

Source link

Continue Reading

Technology

AMD confirms its next-gen RDNA 4 GPUs will launch in early 2025

Published

on

AMD confirms its next-gen RDNA 4 GPUs will launch in early 2025

AMD’s Q3 2024 earnings call today wasn’t bullish on gaming revenue overall, but it did confirm a hot new rumor on GPUs — specifically, the launch of AMD’s next-gen RDNA 4 parts early next year. “We are on track to launch the first RDNA4 GPUs in early 2025,” said AMD CEO Lisa Su, and the company confirmed to PCWorld that it’s the first time it’s shared those plans publicly.

“In addition to a strong increase in gaming performance, RDNA 4 delivers significantly higher ray tracing performance and adds new AI capabilities,” Su said on the call.

AMD expects its gaming revenue to continue to decline this quarter, due in no small part to the PlayStation 5 and Xbox Series consoles aging out, and it’s not exactly the company’s primary focus these days anyhow. On today’s call, Su pointed out how gaming only accounts for two percent of the company’s revenue, while data center is now well over half of the company’s business. She says that after spending 10 years turning AMD around, her next task is to “make AMD the end-to-end AI leader.”

Source link

Continue Reading

Technology

Apple’s keyboards, mice, and trackpads are finally improving – now it’s time for more peripherals

Published

on

Apple's keyboards, mice, and trackpads are finally improving - now it's time for more peripherals

Apple has been dropping tons of new releases for its most popular product lines like the M4 iMac and M4 Mac mini this week, but one of the biggest surprises was the tech giant relaunching three of its most well-known peripherals — Magic Mouse, Magic Keyboard, and Magic Trackpad — now equipped with USB-Type C compatibility.

However, when looking at the list of Apple-branded accessories currently available, it all feels a bit…lacking.

Source link

Continue Reading

Technology

Follow Mars rover’s 18-mile trip in NASA’s animated route map

Published

on

Follow Mars rover’s 18-mile trip in NASA’s animated route map

Perseverance Mars Rover Drive Path Animation

NASA has shared a fascinating animation showing the route taken by the Perseverance rover on Mars since its arrival there in February 2021.

Perseverance is NASA’s most advanced Mars rover to date, and while its general routes are decided by a team at NASA’s Jet Propulsion Laboratory in Southern California, the rover actually moves forward autonomously, checking for hazards and moving around any problematic objects as it goes.

The animation covers the entire 18.7 miles (about 30 kilometers) traveled by Perseverance over the last 44 months, and includes the locations where it’s been collecting samples of Mars rock and soil.

Those samples will be returned to Earth in the coming years so that scientists can study them in laboratory conditions to try to determine whether microbial life ever existed on the red planet.

Advertisement

Most of Perseverance’s travels have taken place inside Jezero Crater, a place once filled with water and which scientists believe has the best chance of containing evidence of ancient life.

In recent months, however, Perseverance has embarked on a challenging climb up the side of the crater and is now tackling its steepest inclines to date.

Because much of the material it’s currently driving over comprises loosely packed dust and sand with a thin, brittle crust, Perseverance has recently been slipping a lot and has covered only about 50% of the distance that it would have managed on a more stable surface. On one occasion, it managed only 20% of the planned route.

“Mars rovers have driven over steeper terrain, and they’ve driven over more slippery terrain, but this is the first time one had to handle both, and on this scale,” said JPL’s Camden Miller, who is a rover planner, or “driver,” on the Perseverance mission. “For every two steps forward Perseverance takes, we were taking at least one step back. The rover planners saw this was trending toward a long, hard slog, so we got together to think up some options.”

Advertisement

The team used a replica rover on Earth to test out some new maneuvers aimed at reducing slippage, and also considered alternative routes featuring different terrain. Assessing the data, the planners settled on altering the route, and Perseverance is continuing on its way at a steady pace.

“That’s the plan right now, but we may have to change things up the road,” Miller said. “No Mars rover mission has tried to climb up a mountain this big, this fast. The science team wants to get to the top of the crater rim as soon as possible because of the scientific opportunities up there. It’s up to us rover planners to figure out a way to get them there.”

Those opportunities include access to rocks from the most ancient crust of Mars that were formed from a wealth of different processes. Rocks there have never been analyzed close up before, and they could potentially include once habitable environments.






Source link

Advertisement

Continue Reading

Trending

Copyright © 2024 WordupNews.com