We got a sense recently for the “metaverse stock price” as it stands in 2024 at our recent GamesBeat Next 2024 event.
Neal Stephenson talked about that notion as he did a talk about how to make sci-fi come true and turn the dreams for an open metaverse into reality. Stephenson famously coined the word “metaverse” in his novel Snow Crash that debuted in 1992. I read the novel back then and I was honored to co-moderate a fireside chat with Stephenson at our recent GamesBeat Next 2024 event.
Riz Virk, author of the Simulation Hypothesis, which is about whether we’re living in a simulation, joined me as co-moderator. I’m pretty sure our talk with Stephenson was real, and that Virk is also a faculty associate at Arizona State University, founder of Play Labs, and venture partner at Griffin Gaming Partners.
Stephenson has written many science fiction novels, but he joined us in a session entitled “The science fiction future that we want.” And he is dedicated to turning some of his ideas, like the metaverse, into science fact. He is cofounder of Whenere, which is making a game where users can use AI to enhance their storytelling. Whenere is what creators would use to create linear narratives. And Stephenson is also cofounder of Lamina1, a Web3 company focused on fair compensation for digital creators.
Advertisement
We started out with his definition of the metaverse, which for him has a spatial element, and then we strayed into discussions of the “metaverse stock price” and whether games like Fortnite, Minecraft and Roblox count as metaverse applications.
We also discussed Whenere’s attempt to let users create their own stories, first around Jane Austen’s Pride & Prejudice universe (which is no longer copyrighted). Interestingly, Stevenson said he doesn’t use AI to write because he “knows how to write.”
Asked about the kind of science fiction future he wants, he said he is concerned about “carbon” and the fact that so many people don’t know what is real. (Given recent events, I can relate to the latter one). We even talked about digital twins and the notion that the metaverse might be inside Microsoft Flight Simulation 2024. We quizzed him about his newest novel Polostan, about the pre-atomic bomb era, and whether it has parallels to our era ahead of general artificial intelligence. And we asked if there would be a Snow Crash 2 or a Snow Crash film.
Here’s an edited transcript of our fireside chat with Stephenson. You can also watch the video in this post.
Advertisement
Riz Virk: Neal, you were talking recently about Matthew Ball and Tim Sweeney. You offered a definition of the metaverse: a massively multiplayer online universe that has a sense of space, where there are experiences distributed around that space in a way that’s perceived by all of its users in the same way. You can move from one place to another and interact with other users who are not physically present. It’s not controlled by any one entity. Many creators large and small build things there.
Stephenson: That was me being somewhat off the cuff, but when you read it back, it covers most of the important bases of what we want from a metaverse.
GamesBeat: I noted that the word metaverse on Google trends saw its peak in 2021, after Mark Zuckerberg changed his company’s name to Meta. The word has had a slight comeback, but it’s nowhere near as popular as it was during the pandemic. What observation would you have on this?
Stephenson: Tim Sweeney, in that conversation you mentioned, which is a pretty interesting document – you can find it on Matt Ball’s website – he likened it to a stock whose value goes up and down. But it’s always there at some level. If somebody does something cool that’s connected with the idea of the metaverse then the stock rises. If somebody does something lame the value goes down. But the ups and downs are against the context that it’s an ongoing project. It doesn’t necessarily cease to exist just because it’s gone into a down phase.
GamesBeat: Fortnite, Roblox, Minecraft happened and the stock goes up. But if something in the market doesn’t pan out, it’s going down.
Advertisement
Stephenson: To the extent that people think–it’s clear, unequivocally, that Tim thinks of the three applications you mentioned as absolutely being metaverse applications. By that standard, there are many hundreds of millions of people using it all the time and it’s making money. If you have a different definition of what the metaverse is, if you think of it as exactly what’s described in the novel, then it’s still a little ways out.
Virk: Snow Crash had the idea of programs like the Librarian and other AI characters within the metaverse. Sometimes I like to joke that the AI in the metaverse are the real residents. The rest of us just visit as avatars. I’m curious about this recent trend of smart NPCs. Companies like Inworld and Replika are creating these NPCs that are basically light wrappers around LLMs like ChatGPT. What are you thoughts about how AI will evolve in the metaverse?
Stephenson: That’s one we’re working on with Whenere, which is the product that (emcee) Tadhg (Kelly) just alluded to. We started experimenting with Inworld’s AI technology at the beginning of 2023. We whipped up a demo, a character called Virj from the Snow Crash universe, who we created in Unreal Engine using the Inworld AI platform. We were impressed by it. It was fascinating, which is how we got going on our current project. We’re very much paying attention to that and using those tools in an intensive manner every day. We think there’s huge potential there, which is why we’re doing it.
GamesBeat: You have some more things going on at Whenere, like the Jane Austen novel, this marriage of AI and storytelling.
Stephenson: Like I said, the first thing we tried was this character from Snow Crash. On further reflection, one of my co-founders came up with the idea of instead starting with the world of Pride and Prejudice, for several reasons. One is that we love it, but beyond just that, it’s in the public domain. We don’t have to spend the first year fucking around with lawyers. It’s conversation-based. There’s no starship battles or gunfights or other things that are hard and expensive to bring to life in a game engine. It’s people sitting in rooms talking to each other. We thought it was a good test case to prove the point that we wanted to prove about whether this could be a rewarding and engaging platform.
Advertisement
Virk: Does that mean you play as one of the characters in Pride and Prejudice?
Stephenson: We’re kind of hardcore believers in linear narrative. We’re not trying to make a complete open world where you can go in and fundamentally change what happens in the story. People like story worlds for a reason. For example, if you made the world of the Lord of the Rings, you could go into the Green Dragon pub and wait for Frodo to come in and say, “Don’t go through Moria. It’s very dangerous. Go around.” You could say a lot of things to those characters that would screw up the story of the book. The story of the book is what people love. They don’t want to see that change.
We do think people might want to immersively sit in that world and have less consequential interactions with characters in those worlds. As well as be able to write their own stories and see those stories play out in those worlds.
Virk: Could you then allow people to create their own worlds based on their own stories, or is it more that the company is going to curate these worlds?
Stephenson: Building a world–I don’t need to explain to this audience that building a world convincingly is expensive. Someone has to do that. In theory, someone who has the staff and the budget to create any world they want in a game engine. The engine we’re using is Unreal. But we think it would be a lot easier for users if a world is supplied to them with all the pieces there. Then you could make changes to it, but you wouldn’t have to build the entire thing from scratch.
Advertisement
Virk: A lot of people are using AI for writing these days. What is your writing process like, and are you thinking of using AI anywhere in that process?
Stephenson: No. I already know how to write, so I don’t need help on that front. The act of writing is pleasurable to me. Making art is both a form of enjoyment for artists and a way of enhancing their own powers, exercising their own intellect. There’s a quote–this is terrible, but I can’t remember the name of the writer who put this up on Twitter. I quote her and give her credit on my Substack. She says, “I don’t want AI to make art and poetry so I can do the dishes and run the laundry. I want AI to do the dishes and run the laundry so I can make art and poetry.”
GamesBeat: The interesting question there is, what if your users ask AI to write something better than Neal Stephenson?
Stephenson: It can try. There are all kinds of ways, seriously, that AI can–for example, the voices we’re using are from ElevenLabs. ElevenLabs is using some kind of AI system where you feed it some text and it figures out how to say that line of dialogue in a way that sounds like an actor. It’s not perfect, but it’s surprisingly good. That’s an example of making a tool powered by AI that gives creators some agency, as opposed to just jerking the steering wheel out of their hands.
GamesBeat: What is the science fiction future that we want?
Advertisement
Stephenson: We in this room?
GamesBeat: We in this room, the game industry, the world…
Stephenson: “We” questions are tricky. People in social media discourse are always using that word. We should do this. We shouldn’t do that. It gets complicated when you start to ask the question, “Who exactly is the ‘We’ we’re talking about?”
GamesBeat: Is there some science fiction that you want?
Advertisement
Stephenson: Talking about big picture social concerns, if that’s where we’re going with this, the two big things that I mostly worry about are carbon and the fact that people can’t agree on what is real. There’s all kinds of hard science fiction you could write about ways to deal with the carbon problem that would be nice if they came true. So far the second problem I mentioned is trickier to work out. I’m not sure if science fiction is ready to tackle that.
Virk: A few years ago you announced that you were co-founder of Lamina1. For many people that was like seeing an intersection of science fiction and real-world innovation. Can you give us an update on Lamina1 and what you’re up to there?
Stephenson: For those who aren’t familiar with it, the idea was that when the metaverse suddenly hit that spike in popularity in late 2021, early 2022, we would try to build a system that creators could use to track their contributions to an open, decentralized metaverse, and hopefully make money from them. The thing that was obvious to me, and still is, was that there was going to be a metaverse, by the definition quoted earlier. It would come out in the game industry in the sense that game industry people know how to use the tool chain that’s necessary to build those kinds of experiences. You can’t have millions of people using the metaverse unless there are experiences that millions of people enjoy. It’s the game industry that knows how to deliver that.
The thing I thought might be missing was some way that you could post your contributions to the metaverse, have them attributed to you, and hopefully have revenue flow into your wallet if the thing you made reached an audience and became popular. That’s the founding vision of Lamina1, which is a blockchain. I’m the chairman. For me it’s a couple of hours a week. The CEO and powerhouse behind it is Rebecca Barkin, who is someone I met when we were both at Magic Leap. She’s been working with a terrific engineering team of people who know what they’re doing with crypto and blockchain. In spite of serious headwinds that hit that industry in 2022 and 2023, they’ve managed to keep that going and launch the chain in May. It’s being used. The system works. We’re starting to flex our muscles a bit creatively and get some content up there.
GamesBeat: I thought it was interesting that the different pieces you’re highlighting point to a very similar view of the open metaverse that you see from Tim Sweeney. He doesn’t want it to be controlled by any one party, any big platforms. Is there a meeting of the minds there? Do you have your own views on how the open metaverse should be built?
Advertisement
Stephenson: For the most part Tim and I are more aligned than not. What I hear from him typically has me nodding my head in agreement. He’s still quite cautious and skeptical about blockchain. He thinks it’s an interesting technology that got adopted too soon. It should have spent more time in the lab. I think that’s the gist of what he says in the Matthew Ball interview. He has similar skepticism about AI, about LLMs, based on ethical considerations around the fact that these things are trained–the big models are trained on data with a provenance that isn’t fully nailed down. There’s some controversy about where the data sets came from.
One of the reasons we picked an old book to begin the Whenere project is that the specific training data for the characters in that world is all in the public domain. It’s all 200 years old. But there’s no getting around the fact that the big model that powers the whole thing has data from all over the place. I think Tim has some scruples around that, which I respect. He has a very principled set of rules he likes to follow in picking projects that he wants to advocate and work on.
Virk: You came out with Fall in 2019. That was the same year I came out with my book The Simulation Hypothesis, which is about this idea that we’re already living inside a simulated environment. I’ve often said that the future of the metaverse is going to this point where we’ll be unable to distinguish a virtual world from a physical world. You would be unable to distinguish AI characters from human-controlled avatars or uploaded characters. My question is, do you think we’ll get to that point where video games will be indistinguishable from reality?
Stephenson: They’re certainly getting damn good. I don’t know about indistinguishable. If you want to throw enough processing power at it, you can use metahumans and other features of a modern game engine to make something that’s definitely cinematic quality. Of course you’re still looking at it on a two-dimensional screen.
Beyond that we’re talking far, far out in the future. The thing that got me going on Fall was David Deutsch’s books. The second one is called The Beginning of Infinity. He talks about this problem of simulating reality and what kind of computation power it takes to make increasingly good simulations. I’m going to completely mangle his thesis and dumb it down to something I can work with, which is that to make a simulation that’s as good as the universe, you have to have a computer the size of the universe. If you take that point of view, that’s where I was going. That’s the idea I was playing with in the book you mentioned.
Advertisement
GamesBeat: Will Wright once said that a dog-eared copy of Snow Crash was the business plan for every startup in Silicon Valley. How do you feel about this ability to influence real life?
Stephenson: Riz has a connection with the Center for Science and the Imagination, which was actually started to address the thing you’re talking about. It happened probably 15 years ago when I was on a stage like this with Michael Crow, the president of Arizona State. He said, “When are science fiction writers going to stop writing all this dystopian crap and write something that inspires people again?” We actually wrote a book, created an anthology at CSI called Hieroglyph. We were trying to get a bunch of science fiction writers to do that.
It turned out to be surprisingly hard to break people out of the dystopian groove, but I still think it was a worthy experiment. I’m not sure how much of it exerted any influence per se, but from time to time a science fiction book can be somewhat useful in getting a bunch of people in a company roughly pointed in the same direction.
GamesBeat: We know you love history. Your books jump between the future and the past a lot. What’s your view of history as an influence on science fiction?
Stephenson: I think it’s always the case that if you scratch a science fiction writer, you’ll find a history geek. I was reading old anthologies of science fiction stories as a kid, and there were all kinds of historical stories sprinkled in there. They would find ways to send someone back in time or bring a historical character forward in time. That’s been the case forever with science fiction writers. I guess I’m no exception.
Advertisement
Virk: Since you write about the history of the atomic bomb, do you think there are any lessons here for what’s happening about AI today?
Stephenson: I guess the way I would put it is that once they figured out how to control the power of the atom, they went out and started making bombs. We obliterated an atoll from the map of the Pacific Ocean. That’s an impressive demo of the power of the atom. But a lot of people were of a mindset–gee, I kind of like the glow in the dark watch dial so I can tell the time at night. Maybe we should work on radiotherapy to treat certain diseases.
There’s a similar thing happening now with AI. The people making the big systems want to demonstrate the equivalent of blowing up an atoll. That’s all very impressive, but as I was mentioning before, I think the real utility of it is going to be much more focused, fine-grained tools that solve actual problems for people.
GamesBeat: There are lots of interesting projects underway around digital twins. The enterprises of the world are using game engines to make these for things like BMW factories before they build them. Once the digital twin is perfect they build it in the physical worlds. These projects are so big that they’re building digital twins of the earth now. Microsoft’s Flight Simulator 2024 is essentially a digital twin of the earth. Nvidia has been working on something called Earth 2 to build a climate model to predict climate change in the decades to come. Are we going to be putting these versions of the earth together to create a metaverse that’s a full digital twin of our planet?
Stephenson: To be pedantic, that’s a different thing from the metaverse. In Snow Crash you also have an application called Earth that’s just a utility that looks like the earth made of cartographic data. A digital twin of the earth is a fascinating and cool project, it’s just a different kind of project from what I think of as the metaverse, which is an imaginary space full of imaginary experiences. But for sure, the ability to simulate climate and geological processes at scale in a digital twin of the earth is something I very much look forward to playing with.
Advertisement
GamesBeat: We know your novel Seveneves is coming to the small screen, with a project in the works at Legendary Pictures. Will we see a Snow Crash film, or a Snow Crash 2? What are some technological elements we could see in a Snow Crash 2?
Stephenson: I’ve written some prequel material in the Snow Crash universe. But nothing that I would consider Snow Crash 2, not a lot of sequel stuff. It’s hard enough to get a movie made of Snow Crash one. Seveneves is at Legendary and they’re starting to work on it as a TV idea. Snow Crash is at Skydance. They’re working on it as one or more feature films. Beyond that I can’t say anything. They’re pretty tight-lipped about announcing what’s going on.
The funny thing is that if it had happened earlier, it would have sucked. People in 1990 would have said, “Oh, cool, a computer graphics universe. Let’s make the metaverse.” And they would have made it look like computer graphics looked back then. We’d be looking at it now and cringing at the poor quality of the graphics. It would be campy at this point. There was a certain point when various people who’ve come and gone, people who talked about making a Snow Crash movie–they realized that the metaverse that existed in the book had to be full cinematic quality. It wasn’t meant to be discernible from film shot with human actors. We dodged a bullet, I think.
Question: This conversation has largely revolved on what you want in the future. What is the future that you think we’re actually going to get?
Stephenson: Obviously it’s been a crazy year for the game industry. There’s some kind of sea change happening. That’s the optimistic take on it. What we’ll see coming from the next generation of game projects may look very different from what we have now. I hope, as I’ve made clear–I think we’re at a threshold now where we have new ways of interacting with game worlds. Game worlds have, for a very long time, been based on what amounts to a point and click interface. You have a cursor on the screen. You get it over something. You click the mouse button or hit a key and something happens. Most commonly you shoot someone.
Advertisement
That’s great fun. I don’t knock it at all. But the thing that was already happening, and was massively accelerated by COVID, is that everyone now has microphones on their computers. They’re in the habit of talking into computers. The ability to interact with a game world by talking and listening, to make a really terrible pun, is a game-changer. That’s going to open up a lot of interesting creative avenues for the industry going forward. We may see other new kinds of interactive schemes available as well, based on the camera looking at the player’s face and so on.
Question: You talked about how AI will not write your stories for you, but you do believe in the tools side. Can you dive deeper into what you get most excited about in terms of AI as it relates to storytelling?
Stephenson: Everyone has their own creative strengths and weaknesses, things they know how to do, that they’re comfortable doing, and other areas where they feel a bit of help would be valuable, especially if it’s taking over something that feels like a chore, that’s not very rewarding to do. I was looking at DaVinci Resolve the other day. A big part of what that program is famous for is color grading, which is an infamously meticulous and detailed process. The people who do it are wizards, amazing contributors to the creative process. In a perfect world you could go out and hire someone who’s great at it, but for a lot of people it’s serious drudgery. You know it’s terribly important, but you don’t know quite how to do it. For everyone who works in creative areas there are things like that, where AI can provide tools that extend the artist’s power without taking away the artist’s prerogatives.
VB Daily
Advertisement
Stay in the know! Get the latest news in your inbox daily
Monarch Tractor has laid off around 10% of its workforce as part of a restructuring that will see it prioritize non-agricultural customers, license its autonomous technology, and boost sales of its AI-powered farm management software, TechCrunch has learned.
Around 35 employees were cut this week by the Livermore, California-based autonomous electric tractor startup that has raised a total of $220 million since it was founded in 2018. Some Monarch workers told TechCrunch they were let go without severance. It’s the second cut this year; Monarch previously laid off around 15% of its workforce in July.
CEO Praveen Penmetsa told TechCrunch in an interview the company decided to restructure after a slower-than-expected third quarter, and despite raising $133 million in July from the likes of Foxconn and agri-food tech impact firm Astanor. Penmetsa said he was uncertain if employees were let go without severance, but that the company has been trying to help out laid-off workers on a case-by-case basis.
“All of this happened pretty quickly,” Penmetsa said, referring to the recent crash of California’s vineyards, which made up a bulk of Monarch’s early customers. That development, plus an ongoing pullback in agri-tech investing, left Penmetsa and his team looking at other options.
Advertisement
“The industry has slowed down on acquisition of new equipment and new solutions, especially in the core farming sectors,” Penmetsa said. “But in the meantime, as a platform company, we also have some very exciting non-agriculture opportunities that started sprouting because of our success in ag.”
He said the company, which has shipped 500 tractors to date, is now focused on widening its customer base in a number of ways. It is expanding beyond agricultural customers to golf courses, solar farms, and even municipalities. It’s also putting more focus on selling its “WingspanAI” farm management software. And Monarch is in talks with other “off-road” vehicle companies to license its autonomous technology.
Penmetsa said these changes inspired the cuts which hit, among other things, some of Monarch’s engineering and operations team. He also said that Monarch is leaning more on contract-manufacturer Foxconn, which builds the tractors at its Lordstown, Ohio facility, for operational roles.
“We are a startup,” Penmetsa said. “You have to be agile, right?”
The Beatles have been nominated for two Grammys — nearly 50 years after the band officially split up. Their final song, called “Now and Then,” was restored last year with the help of AI, and is now up for record of the year alongside the likes of Beyoncé, Charlie XCX, Billie Eilish, and Taylor Swift. It’s also been nominated for best rock performance, where it goes up against Green Day, Pearl Jam, and The Black Keys.
However, “Now and Then” was never released, as technology at the time couldn’t separate John’s vocals and piano to get a clear sound. But in 2021, filmmaker Peter Jackson and his sound team were able to separate the instrumentals and vocals with machine learning technology, allowing Paul McCartney and Ringo Starr to finally complete the song.
Though “Now and Then” was finished using machine learning, it still falls within the bounds of The Grammy’s rules surrounding AI. The guidelines currently state that “only human creators are eligible to be submitted for consideration for, nominated for, or win a GRAMMY Award,” but work that contains “elements” of AI material is “eligible in applicable categories.”
It’s a bit strange to see “Now and Then” competing with modern-day music like Beyoncé’s “Texas Hold ‘Em,” but it’s been a long time coming. We’ll get to see how the Beatles fare during the 2025 Grammy Awards, which takes place on Sunday, February 2nd.
Someone moved the UK’s oldest satellite and there appears to be no record of exactly who, when or why.
Launched in 1969, just a few months after humans first set foot on the Moon, Skynet-1A was put high above Africa’s east coast to relay communications for British forces.
When the spacecraft ceased working a few years later, gravity might have been expected to pull it even further to the east, out over the Indian Ocean.
But today, curiously, Skynet-1A is actually half a planet away, in a position 22,369 miles (36,000km) above the Americas.
Advertisement
Orbital mechanics mean it’s unlikely the half-tonne military spacecraft simply drifted to its current location.
Almost certainly, it was commanded to fire its thrusters in the mid-1970s to take it westwards. The question is who that was and with what authority and purpose?
It’s intriguing that key information about a once vital national security asset can just evaporate. But, fascination aside, you might also reasonably ask why it still matters. After all, we’re talking about some discarded space junk from 50 years ago.
“It’s still relevant because whoever did move Skynet-1A did us few favours,” says space consultant Dr Stuart Eves.
Advertisement
“It’s now in what we call a ‘gravity well’ at 105 degrees West longitude, wandering backwards and forwards like a marble at the bottom of a bowl. And unfortunately this brings it close to other satellite traffic on a regular basis.
“Because it’s dead, the risk is it might bump into something, and because it’s ‘our’ satellite we’re still responsible for it,” he explains.
Dr Eves has looked through old satellite catalogues, the National Archives and spoken to satellite experts worldwide, but he can find no clues to the end-of-life behaviour of Britain’s oldest spacecraft.
It might be tempting to reach for a conspiracy theory or two, not least because it’s hard to hear the name “Skynet” without thinking of the malevolent, self-aware artificial intelligence (AI) system in The Terminator movie franchise.
Advertisement
But there’s no connection other than the name and, in any case, real life is always more prosaic.
What we do know is that Skynet-1A was manufactured in the US by the now defunct Philco Ford aerospace company and put in space by a US Air Force Delta rocket.
“The first Skynet satellite revolutionised UK telecommunications capacity, permitting London to securely communicate with British forces as far away as Singapore. However, from a technological standpoint, Skynet-1A was more American than British since the United States both built and launched it,” remarked Dr Aaron Bateman in a recent paper on the history of the Skynet programme, which is now on its fifth generation.
This view is confirmed by Graham Davison who flew Skynet-1A in the early 70s from its UK operations centre at RAF Oakhanger in Hampshire.
Advertisement
“The Americans originally controlled the satellite in orbit. They tested all of our software against theirs, before then eventually handing over control to the RAF,” the long-retired engineer told me.
“In essence, there was dual control, but when or why Skynet-1A might have been handed back to the Americans, which seems likely – I’m afraid I can’t remember,” says Mr Davison, who is now in his 80s.
Rachel Hill, a PhD student from University College London, has also been scouring the National Archives.
Her readings have led her to one very reasonable possibility.
Advertisement
“A Skynet team from Oakhanger would go to the USAF satellite facility in Sunnyvale (colloquially known as the Blue Cube) and operate Skynet during ‘Oakout’. This was when control was temporarily transferred to the US while Oakhanger was down for essential maintenance. Perhaps the move could have happened then?” Ms Hill speculated.
The official, though incomplete, logs of Skynet-1A’s status suggest final commanding was left in the hands of the Americans when Oakhanger lost sight of the satellite in June 1977.
But however Skynet-1A then got shifted to its present position, it was ultimately allowed to die in an awkward place when really it should have been put in an “orbital graveyard”.
This refers to a region even higher in the sky where old space junk runs zero risk of running into active telecommunications satellites.
Advertisement
Graveyarding is now standard practice, but back in the 1970s no-one gave much thought to space sustainability.
Attitudes have since changed because the space domain is getting congested.
At 105 degrees West longitude, an active satellite might see a piece of junk come within 50km of its position up to four times a day.
That might sound like they’re nowhere near each other, but at the velocities these defunct objects move it’s starting to get a little too close for comfort.
Advertisement
The Ministry of Defence said Skynet-1A was constantly monitored by the UK’s National Space Operations Centre. Other satellite operators are informed if there’s likely to be a particularly close conjunction, in case they need to take evasive action.
Ultimately, though, the British government may have to think about removing the old satellite to a safer location.
Technologies are being developed to grab junk left in space.
Already, the UK Space Agency is funding efforts to do this at lower altitudes, and the Americans and the Chinese have shown it’s possible to snare ageing hardware even in the kind of high orbit occupied by Skynet-1A.
Advertisement
“Pieces of space junk are like ticking time bombs,” observed Moriba Jah, a professor of aerospace engineering at the University of Texas at Austin.
“We need to avoid what I call super-spreader events. When these things explode or something collides with them, it generates thousands of pieces of debris that then become a hazard to something else that we care about.”
Strands is the NYT’s latest word game after the likes of Wordle, Spelling Bee and Connections – and it’s great fun. It can be difficult, though, so read on for my Strands hints.
SPOILER WARNING: Information about NYT Strands today is below, so don’t read on if you don’t want to know the answers.
Your Strands expert
Your Strands expert
Marc McLaren
NYT Strands today (game #251) – hint #1 – today’s theme
What is the theme of today’s NYT Strands?
• Today’s NYT Strands theme is… Generation jam
Advertisement
NYT Strands today (game #251) – hint #2 – clue words
Play any of these words to unlock the in-game hints system.
NYT Strands today (game #251) – hint #3 – spangram
What is a hint for today’s spangram?
• GrungePop?
NYT Strands today (game #251) – hint #4 – spangram position
What are two sides of the board that today’s spangram touches?
First: top, 3rd column
Last: bottom, 3rd column
Advertisement
Right, the answers are below, so DO NOT SCROLL ANY FURTHER IF YOU DON’T WANT TO SEE THEM.
NYT Strands today (game #251) – the answers
The answers to today’s Strands, game #251, are…
BLUR
OASIS
PAVEMENT
TOOL
NIRVANA
SUBLIME
SPANGRAM: NINETIESBANDS
My rating: Moderate
My score: Perfect
I have slightly mixed feelings about this Strands puzzle. On the one hand, the subject is NINETIESBANDS, and in many ways that’s my specialist subject. I’m a huge music fan (and former music journalist) and the ’90s was my era. Two of the bands in today’s list, PAVEMENT and BLUR, are among my favorites ever, and I have a soft spot for NIRVANA too.
Sign up to be the first to know about unmissable Black Friday deals on top tech, plus get all your favorite TechRadar content.
On the flip side, what on earth does ‘Generation jam’, the theme clue, refer to? I have no idea whatsoever. And setting that aside, this may prove very difficult for people who are either too young or too old to be familiar with the likes of TOOL and SUBLIME, who aren’t necessarily household names (or aren’t these days, at least). Still, rather this than yesterday’s silly emoji-based game.
Yesterday’s NYT Strands answers (Friday, 8 November, game #250)
SHAKE
WAVE
CLAP
PRAY
PINCH
FIST
POINT
PEACE
SPANGRAM: HANDGESTURES
What is NYT Strands?
Strands is the NYT’s new word game, following Wordle and Connections. It’s now out of beta so is a fully fledged member of the NYT’s games stable and can be played on the NYT Games site on desktop or mobile.
I’ve got a full guide to how to play NYT Strands, complete with tips for solving it, so check that out if you’re struggling to beat it each day.
A number of popular generative AI platforms are seeing consistent growth as users are figuring out how they want to use the tools — and ChatGPT is at the top of the list with the most visits, at 3.7 billion worldwide. So many people are visiting the AI chatbot, and its figures are rivaling browser market share. It can only be compared to Google Chrome figures in terms of monthly users, which is estimated to be around 3.45 billion.
Statistics from Similarweb indicate that ChatGPT saw a 17.2% month-over-month (MoM) growth and a 115.9% year-over-year (YoY) traffic growth. Some highlights that spurned the ChatGPT growth during 2024 include its parent company, OpenAI, updating its web address from a subdomain, chat.openai.com, to a main domain, chatgpt.com. The tool especially saw a surge of traffic in May 2024, when it hit a 2.2-billion-visit milestone, and has been growing ever since, according to Similarweb researcher David F. Carr.
While ChatGPT has had an infamous presence in the technology industry since November 2022, OpenAI, continues to flesh out its product with new versions, features, and supplementary applications, or GPTs, to keep users interested. Recently, the company introduced ChatGPT Search, an in-house search engine feature that allows users to receive real-time answers to queries, such as sports scores, breaking news, and stock quotes.
Many note how the generative AI tool has come full circle to offer services very similar to the industry its intended to displace, Google Search. Overall, the web-based versions of ChatGPT would still rely on browsers to exist. In comparison, Google’s Chrome browser has a solid market share of 35.4 billion users in 2024. It has seen minimal growth YoY but has grown 45.35% in the last 5 years, according to Statscounter.
Other recent news indicates that OpenAI has purchased the domain name Chat.com; however, there is no word on what the company plans to do with this product.
Meanwhile, other AI tools continue to see traffic and growth, despite not being at the same level as ChatGPT. Despite recent plagiarism claims, the Perplexity chatbot has seen 90.8 million visits in October, a 25.5% MoM growth and 199.2% YoY growth. Google’s Gemini Chatbot saw 291.6 million visits in October, a 6.2% MoM growth and 19% YoY growth after the company introduced a new ChromeOS update that brought new AI features to its Chromebooks.
Advertisement
Anthropic’s Claude chatbot has seen 84.1 million visits in October, a 25.5% MoM growth and 394.9% YoY growth, after recently rolling out a desktop application for Windows and macOS. Microsoft’s web-based Copilot website saw 69.4 million visits in October, an 87.6% MoM growth.
Lastly, NotebookLM has seen 31.5 million visits in October, an over 200% MoM growth. Based on Google generative models, the note-taking, document, link, and content collection app allows users to process and summarize different formats of information using AI prompts.
You must be logged in to post a comment Login