Connect with us

Technology

Snow Crash author Neal Stephenson on the ‘metaverse stock price’ | The DeanBeat

Published

on

Snow Crash author Neal Stephenson on the 'metaverse stock price' | The DeanBeat

We got a sense recently for the “metaverse stock price” as it stands in 2024 at our recent GamesBeat Next 2024 event.

Neal Stephenson talked about that notion as he did a talk about how to make sci-fi come true and turn the dreams for an open metaverse into reality. Stephenson famously coined the word “metaverse” in his novel Snow Crash that debuted in 1992. I read the novel back then and I was honored to co-moderate a fireside chat with Stephenson at our recent GamesBeat Next 2024 event.

Riz Virk, author of the Simulation Hypothesis, which is about whether we’re living in a simulation, joined me as co-moderator. I’m pretty sure our talk with Stephenson was real, and that Virk is also a faculty associate at Arizona State University, founder of Play Labs, and venture partner at Griffin Gaming Partners.

Stephenson has written many science fiction novels, but he joined us in a session entitled “The science fiction future that we want.” And he is dedicated to turning some of his ideas, like the metaverse, into science fact. He is cofounder of Whenere, which is making a game where users can use AI to enhance their storytelling. Whenere is what creators would use to create linear narratives. And Stephenson is also cofounder of Lamina1, a Web3 company focused on fair compensation for digital creators.

Advertisement

We started out with his definition of the metaverse, which for him has a spatial element, and then we strayed into discussions of the “metaverse stock price” and whether games like Fortnite, Minecraft and Roblox count as metaverse applications.

We also discussed Whenere’s attempt to let users create their own stories, first around Jane Austen’s Pride & Prejudice universe (which is no longer copyrighted). Interestingly, Stevenson said he doesn’t use AI to write because he “knows how to write.”

Asked about the kind of science fiction future he wants, he said he is concerned about “carbon” and the fact that so many people don’t know what is real. (Given recent events, I can relate to the latter one). We even talked about digital twins and the notion that the metaverse might be inside Microsoft Flight Simulation 2024. We quizzed him about his newest novel Polostan, about the pre-atomic bomb era, and whether it has parallels to our era ahead of general artificial intelligence. And we asked if there would be a Snow Crash 2 or a Snow Crash film.

Here’s an edited transcript of our fireside chat with Stephenson. You can also watch the video in this post.

Advertisement
Left to right: Riz Virk, Neal Stephenson and Dean Takahashi at GamesBeat Next 2024.

Riz Virk: Neal, you were talking recently about Matthew Ball and Tim Sweeney. You offered a definition of the metaverse: a massively multiplayer online universe that has a sense of space, where there are experiences distributed around that space in a way that’s perceived by all of its users in the same way. You can move from one place to another and interact with other users who are not physically present. It’s not controlled by any one entity. Many creators large and small build things there.

Stephenson: That was me being somewhat off the cuff, but when you read it back, it covers most of the important bases of what we want from a metaverse.

GamesBeat: I noted that the word metaverse on Google trends saw its peak in 2021, after Mark Zuckerberg changed his company’s name to Meta. The word has had a slight comeback, but it’s nowhere near as popular as it was during the pandemic. What observation would you have on this?

Stephenson: Tim Sweeney, in that conversation you mentioned, which is a pretty interesting document – you can find it on Matt Ball’s website – he likened it to a stock whose value goes up and down. But it’s always there at some level. If somebody does something cool that’s connected with the idea of the metaverse then the stock rises. If somebody does something lame the value goes down. But the ups and downs are against the context that it’s an ongoing project. It doesn’t necessarily cease to exist just because it’s gone into a down phase.

GamesBeat: Fortnite, Roblox, Minecraft happened and the stock goes up. But if something in the market doesn’t pan out, it’s going down.

Advertisement

Stephenson: To the extent that people think–it’s clear, unequivocally, that Tim thinks of the three applications you mentioned as absolutely being metaverse applications. By that standard, there are many hundreds of millions of people using it all the time and it’s making money. If you have a different definition of what the metaverse is, if you think of it as exactly what’s described in the novel, then it’s still a little ways out.

Lamina1 was started by Neal Stephenson and Peter Vessenes.
Lamina1 was started by Neal Stephenson and Peter Vessenes.

Virk: Snow Crash had the idea of programs like the Librarian and other AI characters within the metaverse. Sometimes I like to joke that the AI in the metaverse are the real residents. The rest of us just visit as avatars. I’m curious about this recent trend of smart NPCs. Companies like Inworld and Replika are creating these NPCs that are basically light wrappers around LLMs like ChatGPT. What are you thoughts about how AI will evolve in the metaverse?

Stephenson: That’s one we’re working on with Whenere, which is the product that (emcee) Tadhg (Kelly) just alluded to. We started experimenting with Inworld’s AI technology at the beginning of 2023. We whipped up a demo, a character called Virj from the Snow Crash universe, who we created in Unreal Engine using the Inworld AI platform. We were impressed by it. It was fascinating, which is how we got going on our current project. We’re very much paying attention to that and using those tools in an intensive manner every day. We think there’s huge potential there, which is why we’re doing it.

GamesBeat: You have some more things going on at Whenere, like the Jane Austen novel, this marriage of AI and storytelling.

Stephenson: Like I said, the first thing we tried was this character from Snow Crash. On further reflection, one of my co-founders came up with the idea of instead starting with the world of Pride and Prejudice, for several reasons. One is that we love it, but beyond just that, it’s in the public domain. We don’t have to spend the first year fucking around with lawyers. It’s conversation-based. There’s no starship battles or gunfights or other things that are hard and expensive to bring to life in a game engine. It’s people sitting in rooms talking to each other. We thought it was a good test case to prove the point that we wanted to prove about whether this could be a rewarding and engaging platform.

Advertisement
Whenere is an AI storytelling game, starting out with Jane Austen.

Virk: Does that mean you play as one of the characters in Pride and Prejudice?

Stephenson: We’re kind of hardcore believers in linear narrative. We’re not trying to make a complete open world where you can go in and fundamentally change what happens in the story. People like story worlds for a reason. For example, if you made the world of the Lord of the Rings, you could go into the Green Dragon pub and wait for Frodo to come in and say, “Don’t go through Moria. It’s very dangerous. Go around.” You could say a lot of things to those characters that would screw up the story of the book. The story of the book is what people love. They don’t want to see that change.

We do think people might want to immersively sit in that world and have less consequential interactions with characters in those worlds. As well as be able to write their own stories and see those stories play out in those worlds.

Virk: Could you then allow people to create their own worlds based on their own stories, or is it more that the company is going to curate these worlds?

Stephenson: Building a world–I don’t need to explain to this audience that building a world convincingly is expensive. Someone has to do that. In theory, someone who has the staff and the budget to create any world they want in a game engine. The engine we’re using is Unreal. But we think it would be a lot easier for users if a world is supplied to them with all the pieces there. Then you could make changes to it, but you wouldn’t have to build the entire thing from scratch.

Advertisement

Virk: A lot of people are using AI for writing these days. What is your writing process like, and are you thinking of using AI anywhere in that process?

Stephenson: No. I already know how to write, so I don’t need help on that front. The act of writing is pleasurable to me. Making art is both a form of enjoyment for artists and a way of enhancing their own powers, exercising their own intellect. There’s a quote–this is terrible, but I can’t remember the name of the writer who put this up on Twitter. I quote her and give her credit on my Substack. She says, “I don’t want AI to make art and poetry so I can do the dishes and run the laundry. I want AI to do the dishes and run the laundry so I can make art and poetry.”

Lamina1 content by @m1nal
Lamina1 content by @m1nal

GamesBeat: The interesting question there is, what if your users ask AI to write something better than Neal Stephenson?

Stephenson: It can try. There are all kinds of ways, seriously, that AI can–for example, the voices we’re using are from ElevenLabs. ElevenLabs is using some kind of AI system where you feed it some text and it figures out how to say that line of dialogue in a way that sounds like an actor. It’s not perfect, but it’s surprisingly good. That’s an example of making a tool powered by AI that gives creators some agency, as opposed to just jerking the steering wheel out of their hands.

GamesBeat: What is the science fiction future that we want?

Advertisement

Stephenson: We in this room?

GamesBeat: We in this room, the game industry, the world…

Stephenson: “We” questions are tricky. People in social media discourse are always using that word. We should do this. We shouldn’t do that. It gets complicated when you start to ask the question, “Who exactly is the ‘We’ we’re talking about?”

GamesBeat: Is there some science fiction that you want?

Advertisement

Stephenson: Talking about big picture social concerns, if that’s where we’re going with this, the two big things that I mostly worry about are carbon and the fact that people can’t agree on what is real. There’s all kinds of hard science fiction you could write about ways to deal with the carbon problem that would be nice if they came true. So far the second problem I mentioned is trickier to work out. I’m not sure if science fiction is ready to tackle that.

Jamil Moledina's signed copies of Neal Stephenson books.
Jamil Moledina’s signed copies of Neal Stephenson books.

Virk: A few years ago you announced that you were co-founder of Lamina1. For many people that was like seeing an intersection of science fiction and real-world innovation. Can you give us an update on Lamina1 and what you’re up to there?

Stephenson: For those who aren’t familiar with it, the idea was that when the metaverse suddenly hit that spike in popularity in late 2021, early 2022, we would try to build a system that creators could use to track their contributions to an open, decentralized metaverse, and hopefully make money from them. The thing that was obvious to me, and still is, was that there was going to be a metaverse, by the definition quoted earlier. It would come out in the game industry in the sense that game industry people know how to use the tool chain that’s necessary to build those kinds of experiences. You can’t have millions of people using the metaverse unless there are experiences that millions of people enjoy. It’s the game industry that knows how to deliver that.

The thing I thought might be missing was some way that you could post your contributions to the metaverse, have them attributed to you, and hopefully have revenue flow into your wallet if the thing you made reached an audience and became popular. That’s the founding vision of Lamina1, which is a blockchain. I’m the chairman. For me it’s a couple of hours a week. The CEO and powerhouse behind it is Rebecca Barkin, who is someone I met when we were both at Magic Leap. She’s been working with a terrific engineering team of people who know what they’re doing with crypto and blockchain. In spite of serious headwinds that hit that industry in 2022 and 2023, they’ve managed to keep that going and launch the chain in May. It’s being used. The system works. We’re starting to flex our muscles a bit creatively and get some content up there.

GamesBeat: I thought it was interesting that the different pieces you’re highlighting point to a very similar view of the open metaverse that you see from Tim Sweeney. He doesn’t want it to be controlled by any one party, any big platforms. Is there a meeting of the minds there? Do you have your own views on how the open metaverse should be built?

Advertisement

Stephenson: For the most part Tim and I are more aligned than not. What I hear from him typically has me nodding my head in agreement. He’s still quite cautious and skeptical about blockchain. He thinks it’s an interesting technology that got adopted too soon. It should have spent more time in the lab. I think that’s the gist of what he says in the Matthew Ball interview. He has similar skepticism about AI, about LLMs, based on ethical considerations around the fact that these things are trained–the big models are trained on data with a provenance that isn’t fully nailed down. There’s some controversy about where the data sets came from.

One of the reasons we picked an old book to begin the Whenere project is that the specific training data for the characters in that world is all in the public domain. It’s all 200 years old. But there’s no getting around the fact that the big model that powers the whole thing has data from all over the place. I think Tim has some scruples around that, which I respect. He has a very principled set of rules he likes to follow in picking projects that he wants to advocate and work on.

Virk: You came out with Fall in 2019. That was the same year I came out with my book The Simulation Hypothesis, which is about this idea that we’re already living inside a simulated environment. I’ve often said that the future of the metaverse is going to this point where we’ll be unable to distinguish a virtual world from a physical world. You would be unable to distinguish AI characters from human-controlled avatars or uploaded characters. My question is, do you think we’ll get to that point where video games will be indistinguishable from reality?

Stephenson: They’re certainly getting damn good. I don’t know about indistinguishable. If you want to throw enough processing power at it, you can use metahumans and other features of a modern game engine to make something that’s definitely cinematic quality. Of course you’re still looking at it on a two-dimensional screen.

Beyond that we’re talking far, far out in the future. The thing that got me going on Fall was David Deutsch’s books. The second one is called The Beginning of Infinity. He talks about this problem of simulating reality and what kind of computation power it takes to make increasingly good simulations. I’m going to completely mangle his thesis and dumb it down to something I can work with, which is that to make a simulation that’s as good as the universe, you have to have a computer the size of the universe. If you take that point of view, that’s where I was going. That’s the idea I was playing with in the book you mentioned.

Advertisement

GamesBeat: Will Wright once said that a dog-eared copy of Snow Crash was the business plan for every startup in Silicon Valley. How do you feel about this ability to influence real life?

Neal Stephenson and Dean Takahashi talk about turning science fiction into reality.
Neal Stephenson and Dean Takahashi talk about turning science fiction into reality in 2022.

Stephenson: Riz has a connection with the Center for Science and the Imagination, which was actually started to address the thing you’re talking about. It happened probably 15 years ago when I was on a stage like this with Michael Crow, the president of Arizona State. He said, “When are science fiction writers going to stop writing all this dystopian crap and write something that inspires people again?” We actually wrote a book, created an anthology at CSI called Hieroglyph. We were trying to get a bunch of science fiction writers to do that.

It turned out to be surprisingly hard to break people out of the dystopian groove, but I still think it was a worthy experiment. I’m not sure how much of it exerted any influence per se, but from time to time a science fiction book can be somewhat useful in getting a bunch of people in a company roughly pointed in the same direction.

GamesBeat: We know you love history. Your books jump between the future and the past a lot. What’s your view of history as an influence on science fiction?

Stephenson: I think it’s always the case that if you scratch a science fiction writer, you’ll find a history geek. I was reading old anthologies of science fiction stories as a kid, and there were all kinds of historical stories sprinkled in there. They would find ways to send someone back in time or bring a historical character forward in time. That’s been the case forever with science fiction writers. I guess I’m no exception.

Advertisement

Virk: Since you write about the history of the atomic bomb, do you think there are any lessons here for what’s happening about AI today?

Polostan is Neal Stephenson’s newest novel.

Stephenson: I guess the way I would put it is that once they figured out how to control the power of the atom, they went out and started making bombs. We obliterated an atoll from the map of the Pacific Ocean. That’s an impressive demo of the power of the atom. But a lot of people were of a mindset–gee, I kind of like the glow in the dark watch dial so I can tell the time at night. Maybe we should work on radiotherapy to treat certain diseases.

There’s a similar thing happening now with AI. The people making the big systems want to demonstrate the equivalent of blowing up an atoll. That’s all very impressive, but as I was mentioning before, I think the real utility of it is going to be much more focused, fine-grained tools that solve actual problems for people.

GamesBeat: There are lots of interesting projects underway around digital twins. The enterprises of the world are using game engines to make these for things like BMW factories before they build them. Once the digital twin is perfect they build it in the physical worlds. These projects are so big that they’re building digital twins of the earth now. Microsoft’s Flight Simulator 2024 is essentially a digital twin of the earth. Nvidia has been working on something called Earth 2 to build a climate model to predict climate change in the decades to come. Are we going to be putting these versions of the earth together to create a metaverse that’s a full digital twin of our planet?

Stephenson: To be pedantic, that’s a different thing from the metaverse. In Snow Crash you also have an application called Earth that’s just a utility that looks like the earth made of cartographic data. A digital twin of the earth is a fascinating and cool project, it’s just a different kind of project from what I think of as the metaverse, which is an imaginary space full of imaginary experiences. But for sure, the ability to simulate climate and geological processes at scale in a digital twin of the earth is something I very much look forward to playing with.

Advertisement

GamesBeat: We know your novel Seveneves is coming to the small screen, with a project in the works at Legendary Pictures. Will we see a Snow Crash film, or a Snow Crash 2? What are some technological elements we could see in a Snow Crash 2?

Stephenson: I’ve written some prequel material in the Snow Crash universe. But nothing that I would consider Snow Crash 2, not a lot of sequel stuff. It’s hard enough to get a movie made of Snow Crash one. Seveneves is at Legendary and they’re starting to work on it as a TV idea. Snow Crash is at Skydance. They’re working on it as one or more feature films. Beyond that I can’t say anything. They’re pretty tight-lipped about announcing what’s going on.

Karen Laur and Neal Stephenson of Whenere.

The funny thing is that if it had happened earlier, it would have sucked. People in 1990 would have said, “Oh, cool, a computer graphics universe. Let’s make the metaverse.” And they would have made it look like computer graphics looked back then. We’d be looking at it now and cringing at the poor quality of the graphics. It would be campy at this point. There was a certain point when various people who’ve come and gone, people who talked about making a Snow Crash movie–they realized that the metaverse that existed in the book had to be full cinematic quality. It wasn’t meant to be discernible from film shot with human actors. We dodged a bullet, I think.

Question: This conversation has largely revolved on what you want in the future. What is the future that you think we’re actually going to get?

Stephenson: Obviously it’s been a crazy year for the game industry. There’s some kind of sea change happening. That’s the optimistic take on it. What we’ll see coming from the next generation of game projects may look very different from what we have now. I hope, as I’ve made clear–I think we’re at a threshold now where we have new ways of interacting with game worlds. Game worlds have, for a very long time, been based on what amounts to a point and click interface. You have a cursor on the screen. You get it over something. You click the mouse button or hit a key and something happens. Most commonly you shoot someone.

Advertisement

That’s great fun. I don’t knock it at all. But the thing that was already happening, and was massively accelerated by COVID, is that everyone now has microphones on their computers. They’re in the habit of talking into computers. The ability to interact with a game world by talking and listening, to make a really terrible pun, is a game-changer. That’s going to open up a lot of interesting creative avenues for the industry going forward. We may see other new kinds of interactive schemes available as well, based on the camera looking at the player’s face and so on.

Question: You talked about how AI will not write your stories for you, but you do believe in the tools side. Can you dive deeper into what you get most excited about in terms of AI as it relates to storytelling?

Stephenson: Everyone has their own creative strengths and weaknesses, things they know how to do, that they’re comfortable doing, and other areas where they feel a bit of help would be valuable, especially if it’s taking over something that feels like a chore, that’s not very rewarding to do. I was looking at DaVinci Resolve the other day. A big part of what that program is famous for is color grading, which is an infamously meticulous and detailed process. The people who do it are wizards, amazing contributors to the creative process. In a perfect world you could go out and hire someone who’s great at it, but for a lot of people it’s serious drudgery. You know it’s terribly important, but you don’t know quite how to do it. For everyone who works in creative areas there are things like that, where AI can provide tools that extend the artist’s power without taking away the artist’s prerogatives.


Source link
Continue Reading
Advertisement
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Technology

Cadillac reveals the 2026 Vistiq EV SUV

Published

on

Menu

Cadillac is adding to its fleet of EVs with a new luxury SUV. is a three-row, all-electric SUV that will hit showrooms and dealerships sometime next summer with a starting price of $78,790.

The Vistiq’s dual-motor, all-wheel drive system runs on a 102 kWh battery pack with a range of 300 miles that produces 615 horsepower and 650 pound-feet of torque. The Vistiq also supports vehicle-to-home (V2H) bidirectional charging capabilities: it can charge at home, and also deliver electricity to your house during a power outage. Using the features requires buying the GM Energy V2H bundle though.

The SUV’s design borrows aesthetically from other Cadillac EVs. Like the , it has flush door handles, and features similar looking lights and side panels. It also matches the Lyriq’s 300 mile range. The “swept-back windshield” and “Black Crystal Shield grill” evoke the Escalade IQ.

Of course, the Vistiq’s power and price are different from its Cadillac EV’s. The new Cadillac EV SUV is less expensive than an Escalade IQ ($129,990) but more than a Lyriq ($58,595), and the has a higher peak battery range at 450 miles.

Advertisement

The Vistiq comes with a 23-speaker AKG7 Studio Audio system with Dolby Atmos. The Android-powered infotainment system is baked into a 33-inch high resolution LED display. The Verge also that the new EV’s navigation system uses Google Maps and can run other apps from the Google Play Store.

Apple CarPlay and Android Auto won’t be available in Cadillac’s newest EV. is phasing out Apple CarPlay and Android Auto from its EVs and plans to go with Android Automotive. GM’s Executive Director of Digital Cockpit Experience Edward Kummer said in that the carmaker didn’t want any features in its EVs “that are dependent on a person having a cellphone.”

Source link

Continue Reading

Science & Environment

Study finds great white sharks less likely to attack surfboards with bright lights: “Like an invisibility cloak”

Published

on

Study finds great white sharks less likely to attack surfboards with bright lights: "Like an invisibility cloak"


Covering your surfboard in bright lights sounds like an open invitation to great white sharks, but research released Tuesday by Australian scientists found it might actually stave off attacks.

Biologist Laura Ryan said the predator often attacked its prey from underneath, occasionally mistaking a surfer’s silhouette for the outline of a seal.

Ryan and her fellow researchers showed that seal-shaped boards decked with bright horizontal lights were less likely to be attacked by great white sharks.

This appeared to be because the lights distorted the silhouette on the ocean’s surface, making it appear less appetizing.

Advertisement

“There is this longstanding fear of white sharks and part of that fear is that we don’t understand them that well,” said Ryan, from Australia’s Macquarie University.

The study, published in the journal Current Biology, was conducted in the waters of South Africa’s Mossel Bay, a popular great white feeding ground.

Seal-shaped decoys were strung with different configurations of LED lights and towed behind a boat to see which attracted the most attention.

Brighter lights were better at deterring sharks, the research found, while vertical lights were less effective than horizontal.

Advertisement

Macquarie University Professor Nathan Hart, one of the study’s authors, said the lights caused a “complex interaction” with the shark’s behavior.

“It’s like an invisibility cloak but with the exception that we are splitting the object, the visual silhouette, into smaller bits,” Hart said.

The study’s authors released a video showing some of the research in action.

Advertisement


Lights stop Great White attacks: new shark research by
Macquarie University on
YouTube

Ryan said the results were better than expected and is now in the process of building prototypes for use on the underside of kayaks and surfboards.

Australia has some of the world’s most comprehensive shark management measures, including monitoring drones, shark nets and a tagging system that alerts authorities when a shark is near a crowded beach.

Ryan said her research could allow less invasive mitigation methods to be used.

More research was needed to see if bull and tiger sharks — which have different predatory behavior — responded to the lights in a similar way, the authors said.

Advertisement

shark-study-fx1-lrg.jpg
Covering your surfboard in bright lights may deter great white shark attacks, according to research released Tuesday by Australian scientists.

Ryan et al. / Current Biology


There have been more than 1,200 shark incidents in Australia since 1791, of which 255 resulted in death, official data shows.

Great white sharks were responsible for 94 of those deaths.

The overall number of deadly shark attacks worldwide in 2023 remained relatively low, but it was still twice the previous year’s total, according to the latest iteration of the International Shark Attack File — a database of global shark attacks run by the University of Florida. 

The report noted that a “disproportionate” amount of people died from shark bites in Australia last year compared with other countries around the world.



Source link

Advertisement

Continue Reading

Technology

Snowflake’s ‘data agents’ leverage enterprise apps so you don’t have to

Published

on

Snowflake's 'data agents' leverage enterprise apps so you don't have to

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


Today, data ecosystem giant Snowflake kicked off its BUILD developer conference with the announcement of a special new offering: Snowflake Intelligence.

Set to launch in private preview soon, Snowflake Intelligence is a platform that will help enterprise users set up and deploy dedicated ‘data agents’ to extract relevant business insights from their data, hosted within their data cloud instance and beyond, and then use those insights to take actions across different tools and applications, like Google Workspace and Salesforce.

The move comes as the rise of AI agents continues to be a prominent theme in the enterprise technology landscape, with both nimble startups and large-scale enterprises (like Salesforce) adopting them. It will further strengthen Snowflake’s position in the data domain, leaving the ball in rival Databricks’ court to come back with something bigger. 

Advertisement

However, it is important to note that Snowflake isn’t the very first company to toy with the idea of AI agents for improved data operations.

Other startups including Redbird, Altimate AI and Connecty AI, are also exploring with the idea of agents to help users better manage and extract value (in the form of AI and analytical applications) from their datasets. One key benefit of Snowflake’s is that the agent creation and deployment platform will live within the same cloud data warehouse or lakehouse provider, eliminating the need for another tool.

What to expect from Snowflake’s data agents?

Ever since Neeva AI CEO Sridhar Ramaswamy took over as CEO, Snowflake has been integrating AI capabilities on top of its core data platform to help customers take advantage of all their datasets, without running into technical complexities. 

From the Document AI feature launched last year to help teams extract data from their unstructured documents and to fully-managed open LLM solution Cortex AI to Snowflake Copilot, an assistant built with Cortex to write SQL queries in natural language and extract insights from data, Snowflake has been busy adding such AI features.

Advertisement

However, until now, the AI smarts were only limited to working with the data hosted within users’ respective Snowflake instances, not other sources.

How Snowflake Intelligence data agents work

With the launch of Snowflake Intelligence, the company is expanding these capabilities, giving teams the option to set up enterprise-grade data agents that could tap not only business intelligence data stored in their Snowflake instance, but also structured and unstructured data across siloed third-party tools — such as sales transactions in a database, documents in knowledge bases like SharePoint, information in tools like Slack, Salesforce, and Google Workspace. 

According to the company, the platform, underpinned by Cortex AI’s capabilities, integrates different data systems with a single governance layer and then uses Cortex Analyst and Cortex Search (part of Cortex AI architecture) to deploy agents that accurately retrieve and process specific data assets from both unstructured and structured data sources to provide relevant insights.

The users interact with the agents in natural language, asking business-related questions covering different subjects, while the agents identify the relevant internal and external data sources, covering data types like PDFs, tables, etc., for those subjects and run analysis and summarization jobs to provide answers.

Advertisement

But that’s not all. Once the relevant data is surfaced, the user can ask the data agents to go a step further and take specific actions around the generated insights.

For instance, a user can ask their data agent to enter the surfaced insights into an editable form and upload the file to their Google Drive. The agent would immediately analyze the query, plan and make required API function calls to connect to the relevant tools and execute the task. It can even be used for writing to Snowflake tables and making data modifications.

Snowflake Intelligence data agent in action
Snowflake Intelligence data agent in action

We’ve reached out to Snowflake with specific questions about these data agents, including the breadth of data sources they can cover and tasks they can (or cannot) execute, but have not heard from the company at the time of writing.

It also remains to be seen how quickly and easily users can create and set up these data agents. For now, the company has only said it only takes a “few steps” to deploy them.

Baris Gultekin, the head of AI at Snowflake says the unified platform “represents the next step in Snowflake’s AI journey, further enabling teams to easily, and safely, advance their businesses with data-driven insights they can act on to deliver measurable impact.”

Advertisement

No word on widespread availability 

While the idea of having agents that could answer questions about business data and then take specific actions with the generated insights to do organizational work sounds very tempting, it is pertinent to note that the capability has just been announced yet.

Snowflake has not given a timeline on its availability. It only says that the unified platform will go into private preview very soon.

However, the competition is intensifying fast, including from AI model provider startups such as Anthropic with its new Computer Use mode, giving users more options to choose from when it comes to turning autonomous agents loose on business data, and completing tasks from a user’s text prompt instructions.

The company also notes that Snowflake Intelligence will be natively integrated with the company’s Horizon Catalog at the foundation level, allowing users to run agents for insights right where they discover, manage and govern their data assets. It will be compatible with both Apache Iceberg and Polaris, the company added. 

Advertisement

Snowflake BUILD runs from November 12 to 15, 2024.


Source link
Continue Reading

Science & Environment

Jets of liquid bounce off hot surfaces without ever touching them

Published

on

New Scientist. Science news and long reads from expert journalists, covering developments in science, technology, health and the environment on the website and the magazine.


New Scientist. Science news and long reads from expert journalists, covering developments in science, technology, health and the environment on the website and the magazine.

If you cook with stainless steel pans, you’re probably familiar with the Leidenfrost effect

Franck Celestini

A jet of liquid can bounce off of a hot plate without ever touching it. This extension of the Leidenfrost effect – the phenomenon that allows beads of water to skitter across a scorching pan – could help improve cooling processes, from nuclear reactors to firefighting.

Though first described nearly 300 years ago, the Leidenfrost effect has only been tested with fluid droplets, not squirts of liquid. Until now.

Advertisement

Frack Celestini at Côte d’Azur…



Source link

Continue Reading

Technology

Particle launches an AI news app to help publishers, instead of just stealing their work

Published

on

Particle launches an AI news app to help publishers, instead of just stealing their work

The media industry today may not have a very favorable view of AI — a technology that’s already been used to replace reporters with AI-written copy, while other AI companies have scooped up journalists’ work to feed their chatbots’ data demands, but without returning traffic to the publisher as search engines once did. However, one startup, an AI newsreader called Particle from former Twitter engineers, believes that AI could serve a valuable role in the media industry by helping consumers make sense of the news and dig deeper into stories, while still finding a way to support the publishers’ businesses.

Backed by $4.4 million in seed funding and a $10.9 million Series A led by Lightspeed, Particle was founded last year by the former senior director of Product Management at Twitter, Sara Beykpour, who worked on products like Twitter Blue, Twitter Video, and conversations, and who spearheaded the experimental app, twttr. Her co-founder is a former senior engineer at Twitter and Tesla, Marcel Molina.

From the consumers’ perspective, the core idea behind Particle is to help readers better understand the news with the help of AI technology. More than just summarizing stories into key bullet points for quick catch-ups, Particle offers a variety of clever features that let you approach the news in different ways.

Image Credits:Particle

But instead of simply sucking up publishers’ work for its own use, Particle aims to compensate publishers or even drive traffic back to news sites by prominently showcasing and linking to sources directly underneath its AI summaries.

To start, Particle has partnered with specific publishers to host some of their content in the app via their APIs, including outlets like Reuters, AFP, and Fortune. These partners receive better positioning and their links are highlighted in gold above others.

Advertisement

Image Credits:Particle

Already, beta tests indicate that readers are clicking through to publishers’ sites because of the app’s design and user interface, though that could shift now that the app is launching beyond news junkies to the general public. In time, the company intends to introduce other ways to work with the media, too, in addition to sending them referral traffic. The team is also having discussions with publishers about providing its users access to paywalled content in a way that makes sense for all parties.

“Having deep partnerships and collaboration is one of the things that we’re really interested in,” notes Beykpour.

To help with its traffic referral efforts, the app’s article section includes big tap targets, making it easy for readers to click through to the publisher’s site. Plus, Particle includes the faces of the journalists on their bylines, and readers can follow through links to publisher profiles to read more of their content or follow them.

Using the app’s built-in AI tools, news consumers can switch between different modes like “Explain Like I’m 5,” to get a simplified version of a complicated story or those that summarize “just the facts,” (or the 5W’s — who, what, when, where, and why). You can have the news summarized in another language besides English, or listen to an audio summary of a story or a personalized selection of stories while on the go. Particle can also pull out important quotes from a story and other links of reference.

Image Credits:Particle

But two of the more interesting features involve how Particle leverages AI to help present the news from different angles and allows you to further engage with the story at hand by asking questions.

In Particle, one tool called “Opposite Sides” aims to break users’ filter bubbles by presenting different viewpoints from the same story. This model has been tried before by other news apps, including the startup Brief and SmartNews. Unlike earlier efforts, Particle includes a story spectrum that shows how news is being reported across both “red” and “blue”-leaning sites, with bubbles placed to indicate how far to the left or right the news’ positioning is, and how outsized the coverage may be from one side or the other. The AI will also summarize both sides’ positions, allowing news consumers to reach their own opinions about the matter.

Advertisement

Image Credits:Particle

However, the app’s killer feature is an AI chatbot that lets you ask questions and get instant answers about a story. The app will include suggested questions and those asked by others. For example, if you’re reading about Trump’s immigration policy plans, you could ask the chatbot things like “What are the potential legal challenges to Trump’s deportation plans?” or “What are the potential costs of mass deportation?” among other things. Particle will then use its AI technology to find those answers and fact-check them for accuracy.

“The chat function uses OpenAI as well as…our own pre-processing and post-processing,” explains Beykpour, in an interview with TechCrunch. “It uses the content, searches the web a little bit — if it wants to find extra information on the web — to generate those answers.” She says that after the answer is generated, Particle includes an extra step where the AI has to go find the supporting material that matches those answers.

Overall, the app encompasses tech like OpenAI’s GPT-4o and GPT-4o mini, Anthropic, Cohere, and others, including more traditional AI technologies, which are not LLM-based, from Google.

“We have a processing pipeline that takes related content and summarizes it into bullet points, into a headline, sub-headline, and does all the extractions,” she continues. “Then…we pull out quotes and links and all sorts of relevant information about [the story]. And we have our own algorithms to rank, so that the most important or relevant link is the one that you see first — or what we think is the most important or relevant quote is the one that you see first.”

The company claims that its technology reduces AI accuracy problems that would otherwise occur one out of 100 times, and reduces their likelihood to one out of 10,000 times.

Advertisement

Particle will also use human editors as it grows to help better manage the AI content and curate its homepage, she notes.

The app is a free download on iOS for the time being and works across iPhone and iPad.

TechCrunch has an AI-focused newsletter! Sign up here to get it in your inbox every Wednesday.

Source link

Advertisement

Continue Reading

Technology

VMware Workstation and Fusion are now free for everyone

Published

on

VMware Workstation and Fusion are now free for everyone

VMware made its Fusion and Workstation software that creates and manages virtual machines free for personal use earlier this year. Now, the company announced that as of Monday, it’s free for everyone, including commercial customers. Also, the Fusion (for Macs) and Workstation (for Windows and Linux) Pro versions are no longer available for purchase.

Broadcom’s $61 billion acquisition of VMware in 2022 was one of the biggest tech acquisitions ever. Since then, it has bundled the company’s products to “simplify its portfolio” and dropped many existing SKUs. It has already announced an end to offering VMware perpetual licensing for standalone offerings to push enterprises towards its Cloud Foundation or vSphere Foundation subscription products.

However, in this Business Insider report mentioned by Tom’s Hardware, some business customers claimed they’ve seen prices spike following the acquisition as the company focuses on subscriptions and its most lucrative customers to increase annual revenue. One unnamed corporate customer quoted by BI said their prices increased by 175 percent and compared the situation to being “held for ransom” because of the difficulty in possibly switching to something else.

Commercial contracts will remain in effect for businesses, and they will receive the same level of support through the end of their contracts. However, VMware is discontinuing its support ticketing for troubleshooting after that and says instead to use the community, documentation, and support articles available online.

Advertisement

Source link

Continue Reading

Trending

Copyright © 2024 WordupNews.com