Connect with us

Technology

Quest 3S, Orion AR glasses and Meta AI updates

Published

on

Menu

Although Meta Connect 2024 lacked a marquee high-end product for the holiday season, it still included a new budget VR headset and a tease of the “magic glasses” Meta’s XR gurus have been talking about for the better part of a decade. In addition, the company keeps plowing forward with new AI tools for its Ray-Ban glasses and social platforms. Here’s everything the company announced at Meta Connect 2024.

A person wearing Meta’s Orion AR glasses.

Meta

Today’s best mixed reality gear — like Apple’s Vision Pro and the Meta Quest 3 — are headsets with passthrough video capabilities. But the tech industry eventually wants to squeeze that tech into something resembling a pair of prescription glasses. We’ll let you judge whether the Orion AR glasses pictured above pass that test, but they’re certainly closer than other full-fledged AR devices we’ve seen.

First, the bad news. These puppies won’t be available this year and don’t have an official release date. A leaked roadmap from last year suggested they’d arrive in 2027. However, Meta said on Wednesday that Orion would launch “in the near future,” so take what you will from that. For its part, Meta says the full-fledged product prototype is “truly representative of something that could ship to consumers” rather than a research device that’s decades away from shipping.

The glasses include tiny projectors to display holograms onto the lenses. Meta describes them as having a large field of view and immersive capabilities. Sensors can track voice, eye gaze, hand tracking and electromyography (EMG) wristband input.

Advertisement

The glasses combine that sensory input with AI capabilities. Meta gave the example of looking in a refrigerator and asking the onboard AI to spit out a recipe based on your ingredients. It will also support video calls, the ability to send messages on Meta’s platforms and spatial versions of Spotify, YouTube and Pinterest apps.

The Meta Quest 3S VR headset against a white background.

Meta

This year’s new VR headset focuses on the entry-level rather than early adopters wanting the latest cutting-edge tech. The Meta Quest 3S is a $300 baby sibling to last year’s Quest 3, shaving money off the higher-end model’s entry fee in exchange for cheaper lenses, a resolution dip and skimpier storage.

The headset includes Fresnel lenses, which are familiar to Quest 2 owners, instead of the higher-end pancake ones in Quest 3. It has a 1,832 x 1,920 resolution (20 pixels per degree), a drop from the 2,064 x 2,208 (25 PPD) in the Quest 3. Meta says the budget model’s field of view is also slightly lower.

The Quest 3S starts with a mere 128GB of storage, which could fill up quickly after installing a few of the platform’s biggest games. But if you’re willing to shell out $400, you can bump that up to a more respectable 256GB. (Alongside the announcement, Meta also dropped the 512GB Quest 3 price to $500 from $650.)

Advertisement

The headset may outlast the Quest 3 in one respect: battery life. Meta estimates the Quest 3S will last 2.5 hours, while the Quest 3 is rated for 2.2 hours.

Those ordering the headset will get a special Bat-bonus. Quest 3S (and Quest 3) orders between now and April 2025 will receive a free copy of Batman: Arkham Shadow, the VR action game coming next month.

The Quest 3S is now available for pre-order. It begins shipping on October 15.

To celebrate the arrival of the Meta Quest 3S, Meta is kicking two older models to the curb. The Quest 2 and Quest Pro will be discontinued by the end of the year. The company says sales will continue until inventory runs out or the end of the year, whichever comes first.

Advertisement

The company now views the Quest 3S, with its much better mixed reality capabilities, as the new budget model, so the $200 Quest 2 no longer has a place. The Quest Pro, which never gained much traction with consumers, has inferior cameras and passthrough video than the two Quest 3-tier models. The Pro launched two years ago as a Metaverse-centric device — back when the industry was pounding that word as hard as it’s pushing “AI” now. The headset launched at a whopping $1,500 and was later reduced to $1,000.

Glasses on a table.

Sam Rutherford for Engadget

Although the hardware stays the same, Meta is adding new AI features to its tech-filled sunglasses. The Ray-Ban Meta smart glasses will get an updated AI assistant.

The assistant will now let you set reminders based on objects you see. For example, you could say, “Hey Meta, remind me to buy that book next Monday” to set an alert for something you see in the library. The glasses can also scan QR codes and dial phone numbers from text it recognizes.

Meta’s assistant should also respond to more natural commands. You’ll need to worry less about remembering formal prompts to trigger it (“Hey Meta, look and tell me”). It will let you use more casual phrasing like “What am I looking at?” The AI can also handle complex follow-up questions for more fluid chats with the robot friend living in your sunglasses.

Advertisement

According to Meta, the glasses’ live translation is also getting better. While last year’s version struggled with longer text, the company says the software will now translate larger chunks more effectively. Live translations will arrive in English, French, Italian and Spanish by the end of 2024.

A phone showing Meta AI, with a surfing goat.

Meta

The company said Met AI now supports voice chats. Although this capability existed before, it was limited to the Ray-Ban glasses.

Meta also partnered with celebrities to help draw customers into its chatbots. That’s right, folks: You can now hear Meta’s chatbot responses in the dulcet tones of the one and only John Cena! Other celebrity voices include Dame Judi Dench, Awkwafina, Keegan Michael Key and Kristen Bell.

Meta’s AI can now edit photos with text prompts, performing tasks like adding or removing objects or changing details like backgrounds or clothes. AI photo editing will be available on Meta’s social apps, including Instagram, Messenger, and WhatsApp.

Advertisement

Meanwhile, Meta’s Llama 3.2 AI model introduces vision capabilities. It can analyze and describe images, competing with similar features in ChatGPT and Anthropic’s Claude.

Catch up on all the news from Meta Connect 2024!

Source link

Advertisement
Continue Reading
Advertisement
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Technology

Snowflake’s ‘data agents’ leverage enterprise apps so you don’t have to

Published

on

Snowflake's 'data agents' leverage enterprise apps so you don't have to

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


Today, data ecosystem giant Snowflake kicked off its BUILD developer conference with the announcement of a special new offering: Snowflake Intelligence.

Set to launch in private preview soon, Snowflake Intelligence is a platform that will help enterprise users set up and deploy dedicated ‘data agents’ to extract relevant business insights from their data, hosted within their data cloud instance and beyond, and then use those insights to take actions across different tools and applications, like Google Workspace and Salesforce.

The move comes as the rise of AI agents continues to be a prominent theme in the enterprise technology landscape, with both nimble startups and large-scale enterprises (like Salesforce) adopting them. It will further strengthen Snowflake’s position in the data domain, leaving the ball in rival Databricks’ court to come back with something bigger. 

Advertisement

However, it is important to note that Snowflake isn’t the very first company to toy with the idea of AI agents for improved data operations.

Other startups including Redbird, Altimate AI and Connecty AI, are also exploring with the idea of agents to help users better manage and extract value (in the form of AI and analytical applications) from their datasets. One key benefit of Snowflake’s is that the agent creation and deployment platform will live within the same cloud data warehouse or lakehouse provider, eliminating the need for another tool.

What to expect from Snowflake’s data agents?

Ever since Neeva AI CEO Sridhar Ramaswamy took over as CEO, Snowflake has been integrating AI capabilities on top of its core data platform to help customers take advantage of all their datasets, without running into technical complexities. 

From the Document AI feature launched last year to help teams extract data from their unstructured documents and to fully-managed open LLM solution Cortex AI to Snowflake Copilot, an assistant built with Cortex to write SQL queries in natural language and extract insights from data, Snowflake has been busy adding such AI features.

Advertisement

However, until now, the AI smarts were only limited to working with the data hosted within users’ respective Snowflake instances, not other sources.

How Snowflake Intelligence data agents work

With the launch of Snowflake Intelligence, the company is expanding these capabilities, giving teams the option to set up enterprise-grade data agents that could tap not only business intelligence data stored in their Snowflake instance, but also structured and unstructured data across siloed third-party tools — such as sales transactions in a database, documents in knowledge bases like SharePoint, information in tools like Slack, Salesforce, and Google Workspace. 

According to the company, the platform, underpinned by Cortex AI’s capabilities, integrates different data systems with a single governance layer and then uses Cortex Analyst and Cortex Search (part of Cortex AI architecture) to deploy agents that accurately retrieve and process specific data assets from both unstructured and structured data sources to provide relevant insights.

The users interact with the agents in natural language, asking business-related questions covering different subjects, while the agents identify the relevant internal and external data sources, covering data types like PDFs, tables, etc., for those subjects and run analysis and summarization jobs to provide answers.

Advertisement

But that’s not all. Once the relevant data is surfaced, the user can ask the data agents to go a step further and take specific actions around the generated insights.

For instance, a user can ask their data agent to enter the surfaced insights into an editable form and upload the file to their Google Drive. The agent would immediately analyze the query, plan and make required API function calls to connect to the relevant tools and execute the task. It can even be used for writing to Snowflake tables and making data modifications.

Snowflake Intelligence data agent in action
Snowflake Intelligence data agent in action

We’ve reached out to Snowflake with specific questions about these data agents, including the breadth of data sources they can cover and tasks they can (or cannot) execute, but have not heard from the company at the time of writing.

It also remains to be seen how quickly and easily users can create and set up these data agents. For now, the company has only said it only takes a “few steps” to deploy them.

Baris Gultekin, the head of AI at Snowflake says the unified platform “represents the next step in Snowflake’s AI journey, further enabling teams to easily, and safely, advance their businesses with data-driven insights they can act on to deliver measurable impact.”

Advertisement

No word on widespread availability 

While the idea of having agents that could answer questions about business data and then take specific actions with the generated insights to do organizational work sounds very tempting, it is pertinent to note that the capability has just been announced yet.

Snowflake has not given a timeline on its availability. It only says that the unified platform will go into private preview very soon.

However, the competition is intensifying fast, including from AI model provider startups such as Anthropic with its new Computer Use mode, giving users more options to choose from when it comes to turning autonomous agents loose on business data, and completing tasks from a user’s text prompt instructions.

The company also notes that Snowflake Intelligence will be natively integrated with the company’s Horizon Catalog at the foundation level, allowing users to run agents for insights right where they discover, manage and govern their data assets. It will be compatible with both Apache Iceberg and Polaris, the company added. 

Advertisement

Snowflake BUILD runs from November 12 to 15, 2024.


Source link
Continue Reading

Science & Environment

Jets of liquid bounce off hot surfaces without ever touching them

Published

on

New Scientist. Science news and long reads from expert journalists, covering developments in science, technology, health and the environment on the website and the magazine.


New Scientist. Science news and long reads from expert journalists, covering developments in science, technology, health and the environment on the website and the magazine.

If you cook with stainless steel pans, you’re probably familiar with the Leidenfrost effect

Franck Celestini

A jet of liquid can bounce off of a hot plate without ever touching it. This extension of the Leidenfrost effect – the phenomenon that allows beads of water to skitter across a scorching pan – could help improve cooling processes, from nuclear reactors to firefighting.

Though first described nearly 300 years ago, the Leidenfrost effect has only been tested with fluid droplets, not squirts of liquid. Until now.

Advertisement

Frack Celestini at Côte d’Azur…



Source link

Continue Reading

Technology

Particle launches an AI news app to help publishers, instead of just stealing their work

Published

on

Particle launches an AI news app to help publishers, instead of just stealing their work

The media industry today may not have a very favorable view of AI — a technology that’s already been used to replace reporters with AI-written copy, while other AI companies have scooped up journalists’ work to feed their chatbots’ data demands, but without returning traffic to the publisher as search engines once did. However, one startup, an AI newsreader called Particle from former Twitter engineers, believes that AI could serve a valuable role in the media industry by helping consumers make sense of the news and dig deeper into stories, while still finding a way to support the publishers’ businesses.

Backed by $4.4 million in seed funding and a $10.9 million Series A led by Lightspeed, Particle was founded last year by the former senior director of Product Management at Twitter, Sara Beykpour, who worked on products like Twitter Blue, Twitter Video, and conversations, and who spearheaded the experimental app, twttr. Her co-founder is a former senior engineer at Twitter and Tesla, Marcel Molina.

From the consumers’ perspective, the core idea behind Particle is to help readers better understand the news with the help of AI technology. More than just summarizing stories into key bullet points for quick catch-ups, Particle offers a variety of clever features that let you approach the news in different ways.

Image Credits:Particle

But instead of simply sucking up publishers’ work for its own use, Particle aims to compensate publishers or even drive traffic back to news sites by prominently showcasing and linking to sources directly underneath its AI summaries.

To start, Particle has partnered with specific publishers to host some of their content in the app via their APIs, including outlets like Reuters, AFP, and Fortune. These partners receive better positioning and their links are highlighted in gold above others.

Advertisement

Image Credits:Particle

Already, beta tests indicate that readers are clicking through to publishers’ sites because of the app’s design and user interface, though that could shift now that the app is launching beyond news junkies to the general public. In time, the company intends to introduce other ways to work with the media, too, in addition to sending them referral traffic. The team is also having discussions with publishers about providing its users access to paywalled content in a way that makes sense for all parties.

“Having deep partnerships and collaboration is one of the things that we’re really interested in,” notes Beykpour.

To help with its traffic referral efforts, the app’s article section includes big tap targets, making it easy for readers to click through to the publisher’s site. Plus, Particle includes the faces of the journalists on their bylines, and readers can follow through links to publisher profiles to read more of their content or follow them.

Using the app’s built-in AI tools, news consumers can switch between different modes like “Explain Like I’m 5,” to get a simplified version of a complicated story or those that summarize “just the facts,” (or the 5W’s — who, what, when, where, and why). You can have the news summarized in another language besides English, or listen to an audio summary of a story or a personalized selection of stories while on the go. Particle can also pull out important quotes from a story and other links of reference.

Image Credits:Particle

But two of the more interesting features involve how Particle leverages AI to help present the news from different angles and allows you to further engage with the story at hand by asking questions.

In Particle, one tool called “Opposite Sides” aims to break users’ filter bubbles by presenting different viewpoints from the same story. This model has been tried before by other news apps, including the startup Brief and SmartNews. Unlike earlier efforts, Particle includes a story spectrum that shows how news is being reported across both “red” and “blue”-leaning sites, with bubbles placed to indicate how far to the left or right the news’ positioning is, and how outsized the coverage may be from one side or the other. The AI will also summarize both sides’ positions, allowing news consumers to reach their own opinions about the matter.

Advertisement

Image Credits:Particle

However, the app’s killer feature is an AI chatbot that lets you ask questions and get instant answers about a story. The app will include suggested questions and those asked by others. For example, if you’re reading about Trump’s immigration policy plans, you could ask the chatbot things like “What are the potential legal challenges to Trump’s deportation plans?” or “What are the potential costs of mass deportation?” among other things. Particle will then use its AI technology to find those answers and fact-check them for accuracy.

“The chat function uses OpenAI as well as…our own pre-processing and post-processing,” explains Beykpour, in an interview with TechCrunch. “It uses the content, searches the web a little bit — if it wants to find extra information on the web — to generate those answers.” She says that after the answer is generated, Particle includes an extra step where the AI has to go find the supporting material that matches those answers.

Overall, the app encompasses tech like OpenAI’s GPT-4o and GPT-4o mini, Anthropic, Cohere, and others, including more traditional AI technologies, which are not LLM-based, from Google.

“We have a processing pipeline that takes related content and summarizes it into bullet points, into a headline, sub-headline, and does all the extractions,” she continues. “Then…we pull out quotes and links and all sorts of relevant information about [the story]. And we have our own algorithms to rank, so that the most important or relevant link is the one that you see first — or what we think is the most important or relevant quote is the one that you see first.”

The company claims that its technology reduces AI accuracy problems that would otherwise occur one out of 100 times, and reduces their likelihood to one out of 10,000 times.

Advertisement

Particle will also use human editors as it grows to help better manage the AI content and curate its homepage, she notes.

The app is a free download on iOS for the time being and works across iPhone and iPad.

TechCrunch has an AI-focused newsletter! Sign up here to get it in your inbox every Wednesday.

Source link

Advertisement

Continue Reading

Technology

VMware Workstation and Fusion are now free for everyone

Published

on

VMware Workstation and Fusion are now free for everyone

VMware made its Fusion and Workstation software that creates and manages virtual machines free for personal use earlier this year. Now, the company announced that as of Monday, it’s free for everyone, including commercial customers. Also, the Fusion (for Macs) and Workstation (for Windows and Linux) Pro versions are no longer available for purchase.

Broadcom’s $61 billion acquisition of VMware in 2022 was one of the biggest tech acquisitions ever. Since then, it has bundled the company’s products to “simplify its portfolio” and dropped many existing SKUs. It has already announced an end to offering VMware perpetual licensing for standalone offerings to push enterprises towards its Cloud Foundation or vSphere Foundation subscription products.

However, in this Business Insider report mentioned by Tom’s Hardware, some business customers claimed they’ve seen prices spike following the acquisition as the company focuses on subscriptions and its most lucrative customers to increase annual revenue. One unnamed corporate customer quoted by BI said their prices increased by 175 percent and compared the situation to being “held for ransom” because of the difficulty in possibly switching to something else.

Commercial contracts will remain in effect for businesses, and they will receive the same level of support through the end of their contracts. However, VMware is discontinuing its support ticketing for troubleshooting after that and says instead to use the community, documentation, and support articles available online.

Advertisement

Source link

Continue Reading

Technology

Has Pakistan begun the crackdown on “unregistered” VPNs?

Published

on

Woman hands and flag of Pakistan on laptop keyboard

In what would appear to be the beginning of the implementation of VPN-restricting legislation in Pakistan, many residents have reported issues accessing their VPN services. On Sunday, November 9, 2024, people in Pakistan lamented issues using the best VPN apps. After the initial silence, authorities later confirmed to local publications that this wave of restrictions was due to a “brief technical glitch” – VPNs returned working as usual by the end of the day – while reiterating the need for VPN providers to register their services in the country to avoid further disruptions.

The Pakistan Tele­communication Authority (PTA) announced plans to regulate the use of VPNs in August. The new legislation would aim to curb VPN misuse and security risks. Authorities deemed unregistered VPNs a “security risk” for Pakistan as they can be used to access “sensitive data.”

The graph shows spike in VPN downloads in Pakistan starting from November 9, 2024.

Data from Proton VPN shows a huge spike in usage of the VPN service after other providers began experiencing issues. (Image credit: Proton)

Proton VPN has since confirmed to TechRadar a spike in usage of over 350% above normal traffic following the reported VPN outages and connectivity issues (see graph above).

Source link

Advertisement

Continue Reading

Technology

This robot can build anything you ask for out of blocks

Published

on

This robot can build anything you ask for out of blocks

An AI-assisted robot can listen to spoken commands and assemble 3D objects such as chairs and tables out of reusable building blocks

Source link

Continue Reading

Trending

Copyright © 2024 WordupNews.com