Connect with us

Technology

Amazon reportedly bumped back its AI-powered Alexa to next year

Published

on

Amazon reportedly bumped back its AI-powered Alexa to next year

If you’re wondering what happened to Amazon’s new and improved version of its Alexa voice assistant, you’re not alone. reports that the new Alexa is still stuck in its developmental phase and Amazon has cut off access to its beta phase including its new “Let’s Chat” phase. As a result, a planned late 2024 launch has been pushed back to next year.

The problem seems to be with its large language models (LLMs). The new Alexa is designed to from users but it’s also more likely to fail doing some of the most basic things the old version could do quite easily like create a timer or operate smart lights, according to a follow up report from .

Amazon originally planned to unveil its new version of Alexa AI in October but now the timeline has been extended into next year. (As you might have noticed, October has come and gone.) The original timeline planned to premiere the next evolutionary step in Alexa’s advancement on October 17 but Amazon decided to pivot and used the date to show off its new line of Kindle ereaders. Then in August, news surfaced that the new Alexa would be powered by and come with a monthly subscription fee.

As ChatGPT began to rise in popularity in the summer of 2023, Amazon CEO Andy Jassy wanted to see if Alexa could compete if it had an AI upgrade. Jassy reportedly started peppering Alexa with sports questions “like an ESPN reporter at a playoff press conference” and its answers were “nowhere near perfect.” It even made up a recent game score for Jassy.

Advertisement

Despite this, Alexa passed the good enough stage and Jassy and his fellow executives felt their engineers could build a beta version by the early part of 2024. Unfortunately, Amazon wasn’t able to meet its deadline.

Even with the new deadline, the new Alexa still has a long way to go to fix its problems. Some employees told Bloomberg that the problem outside of Alexa’s innerworkings is with Amazon’s overstuffed management and a lack of “a compelling vision for an AI-powered Alexa.” .

Source link

Advertisement
Continue Reading
Advertisement
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Technology

Instagram reorganizes message requests for creators

Published

on

Instagram reorganizes message requests for creators

A new update for Instagram posted earlier today could fix one of the most frustrating problems for creators. Adam Mosseri, the head of Instagram, announced a new filtering update on Instagram for creators’ inboxes.

Instagram users with a creator-designated account can now filter message requests in their inbox based on its sender in a similar way to Gmail’s labels. Creators can still sort their messages by the most “recent” received and by the “number of followers” but they can now filter out certain messages. The new filters include requests and messages from “verified accounts,” “businesses,” “creators” and “subscribers.”

The update also includes a way to sort all of your story replies on Instagram. If you go to the top of your inbox, you can also sort and filter your story replies “in case you just wanna get to these requests really quickly and easily,” Mosseri says.

“Now there’s a lot more to do to improve the inbox for creators and requests but hopefully this is one step in the right direction,” Mosseri adds in his video. He also said this feature was one a lot of creators were asking for, so hopefully Instagram will be adding more inbox tools in the near future to make that part of the app a bit cleaner.

Advertisement

Instagram has been toying with new ways to update its platform for higher profile users and creators for a long time now. The company started testing its creator account concept in 2018 that allowed celebrities and more famous social media stars to filter their direct messages and track stats of their followers.

Source link

Advertisement
Continue Reading

Technology

AI on your smartphone? Hugging Face’s SmolLM2 brings powerful models to the palm of your hand

Published

on

AI on your smartphone? Hugging Face’s SmolLM2 brings powerful models to the palm of your hand

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


Hugging Face today has released SmolLM2, a new family of compact language models that achieve impressive performance while requiring far fewer computational resources than their larger counterparts.

The new models, released under the Apache 2.0 license, come in three sizes — 135M, 360M and 1.7B parameters — making them suitable for deployment on smartphones and other edge devices where processing power and memory are limited. Most notably, the 1.7B parameter version outperforms Meta’s Llama 1B model on several key benchmarks.

Performance comparison shows SmolLM2-1B outperforming larger rival models on most cognitive benchmarks, with particularly strong results in science reasoning and commonsense tasks. Credit: Hugging Face

Small models pack a powerful punch in AI performance tests

“SmolLM2 demonstrates significant advances over its predecessor, particularly in instruction following, knowledge, reasoning and mathematics,” according to Hugging Face’s model documentation. The largest variant was trained on 11 trillion tokens using a diverse dataset combination including FineWeb-Edu and specialized mathematics and coding datasets.

This development comes at a crucial time when the AI industry is grappling with the computational demands of running large language models (LLMs). While companies like OpenAI and Anthropic push the boundaries with increasingly massive models, there’s growing recognition of the need for efficient, lightweight AI that can run locally on devices.

Advertisement

The push for bigger AI models has left many potential users behind. Running these models requires expensive cloud computing services, which come with their own problems: slow response times, data privacy risks and high costs that small companies and independent developers simply can’t afford. SmolLM2 offers a different approach by bringing powerful AI capabilities directly to personal devices, pointing toward a future where advanced AI tools are within reach of more users and companies, not just tech giants with massive data centers.

A comparison of AI language models shows SmolLM2’s superior efficiency, achieving higher performance scores with fewer parameters than larger rivals like Llama3.2 and Gemma, where the horizontal axis represents the model size and the vertical axis shows accuracy on benchmark tests. Credit: Hugging Face

Edge computing gets a boost as AI moves to mobile devices

SmolLM2’s performance is particularly noteworthy given its size. On the MT-Bench evaluation, which measures chat capabilities, the 1.7B model achieves a score of 6.13, competitive with much larger models. It also shows strong performance on mathematical reasoning tasks, scoring 48.2 on the GSM8K benchmark. These results challenge the conventional wisdom that bigger models are always better, suggesting that careful architecture design and training data curation may be more important than raw parameter count.

The models support a range of applications including text rewriting, summarization and function calling. Their compact size enables deployment in scenarios where privacy, latency or connectivity constraints make cloud-based AI solutions impractical. This could prove particularly valuable in healthcare, financial services and other industries where data privacy is non-negotiable.

Industry experts see this as part of a broader trend toward more efficient AI models. The ability to run sophisticated language models locally on devices could enable new applications in areas like mobile app development, IoT devices, and enterprise solutions where data privacy is paramount.

The race for efficient AI: Smaller models challenge industry giants

However, these smaller models still have limitations. According to Hugging Face’s documentation, they “primarily understand and generate content in English” and may not always produce factually accurate or logically consistent output.

Advertisement

The release of SmolLM2 suggests that the future of AI may not solely belong to increasingly large models, but rather to more efficient architectures that can deliver strong performance with fewer resources. This could have significant implications for democratizing AI access and reducing the environmental impact of AI deployment.

The models are available immediately through Hugging Face’s model hub, with both base and instruction-tuned versions offered for each size variant.


Source link
Continue Reading

Technology

An Okta login bug bypassed checking passwords on some long usernames

Published

on

An Okta login bug bypassed checking passwords on some long usernames
Illustration of a password above an open combination lock, implying a data breach.
Illustration by Cath Virginia / The Verge | Photo from Getty Images

On Friday evening, Okta posted an odd update to its list of security advisories. The latest entry reveals that under specific circumstances, someone could’ve logged in by entering anything for a password, but only if the account’s username had over 52 characters.

According to the note people reported receiving, other requirements to exploit the vulnerability included Okta checking the cache from a previous successful login, and that an organization’s authentication policy didn’t add extra conditions like requiring multi-factor authentication (MFA).

Here are the details that are currently available:

On October 30, 2024, a vulnerability was internally identified in generating the cache key for AD/LDAP DelAuth. The Bcrypt algorithm was…

Continue reading…

Source link

Advertisement

Continue Reading

Technology

NYT Strands today — hints, answers and spangram for Saturday, November 2 (game #244)

Published

on

NYT Strands homescreen on a mobile phone screen, on a light blue background

Strands is the NYT’s latest word game after the likes of Wordle, Spelling Bee and Connections – and it’s great fun. It can be difficult, though, so read on for my Strands hints.

Want more word-based fun? Then check out my Wordle today, NYT Connections today and Quordle today pages for hints and answers for those games.

Source link

Continue Reading

Technology

What’s new on Apple TV+ this month (November 2024)

Published

on

What's new on Apple TV+ this month (November 2024)

Due to its unique model that includes only original content, Apple TV+ tends to have a very slim new release slate. However, just about every Apple TV+ release features A-list talent, and it has set a high bar for quality. Just look at Best Picture winner CODA and Emmy-winning drama Severance (returning in January).

This month is no exception, as there are only four new additions to the library in November. We’ve highlighted the two most anticipated, but don’t overlook Season 2 of the critically acclaimed comedy Bad Sisters or the Malala Yousafzai and Jennifer Lawrence documentary Bread & Roses.

There are only a few new arrivals each month to Apple TV+, but they’re usually all worth at least a glance. This month is no exception. Read on for everything coming to Apple TV+ in October 2024.

Looking for more content? Check out our guides on the best new shows to stream, the best shows on Apple TV+, the best shows on Netflix, and the best shows on Hulu.

Need more suggestions?

Our top picks for November

Everything new on Apple TV+ in November

November 13

November 15

November 22

Last month’s top picks






Source link

Advertisement

Continue Reading

Technology

Google could add album art to ‘Now Playing’ on Pixel phones

Published

on

Google could add album art to ‘Now Playing’ on Pixel phones

Google may upgrade the “Now Playing” feature by adding the much-needed album art to the history page. Now Playing has been able to identify songs with a high degree of accuracy, but the list only included the name of the song and the artist.

Now Playing is constantly operating in the background, but only for music

Introduced way back in 2017 along with the Pixel 2, the Now Playing feature has remained exclusive to the Google Pixel phones. It essentially identifies songs that are playing nearby and works well even on the latest Pixel 9 devices.

Apps like Shazam have been recognizing music and songs for quite some time. However, Now Playing has some tricks for the Pixel phones. Now Playing works entirely in the background. Pixel users don’t even need to pull out their phones.

While working in the background, Now Playing relies on the low-power efficiency cores to continuously analyze audio through the microphone. If it picks up audio that seems like music or a song, Now Playing requests the performance cores to record a few seconds of the audio.

Advertisement

Now Playing then matches the recorded audio on a database containing tens of thousands of fingerprints of the most popular songs in a particular region. After processing and matching, Now Playing displays the name and artist of the song on the lock screen as well as in a notification.

Needless to say, Now Playing is fairly accurate. However, the list of songs it recognizes contains only the name of the song, the artist, and a timestamp.

Google’s Now Playing feature for Pixel devices may get album art

The songs that Now Playing recognized are visible under Settings > Sound & vibration > Now Playing. The page lists the history of identified songs in reverse chronological order.

Although there’s an icon next to each song, Google has refused to append any album art to the songs Now Playing recognizes. According to Android Authority, this might change in the future.

Advertisement

The hidden system app that downloads the Now Playing database may soon also grab album art. The code change is titled “#AlbumArt Add Now Playing album art downloads to the network usage log”.

Google has yet to assign a dedicated online repository from where Now Playing will download album art for the songs it recognizes. However, Ambient Music Mod, an open-source port of Now Playing by developer Kieron Quinn, already has the feature. The reverse-engineered version essentially replaces the generic music note icon with album art.

Source link

Continue Reading

Trending

Copyright © 2024 WordupNews.com