It’s the perennial “cocktail party problem” – standing in a room full of people, drink in hand, trying to hear what your fellow guest is saying.
In fact, human beings are remarkably adept at holding a conversation with one person while filtering out competing voices.
However, perhaps surprisingly, it’s a skill that technology has until recently been unable to replicate.
And that matters when it comes to using audio evidence in court cases. Voices in the background can make it hard to be certain who’s speaking and what’s being said, potentially making recordings useless.
Advertisement
Electrical engineer Keith McElveen, founder and chief technology officer of Wave Sciences, became interested in the problem when he was working for the US government on a war crimes case.
“What we were trying to figure out was who ordered the massacre of civilians. Some of the evidence included recordings with a bunch of voices all talking at once – and that’s when I learned what the “cocktail party problem” was,” he says.
“I had been successful in removing noise like automobile sounds or air conditioners or fans from speech, but when I started trying to remove speech from speech, it turned out not only to be a very difficult problem, it was one of the classic hard problems in acoustics.
“Sounds are bouncing round a room, and it is mathematically horrible to solve.”
Advertisement
The answer, he says, was to use AI to try to pinpoint and screen out all competing sounds based on where they originally came from in a room.
This doesn’t just mean other people who may be speaking – there’s also a significant amount of interference from the way sounds are reflected around a room, with the target speaker’s voice being heard both directly and indirectly.
In a perfect anechoicchamber – one totally free from echoes – one microphone per speaker would be enough to pick up what everyone was saying; but in a real room, the problem requires a microphone for every reflected sound too.
Mr McElveen founded Wave Sciences in 2009, hoping to develop a technology which could separate overlapping voices. Initially the firm used large numbers of microphones in what’s known as array beamforming.
Advertisement
However, feedback from potential commercial partners was that the system required too many microphones for the cost involved to give good results in many situations – and wouldn’t perform at all in many others.
“The common refrain was that if we could come up with a solution that addressed those concerns, they’d be very interested,” says Mr McElveen.
And, he adds: “We knew there had to be a solution, because you can do it with just two ears.”
The company finally solved the problem after 10 years of internally funded research and filed a patent application in September 2019.
Advertisement
What they had come up with was an AI that can analyse how sound bounces around a room before reaching the microphone or ear.
“We catch the sound as it arrives at each microphone, backtrack to figure out where it came from, and then, in essence, we suppress any sound that couldn’t have come from where the person is sitting,” says Mr McElveen.
The effect is comparable in certain respects to when a camera focusses on one subject and blurs out the foreground and background.
“The results don’t sound crystal clear when you can only use a very noisy recording to learn from, but they’re still stunning.”
Advertisement
The technology had its first real-world forensic use in a US murder case, where the evidence it was able to provide proved central to the convictions.
After two hitmen were arrested for killing a man, the FBI wanted to prove that they’d been hired by a family going through a child custody dispute. The FBI arranged to trick the family into believing that they were being blackmailed for their involvement – and then sat back to see the reaction.
While texts and phone calls were reasonably easy for the FBI to access, in-personmeetings in two restaurants were a different matter. But the court authorised the use of Wave Sciences’ algorithm, meaning that the audio went from being inadmissible to a pivotal piece of evidence.
Since then, other government laboratories, including in the UK, have put it through a battery of tests. The company is now marketing the technology to the US military, which has used it to analyse sonar signals.
Advertisement
It could also have applications in hostage negotiations and suicide scenarios, says Mr McElveen, to make sure both sides of a conversation can be heard – not just the negotiator with a megaphone.
Late last year, the company released a software application using its learning algorithm for use by government labs performing audio forensics and acoustic analysis.
Eventually it aims to introduce tailored versions of its product for use in audio recording kit, voice interfaces for cars, smart speakers, augmented and virtual reality, sonar and hearing aid devices.
So, for example, if you speak to your car or smart speaker it wouldn’t matter if there was a lot of noise going on around you, the device would still be able to make out what you were saying.
Advertisement
AI is already being used in other areas of forensics too, according to forensic educator Terri Armenta of the Forensic Science Academy.
“ML [machine learning] models analyse voice patterns to determine the identity of speakers, a process particularly useful in criminal investigations where voice evidence needs to be authenticated,” she says.
“Additionally, AI tools can detect manipulations or alterations in audio recordings, ensuring the integrity of evidence presented in court.”
And AI has also been making its way into other aspects of audio analysis too.
Advertisement
Bosch has a technology called SoundSee, that uses audio signal processing algorithms to analyse, for instance, a motor’s sound to predict a malfunction before it happens.
“Traditional audio signal processing capabilities lack the ability to understand sound the way we humans do,” says Dr Samarjit Das, director of research and technology at Bosch USA.
“Audio AI enables deeper understanding and semantic interpretation of the sound of things around us better than ever before – for example, environmental sounds or sound cues emanating from machines.”
More recent tests of the Wave Sciences algorithm have shown that, even with just two microphones, the technology can perform as well as the human ear – better, when more microphones are added.
Advertisement
And they also revealed something else.
“The math in all our tests shows remarkable similarities with human hearing. There’s little oddities about what our algorithm can do, and how accurately it can do it, that are astonishingly similar to some of the oddities that exist in human hearing,” says McElveen.
“We suspect that the human brain may be using the same math – that in solving the cocktail party problem, we may have stumbled upon what’s really happening in the brain.”
It’s been quite some time since we heard anything about Netflix’s animated adaptation of Splinter Cell — but the streamer has finally provided some details on the show. The reveal comes in the form of a very brief teaser trailer, which shows a little bit of the show, but mostly showcases Liev Schreiber’s gravelly take on lead character Sam Fisher. We also have a proper name now: it’s called Splinter Cell: Deathwatch.
Horseshoe crabs: Ancient creatures who are a medical marvel – CBS News
Correspondent Conor Knighton visits New Jersey beaches along the Delaware Bay to learn about horseshoe crabs – mysterious creatures that predate dinosaurs – whose very blood has proved vital to keeping humans healthy by helping detect bacterial endotoxins. He talks with environmentalists about the decline in the horseshoe crab population, and with researchers who are pushing the pharmaceutical industry to switch its use of horseshoe crab blood with a synthetic alternative used in medical testing.
Advertisement
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.
Strands is the NYT’s latest word game after the likes of Wordle, Spelling Bee and Connections – and it’s great fun. It can be difficult, though, so read on for my Strands hints.
SPOILER WARNING: Information about NYT Strands today is below, so don’t read on if you don’t want to know the answers.
Your Strands expert
Advertisement
Your Strands expert
Marc McLaren
NYT Strands today (game #201) – hint #1 – today’s theme
What is the theme of today’s NYT Strands?
• Today’s NYT Strands theme is… A way with words
NYT Strands today (game #201) – hint #2 – clue words
Play any of these words to unlock the in-game hints system.
Advertisement
DIRTY
STRICT
POSE
POSED
DEAN
DOSE
NYT Strands today (game #201) – hint #3 – spangram
What is a hint for today’s spangram?
• A bard’s domain
NYT Strands today (game #201) – hint #4 – spangram position
What are two sides of the board that today’s spangram touches?
First: left, 4th row
Last: right, 5th row
Right, the answers are below, so DO NOT SCROLL ANY FURTHER IF YOU DON’T WANT TO SEE THEM.
NYT Strands today (game #201) – the answers
The answers to today’s Strands, game #201, are…
Advertisement
RHYME
VERSE
METER
STANZA
SYNTAX
DICTION
SCANSION
SPANGRAM: POETRY
My rating: Moderate
My score: 2 hints
I’ve never been a fan of poetry, though I love words and language. Set it to music and it’s a different matter – and I guess the best lyricists are also poets. But ask me to talk about SCANSION and STANZAs and I’m a little lost. All of which is a way of justifying why I needed two hints to complete what for some people will probably be a fairly simple Strands puzzle.
Sign up for breaking news, reviews, opinion, top tech deals, and more.
I worked out what the theme was early on, with the clue of ‘A way with words’ and the fact that I found RHYME by accident combining to set me on the right track. But though I spotted a couple more, I couldn’t get them all without needing a helping hand for METER and STANZA. After that I spotted the spangram, and the others were solved pretty much by a combination of guesswork and my modicum of knowledge.
Yesterday’s NYT Strands answers (Thursday 19 September, game #200)
SPIDER
MILLIPEDE
BEETLE
TERMITE
EARWIG
SPANGRAM: CREEPYCRAWLIES
What is NYT Strands?
Strands is the NYT’s new word game, following Wordle and Connections. It’s now out of beta so is a fully fledged member of the NYT’s games stable and can be played on the NYT Games site on desktop or mobile.
I’ve got a full guide to how to play NYT Strands, complete with tips for solving it, so check that out if you’re struggling to beat it each day.
A blockchain entrepreneur, a cinematographer, a polar adventurer and a robotics researcher plan to fly around Earth’s poles aboard a SpaceX Crew Dragon capsule by the end of the year, becoming the first humans to observe the ice caps and extreme polar environments from orbit, SpaceX announced Monday.
The historic flight, launched from the Kennedy Space Center in Florida, will be commanded by Chun Wang, a wealthy bitcoin pioneer who founded f2pool and stakefish, “which are among the largest Bitcoin mining pools and Ethereum staking providers,” the crew’s website says.
“Wang aims to use the mission to highlight the crew’s explorational spirit, bring a sense of wonder and curiosity to the larger public and highlight how technology can help push the boundaries of exploration of Earth and through the mission’s research,” SpaceX said on its website.
Wang’s crewmates are Norwegian cinematographer Jannicke Mikkelsen, Australian adventurer Eric Philips and Rabea Rogge, a German robotics researcher. All four have an interest in extreme polar environments and plan to carry out related research and photography from orbit.
The mission, known as “Fram2” in honor of a Norwegian ship used to explore both the Arctic and Antarctic regions, will last three to five days and fly at altitudes between about 265 and 280 miles.
Advertisement
“This looks like a cool & well thought out mission. I wish the @framonauts the best on this epic exploration adventure!” tweeted Jared Isaacman, the billionaire philanthropist who charted the first private SpaceX mission — Inspiration4 — and who plans to blast off on a second flight — Polaris Dawn — later this month.
The flights “showcase what commercial missions can achieve thanks to @SpaceX’s reusability and NASA’s vision with the commercial crew program,” Isaacman said. “All just small steps towards unlocking the last great frontier.”
Like the Inspiration4 mission before them, Wang and his crewmates will fly in a Crew Dragon equipped with a transparent cupola giving them a picture-window view of Earth below and deep space beyond.
No astronauts or cosmonauts have ever viewed Earth from the vantage point of a polar orbit, one tilted, or inclined, 90 degrees to the equator. Such orbits are favored by spy satellites, weather stations and commercial photo-reconnaissance satellites because they fly over the entire planet as it rotates beneath them.
The high-inclination record for piloted flight was set in the early 1960s by Soviet Vostok spacecraft launched into orbits inclined 65 degrees. The U.S. record was set by a space shuttle mission launched in 1990 that carried out a classified military mission in an orbit tilted 62 degrees with respect to the equator.
The International Space Station never flies beyond 51.6 degrees north and south latitude. NASA planned to launch a space shuttle on a classified military mission around the poles in 1986, but the flight was canceled in the wake of the Challenger disaster.
Advertisement
“The North and South Poles are invisible to astronauts on the International Space Station, as well as to all previous human spaceflight missions except for the Apollo lunar missions but only from far away,” the Fram2 website says. “This new flight trajectory will unlock new possibilities for human spaceflight.”
SpaceX has launched 13 piloted missions carrying 50 astronauts, cosmonauts and private citizens to orbit in nine NASA flights to the space station, three commercial visits to the lab and the Inspiration4 mission chartered by Isaacman.
Isaacman and three crewmates plan to blast off Aug. 26 on another fully commercial flight, this one featuring the first civilian spacewalks. NASA plans to launch its next Crew Dragon flight to the space station around Sept. 24.
Bill Harwood has been covering the U.S. space program full-time since 1984, first as Cape Canaveral bureau chief for United Press International and now as a consultant for CBS News.
Today we’re launching a totally new, totally different app. Meet Orion.
Orion is a small, fun app that helps you use your iPad as an external HDMI display for any camera, video game console, or even VHS. Just plug in one of the bajillion inexpensive adapters, and Orion handles the rest.
But wait — we’re a camera company. Why an HDMI monitor?
We built this to scratch a few itches. First, in professional cinematography, it’s common to connect an external screen to your camera to get a better view of the action. Orion not only gives you a bigger screen, but you can even share screenshots with your crew with a couple of taps.
We also built this for… pure fun. When traveling with a Nintendo Switch, it’s a delight to play games on a bigger screen, especially alongside friends.
Orion goes a step beyond display. By default, inputs could look fuzzy on an iPad’s retina display. (Why? The Switch runs a modest 1080p resolution, and even if it ran at a higher resolution, most adapters on the market can only run 60 frames per second at 1080P.)
Advertisement
Orion sharpens those low resolution inputs with an AI powered upscaler!
Another perk is control over the brightness of the image beyond the iPad’s screen brightness. If you’re trying to view video in daylight, crank up brightness to HDR range for extra help. If you’re on a late-night flight and don’t want to bother anyone around you, make things darker than the iPad’s darkest.
OK, I hear you ask, but how much does all of this cost? A camera monitor is hundreds of dollars. Well, Orion is free. Yep, free.
If you want to support the app, get Orion Pro: It packs AI upscaling, CRT emulation for retro games, and image adjustments (and whatever else we cook up). It’s a one-time upgrade for $5. It unlocks everything. No subscriptions.
As for those adapters, we found plenty available for under $20. Now it’s easy to get confused and accidentally buy, say, a USB-C hub with video output, which can’t capture anything. (Ask me how I know.) That’s why we personally tested the top ten adapters on Amazon and made a helpful buying guide with our recommendations and some other accessories, too.
The Story of Orion
This summer, Apple announced a set of awesome new features coming to iOS 17, and one of them was external-webcam support on iPad. After digging into the feature for our flagship app, Halide, we weren’t satisfied with the results in a camera app.
Advertisement
However, we did discover that a ton of companies sell tiny, inexpensive adapters that convert HDMI signals into webcams. “What if you could use an iPad as a portable screen?” Hmm! Intriguing. We had an idea, and we got to work.
We wrote the first line of code on August 6th, we’re shipping September 20th — 45 days later.
We’re launching at the start of new iPhone season, so we’re already super busy and shifting our focus to our flagship iPhone photography app, Halide. Orion won’t distract us from that, because we’re calling it a b-side.
B-sides are small fun, small, and focused projects. Apps like Halide needs major work every year to keep up with new hardware, but we expect Orion will be “done” after a release or two. We’ll keep maintaining it so it doesn’t break, but we won’t revolve our lives around it. It’s a fun utility, and that’s why we’re only asking for a few bucks.
Beyond being fun to build and design, apps like Orion let us experiment with new developer tools earlier than in our flagship apps. In a mature app used by lots of people, it’s a good idea to wait a year or two before adopting a cutting edge technology; while Apple launched SwiftUI in 2019, but we waited until 2021 to add it to Halide. SwiftUI has been a huge win for certain types of problems— and we couldn’t have built Orion so quickly without it— but by waiting two years before adding SwiftUI it to Halide, we had to play a lot of catchup in 2021.
So apps like Orion allow us to scratch our own itch, which is how we got into building apps in the first place, and also help us keep up with where iOS is heading.
Advertisement
The Orion Video System Design
You might notice something about the styling of Orion — it’s very stylized in a… retro sort of way.
When we set off to design the app, we really wanted it to be fun. Starting with the basic idea—a portable screen—we thought of the era where televisions and video were still exciting, fresh technology. The techno-utopia of the early 1980s came to mind. We find this a delightful aesthetic.
Pastels, purples and pinks. Detailed technical illustrations and bright colors. Futuristic logos. Type that tracked far too tightly thanks to the invention of the photo-typesetter. And of course, the invention of bitmap typefaces and on-screen user interfaces and icons.
We didn’t want to just lean into the clichés—there are enough vaporwave sunsets with Deloreans out there that try to seem ’80s’— so went and developed a visual language that is based on the electronics brochures and VCR interfaces of bygone days that conveys ‘modern’ in a way only the 1980s visual vernacular can.
In Halide, we did everything we could to make the app feel as tactile as a real camera. Great cameras are wonderfully tactile — every knob and switch has a weighted, deliberate feel and click to it.
In Orion, we wanted to give you the joy of your own ‘video system’. That meant starting from the beginning: you open the box to unpack it. Because, well, why not.
Instructions follow, so you can get started quickly.
And when not actively in use, you return to a glowing, slightly distorted nostalgic place of on-screen menus, where our custom-made pixel font called Radiant steals the show.
If it wasn’t obvious, we had a lot of fun doing this. And that’s what really mattered to us: if anything, Orion was a project to collaborate with friends on something fun and different.
Thank you
We want to build things with craft, fun and delight. To showcase that apps are an art form, and have no business being boring. We hope you enjoy the result — we know that we loved building it for you. Thanks to you, we get to do what we love.
Orion was a collaboration with friends. Some of the incredible design and typography on display (and our two custom typefaces) are the work of Jelmar Geertsma. Orion was co-engineered with Anton Heestand. The opening music (yes, opening music) is by Cabel Sasser. Extra thanks go to Louie Mantia for bézier wrangling and our families — especially Margo — for supporting us in doing what we love. If you are still reading here, please consider leaving us a review on our apps — it goes a very long way.
The saga of the large invasive Joro spiders that parachute through the air isn’t over. A new study found that the critters with 4-inch-long legs are truly built differently, with hearts that are able to withstand the loud and bustling noises of big cities.
University of Georgia researcher Andy Davis made the discovery while conducting cardiac stress tests on Joro spiders and their cousin, the golden silk spider. The research, published in Physiological Entomology on Monday, found that the species know how to chill out and stay calm when put in heart rate-raising situations.
The Joro spider, also known as Trichonephila clavata, “is known for making webs not only in natural green spaces but also in cities and towns, often on buildings and human dwellings,” the study says. “The stress reactions of Trichonephila spiders could be characterized as ‘even-tempered,’ which may factor into their ability to live in habitats with frequent disturbances.”
Advertisement
Davis and his team evaluated the physiological reactions of Joro spiders and golden silk spiders and compared them to those of another pair of similarly-sized species that are related to each other, garden spiders and banded garden spiders.
Researchers recorded baseline heart rates of the arachnids while they were resting and inactive, and then recorded their heart rates after restraining them under electronic sensors for 10 minutes.
“When subjected to the novel restraint stress, heart rates of all spider species became elevated, which is an expected reaction that other spider researchers have noted,” the study says. “However, there were differences among species in the magnitude of this elevation, and of how the responses progressed during the 10 min period.”
The garden spiders, both of which belong to the Argiope genus, showed “distinct periods of fluctuations during the restraint” and were even found to struggle against the restraints, researchers said. Joro spiders and their golden silk cousins, on the other hand, were “less variable and more even.” They were also observed entering a state of thanatosis for more than an hour after stressors, meaning they essentially froze up during that time.
The tests “are beginning to paint a picture of how the invasive Joro spider and its cousin, the golden silk spider, have a unique way of tolerating novel stressors, which may be the reason for their ability to occupy anthropogenic landscapes,” researchers said, noting that other spider species in their family line could share this trait, although that would need further investigation.
Joro spiders have been making headlines for years as they continue to spread up the East Coast. Originally from Asia, the spiders are believed to have been first introduced to north Georgia around 2010. They have since been found across nearly a dozen other states. In December, Davis told The New York Times that New York is “right in the middle of where they like to be.” It’s been predicted that they could pop up in the New York tri-state area this summer, although no reports of such have been made.
Advertisement
“They seem to be OK with living in a city,” he told the paper, adding that they’ve been seen hanging out on street lamps and telephone poles, where “regular spiders wouldn’t be caught dead in.”
The latest findings may not definitively prove that the spiders’ relaxed demeanor is the reason for “their affinity for urban settings,” the study says, adding that more research is needed. It does, however, bolster Davis’ research from February, which also found that Joro spiders don’t necessarily mind the increased noise and vibrations that come with city living.
Advertisement
“These Joro webs are everywhere in the fall, including right next to busy roads, and the spiders seem to be able to make a living there. For some reason, these spiders seem urban tolerant,” Davis said of his earlier research.
UGA student and co-author of that study, Alexa Schultz, agreed, saying, “It looks like Joro spiders are not going to shy away from building a web under a stoplight or an area where you wouldn’t imagine a spider to be.”
But don’t worry — while the spiders are venomous, they don’t pose a danger to humans, although they may elevate your heart rate more than you elevate theirs.
Li Cohen is a senior social media producer at CBS News. She previously wrote for amNewYork and The Seminole Tribune. She mainly covers climate, environmental and weather news.
You must be logged in to post a comment Login