Connect with us
DAPA Banner

Tech

Are we ready to place lab experiments in non-human hands?

Published

on

Stephen D Turner of the University of Virginia explores the importance of governance and oversight around AI in the design and execution of lab experiments.

Artificial intelligence is rapidly learning to autonomously design and run biological experiments, but the systems intended to govern those capabilities are struggling to keep pace.

AI company OpenAI and biotech company Ginkgo Bioworks announced in February 2026 that OpenAI’s flagship model GPT-5 had autonomously designed and run 36,000 biological experiments. It did this through a robotic cloud laboratory, a facility where automated equipment controlled remotely by computers carries out experiments. The AI model proposed study designs, and robots carried them out and fed the data back to the model for the next round. Humans set the goal, and the machines did much of the work in the lab, cutting the cost of producing a desired protein by 40pc.

This is programmable biology: designing biological components on a computer and building them in the physical world, with AI closing the loop.

Advertisement

For decades, biology mostly moved from observation toward understanding. Scientists sequenced the genomes of organisms to catalogue all of their DNA, learning how genes encode the proteins that carry out life’s functions. The invention of tools like CRISPR then allowed scientists to edit that DNA for specific purposes, such as disabling a gene linked to disease. AI is now accelerating a third phase, where computers can both design biological systems and rapidly test them.

The process looks less like traditional benchwork in a lab and more like engineering: design, build, test, learn and repeat. Where a traditional experiment might test a single hypothesis, AI-driven programmable biology explores thousands of design variations in parallel, iterating the way an engineer refines a prototype.

As a data scientist who studies genomics and biosecurity, I research how AI is reshaping biological research and what safeguards that demands. Current safety measures and regulations have not kept pace with these capabilities, and the gap between what AI can do in biology and what governance systems are prepared to handle is growing.

What AI makes possible

The clearest example of how researchers are using AI to automate research is AI-accelerated protein design.

Advertisement

Proteins are the molecular machines that carry out most functions in living cells. Designing new ones has traditionally required years of trial and error because even small changes to a protein’s sequence can alter its shape and function in unpredictable ways.

Protein language models, which are AI systems trained on millions of natural protein sequences, can quickly predict how mutations will change a protein’s behavior or design new proteins. These AI models are designing potential new drugs and speeding vaccine development.

Paired with automated labs, these models create tight loops of experimentation and revision, testing thousands of variations in days rather than the months or years a human team would need.

Faster protein engineering could mean faster responses to emerging infections and cheaper drugs.

Advertisement

The dual-use problem

Researchers have raised concerns that these same AI tools could be misused, a challenge known as the dual-use problem: technologies developed for beneficial purposes can also be repurposed to cause harm.

For example, researchers have found that AI models integrated with automated labs can optimise how well a virus spreads, even without specialised training. Scientists have developed a risk-scoring tool to evaluate how AI could modify a virus’s capabilities, such as altering which species it infects or helping it evade the immune system.

Current AI models are able to walk users through the technical steps of recovering live viruses from synthetic DNA. Researchers have determined that AI could lower barriers at multiple stages in the process of developing a bioweapon, and that current oversight does not adequately address this risk.

Risk from bio AI

Experienced scientists are already using AI to plan and design biological experiments. The question of whether AI can help people with limited biology training carry out dangerous lab work is the subject of active research.

Advertisement

Two recent studies have reached different conclusions.

A study by AI company Scale AI and biosecurity nonprofit SecureBio found that when people with limited biology experience were given access to large language models, which is the type of AI behind tools like ChatGPT, they were able to complete biosecurity-related tasks such as troubleshooting complex virology lab protocols with four times greater accuracy. In some areas, these novices outperformed trained experts. Around 90pc of these novices reported little difficulty getting the models to provide risky biological information, such as detailed instructions on working with dangerous pathogens, despite built-in safety filters meant to block such outputs.

In contrast, a study led by Active Site, a research nonprofit that studies the use of AI in synthetic biology, found that AI help did not lead to significant differences in the ability of novices to complete the complex workflow to produce a virus in a biosafety laboratory. However, the AI-assisted group succeeded more often on most tasks and finished some steps faster, most notably on growing cells in the lab.

Hands-on work in the lab has traditionally been a bottleneck to translating designs into results. Even a brilliant study plan still depends on skilled human hands to carry out. That may not last, as cloud laboratories and robotic automation become cheaper and more accessible, allowing researchers to send AI-generated experimental designs to remote facilities for execution.

Advertisement

Responding to AI-driven biological risks

AI systems are now able to run experiments autonomously and at scale, but existing regulations were not designed for this. Rules governing biological research do not account for AI-driven automation, and rules governing AI do not specifically address its use in biology.

In the US, the Biden administration had issued a 2023 executive order on AI security that included biosecurity provisions, but the Trump administration revoked it. Screening the synthetic DNA that commercial providers make to ensure it cannot be misused to make pathogens or toxins remains mostly voluntary. A bipartisan bill introduced in 2026 to mandate DNA screening does not yet address AI-designed sequences that evade current detection methods.

The 1975 Biological Weapons Convention, an international treaty prohibiting the production and use of bioweapons, contains no provisions for AI. The UK AI Security Institute and the US National Security Commission on Emerging Biotechnology have both called for coordinated government action.

The safety evaluations that AI labs run before releasing new models are often opaque and unsuited to capture real-world risk. Researchers have estimated that even modest improvements in an AI model’s ability to help plan pathogen-related experiments could translate to thousands of additional deaths from bioterrorism per year. Timelines for when these capabilities cross critical thresholds remain unclear.

Advertisement

The Nuclear Threat Initiative has proposed a managed access framework for biological AI tools, matching who can use a given tool to the risk level of the model rather than blanket restrictions. The RAND Center on AI, Security and Technology outlined a set of actions researchers could take to improve biosecurity, including improved DNA synthesis screening and model evaluations before release. Researchers have also argued that biological data itself needs governance, especially genomic data that could train models with dangerous capabilities.

Some AI companies have started voluntarily imposing their own safety measures. Anthropic activated its highest safety tier when it released its most advanced model in mid-2025. At the same moment, OpenAI updated its Preparedness Framework, revising the thresholds for how much biological risk a model can pose before additional safeguards are required. But these are voluntary, company-specific steps. Anthropic’s CEO, Dario Amodei, wrote that the pace of AI development may soon outrun any single company’s ability to assess the risk of a given model.

When used in a well-controlled setting, AI can help scientists quickly reach their research goals. What happens when the same capabilities operate outside those controls is a question that policy has not yet answered. Overreact, and talent and investment may move elsewhere while the technology continues advancing anyway. Underreact, and the risks of that technology could be exploited to cause real harm.

The Conversation

Stephen D Turner

Advertisement

Stephen D Turner is an associate professor of data science and an assistant dean for research at the University of Virginia School of Data Science. He has worked on biosecurity applications in national security and writes about AI, biosecurity and other topics.

Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.

Source link

Advertisement
Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Tech

Klipsch OJAS kO-R2 Speaker Debuts at Milan Design Week 2026: Only 600 Pairs, Don’t Expect Them to Last Long

Published

on

Klipsch is returning to Milan Design Week 2026 with something that goes beyond another product launch; it’s a continuation of one of the more interesting collaborations in modern hi-fi. Following the limited-run kO-R1 in 2024, Klipsch and OJAS have officially unveiled the kO-R2, a new loudspeaker created with Devon Turnbull, the artist and acoustic designer behind OJAS, as part of Klipsch’s 80th anniversary.

That matters more than the usual show-floor debut. The first kO-R1 wasn’t just a speaker, it was a statement about where heritage audio could go when handed to someone outside the traditional engineering echo chamber. Turnbull approached Klipsch’s horn-loaded DNA with a minimalist, almost gallery-first mindset, and the result landed somewhere between serious hi-fi and functional art. It sold out quickly and didn’t need a stack of Audio Science Review graphs to justify itself. Turns out art and musical enjoyment still carry more weight than rigid objectivism.

The kO-R2 builds directly on that foundation. Klipsch and OJAS describe it as a blend of minimalist design, advanced acoustic thinking, and bespoke materials, with an emphasis on form that’s meant to live as comfortably in a design exhibition as it does in a listening room. There are no performance specifications or pricing details yet, which feels intentional. This isn’t being positioned as a spec war product; it’s being framed as a continuation of an idea.

ko-r2-loudspeaker-oak
Klipsch OJAS kO-R2

And that’s the real story. At a time when much of the industry is chasing incremental upgrades and feature checklists, Klipsch is doubling down on a collaboration that prioritizes identity, experience, and cultural relevance. Bringing the kO-R2 to Milan Design Week instead of a traditional audio show makes that point clear: this is as much about design language and audience expansion as it is about sound.

Whether the kO-R2 ultimately delivers on the acoustic side will come later. For now, Klipsch and OJAS have done something more difficult; they’ve made people outside the usual audiophile bubble pay attention. 

Advertisement

Unveiled at Milan Design Week 2026

Set against the backdrop of the Fondazione Luigi Rovati, in partnership with USM Modular Furniture and Karimoku, Klipsch and OJAS are hosting curated, appointment-only listening sessions during Milan Design Week through April 26, 2026. Those who get access are encouraged to bring their own music, turning the kO-R2 preview into something more personal than the usual show-floor demo.

After its debut in Milan, a broader launch for the kO-R2 is expected in June 2026.

“Working with Klipsch continues to be an exploration of how we can strip audio down to its most essential, emotional core,” said Devon Turnbull. “With the kO-R2, we focused on creating something that feels immediate and human—where the technology disappears, and the listener is left with a pure, physical connection to the music.”

kO-R2 Design Concept

The kO-R2 is a two-way, sectoral horn-loaded loudspeaker positioned as the next step in the Klipsch x OJAS collaboration. It’s handcrafted in Hope, Arkansas, by the same team behind Klipsch’s legacy designs, and features an OJAS-developed multisectoral horn paired with Baltic birch cabinetry. The goal is clear: deliver the dynamic, low-distortion traits horn systems are known for, while presenting something that looks just as considered as it sounds.

Advertisement
ko-r2-loudspeaker
Klipsch OJAS kO-R2 Loudspeaker in Hammertone Silver.

The core of the latest speaker design is the OJAS 1506 multisectoral horn, fabricated from heavy cast aluminum and finished with electrophoresis and a flat black powder coat.

The exponential horn pulls from classic Western Electric and Altec Lansing design cues, but it’s not a straight throwback. The square, isosceles trapezoidal mouth is doing real work here, controlling dispersion in both planes rather than just looking the part. The result should be more even frequency distribution and a wider, more stable listening window, which is exactly what these older horn concepts were chasing in the first place.

Advertisement. Scroll to continue reading.

The kO-R2 leans into a restrained, material-first design without skimping on the hardware. It uses a high-quality compression driver, anodized aluminum binding posts, and anti-vibration feet—nothing flashy, just components that make sense for a horn-loaded design like this.

Details like the laser-engraved metal ID plate add a layer of exclusivity without turning it into a gimmick, and the five-step high-frequency attenuator is there for a reason: dialing in top-end energy to match the room and placement, which matters more with horns than most speaker types.

Advertisement

Calling it a “museum piece” isn’t entirely off base, but the real goal here isn’t to redefine audiophile expectations. It’s to bridge two worlds that don’t usually overlap this cleanly: serious acoustic design and industrial design that people actually want to live with.

The kO-R2 represents a powerful intersection of heritage and forward-thinking design. Partnering with Devon allows us to honor Klipsch’s 80-year legacy while pushing into new creative territory—delivering a product that is as culturally relevant as it is acoustically exceptional,said Vinny Bonacorsi, COO of Klipsch.

Klipsch OJAS Logo

The Bottom Line 

This isn’t a typical brand crossover. Klipsch is working within its core strength—horn-loaded design—while Devon Turnbull brings a different perspective on how these systems look and live in real spaces. The kO-R2 builds on the kO-R1 with a larger, more complex horn and a move to a floorstanding design, which should translate into greater scale and output.

There are still no detailed specifications or pricing, but the context matters. The kO-R1 launched at $8,498 per pair and sold out quickly. For the kO-R2, production is expected to be limited to around 600 pairs, so availability is going to be tight from the start.

It’s aimed at a specific buyer: someone who values both the design and the underlying acoustic approach, and who is comfortable buying into the concept without a full data sheet upfront. Between the prior pricing and limited run, this won’t be a mainstream Klipsch product—and that’s the point.

Advertisement
Klipsch OJAS kO-R2 Loudspeakers
Klipsch OJAS kO-R2 Loudspeaker in Red Oak veneer.

Price & Availability

Once released (expected to be June 2026), 600 pairs of the kO-R2 will be available worldwide in either Red Oak veneer or Hammertone Silver with a powder-coated, matte-black horn. Price has yet to be announced

Source link

Advertisement
Continue Reading

Tech

Who Owns Carroll Gas Stations?

Published

on





Two entrepreneurs, Benson Phelps and Carroll Faye, teamed up to open a small coal and wood delivery company in Baltimore in 1907. The business saw success in its early years, expanding rapidly over its first couple of decades. Faye decided to move on to other ventures and sold his stake in the business to Phelps, but the company continued to use Faye’s first name as its brand. The Carroll Independent Fuel Company began selling oil in the 1930s under the guidance of Phelps, and it never stopped. Today, drivers can still buy fuel from the same company, although they’ll now recognize it as Carroll Motor Fuels.

The Carroll network of gas stations might have grown significantly over its century-plus of trading, but its ownership structure has remained consistent. It’s still an independent, family-owned business, with various members of the Phelps family at the helm. John Phelps serves as the company’s CEO and President, while Richard B. Phelps III holds the title of Executive Vice President alongside C. Howard Phelps. Several more Phelps family members hold leadership roles.

Carroll isn’t the only gas station chain that has remained family owned since its inception. The Love’s chain of gas stations is also still owned by members of its founding family, and it has risen to become one of America’s largest privately owned companies.

Advertisement

The Carroll network operates under multiple brands

Alongside its own-brand gas stations, Carroll Independent Fuel also operates stations under various other names. The East Coast chain’s network includes stations that use Sunoco branding, which is most famously associated with the NASCAR Cup Series. Other locations are branded as BP gas stations, with Carroll working with the British-owned oil company since 2006.

Advertisement

In 2012, Carroll Independent Fuel also acquired High’s, a Baltimore-based chain of convenience stores. In an interview with the Baltimore Business Journal, Executive Vice President Howard Phelps said that the company realized that “competition on the gasoline retail side was transitioning to convenience,” and that Carroll wanted to “to go toe to toe” with rivals like Sheetz and Wawa.

The Carroll network continues to grow, with the company acquiring seven new sites in 2022. The new locations helped develop its network outside the company’s home state of Maryland, with Delaware, New Jersey, and Pennsylvania all seeing new Carroll-operated locations launched.

Advertisement



Source link

Continue Reading

Tech

Fluidic Contact Lens Treats Glaucoma

Published

on

We’ve always been interested in fluidic computers, a technique that uses moving fluids to perform logic operations. Now, Spectrum reports that researchers have developed an electronics-free contact lens that monitors glaucoma and can even help treat it.

The lens is made entirely of polymer and features a microfluidic sensor that can monitor eye pressure in real time. It also has pressure-activated drug reservoirs that dispense medicine when pressure exceeds a fixed threshold. You can see Spectrum’s video on the device below.

This isn’t the first attempt to treat glaucoma, which affects more than 80 million people, with a contact lens. In 2016, Triggerfish took a similar approach, but it used electronic components in the lens, which poses problems for manufacturing and for people wearing them.

Naturally, the device depends on 3D printed molds to create channels and reservoirs in the lens. A special silk sponge in the reservoirs can absorb up to 2,700 times its weight. One sponge holds a red fluid that is forced by pressure into a serpentine microchannel. A phone app uses a neural network to convert the image of the red fluid into a pressure reading.

Advertisement

Two more sponges hold drugs that release at a given pressure determined by the width of the associated microchannel. This allows the possibility of increasing the dose at a higher pressure or even delivering two drugs at different pressure levels.

It is fairly hard to hack your own contact lenses, although we’ve seen it at least once. But smart contacts are not as rare as you might think.

Advertisement

Source link

Continue Reading

Tech

‘Han Solo Wants to Be Me’: Artemis II’s Victor Glover on Flying the Orion

Published

on

Even if you’re 250,000 miles from Earth, sleep is important. However, for all the life-sustaining accoutrements aboard the Orion spacecraft, the capsule lacked bedrooms, leaving the four-person Artemis II crew with a truly bizarre sleeping arrangement.

“I slept really close to an air conditioning vent. And so I’d wake up and I just see this big hunk of metal,” Glover told CNET during a video call. “And it was like, ‘Oh, I’m in space. I am weightless.’”

Sleep wasn’t just a means for the astronauts to recharge; it also grounded them during their historic journey. Glover explained, “What really resonated with me is we’re also humans. It’s like camping, and this is a very important part of this journey.”

Advertisement

Watch this: Artemis II’s Victor Glover Chats With CNET

Artemis II was the first crewed mission to the moon in over 50 years. It followed Artemis I, a 2022 uncrewed mission that was the first for NASA’s new Space Launch System rocket and Orion spacecraft. The goal for Artemis II was to have a crew test the spacecraft, life support systems, the SLS rocket and the procedures needed for future lunar missions that will involve landing on the moon and eventually building a base there.

Glover, the Orion’s pilot, along with commander Reid Wiseman and mission specialists Christina Koch and Jeremy Hansen, made up the Artemis II crew. The mission made a lot of history. It’s the first time a woman, a Black man or a Canadian has journeyed to the moon. The four Artemis II astronauts traveled 252,756 miles from Earth, farther than any other human being, surpassing the record set by the 1970 Apollo 13 mission.

Advertisement
Artemis II's Orion capsule in deep space

This image of NASA’s Orion spacecraft was taken with a camera mounted on its solar array wings.

NASA

This wasn’t Glover’s first time in space. In 2020, with a Falcon 9 rocket for liftoff, he piloted the Crew Dragon capsule to and from the International Space Station for NASA’s SpaceX Crew-1 mission, spending over 167 days in space. But Artemis II gave Glover the opportunity to be the first to fly the Orion, a new vehicle designed for Artemis missions. For the majority of the nearly 10-day journey, Orion was on autopilot. But Glover had several opportunities to take manual control of the spacecraft to test its handling.

“It was such a treat and a joy,” Glover said about flying the Orion. “It was a test pilot’s dream to fly a new spaceship for the first time by hand.”

Advertisement

Even after spending time training to fly in a simulator back on Earth, he was surprised by how responsive the Orion’s hand controller was and by the clarity of the cameras, used to maneuver the craft around the Interim Cryogenic Propulsion Stage that holds the fuel for the upper stage of liftoff. He said the view from the cameras and monitors was like “looking out a window.”

Artemis II's Victor Glover looking off to the side

Artemis II astronaut and pilot Victor Glover wears an orange flight suit.

NASA

When I asked Glover if he felt like Han Solo when piloting the Orion, he retorted, “Han Solo wants to be me when he grows up!” Throughout my interview, Glover was gracious, passionate and funny.

Advertisement

“I get to do stuff that’s cooler than Han Solo. I mean, just the fact that it’s real, it’s better.”

While landing on the moon wasn’t in the cards for this trip, the Orion crew traveled about 4,000 miles beyond the moon, allowing them to see parts of the moon that had never been seen before. For comparison, Apollo missions flew about 70 miles above the moon to make landings, limiting how much of it they could actually see.

Earth seen as a bright blue and white crescent just over the dimly lit brown surface of the moon

Earthset captured through the Orion spacecraft window at 6:41 p.m. EDT, April 6, 2026, during the Artemis II crew’s flyby of the moon.

Advertisement

NASA

The images that Glover and the crew took of the moon were stunning. Shots like the Earthset were a reminder of how beautiful our planet is and our place within the solar system. The astronauts even witnessed a total solar eclipse as they rounded the far side of the moon. But none of the photos they took compares to what they saw, according to Glover.

“I could see the curvature of the moon. Depth is just one aspect that you cannot see in the pictures. But here’s the other thing, the pictures lack scale.”

The moon, half in light, half in dark

When the Artemis II flew over the terminator, the crew said that this boundary between day and night was “anything but a straight line,” according to NASA.

Advertisement

NASA

For the lunar flyby, the Orion was moving fast: 60,863 mph relative to Earth, but only 3,139 mph relative to the moon, according to NASA. The speed meant the shadows across the surface were constantly morphing into different shapes. Glover was particularly enamored with the moon’s terminator, where the light and dark sides of the moon meet. The terminator isn’t fixed and depends on the moon’s position relative to the sun. As Orion moved, it transformed into various shapes that looked like letters of the alphabet.

“People know, I fell in love with the terminator when I got to see the real one up close. I watched the terminator go from a letter C to a letter D, which means there was a point when the moon was half light, half dark. It was pointing right at me.”

Four astronauts huddled together wearing eclipse glasses.

The Artemis II astronauts take a selfie of themselves wearing eclipse glasses using an iPhone 17 Pro Max.

Advertisement

NASA

Artemis II’s lunar flyby was a highlight of the journey for many of us on Earth, in part because we could watch it in real time on streaming services like Netflix. Nearly the entire mission was streamed live on NASA’s website and YouTube channel, making it feel like a reality show. One minute you’re watching the crew eat, work out, take photos of the moon; the next, there’s a random jar of Nutella floating by one of the cameras. I asked Glover whether it felt like he was on a TV show while on the Orion.

“It did not feel like a reality show on my end,” said Glover. “For you to see the science and hear us describing the moon, and to see us flying the spaceship by hand, and to see bedtime and bath time and teeth brush time, that’s what it’s like. The mission was all of those things.”

Glover was ecstatic to hear how I and others felt so connected to the crew during their mission. He said it was important to NASA to let the world in on everything it took to send four people a quarter of a million miles away.

“I think that maybe one of the really, most special things about this mission is how much you were able to see,” Glover said with a smile. “It makes me feel good that you felt like you were there.”

Advertisement

Watch this: Getting Personal With the Crew of Artemis II | Tech Today

Source link

Advertisement
Continue Reading

Tech

GoPro’s Mission 1 camera series will start at $600

Published

on

We heard all about GoPro’s new action camera series last week, but the company is now unveiling the pricing across its Mission 1, Mission 1 Pro and Mission 1 Pro ILS cameras. The entry-level Mission 1 ($600) features GoPro’s new 50-megapixel 1-inch sensor, which the company says will offer a major leap in image quality and low-light performance over the Hero 13 line. While largely looking the same as the Hero series (and still waterproof), the Mission 1 can record 8K video at 30fps and 4K at 120fps. It lacks the higher frame rates of the other Mission 1 cameras, but supports 10-bit GP-Log2 color and 32-bit float audio.

The Mission 1 Pro ($700) is the flagship fixed-lens model this year, aimed at the professional (or semi-pro) videographer. It has upgraded frame-rate capture to 8K at 60 fps and 4K at 240 fps, along with an extreme “burst” slow-motion mode that hits 960 fps at 1080p. It also captures 4:3 “Open Gate” recordings at 8K/30fps and 4K/120fps, covering the entire sensor area, enabling more versatile editing and cropping across different screen sizes, including vertical video.

GoPro Mission 1 camera series

Steve Dent for Engadget

Then there’s the beastly Mission 1 Pro ILS (Interchangeable Lens System). It swaps the standard GoPro lens for a Micro Four Thirds (MFT) mount lens. It otherwise shares the same 1-inch sensor and high-speed 8K/60fps video specs as the Pro model. It also matches the Pro model’s $700 price, with an additional $100 discount for GoPro subscribers. However, it won’t be launching until Q3 2026.

All of the Mission 1 Series accessories will be available on a rolling basis beginning May 28, with GoPro’s own wireless mic system (take note, Rode and DJI) priced at $160. If you preorder a Mission 1 or Mission 1 Pro directly from GoPro now, you’ll get the point-and-shoot grip bundled for free. The company still doesn’t have an official release date for the cameras.

Advertisement

Source link

Continue Reading

Tech

These are rumored to be the four iPhone 18 Pro colors

Published

on

The rumor mill is still churning on the iPhone 18 Pro colors, with a new leak showing what the colors may be.

Row of modern Apple smartphones in black, white, light blue, and rose colors, showing rear triple cameras and sleek design, with one phone on the right displaying a glowing abstract screen pattern
Four possible colors of iPhone 18 Pro

The iPhone rumor mill has been on a bit of a color kick lately, with multiple rumors claiming to know which Apple will use in 2028. For the iPhone 18 Pro, it seems that there could be four colors on the way.
The image shared by Weibo leaker Ice Universe shows what appear to be rear camera plateaus for the iPhone 18 Pro. It is unclear where they were sourced from, but they may be shots gathered from an accessory maker, rather than the actual Apple supply chain.
Continue Reading on AppleInsider | Discuss on our Forums

Source link

Continue Reading

Tech

Flagship Rematch: Ryzen 7 5800X3D vs. Core i9-12900K

Published

on

Four years on, we revisit the Ryzen 7 5800X3D vs Core i9-12900K with modern games and DDR4 vs DDR5 configs. The result: still neck and neck, but memory choice now makes a real difference.

Read Entire Article
Source link

Continue Reading

Tech

The first CD recorder was shockingly expensive – guess how much

Published

on

Before CDs went mainstream, recording one cost a small fortune. Made by Denon in 1991

Read Entire Article
Source link

Continue Reading

Tech

I Was Cooking Bacon Wrong for Decades, and You Probably Are Too

Published

on

Stop fighting a losing battle with a grease-spattered stovetop. If you’re buying high-end bacon, you want a perfect crunch without the 20-minute cleanup. The real problem with a frying pan isn’t the taste, though. It’s all that popping and the errant grease spots that mark your skin and kitchen walls. 

Home Tips

In an effort to find the best, cleanest way to make bacon for a Sunday brunch or BLT, I tried several methods, including the stovetop, oven and air fryer.

It turns out I’ve been doing it all wrong. 

Advertisement

A frying pan

  • Cooking time: 10 minutes
  • Hassle: 8/10
  • How much bacon: 7-8 strips
Strips of bacon cooking in a greasy black pan on the stove.

I grew up on pan-fried bacon but my test revealed there’s a better way. 

Mike Mackinven/Getty Images

This is the way I grew up cooking bacon and it’s perfectly fine. There isn’t much skill needed to fry bacon in a pan, although just about every batch I’ve ever made sends a healthy splatter over the stove. In more unfortunate instances, that infernal grease lands directly on my skin or clothes, presenting two distinct but equally aggravating problems.

Pan-fried bacon soaks up a ton of grease, which is why many turn to paper towels to drain it after cooking.  Pan-frying these strips of pork belly also tends to curl them into little bacon balls. While that has no impact on the taste, it can make for a suboptimal presentation.

Advertisement
bacon in a frying pan

I can feel the splatter bombs just looking at this photo.

David Watsky/CNET

Another drawback of cooking bacon in the frying pan is its limited capacity. A 10-inch frying pan can hold only about 7 average-sized strips of bacon at a time, although you can add more as they shrink during cooking. 

Then there’s the matter of cleaning said pan after use. It’s not recommended to put most cookware in the dishwasher, so you’ll have to manage that grease-soaked surface yourself.

The oven 

  • Cooking time: 18 minutes
  • Hassle: 6/10
  • How much bacon: 10-12 strips
9 strips of bacon on a cooking tray.

Oven bacon is best for cooking large batches. 

Advertisement

CNET

While it requires more prep, oven-cooked bacon has clear advantages over pan-frying. For one, there is little concern about capacity, as a standard cookie sheet or baking tray can hold nearly a full package of bacon, making the oven ideal for cooking large quantities.

Using a baking tray and rack allows grease to drip off. That makes for crispier, less greasy results, but it does present a headache when it’s time to clean. Cookie sheets and baking trays don’t fit well in the sink, and there’s typically enough grease that you don’t want to run them through your dishwasher.

You can line the baking tray with aluminum foil, but it takes a lot of foil, and most of the time, bacon grease finds its way under or through it anyway.

Oven-cooked bacon takes longer than bacon cooked in a frying pan — about 18 minutes — but if you’re planning to cook a whole package and don’t want to tend to the stove while it cooks, your oven is the best bet.

Advertisement

The air fryer

  • Cooking time: 7 minutes
  • Hassle: 4/10
  • How much bacon: 6-7 strips
bacon in an air fryer shot from above.

Thanks to its quick cooking time and hassle-free execution, the air fryer is my new go-to for making bacon.

David Watsky/CNET

There’s almost nothing I won’t try to make in the air fryer but, astoundingly, this is my first attempt at bacon. I anticipated a quick cook, because air fryers sizzle most food about 25% faster than a standard oven. 

The air fryer proved to be my favorite way to make bacon, with one big caveat (more on that later). My favorite glass-bowl air fryer cooked those strips in about 7 minutes at 375°F — faster than the oven and the frying pan. Because air fryers include a crisping rack, grease naturally drips into the vessel below, so there was no need to nestle it in a paper-towel lasagna. 

Advertisement
air fryer shot from the side with bacon on crisping tray

The crisping tray drained excess fat while the bacon cooked.

David Watsky/CNET

The bacon turned out perfectly crispy and kept its shape better than when fried in a pan. 

And the mess was minimal. Because the air fryer cooking chamber fits easily in my sink, I was able to wash it in seconds with a sponge and soapy water. My glass bowl air fryer chamber is also dishwasher-safe so another option would have been to wipe the grease and stick it all in the dishwasher.

Advertisement
air fryer bacon

Air fryer bacon is really crispy, y’all.

David Watsky/CNET

The big caveat: Capacity

I use a modest 4-quart air fryer so I can only fit about six strips in at a time. That’s plenty for my partner and me but if I were making bacon for a group, I would have had to cook in batches or invest in a larger model.

That said…

Not having to keep watch over a sizzling, splattering pan or negotiate a grease-filled baking tray pulled from the oven is worth running it back another time to feed a group. There’s also no preheating needed, unlike with an oven, and the sheer speed and cleanliness gave the air frier the edge over the other methods I’ve tried. 

Advertisement

Source link

Continue Reading

Tech

Sky Smart Home vs Ring: how much can you save with Sky’s new smart doorbell bundle?

Published

on

Sky has mastered all things TVs and broadband, and now it’s stepping into the world of smart home with its latest venture, Sky Smart Home — a service that could challenge rivals such as Ring and Blink.

The Smart Home Plan is Sky’s entry-level package, which unlocks advanced features including cloud storage for recordings, Smart Alerts, Activity Zones, and more. There’s also the new Smart Home Plan+ that allows you to add extra devices including the Indoor Camera, Leak Pack, or Motion Pack — taking your smart home ecosystem to the next level.

Source link

Advertisement
Continue Reading

Trending

Copyright © 2025