Connect with us
DAPA Banner

Tech

Quadratic Gravity Theory Reshapes Quantum View of Big Bang

Published

on

Researchers at the University of Waterloo say a new “quadratic quantum gravity” framework could explain the universe’s rapid early expansion without adding extra ingredients to Einstein’s theory by hand. The idea is especially notable because it makes testable predictions, including a minimum level of primordial gravitational waves that future experiments may be able to detect. “Even though this model deals with incredibly high energies, it leads to clear predictions that today’s experiments can actually look for,” said Dr. Niayesh Afshordi, professor of physics and astronomy at the University of Waterloo and Perimeter Institute (PI). “That direct link between quantum gravity and real data is rare and exciting.” Phys.org reports: The research team found that the Big Bang’s rapid early expansion can emerge naturally from this simple, consistent theory of quantum gravity, without adding any extra ingredients. This early burst of expansion, often called inflation, is a central idea in modern cosmology because it explains why the universe looks the way it does today.

Their model also predicts a minimum amount of primordial gravitational waves, which are tiny ripples in spacetime geometry created in the first moments after the Big Bang. These signals may be detectable in upcoming experiments, offering a rare chance to test ideas about the universe’s quantum origins.

[…] The team plans to refine their predictions for upcoming experiments to explore how their framework connects to particle physics and other puzzles about the early universe. Their long-term goal is to strengthen the bridge between quantum gravity and observational cosmology. The research has been published in the journal Physical Review Letters.

Source link

Advertisement
Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Tech

AirPods Max 2 review: Familiar features & design, but needs more

Published

on

AirPods Max 2 finally got an actual update. They’re still excellent, but the added features aren’t really anything new.

Peach and gold over-ear wireless headphones resting on a reflective surface beside an open matching orange carrying case, with a soft blurred pink flower in the foreground
AirPods Max 2 review: What’s old is new again

Apple half-heartedly updated the AirPods Max in September of 2024. It was such a meager update that it removed a prior feature — wired lossless — and didn’t get a new name.
Thankfully, Apple at least brought back wired lossless audio via a software update. That update delivered nothing else, and was months later just to restore a feature that the Lightning version had.
Continue Reading on AppleInsider | Discuss on our Forums

Source link

Continue Reading

Tech

Google is now letting users in the US change their Gmail address

Published

on

Google said on Tuesday that it is now rolling out a way for users in the U.S. to change their Gmail address without starting over or losing access to their data.

Users who have access to this feature can go to their Google Account settings, navigate to Personal info> Email > Google Account email option to see a “Change Google Account email” button. Tap on the button to start the process of changing your username.

Users will be able to change their username only once every 12 months. Plus, they won’t be able to delete their new email address for that period of time.

The company said users’ old emails will be preserved, and the old email address will serve as an alternate address for the account. Users will be able to sign in to Google services using both the old and the new addresses.

Advertisement

Google was rolling out this change in some Hindi-speaking territories, as noted by 9to5Google, which noticed the Hindi support page describing the process to change the username.

The company’s support page says the feature is rolling out gradually, and users might not immediately have access to it.

Source link

Advertisement
Continue Reading

Tech

Ollama is supercharged by MLX's unified memory use on Apple Silicon

Published

on

Machine learning researchers using Ollama will enjoy a speed boost to LLM processing, as the open-source tool now uses MLX on Apple Silicon to fully take advantage of unified memory.

Open laptop on a desk displaying a black screen with a white line drawing of a cute alpaca character standing beside a sleek sports car, against a soft orange background.
Ollama has been boosted by MLX on Apple Silicon

Anyone working with large language models (LLMs) wants results as quickly as possible. There are techniques to do this using multiple Macs, working in a cluster to increase the amount of processing at hand, but one method made by Apple also provides an extra bit of assistance.
This has been undertaken by the developers working on the open-source model management and execution tool Ollama. In a March 30 update, it announced that it is previewing a version of the tool for Apple Silicon that takes advantage of MLX.
Continue Reading on AppleInsider | Discuss on our Forums

Source link

Continue Reading

Tech

5 Classic Muscle Cars That Make The Pontiac GTO Look Slow

Published

on





If you were to say the term American muscle car, a Pontiac GTO will certainly spring to a lot of people’s minds, and for good reason. Originally a trim level of the 1964 Pontiac Tempest, the GTO nameplate, which stands for Gran Turismo Omologato (Grand Touring Homologation in Italian) became synonymous with big power in a modest package. Arguably, it started the whole muscle car trend, debuting before giants like the Mustang, Charger, and more. It had the muscle to back it up as well, with later examples boasting either a 400 or 455 cubic inch engine in top trim, with various options such as the famous Ram Air intake, characterized by its hood scoop.

Power figures are impressive for the time, boasting 360 hp and 500 lb-ft torque with the 455 big block, or 370 hp and 445 lb-ft torque with the Ram Air 400 in 1970. But how fast was it, really, in comparison to its peers? It’s hard to say in pure mathematical terms because of the variables; different magazines and journals list varying times, ranging from 14.6 at 99.6 mph to 13.6 at 104.5 mph with the 400 Ram Air and manual, the fastest configuration. The 455 was slower still, dropping down to 15 seconds.

Quite a few cars could certainly hang with the GTO, and more still could exceed it. For this article, we’ll take a look at the original GTO’s fastest year of 1970 and measure it against all muscle cars built up to that point, so nothing post-1970, and no special models like the Super Stock Hursts or Yenkos — these are common production cars only. Let’s kick it off.

Advertisement

1970 Dodge Challenger R/T 440 Six Pack: 13.6 @ 105 mph

Our opening car already matches the GTO’s best recorded time, and beats the 455 by over a second at the line, a massive length in drag racing terms: 1970 was the debut year of the Dodge Challenger. Made famous by its starring role in the hit movie “Vanishing Point,” the 1970 Dodge Challenger, in this case a 400 Six Pack-equipped R/T trim, is one of the most iconic muscle cars ever made, though its status is somewhat deceptive; Challengers are actually pony cars. 

Advertisement

Pony cars are smaller vehicles, in this case built on the Chrysler E-body, a crucial point when talking about power/weight ratio. This 1970 Dodge Challenger R/T houses the same engine as the midsize and full-size muscle cars, but the ’69 Charger R/T 440 weighing 3,900 pounds, whereas the ’70 Challenger comes in at 3,395 pounds. With less weight and a smaller profile to move through the air, the Challenger will naturally be the faster of the two body styles, and certainly as fast or faster than the GTO.

The 440 Six Pack does all the heavy lifting here, of course, boasting 375 hp and 480 lb-ft torque. Much like the 455, these were large, powerful engines designed for cruising; Winnebago motorhomes used these engines, for example, albeit with different accessories and tunes. When you take that engine, give it three carbs and some performance upgrades, and shove it into a car the size of a Challenger, it’s no wonder they exceed the GTO’s figures.

Advertisement

1970 Ford Mustang BOSS 429: 13.6 @ 114 mph

Here’s another car with a bit of a spotty drag racing record; Motor Trend actually tested their own ’69 Boss 429 and got a blistering 12.3-second quarter mile time at 112 mph. For these purposes, let’s use the worst-case scenario — a 1970 model year, same engine, running a 13.6 at 114. And that shouldn’t be all too surprising, because here we have an example of another smaller car with a massive engine shoved under the hood.

Originally, the Mustang didn’t even have a big block at all; the Mustang is one of the progenitors of the term pony car — smaller, more nimble cars with small block V8s like the 289 Ford or 350 Chevy in the Camaro. That changed in 1967, when Ford introduced the 390 option for the Mustang. The company continuously experimented with the design over the next couple of years, and while the 1970 car shares the same basic architecture as the original, their bodies are very different — as are their engines.

The 429 cubic-inch big block is, ostensibly, a racing engine. In fact, the Boss 429 itself was designed to compete in NASCAR. A little-known fact is that the 429 actually uses a hemispherical combustion chamber, like the legendary 426 HEMI engine from Mopar; this configuration allows for extremely efficient combustion processes, especially at rev ranges expected in racing, making this engine particularly well-suited to high-speed runs. It’s believed that Ford underrated the engine at 375 hp and 450 lb-ft torque, which — coupled with the Mustang’s slim profile — made for an extremely potent muscle car.

Advertisement

1970 Buick GS / GSX Stage 1: 13.38 @ 105.5 mph

With 360 hp and a whopping 510 lb-ft torque, the 1970 Buick GS Stage 1 rips up a quarter-mile track at 13.38 seconds, according to Motor Trend’s January 1970 issue. This one’s a bit conflated, however; the same car was also tested over at Hot Rod Magazine in November 1969, reaching the finish line after 14.40 seconds at 96 mph, albeit with the automatic. Being that the manual is faster for both the GTO and GS Stage 1, we’ll use those times instead for consistency.

Most people probably don’t say Buick and high-performance in the same breath anymore, but that wasn’t true in 1970. In fact, the GS Stage 1 was one of the fastest muscle cars on the market — and much like the previous entry, the GS and GSX special-edition were midsize, built on the same A-body platform as the GTO, Chevelle, Olds 4-4-2, and so on. The fastest Olds 4-4-2, a 1966 model with the W-30 and manual accomplished a 13.8-second time, making it about on-par with the GTO. This makes the Buick GS the second-fastest GM-platform in the quarter-mile run.

Advertisement

The engine used by the GS Stage 1 was the 455 cubic inch (7.4-liter) big block, among the most powerful Buick engines ever produced, and it was also Buick’s biggest ever V8 fitted to a production car. While it doesn’t have the same power rating as some others on this list, that engine more than makes up for it in raw torque, especially with the close-ratio Muncie 4-speed it was often paired with.

Advertisement

1970 Chevrolet Chevelle SS 454: 13.12 @ 107.01 mph

Arguably the first true mid-size car on this list, the 1970 Chevrolet Chevelle SS 454 is yet another iconic muscle car, wearing its classic Le Mans racing stripes and SS badging. Moreover, the LS6 engine option code bumped up power to 450 horsepower and 500 lb-ft torque, more power than anything else on this list, at least in terms of factory ratings. This produced rapid times frequently teasing the low-13 second mark, with Hot Rod attaining a respectable 13.44 @ 108.17 mph in their best run, for instance.

The 1970 Chevelle SS came in several different variants, each with their own power and top speed figures, ranging from the entry-level L34-code 396 ci unit with 350 hp, up to the infamous LS6. LS6-powered Chevelles are sometimes referred to today as the king of muscle cars, directly competing with the likes of the infamous 426 HEMI, the 428 Cobra Jet, and more. Much like the 429, the LS6 was a bespoke high-performance engine, sporting an 11.25:1 compression ratio, aggressive solid-lifter camshaft, aluminum pistons, and more, topped off with a thirsty 800 cfm (cubic feet per minute) Holley carburetor. All that runs through a Muncie M-22 Rock Crusher transmission.

In short, the LS6-powered Chevelle is the 1970 equivalent of a supercar today, though it’s decidedly less refined than one. According to Motor Trend, the transmission is noisy and unrefined, the engine unhappy on unleaded gasoline due to its high compression ratio, and it’s almost impossible to drive hard without spinning tires if you’re running regular street rubber. It’s decidedly specialized for one purpose — going fast, and it does that very well, indeed.

Advertisement

1970 Plymouth Barracuda 426 Hemi: 13.10 @ 107.1 mph

It should come as no surprise that the top spot is secured by a Hemi, an engine that needs no introduction to drag racing enthusiasts. In truth, the infamous Elephant Block could likely accommodate several spots on this list, but the fastest among them, at least according to Car Craft magazine, is the 1970 Plymouth Hemi ‘Cuda. Much like Ford’s 429, the 426 Street Hemi is widely rumored to have a significantly underreported horsepower rating throughout its production run — an impressive 425 hp and 490 lb/ft torque, so says Chrysler.

This was a massive, racing-oriented engine that just barely fit in a lot of these cars; getting hemi heads on the block required a lot of real-estate, one reason why you don’t see them too often. The option itself cost an eye-watering $900, or over $7,500 today — basically you have to buy a third of the car over again at the dealership. But what you get is, for all intents and purposes, the closest thing to a factory-built racecar without crossing the line into specialist vehicles. The A-body Barracuda was built with this in mind, being an early example of a hero sports car alongside its sister Dodge Challenger.

To put it into perspective, the already (supposedly) underrated 426 Hemi can launch the infamous Hurst Hemi ’68 Barracuda deep into 10-second times at over 120 mph. That same engine, albeit tuned for street use, propels the 1970 Hemi ‘Cuda over a second and a half faster than the GTO down the strip. With its light weight and massive performance, it’s simply no contest for the Pontiac at this stage.

Advertisement



Source link

Advertisement
Continue Reading

Tech

Facial Recognition Is Spreading Everywhere

Published

on

Facial recognition technology (FRT) dates back 60 years. Just over a decade ago, deep-learning methods tipped the technology into more useful—and menacing—territory. Now, retailers, your neighbors, and law enforcement are all storing your face and building up a fragmentary photo album of your life.

Yet the story those photos can tell inevitably has errors. FRT makers, like those of any diagnostic technology, must balance two types of errors: false positives and false negatives. There are three possible outcomes.

Three Possible Outcomes

White figures and an orange hooded figure, focusing on the hooded figure in a split design.a) identifies the suspect, since the two images are of the same person, according to the software. Success!

Abstract figures: orange hoodie enlarged, white, yellow, and orange on left, black background.b) matches another person in the footage with the suspect’s probe image. A false positive, coupled with sloppy verification, could put the wrong person behind bars and lets the real criminal escape justice.

Three white icons and one orange hoodie icon on left, large orange hoodie icon on right.c) fails to find a match at all. The suspect may be evading cameras, but if cameras just have low-light or bad-angle images, this creates a false negative. This type of error might let a suspect off and raise the cost of the manhunt.

In best-case scenarios—such as comparing someone’s passport photo to a photo taken by a border agent—false-negative rates are around two in 1,000 and false positives are less than one in 1 million.

In the rare event you’re one of those false negatives, a border agent might ask you to show your passport and take a second look at your face. But as people ask more of the technology, more ambitious applications could lead to more catastrophic errors. Let’s say that police are searching for a suspect, and they’re comparing an image taken with a security camera with a previous “mug shot” of the suspect.

Advertisement

Training-data composition, differences in how sensors detect faces, and intrinsic differences between groups, such as age, all affect an algorithm’s performance. The United Kingdom estimated that its FRT exposed some groups, such as women and darker-skinned people, to risks of misidentification as high as two orders of magnitude greater than it did to others.

Five faces arranged left to right, from easy to hard to recognize.Less clear photographs are harder for FRT to process.iStock

What happens with photos of people who aren’t cooperating, or vendors that train algorithms on biased datasets, or field agents who demand a swift match from a huge dataset? Here, things get murky.

Facial Recognition Gone Wrong

THE NEGATIVES OF FALSE POSITIVES

Detroit Police SUV with American flag decal on side under bright sunlight.2020: Robert Williams’s wrongful arrest cost him detention. The ensuing settlement requires Detroit police to enact policies that recognize FRT’s limits. iStock

ALGORITHMIC BIAS

Advertisement

Red sign reads 2023: Court bans Rite Aid from using facial recognition for five years over its use of a racially biased algorithm. iStock

TOO FAST, TOO FURIOUS?

Back of ICE officer in tactical gear facing a house.2026: U.S. immigration agents misidentify a woman they’d detained as two different women. VICTOR J. BLUE/BLOOMBERG/GETTY IMAGES

Consider a busy trade fair using FRT to check attendees against a database, or gallery, of images of the 10,000 registrants, for example. Even at 99.9 percent accuracy you’ll get about a dozen false positives or negatives, which may be worth the trade-off to the fair organizers. But if police start using something like that across a city of 1 million people, the number of potential victims of mistaken identity rises, as do the stakes.

What if we ask FRT to tell us if the government has ever recorded and stored an image of a given person? That’s what U.S. Immigration and Customs Enforcement agents have done since June 2025, using the Mobile Fortify app. The agency conducted more than 100,000 FRT searches in the first six months. The size of the potential gallery is at least 1.2 billion images.

At that size, assuming even best-case images, the system is likely to return around 1 million false matches, but at a rate at least 10 times as high for darker-skinned people, depending on the subgroup.

Advertisement

Responsible use of this powerful technology would involve independent identity checks, multiple sources of data, and a clear understanding of the error thresholds, says computer scientist Erik Learned-Miller of the University of Massachusetts Amherst: “The care we take in deploying such systems should be proportional to the stakes.”

Source link

Continue Reading

Tech

How to back up your iPhone & iPad to your Mac before something goes wrong

Published

on

Backing up your iPhone or iPad to your Mac is the fastest and most reliable way to protect your data, and is especially useful before updates, repairs, or device replacement.

Apple iPhone and iPad on a colorful blurred background, with a large Time Machine backup app icon centered between them
How to back up your iPhone and iPad to Mac

Backing up your iPhone or iPad to your Mac remains the fastest and most complete way to protect your data before updates, repairs, or hardware changes. Apple built local backup support directly into macOS through Finder, allowing full-device backups without relying on an internet connection.
Local backups are like full system snapshots, saving your device settings, messages, app data, and media stored on your device. Backing up to iCloud does save your data, but restoring from a Mac is also faster than from because the data transfers directly over USB.
Continue Reading on AppleInsider | Discuss on our Forums

Source link

Continue Reading

Tech

EE TV is using AI to help you find something to watch

Published

on

EE is taking aim at one of streaming’s biggest annoyances, endlessly scrolling for something to watch.

The company has launched Smart Search, a new AI-powered feature on EE TV. It lets users find content simply by describing what they’re in the mood for.

Instead of typing exact titles, Smart Search understands more natural queries like “a funny detective show” or even a quote from a scene. It then pulls results from across live TV, on-demand services and integrated streaming apps. As a result, it presents everything in one place.

The idea is simple: less app-hopping, more watching.

Advertisement

Alongside it, EE is introducing Mood Matcher, another AI-driven tool designed to tackle the “what should we watch?” problem. Users answer a few quick prompts about mood, genre or themes. The system then serves up tailored recommendations. This is something EE says is particularly useful when multiple people are trying to agree on what to watch.

Advertisement

The launch leans heavily on a very real problem. EE’s own research suggests 41% of viewers struggle to discover new content, while 45% fall back on rewatching shows just to avoid the effort of choosing. Perhaps more tellingly, 38% say deciding what to watch causes household tension which probably sounds familiar.

EE TV Box Pro designEE TV Box Pro design
Image Credit (Trusted Reviews)

There’s also the issue of fragmentation. With 42% of users relying on their phones to find content and 61% wanting a more unified viewing experience, EE is positioning Smart Search as a way to bring everything together into a single interface.

That broader shift, making discovery as important as the content itself, is becoming a key battleground for TV platforms. As analyst Paolo Pescatore puts it, speed and simplicity are now just as critical as having a deep catalogue.

Advertisement

Smart Search and Mood Matcher are available now through the EE TV app on compatible devices. A wider rollout is planned for EE TV Pro and EE TV Box Edge hardware in the near future.

For EE, the pitch is clear: stop searching like a database, and start searching like a human.

Advertisement

Source link

Advertisement
Continue Reading

Tech

What Are The Biggest Limitations Of Supercomputers?

Published

on





Supercomputers are built to solve very large, difficult problems and do it quickly. Instead of relying on a single processor, supercomputers like El Capitan at Lawrence Livermore National Laboratory and Frontier at Oak Ridge National Laboratory use a large number of processors working together simultaneously. That makes them especially useful for jobs like climate modeling, genetic research, nuclear simulations, artificial intelligence, and identifying flaws in jet engine design.

We’re not talking about quantum computers here, though. A supercomputer is still a classical computer: it uses ordinary bits, which are either 0 or 1, and it solves problems by doing massive numbers of conventional calculations very quickly. A quantum computer works differently by using quantum bits, or qubits. Quantum computing is still largely in the experimental and early developmental stage. Right now, the real work is being done by classical supercomputers, helping scientists explore problems that would take ordinary computers far too long to solve. Some of today’s fastest machines can perform more than a billion calculations per second.

Even so, supercomputers are not all-powerful. Their biggest limitations usually come down to four things: workload scaling, data transfer issues, power consumption, and reliability. Engineers are making progress on all four, but none of these problems has disappeared.

Advertisement

Supercomputers work best when they can break tasks into chunks

One of the biggest limitations is that supercomputers are only useful for certain kinds of tasks. They are best at problems that can be broken into many smaller pieces and worked on concurrently. This is known as parallel processing; for example, a climate model can split the atmosphere and oceans into many sections and calculate each one in parallel. But some problems do not work that way. Some tasks have steps that must happen sequentially. When that happens, a supercomputer cannot speed things up very much. If part of a job has to wait for another task to be finished, the whole system slows down. The answer here often isn’t to add more hardware. Instead, it’s to redesign the software so more of the work can happen simultaneously. 

Another major limitation involves the process of moving data around. A supercomputer may be able to calculate incredibly quickly, but it still needs to fetch information from memory. In many cases, the machine is not limited by calculation speed, but by the time it takes to move data from one place to another. To mitigate this challenge, supercomputers store data physically closer to the processors to move it more efficiently. Researchers are also redesigning programs to reuse data more effectively instead of constantly fetching it.

Advertisement

Supercomputers use a lot of power and have a lot of parts that can go wrong

Power use is also a huge limitation. The fastest supercomputers use enormous amounts of electricity. They also need advanced cooling systems to prevent overheating. This creates two problems. First, it makes supercomputers very expensive to run. Second, it raises environmental concerns, especially as people push back on the large data centers needed to house them. Building better supercomputers will depend not only on making them more powerful, but also on making them more energy-efficient.

Another problem is reliability. A supercomputer contains an enormous number of parts: processors, memory units, cables, storage systems, cooling equipment, and more. The more parts a machine has, the more chances there are for something to go wrong. A loose cable, faulty memory chip, or cooling issue can interrupt a major calculation. This matters because some scientific jobs run for hours or days. If something fails midway through, that work may need to be restarted or recovered from a saved checkpoint. Engineers employ tools like the Lawrence Livermore National Laboratory’s Scalable Checkpoint/Restart (SCR) to minimize the amount of work lost when an issue occurs, but there’s no way to fully prevent hardware issues from occurring. After all, building a massive machine also means there are a massive number of things that can break.

Advertisement



Source link

Advertisement
Continue Reading

Tech

Daily Deal: StackSkills Premium Annual Pass

Published

on

StackSkills Premium is your destination for mastering today’s most in-demand skills wherever and whenever your schedule allows. Now, with this exclusive limited-time offer, you’ll gain access to 1000+ StackSkills courses for just one low annual fee! Whether you’re looking to earn a promotion, make a career change, or pick up a side hustle to make some extra cash, StackSkills delivers engaging online courses featuring the skills that matter most today. From blockchain to growth hacking to iOS development, StackSkills stays ahead of the hottest trends to offer the most relevant courses and up-to-date information. Best of all, StackSkills’ elite instructors are experts in their fields and are passionate about sharing learnings based on first-hand successes and failures. If you’re ready to commit to your personal and career growth, you won’t want to pass on this incredible all access pass to the web’s top online courses. It’s on sale for $60.

Note: The Techdirt Deals Store is powered and curated by StackCommerce. A portion of all sales from Techdirt Deals helps support Techdirt. The products featured do not reflect endorsements by our editorial team.

Source link

Continue Reading

Tech

Flipsnack and the shift toward motion-first business content with living visuals

Published

on

Interactive content now generates 52.6% higher engagement than static formats, with users spending significantly longer interacting with dynamic media and showing higher recall for brands that use it. In practical terms, that shift may have transformed expectations around how digital content should be produced, especially in commerce and B2B environments, where attention is often a […]

This story continues at The Next Web

Source link

Advertisement
Continue Reading

Trending

Copyright © 2025