Connect with us
DAPA Banner

Tech

Grab this Elevation Lab 10-year extended battery case for AirTag for only $16

Published

on

If you’re an iPhone user who likes to keep tabs on where your stuff is, you can’t go far wrong with an AirTag. The second-gen model that Apple just released outpaces the original in every way (aside from the galling lack of a keyring hole, that is). While it’s easy enough to replace the battery in both versions of the AirTag, you might not want to have to worry about the device’s battery life for a very long time. Enter Elevation Lab’s extended battery case for the AirTag, which is currently on sale at Amazon for $16.

The case usually sells for $23, so that’s a 30 percent discount. It’s not the first time we’ve seen this deal, but it’s a pretty decent one all the same.

Image for the large product module

Elevation Lab

Elevation Lab says its AirTag case can extend the battery life of the tracking device to 10 years, and now it’s on sale.

Advertisement

This is arguably one of the more useful AirTag accessories around for certain use cases. It won’t exactly be helpful for an AirTag that you put in a wallet or attach to your keys, as it’s too bulky for such a purpose — and it doesn’t have a hole for a keyring anyway. Still, if you’re looking for an AirTag case that you can place in a suitcase or backpack and not have to touch for years, this could be the ticket.

Elevation Lab says that, when you place a couple of AA batteries in the case, it can extend the tracker’s battery life to as much as 10 years (the brand recommends using Energizer Ultimate Lithium batteries for best results). The AirTag is slated to run for over a year on its standard CR2032 button cell.

The case gives the AirTag more protection as well. It’s sealed with four screws and it has a IP69 waterproof rating. What’s more, it doesn’t ostensibly look like an AirTag case, so someone who steals an item with one inside is perhaps less likely to realize that the object they pilfered is being tracked.

Advertisement

There are some other downsides, though. Since the AirTag is locked inside a case, the sound it emits will be muffled. Elevation Lab says the device’s volume will be about two-thirds the level of a case-free AirTag. However, the second-gen AirTag is louder than its predecessor, which should mitigate that issue somewhat.

Image for the mini product module
Image for the mini product module

Follow @EngadgetDeals on X for the latest tech deals and buying advice.

Source link

Advertisement
Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Tech

SpaceX and Blue Origin race to orbit while scientists question the physics

Published

on

The pitch is seductive in its simplicity: AI needs more power than terrestrial grids can supply, so move the data centres into orbit, where the sun never sets and the electricity is free. SpaceX, Blue Origin, and a growing constellation of startups are now racing to make that vision real. The problem, according to the scientists and engineers who would have to make the physics work, is that the vision skips several chapters of thermodynamics, economics, and orbital mechanics that have not yet been written.

SpaceX filed with the Federal Communications Commission on 30 January for permission to launch up to one million satellites into low Earth orbit, each carrying computing hardware that would collectively form what the company described as a constellation with “unprecedented computing capacity to power advanced artificial intelligence models.” The satellites would operate at altitudes between 500 and 2,000 kilometres, in orbits designed to maximise time in sunlight, and route traffic through SpaceX’s existing Starlink network. SpaceX requested a waiver of the FCC’s standard deployment milestones, which typically require half a constellation to be operational within six years.

Seven weeks later, Blue Origin filed its own application. Project Sunrise proposes 51,600 satellites in sun-synchronous orbits between 500 and 1,800 kilometres, complemented by the previously announced TeraWave constellation of 5,408 satellites providing ultra-high-speed optical backhaul. Where SpaceX’s filing emphasised raw scale, Blue Origin’s emphasised architecture: the system would perform computation in orbit and relay results to the ground through TeraWave’s mesh network.

The startup ecosystem is moving even faster. Starcloud, formerly Lumen Orbit, raised $170 million at a $1.1 billion valuation in March, becoming the fastest unicorn in Y Combinator history just 17 months after completing the programme. The company launched its first satellite carrying an Nvidia H100 GPU in November 2025 and filed with the FCC in February for a constellation of up to 88,000 satellites. Aethero, a defence-focused startup building space-grade computers with Nvidia Orin NX chips wrapped in radiation shielding, raised $8.4 million and is testing hardware on orbit this year.

Advertisement

The commercial logic rests on a genuine problem. Global data centre electricity consumption reached roughly 415 terawatt-hours in 2024 and the International Energy Agency projects it could exceed 1,000 TWh by 2026, with accelerated AI servers driving 30 per cent annual growth. In Virginia alone, data centres consume 26 per cent of total electricity supply. Ireland’s share could reach 32 per cent by year’s end. The grid constraints are real, the permitting delays are real, and the political resistance to building more terrestrial capacity is real.

Advertisement

What is also real, scientists argue, is the physics that makes orbital computing spectacularly difficult at any meaningful scale. The most fundamental challenge is heat. In space, there is no air to carry heat away from processors, only radiative cooling, which requires vast surface areas. Dissipating just one megawatt of thermal energy while keeping electronics at a stable 20 degrees Celsius demands approximately 1,200 square metres of radiator, roughly four tennis courts. A several-hundred-megawatt data centre, the minimum threshold for commercial relevance, would require radiators thousands of times larger than anything ever deployed on the International Space Station.

Radiation presents the second structural problem. Low Earth orbit exposes unshielded chips to cosmic rays and trapped particles that induce bit flips and permanent circuit damage. Radiation hardening adds 30 to 50 per cent to hardware costs and reduces performance by 20 to 30 per cent. The alternative, triple modular redundancy, means launching three copies of every chip, three times the cooling, three times the electricity, and three times the mass. Starcloud’s approach of flying commercial GPUs with external shielding is an interesting experiment, but no one has demonstrated that it works at scale or over hardware lifetimes measured in years rather than months.

Latency is the third constraint. A million satellites spread across orbital shells from 500 to 2,000 kilometres cannot achieve the tight coupling required for frontier model training, where inter-node communication latencies must remain in the microsecond range. Low Earth orbit introduces minimum latencies of several milliseconds for inter-satellite links and 60 to 190 milliseconds for ground-to-orbit round trips, compared to 10 to 50 milliseconds for terrestrial content delivery networks. That makes orbital infrastructure potentially viable for inference workloads, not for training, which is where the overwhelming majority of AI compute demand currently sits.

Then there is cost. IEEE Spectrum estimated that a one-gigawatt orbital data centre would cost upwards of $50 billion, roughly three times the cost of an equivalent terrestrial facility including five years of operation. Google has said that launch costs must fall to under $200 per kilogram before space-based computing begins to make economic sense. SpaceX’s current Starlink economics operate at roughly $1,000 to $2,000 per kilogram. Some analysts argue the true threshold for competing with terrestrial refresh economics is $20 to $30 per kilogram, a figure no credible projection places within the next two decades. The economics look even less favourable when set against the deep-tech funding landscape on the ground, where terrestrial infrastructure projects can draw on established supply chains and proven unit economics.

Advertisement

Even OpenAI’s Sam Altman, who explored a multibillion-dollar investment in rocket maker Stoke Space as a potential SpaceX competitor for orbital data centres, has publicly called the concept “ridiculous” for the current decade. Altman told journalists that the rough maths of launch costs relative to terrestrial power costs simply does not work yet, and he pointedly asked how anyone plans to fix a broken GPU in space.

The astronomical community adds a separate objection entirely. The vast majority of the roughly 1,000 public comments on SpaceX’s FCC filing urged the commission not to proceed. If approved, the constellation would place more satellites than visible stars in the sky for large portions of the night throughout the year, further militarising and commercialising an orbital environment that is already straining under the weight of existing megaconstellations.

None of this means orbital data centres will never exist. SpaceX’s Starship, if it achieves its cost targets, could fundamentally change the mass-to-orbit economics that currently make the concept unworkable. Starcloud’s incremental approach of flying small payloads and iterating on radiation performance is the kind of engineering pathway that occasionally produces breakthroughs. And the terrestrial grid constraints driving the interest are not going away.

But the gap between filing an FCC application for a million satellites and actually making orbital computation economically competitive with a warehouse full of GPUs in Iowa is not measured in years. It is measured in physics problems that the current pace of AI infrastructure investment cannot shortcut, no matter how many billionaires are willing to try. The question scientists are asking is not whether space data centres are theoretically possible. It is why, given the magnitude of the unsolved engineering, anyone is treating them as a near-term solution to a problem that requires near-term answers. The sky, it turns out, is not the limit. The radiator is.

Advertisement

Source link

Continue Reading

Tech

Ireland begins digital wallet testing and consultation

Published

on

The wallet will also be used to verify age for accessing online platforms.

The Irish Government is inviting people to try out the new official ‘Government Digital Wallet’ as the platform enters its training phase.

Countries including Nigeria, Laos and New Zealand – and the US state of California – are all piloting their own versions of a digital ID platform, as governments across borders try to bolster security and make administration smoother.

The digital wallet makes up a key part of the Government’s Digital Public Services Plan 2030, which aims to use digital technology to make accessing public services easier and more efficient.

Advertisement

It facilitates identity management that residents should be able to use within the EU to access public and private services. The wallet can be used both offline and online, and will allow users to self-manage how their data is shared.

The ID can help obtain a marriage certificate or register for key welfare supports, and holders can also obtain a digital version of their birth certificates, driving licences and other official documents. The wallet will also be used to verify age on online platforms, amid debates in the region on a ban for social media for those under 16.

It is also expected to reduce the need to repeat the same information to different Government departments and make everyday interactions with state administration more seamless.

The EU mandates that all member states must make a digital wallet available to their citizens by the end of 2026. The Irish wallet will be developed to EU digital identity standards, the Government said.

Advertisement

The digital wallet will “make it simpler for people to verify their identity, apply for supports and access entitlements”, said Minister for Public Expenditure, Infrastructure, Public Service Reform and Digitalisation Jack Chambers, TD.

“The wallet is designed so that all personal data is fully protected, and the user stays in control of what information they put in the wallet and choose to share. Only the details needed for a service will be shared, and nothing more.”

Minister of State at the Department of Public Expenditure, Infrastructure, Public Service Reform and Digitalisation Frank Feighan, TD said that the wallet will be “a crucial element of the Government’s overall portfolio of digital services”.

He added: “It will be able to facilitate secure age verification capability as set out in Digital Ireland and the implementation of the Online Safety Code, under which designated platforms must have age verification measures in place to help protect, in particular, children and young people from online harm.”

Advertisement

Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.

Source link

Advertisement
Continue Reading

Tech

Stop trying to make people read instructions: 10 startup lessons from Convoy co-founder Dan Lewis

Published

on

Convoy co- founder and Microsoft Corporate Vice President Dan Lewis at the Seattle AI Startup Summit on April 2, 2026. (Ken Yeung Photo)

Dan Lewis’ career is hard to summarize in a sentence. He was a product manager at Microsoft, then an early employee at the Seattle AI startup Wavii, which Google later acquired. He made a stop at Amazon before ultimately boomeranging back to Microsoft, where he is today.

At this week’s Seattle AI Startup Summit, it was his experience building Convoy, the one-time unicorn trucking startup that shuttered in 2023, that he wanted to talk about. 

But instead of relitigating what led to Convoy’s collapse, Lewis used his time on stage to share lessons to help entrepreneurs build a startup from the ground up.

Be deliberate about culture

Every company develops a culture, whether the founder shapes it or not, Lewis said. “The question is, are you involved in influencing what that is and helping to shape it around something you think aligns with your mission and the people you want in the company?”

Codify values only after you see what’s working

Back when Lewis was at Amazon, he asked then-CEO Jeff Bezos how the leadership principles were derived. Bezos told him that “he started writing stuff down when he first created the company, and then he realized he didn’t quite know what he was doing. So he waited a year to see what was working and what wasn’t, just to get a feel for how things were going.”

Advertisement

Anything that Bezos wanted to keep was codified. Lewis mirrored this approach for Convoy.

Make sure people know why, not just what

Founders shouldn’t have a culture in which workers accept decisions simply because the CEO says so. Lewis called that dynamic “demotivating,” arguing that employees who don’t understand the reasoning behind decisions can’t act independently or feel real ownership. Without that context, he said, people won’t feel like they’re truly part of the company.

Name teams after problems, not solutions

Lewis urged founders to name teams after the customer problems they’re solving, not the products they’re building. He pointed to his time at Amazon, where he built a Q&A tool called  “Ask a Question, Get an Answer” for the ratings and reviews team. 

The team pushed back: their mandate was to grow ratings and reviews, not launch someone else’s product. Had the team been named around a broader goal like customer or buyer confidence, Lewis said, its members would have been more open to creative approaches rather than feeling like they were “executing somebody else’s plan.”

Advertisement
Innovate deliberately

Invest time and energy into the areas that will really differentiate your company and “give you a chance to win.” Lewis acknowledged that it can feel uncomfortable to copy someone else’s innovations in undifferentiated areas, but sometimes it’s OK, especially when you’re not spending time on things “that don’t matter a lot.” 

Storytelling is a startup superpower
Convoy co-founder Dan Lewis discusses the power of storytelling at the 2026 Seattle AI Startup Summit. (Ken Yeung Photo)

Another critical cultural value is the company’s story. Have you crafted a narrative that is interesting, something people can relate to, and want to be a part of? 

“Think about for what [you’re] doing, what’s the context in the world?” Lewis said. “What is the opportunity that’s just right there in front of us? What’s the tension point as to why we can’t get that opportunity? What is holding the world back from it, and how are we going to unlock it for everyone so it makes everything better?”

When it came to Convoy, for example, he had his work cut out for him early on trying to sign on new business.“Why would my customer, who’s never worked with a technology company, because they’re shipping freight, want to take a bet?” Lewis explained. “Because they want to be part of the story. It’s interesting.” 

Clarify expectations bidirectionally

Trust between founders and employees doesn’t happen by accident. Lewis recommended sitting down — perhaps over a meal — and laying out expectations from both sides before the work begins. It’s a bidirectional process, meaning that both the leader and employee must be heard.

Advertisement
Hire deliberately — and reluctantly
Dan Lewis offers recruiting and hiring insights at the 2026 Seattle AI Startup Summit. (Ken Yeung Photo)

When it comes to hiring, Lewis offered three tips. 

First, every company wants team members who want to “show up every day, knock down walls, and make it happen.” But for more established organizations, they also need an additional type of employee, those capable of operating and innovating existing systems. This creates conflict inside a large business, Lewis said, because two cultures can’t live in harmony, nor is it possible to have “two compensation structures that manage the risk-reward.”

He argued that startups have the “pure play” advantage where there’s one culture, one risk-reward trade-off, and founders can focus on the type of person they need. In fact, Lewis thinks 80% of the workforce should possess that “wall-knocker” mentality.

Second, startups must be deliberate in hiring, applying filters to candidates throughout the candidate funnel, and rating how someone introduced themselves, spoke during the first meeting, and followed up. At the end of the process, companies will “only have people that really want to be there and want to be part of this.” 

Founders shouldn’t invest a lot of time trying to convince someone to join their company. If they are, “you’re working too hard,” and that it’s “probably not the right sign for a startup.”

Advertisement

Lewis’ last tip: Don’t hire. He admitted that it may sound counterintuitive, but he wants founders to think that every time someone new is onboarded, “it was a failure to operate more efficiently and to innovate” in a way that wouldn’t have required bringing a new person aboard.

Instead, they should first ask whether there was an alternative way to complete the task — perhaps through AI — rather than increasing headcount. 

And to be clear, Lewis isn’t advocating for the end of great hiring. Rather, he wants leaders to approach it this way: “Always consider it to be the thing that you wish you didn’t have to do. You wish you could have gotten it done without hiring that person.”

People don’t read instructions

At Convoy, Lewis said, they designed an operations system assuming people would carefully read each other’s notes during multi-day truck jobs with multiple support shifts. Most skipped the notes and started from scratch, irritating customers who had to repeat themselves. 

Advertisement

When Lewis asked investor Henry Kravis of KKR for advice, the answer was blunt: “Stop building a system that assumes people are going to read.” 

The lesson applies beyond operations. Whether it’s customers, employees, or end users, people scan for a button rather than read text. Founders should design processes and products, especially in the AI era, that work even if nobody reads the instructions.

Use data, and embrace concrete examples
Convoy founder Dan Lewis urges startups to back up data with concrete examples at the 2026 Seattle AI Startup Summit. (Ken Yeung Photo)

One final piece of advice from Lewis: be data-driven. Leave the jargon behind and look to the data when something’s wrong, or there’s confusion, and you’re talking it through with your team or customer. 

But also be specific — use clear, concrete examples, along with the exact words customers use, to clarify quickly.

Lewis closed his keynote with a note of humility. None of these lessons came easily, he acknowledged. In fact, many of them weren’t obvious to him until his experience at Convoy forced the issue. The company reached the heights of the startup world before closing its doors, but for entrepreneurs trying to build something that lasts, that hard-won experience may be exactly the point.

Advertisement

His talk kicked off a day of conversation at the second annual Seattle AI Startup Summit, a conference that brings together investors, founders, executives and others. 

In addition to Lewis, attendees heard from AI2 Incubator’s Managing Director, Yifan Zhang, CopilotKit’s CEO, Atai Barkai, Edge Delta’s Founder and CEO, Ozan Unlu, MotherDuck Co-Founder and CEO, Jordan Tigani, and OSS4AI CEO, Yujian Tang, who heads up the conference.

Source link

Advertisement
Continue Reading

Tech

Burning Wood To Brew Wood To Preserve Wood : Pine Tar

Published

on

Before there was pressure-treated wood, before modern paints, there was pine tar. Everything from tool handles to wagons to ships were made of wood preserved with pine tar, once upon a time, and [woodbrew] wants to show you how to make it, how to use it, and why you might put it on your skin.

It starts with, you guessed it, pine! In the first part of the video, [woodbrew] creates a skin salve with pine resin and food-safe oil. The pine resin–which is the sticky goop that dries around wounds on evergreen trees–is highly antiseptic and has been used in wound salves since the stone age. The process is easy: melt it in a double boiler, then mix with equal parts oil. [woodbrew] also adds a touch of beeswax to firm it up, an a little eucalyptus extract for extra germ-killing power, and a nice smell to boot.

That’ll preserve your hands, but what about preserving wood?  That starts at about 9 minutes in, and for that you’re going to need a lot more resin, so picking it off wounded trees like he does at the start of the video won’t work. [woodbrew] suggests starting with dead-or-dying pines, and harvesting the crooks of their branches for “fatwood” — wood with the highest resin content. He also suggests the center of stumps, again of trees that died or were severely injured before being cut down. Then it’s a matter of cooking those fine organic molecules out. This is where we burn the wood to save the wood. Well, to save other wood. Wood we didn’t burn, obviously.

The distillation process [woodbrew] uses it fairly traditional, and consists of a couple of buckets. One bucket is buried and collects the pine tar; the other, with holes in the bottom to allow the tar to drip out, is filled with fatwood and covered tightly before being surrounded by firewood which is set alight. You could use an alternate source of heat here, but if you just cut down a pine tree for its fatwood, well, you’d have the rest of the tree to work with. Inside the fatwood bucket, the heat of the fire cooks off the volatile compounds that make pine tar, while the lack of oxygen from being closed up keeps it from burning. Burying the collection bucket keeps it from getting so hot the volatiles all boil off.

Advertisement

If this sounds like the process for making charcoal or woodgas, that’s because it is! He’s letting the gas fraction flare off here, but you could probably capture it– though a true gasifier brakes the tar down into gaseous compounds as well. The charcoal of course stays in the bucket as a bonus.

To make it usable as a wood finish, [woodbrew] mixes his homemade pine tar 50:50 with linseed oil, thining it to a spreadable consistency that helps it penetrate deep into the wood. By filling the voids in the wood, this mixture will help keep moisture out, and the antiseptic properties of the organic soup that is pine tar will help keep fungi at bay for potentially decades to come.

Thanks to [Keith Olson] for the tip!

Advertisement

Source link

Continue Reading

Tech

Our Favorite iPad Is $50 Off

Published

on

Need a new tablet for your casual couch surfing sessions? There are a variety of options out there, but we think most people will be happy with the standard 11-inch model from 2025. You can grab it right now at Amazon for just $300, a $50 discount from its usual price.

  • Photograph: Brenda Stolyar

  • Photograph: Brenda Stolyar

The outside of the iPad hasn’t changed all that much in the few years since it was updated last, with the screen growing a barely noticeable 0.1 inches, and the standard USB-C port and selfie camera, plus Touch ID built into the power button. Most of the changes affect the inside of the tablet, including a major processor upgrade to the A16 chip and storage that mean this tablet is much snappier and more responsive than the 2022 version. There’s twice as much storage, with 128 GB as a baseline and up to 512GB on the upgraded model, so you won’t need to keep deleting apps to make room for more movies.

While it does have the A16 processor, which is also found in the iPhone 14 Pro, iPhone 15, and iPhone 15 Plus, the reduced RAM means there’s no support for Apple Intelligence. Whether that’s a benefit or a drawback will depend on how much you like or dislike AI. Beyond the lack of Apple Intelligence, you’re really only making a compromise when it comes to the screen, which isn’t laminated, so the Apple Pencil doesn’t feel quite as sharp as it does on other iPads, and it isn’t nano-textured, so glare and bright rooms may be more of an issue.

For most folks, the 2025 A16 iPad will be more than enough tablet for streaming, web browsing, and even some light gaming. You can head over to Amazon to pick up the iPad in either Silver or Blue at the discounted $300 price, with similar discounts on the 256GB and 512GB models too, but availability by color varies as you climb up the storage ladder. If you’re interested in what the other, more premium iPads offer, make sure to check out our guide that covers the entire lineup.

Advertisement

Source link

Continue Reading

Tech

Why ENIAC Was a Loom, Not Just a Calculator

Published

on

This year marks the 80th anniversary of ENIAC, the first general-purpose digital computer. The computer was built during World War II to speed up ballistics calculations, but its contributions to computing extend well beyond military applications.

Two of ENIAC’s key architects—John W. Mauchly, its co-inventor, and Kathleen “Kay” McNulty, one of the six original programmers—married a few years after its completion and raised seven children together. Mauchly and McNulty’s grandchild Naomi Most delivered a talk as part of a celebration in honor of ENIAC’s anniversary on 15 February, which was held online and in-person at the American Helicopter Museum in West Chester, Pa. The following is adapted from that presentation.

There was a library at my grandparents’ farmhouse that felt like it went on forever. September light through the windows, beech leaves rustling outside on the stone porch, the sounds of cousins and aunts and uncles somewhere in the house. And in the corner of that library, an IBM personal computer.

Advertisement

When I spent summers there as a child, I didn’t yet know that the computer was closely tied to my family’s story.

My grandparents are known for their contributions to creating the Electronic Numerical Integrator and Computer, or ENIAC. But both were interested in more than just crunching numbers: My grandfather wanted to predict the weather. My grandmother wanted to be a good storyteller.

In Irish, the first language my grandmother Kathleen “Kay” McNulty ever spoke, a word existed to describe both of these impulses: ríomh.

I began to learn the Irish language myself five years ago, and I was struck by how certain words and phrases had multiple meanings. According to renowned Irish cultural historian Manchán Magan—from whom I took lessons—the word ríomh has at different times been used to mean to compute, but also to weave, to narrate, or to compose a poem. That one word that can tell the story of ENIAC, a machine with wires woven like thread that was built to compute, make predictions, and search for a signal in the noise.

Advertisement

John Mauchly’s Weather-Prediction Ambitions

Before working on ENIAC, John Mauchly spent years collecting rainfall data across the United States. His favorite pastime was meteorology, and he wanted to find patterns in storm systems to predict the weather.

The Army, however, funded ENIAC to make simpler predictions: calculating ballistic trajectory tables. Start there, co-inventors J. Presper Eckert and Mauchly realized, and perhaps the weather would soon be computable.

Black and white 1960s image of two white men in suits looking at a wall of computer controls. Co-inventors John Mauchly [left] and J. Presper Eckert look at a portion of ENIAC on 25 November 1966. Hulton Archive/Getty Images

Weather is a system unfolding through time, and a model of a storm is a story about how that system might unfold. There’s an old Irish saying related to this idea: Is maith an scéalaí an aimsir. Literally, “weather is a good storyteller.” But aimsir also means time. So the usual translation of this phrase into English becomes “time will tell.”

Mauchly wanted to ríomh an aimsire—to weave the weather into pattern, to compute the storm, to narrate the chaos. He realized that complex systems don’t reveal their full purpose at conception. They reveal it through aimsir—through weather, through time, through use.

Advertisement

ENIAC’s First Programmers Were Weavers

Kathleen “Kay” McNulty was born on 12 February 1921, in Creeslough, Ireland, on the night her father—an IRA training officer—was arrested and imprisoned in Derry Gaol.

Family oral history holds that her people were weavers. She spoke only Irish until her family reached Philadelphia when she was 4 years old, entering American school the following year knowing virtually no English. She graduated in 1942 from Chestnut Hill College with a mathematics degree, was recruited to compute artillery firing tables by hand for the U.S. Army, and was then selected—along with five other women—to program ENIAC.

They had no manual. They had only blueprints.

McNulty and her colleagues learned ENIAC and its quirks the way you learn a loom: by touch, by memory, by routing threads of electricity into patterns. They developed embodied knowledge the designers could only approximate. They could narrow a malfunction to a specific failed vacuum tube before any technician could locate it.

Advertisement

McNulty and Mauchly are also credited with conceiving the subroutine, the sequence of instructions that can be repeatedly recalled to perform a task, now essential in any programming. The subroutine was not in ENIAC’s blueprints, nor in the funding proposal. The concept emerged as highly determined people extended their imagination into the machine’s affordances.

The engineers designed the loom. Weavers discovered its true capabilities.

In 1950, four years after ENIAC was switched on, Mauchly’s dream was realized as it was used in the world’s first computer-assisted weather forecast. That was made possible after Klara von Neumann and Nick Metropolis reassembled and upgraded the ENIAC with a small amount of digital program memory. The programmers who transformed the math into operational code for the ENIAC were Norma Gilbarg, Ellen-Kristine Eliassen, and Margaret Smagorinsky. Their names are not as well-known as they should be.

Black and white 1940s image of three women operating a differential analyser in a basement. Before programming ENIAC, Kay McNulty [left] was recruited by the U.S. Army to compute artillery firing tables. Here, she and two other women, Alyse Snyder [center] and Sis Stump, operate a mechanical analog computer designed to solve differential equations in the basement of the University of Pennsylvania’s Moore School of Electrical Engineering.University of Pennsylvania

Kay McNulty, Family Storyteller

Kay married John Mauchly in 1948, describing him as “the greatest delight of my life. He was so intelligent and had so many ideas…. He was not only lovable, he was loving.” She spent the rest of her life ensuring he, Eckert, and the ENIAC programmers would be recognized.

Advertisement

When she died in 2006, I came to her funeral in shock, not fully knowing what I’d lost. As she drifted away, it was said, she had been reciting her prayers in Irish. This understanding made it quickly over to Creeslough, in County Donegal, and awaited me when I visited to honor her memory with the dedication of a plaque right there in the center of town.

In her own memoir, she wrote: “If I am remembered at all, I would like to be remembered as my family storyteller.”

In Irish, the word for computer is ríomhaire. One who ríomhs. One who weaves, computes, and tells. My grandfather wanted to tell the story of the weather through computing. My grandmother wanted to be remembered as a storyteller. The language of her childhood already had a word that contained both of those ambitions.

Computers as Narrative Engines

When it was built, ENIAC looked like the back room of a textile production house. Panels. Switchboards. A room full of wires. Thread.

Advertisement

Thread does not tell you what it will become. We tend to think of computing as calculation—discrete and deterministic. But a model is a structured story about how something behaves.

Weather models, ballistic tables, economic forecasts, neural networks: These are all narrative engines, systems that take raw inputs and produce accounts of how the world might unfold. In complex systems, when parts are woven together through use, new structures arise that no one specified in advance.

Like ENIAC, the machines we are building now—the large models, the autonomous systems—are not merely calculators. They are looms.

Their most important properties will not be specified in advance. They will emerge through use, through the people who learn how to weave with them.

Advertisement

Through imagination.

Through aimsir.

From Your Site Articles

Related Articles Around the Web

Advertisement

Source link

Continue Reading

Tech

AI companies are building huge natural gas plants to power data centers. What could go wrong?

Published

on

Who doesn’t love a good round of FOMO? From dot-com to Web 2.0, virtual reality to blockchain, the tech industry has had its share of being too afraid to miss out on a trend.

The AI bubble is the big daddy of them all. Its first offspring — the rush to lock down power for data centers — is now begetting a mad dash to secure natural gas supplies and equipment. If FOMOs could have babies, then the AI bubble is already having grandkids.

Microsoft said on Tuesday that it’s working with Chevron and Engine No. 1 to build a natural gas power plant in West Texas that could grow to produce 5 gigawatts of electricity. This week Google confirmed that it’s working with Crusoe to build a 933 MW natural gas power plant in North Texas. And last week, Meta announced that it was adding another seven natural gas power plants to its Hyperion data center in Louisiana, bringing the site to 7.46 GW of capacity — enough to power the entire state of South Dakota.

Are we missing anyone?

Advertisement

The recent investments are concentrated in the southern U.S., home to some of the largest natural gas deposits in the world. Recently, the U.S. Geological Survey estimated that there’s enough in one region to supply energy to the entire United States for 10 months by itself. Every data center operator seems to want a part of it.

The scramble for natural gas has led to a shortage of turbines for the power plants, with prices likely to rise 195% by the end of this year relative to 2019 prices, according to Wood Mackenzie. The equipment contributes 20% to 30% of the cost of a power plant. Companies won’t be able to place new orders until 2028, and it’s taking six years to get turbines delivered, the consultancy notes.

That means tech companies are betting that the AI fever won’t break, that AI will continue to need exponential amounts of power, and that natural gas generation will be necessary for success in the AI era.

Techcrunch event

Advertisement

San Francisco, CA
|
October 13-15, 2026

They may come to regret that third assumption.

Advertisement

Though natural gas supplies in the U.S. are plentiful, and because shipping the fuel isn’t cheap, the country remains somewhat insulated from the turmoil in the Middle East. But supplies aren’t unlimited, and recently, growth in production in the big three regions — responsible for three-quarters of all U.S. shale gas production — has slowed considerably

It’s not clear how insulated tech companies are from price swings since none of them have disclosed specific terms of their agreements. A lot will depend on how firm the price is in those contracts. 

Even if the contracted prices are as firm as can be, the companies could still face repercussions.

Because natural gas generates about 40% of the electricity in the U.S., according to the Energy Information Administration, electricity prices are closely tied to natural gas prices. Tech companies might be able to shield themselves from scrutiny for a bit by moving their gas power plants behind the meter — by skipping the grid and connecting them directly to their data centers. But natural gas isn’t an unlimited resource, and if their ambitions grow too big, even the behind-the-meter operations could drive up power prices for everyone. We’ve all seen how that’s played out.

Advertisement

It won’t just be regular households getting upset either. Other industries, including those that remain much more dependent on natural gas and can’t yet turn to renewables, might balk at data centers grabbing so much of the resource. Powering a data center with wind, solar, and batteries is easy. Running a petrochemical plant? Not so much.

Then there’s the weather. One cold winter could change the calculus by driving up demand among households. Wellheads might freeze off, crimping supplies dramatically, as happened in Texas in 2021. When gas runs short, suppliers will face a choice: keep the AI data centers running or let people heat their homes?

By snapping up natural gas supplies and moving behind-the-meter, tech companies can claim that they’re “bringing their own power” and not straining the electrical grid. But in reality, they’re just shifting their use from one grid to another, the natural gas grid. The AI rush has illustrated just how physically constrained the digital world remains. Does it make sense for them to bet big on a finite resource? Tech companies might regret falling for the FOMO.

Source link

Advertisement
Continue Reading

Tech

For Such A Small Program, ZX81 1K Chess Sure Packs A Lot In

Published

on

The Sinclair ZX81 was hardly the most accomplished of 1980s 8-bit microcomputers, but its ultra-low-budget hardware was certainly pressed into service for some impressive work. Perhaps the most legendary piece of commercial software in this vein was 1K Chess, which packed an entire chess engine into the user-available bytes in the unexpanded 1K ZX’s memory map. [MarquisdeGeek] has taken this vintage piece of code in 2026 and subjected it to a thorough analysis, finding all the tricks along the way.

Though hackers have since found ways to trick the ’81 into displaying bitmap graphics, using it as intended is text-only with some limited block graphics. The chess board then is text-only, and its illusion of “thinking” about moves comes courtesy of the on-screen board doubling as the play area memory. In the GitHub repository you can find decompiled and annotated versions as well as the original ZX binary, with as a bonus a screen capture of the game as it appears as BASIC with the ZX’s odd means of storing Z80 code in REM statements.

If that wasn’t enough, in his note giving us the tip he reveals that much of the work was done in a ZX emulator running in a Dragon emulator, and gives us a fun glimpse of the game running in an emulator on a Cheap Yellow Display inside 1K Chess cassette box. We like it, a lot!

Advertisement

If you need a greater ZX81 fix, take a look at how this machine chased the beam to make TV graphics on the cheap.

Source link

Advertisement
Continue Reading

Tech

Five Underrated Tire Brands That Can Compete With Goodyear

Published

on





Goodyear is a tire industry institution. Founded in 1898 in Akron, Ohio, it has spent more than a century building a name synonymous with reliability, performance, and American motoring heritage. Our ranking of major tire brands placed Goodyear second overall, highlighting its broad range of strong-performing models across multiple market segments. However, a trusted name does not guarantee a podium finish in every single test.

Tire science advances rapidly, and in the last two years, a cluster of brands — some familiar to enthusiasts, some largely invisible to mainstream buyers — have turned up in credible, independent tests that outperform Goodyear in specific, measurable ways. Consumer Reports‘ 2026 Best Tire Brands rankings placed Goodyear seventh among the brands it evaluated, most notably behind several names that most drivers would not immediately associate with premium performance. Either way, this piece is not a case against Goodyear.

It is more a case of looking beyond just the label and the brand. The five brands profiled below have each demonstrated, in controlled, verifiable testing, that they can stand toe-to-toe with — and in some areas even surpass — one of the world’s most recognized tire companies. Here is what you need to know and where exactly Goodyear has an underdog problem.

Advertisement

1. Nokian

The Nokian Tyres company traces its roots back to Finland in 1898, and is best known in the Nordic markets for its legendary Hakkapeliitta winter tire line, but its all-season range has been making serious noise in European testing circles. In Tyre Reviews‘ 2025 best SUV all-season tire test, the new Nokian SeasonProof 2 delivered the shortest wet braking distance in the entire test. It stopped faster than the Goodyear Vector 4Seasons Gen 3, which finished behind the Nokian in both wet braking and wet handling categories.

The tester noted that the Nokian was the fastest around the handling lap, all while having a superior blend of feedback, traction, and communication. This result is not just a one-off. When Consumer Reports tested top-ranked tires for winter and snow, the Nokian Hakkapeliitta R5 won in both SUV and passenger car/crossover categories, while the Nokian Tyres Remedy WRG5 was also placed number one in the all-season department. In both instances, these ranked higher than many well-known premium brands.

Advertisement

One notable 2025 test by TÜV SÜD, as covered by TyreReviews, compared five premium all-season 205/55 R16 tires. The Nokian Seasonproof 2 took first place, excelling in snow braking and traction (100%) and snow handling (99.6%), while remaining reasonable in wet metrics and rolling resistance. In contrast, Goodyear’s Vector 4Seasons Gen 3 finished last, struggling in dry and wet braking and hydroplaning, though its snow performance and rolling resistance were more than decent.

Advertisement

2. Vredestein

Vredestein is one of Europe’s oldest tire manufacturers, now owned by Apollo Tyres, and it has spent the better part of the last decade quietly compiling an impressive test record. The brand has also seen a strong reception from buyers, to the point that it ranks as Consumer Reports’ 2025 best major tire brand in terms of customer satisfaction. According to TyreReviews‘ direct cross-test comparison of the Goodyear Vector 4Seasons Gen 3 and the Vredestein Quatrac All-Season, both tires were evaluated across 15 shared tests.

In total, the Quatrac won 10 of them, while Goodyear won five. What’s interesting is that the Goodyear tire performed better in the snow, and most of the wins it earned were tied to snow and ice performance. Conversely, in a separate 2024 ADAC test comparing the Vredestein Wintrac Pro and the Goodyear UltraGrip Performance 3, the Goodyear tire won overall, losing to Vredestein in snow and ice conditions.

In the summer segment, the Vredestein Ultrac earned perhaps its most high-profile result when it won the 2024 AutoExpress summer tire test (as covered by WhatTyre), beating the Goodyear Eagle F1 Asymmetric 6 to first place through performance across wet, dry, noise, and comfort categories. Best of all, it did so at a lower price point than most of its rivals.

Advertisement

3. Hankook

Hankook has been making tires since 1941 and operates as one of the world’s largest manufacturers, supplying OEM fitments to major automakers. However, its reputation among everyday buyers has not always kept pace with its test results. So, are Hankook tires better than Goodyear? If you look at Consumer Reports’ Best Tire Brands of 2025 test results, they placed Hankook ahead of Goodyear, which was the direct result of testing 30 brands across handling, braking, snow traction, noise, hydroplaning, and tread life.

The objective test data support this. In AutoBild’s 2025 EV tire test as reported by Hankook, Hankook’s iON evo took the overall test win — ahead of Michelin, Goodyear, and Continental — for the third consecutive year, earning the magazine’s top “Exemplary” rating. In TyreReviews‘ 2025 EV tire test, Hankook led the wet handling results with 74.4 kph (46.2 mph), narrowly beating Continental and finishing ahead of Goodyear, which took third place.

Overall, both Goodyear and Hankook have positioned themselves as strong performers in the market. Brand competitiveness is also reflected in customer feedback. For example, Tyroola, one of Australia’s largest tire retailers, aggregates reviews for both brands, showing Goodyear rated 4.6 out of 5 and Hankook close behind at 4.5 out of 5. This demonstrates that consumers view both brands in a similar fashion and proves that Hankook can indeed trade punches with the industry’s finest.

Advertisement

4. Falken

Falken is owned by Sumitomo Rubber Industries and has historically been viewed as a mid-tier brand. Therefore, the brand sure is credible, but not headline-grabbing in the same way Goodyear is. However, recent testing suggests that perception can point in a bad direction. TyreReviews‘ 2025 best performance summer tire test — a comparison that included the Goodyear Eagle F1 Asymmetric 6 and the Falken Azenis FK520 — showed just how close those two brands can perform.

Goodyear tied for second place with Michelin and Continental, with Falken just behind them. The reviewer noted that the Falken tire was “incredibly grippy, incredibly stable, and very easy to drive fast,” and found the results good enough to have a second driver independently confirm them. Traditionally, Goodyear is known for making some of the quietest tires on the market, but in this regard, the Falken finished just behind Goodyear in overall noise levels.

In the all-terrain segment, Falken has been equally competitive. TyreReviews‘ best all-terrain tire test found the Falken Wildpeak AT3W returning dry braking distances of 43.9 meters (144 feet) against the Goodyear Wrangler All-Terrain’s 44.6 meters (146 feet), while the publication concluded Falken was the best all-terrain tire overall, and Goodyear’s Wrangler ranked third.

Advertisement

5. Kumho

When talking about whether Kumho tires are better than Goodyear, we first need to mention Consumer Reports’ 2026 best tire rankings, where Kumho placed fifth among all brands evaluated — two places above Goodyear, which came seventh. Moreover, the case is sharpened considerably by specific head-to-head performance data.

In the 2026 AutoBild 245/45 R19 summer tire test, the Kumho Ecsta Sport PS72 and Goodyear Eagle F1 Asymmetric 6 traded punches in many categories. Kumho excelled in wet and dry braking and in value, outperforming Goodyear. On the other hand, Goodyear ranked higher overall thanks to its exceptional treadwear and balanced performance. In practical terms, Kumho offers targeted performance advantages, while Goodyear offers better longevity and consistency.

Advertisement

However, in the 2024 ADAC summer tire test, the Kumho Ecsta HS52 earned third place, while the Goodyear EfficientGrip Performance 2 ranked ninth. These brands are closely matched in the eyes of the consumer as well, since many owners on Reddit are quick to point out that Kumho often feels slightly more comfortable and performance-oriented, while Goodyear is considered solid, reliable, and better for mileage and all‑season use.

Advertisement

How we made the list

Comparing tire brands is tricky because there are so many variables—different models, submodels, sizes, and categories. While direct comparisons of specific tires can highlight strengths and weaknesses, judging an entire brand as a whole isn’t realistic. That wasn’t the goal of this article. Instead, we aimed to identify underrated, non-premium tire brands that can compete with — and sometimes even beat — Goodyear. 

Experiences will naturally vary, but there’s enough credible data online to answer the main question. To create this list, we scoured verifiable tests, comparisons, expert analyses, and user reviews from sources like TyreReviews, AutoExpress, Consumer Reports, AutoBild, ADAC, Tyroola, WhatTyre, and TÜV SÜD. We cross-checked performance metrics, test results, and consumer feedback to show both sides of the coin and provide an honest assessment of where these brands stand.



Advertisement

Source link

Continue Reading

Tech

Rubin Observatory team discovers 11,000 new asteroids, with help from University of Washington software

Published

on

A model of the inner solar system shows asteroids discovered by the Rubin Observatory in light teal. Previously known asteroids are dark blue. The model highlights almost 12,700 asteroids that the Rubin team has discovered over the course of a year and a half. (Photo: NSF–DOE Vera C. Rubin Observatory / NOIRLab / SLAC / AURA / R. Proctor. Star map: NASA / GSFC Scientific Visualization Studio. Gaia DR2: ESA / Gaia / DPAC. Image Processing: M. Zamani / NSF NOIRLab)

The Vera C. Rubin Observatory’s science team has discovered more than 11,000 new asteroids — a feat made possible by the Simonyi Survey Telescope’s advanced capabilities and data-crunching software developed at the University of Washington.

Rubin’s deluge of discoveries, based on a million early-stage observations that were collected over the course of a month and a half last summer, includes roughly 380 trans-Neptunian objects, or TNOs, and 33 previously unknown near-Earth objects. (Don’t panic: None of those near-Earth objects poses a threat to Earth.)

The data set also includes more than 80,000 previously known asteroids, some of which had been “lost” to science because of uncertainty about their orbits. The findings were confirmed by the International Astronomical Union’s Minor Planet Center, the global clearinghouse for small solar system objects.

These aren’t the first finds for the $800 million observatory in Chile, which made its “First Look” debut last June. Astronomers previously reported finding more than 1,500 asteroids during earlier test rounds.

“This first large submission after Rubin First Look is just the tip of the iceberg and shows that the observatory is ready,” UW astronomer Mario Jurić, who heads Rubin’s solar system team, said in a news release. “What used to take years or decades to discover, Rubin will unearth in months. We are beginning to deliver on Rubin’s promise to fundamentally reshape our inventory of the solar system and open the door to discoveries we haven’t yet imagined.”

Advertisement
This video highlights the asteroids discovered at the Vera C. Rubin Observatory. The discoveries come in three bursts: 73 were discovered during the first early test observations using Rubin’s Commissioning Camera in late 2024; 1,514 were discovered during First Look observations in April and May 2025; and 11,000 more asteroids were discovered in Rubin’s early optimization surveys last summer.

The observatory’s centerpiece is the Simonyi Survey Telescope, named after the family of Seattle-area software billionaire Charles Simonyi. Equipped with the world’s largest digital camera, it can generate 20 terabytes of raw data per night. That data is analyzed and interpreted by scientific institutions around the world — including UW’s DiRAC Institute. (DiRAC stands for “Data-Intensive Research in Astrophysics and Cosmology.”)

“Rubin’s unique observing cadence required a whole new software architecture for asteroid discovery,” said Ari Heinze, a UW astronomer who worked with graduate student Jacob Kurlander to create the software that detected the asteroids. “We built it, and it works. It seems pretty clear this observatory will revolutionize our knowledge of the asteroid belt.”

Once it ramps up to full operation, the Rubin Observatory is expected to identify almost 90,000 new near-Earth objects, or NEOs, in the zone around our planet’s orbit. Some of those NEOs could be hazardous, and early detection would give scientists, engineers and policymakers a head start on the development of planetary defense strategies.

The trans-Neptunian objects that were found in the broad zone of the solar system beyond the orbit of Neptune include two icy bodies that appear to have extremely elongated orbits. The Rubin team says these two objects — designated 2025 LS2 and 2025 MX348 — reach distances that are roughly 1,000 times farther out from the sun than Earth. That would place them among the 30 most distant known celestial objects of their kind.

Advertisement

If the far reaches of the solar system harbor a large trans-Neptunian object — a hypothetical world known as Planet Nine or Planet X — Rubin should be able to detect it.

The specks of light teal shown in this rendering of the wider solar system represent the roughly 380 trans-Neptunian objects discovered using observations taken during Rubin’s early optimization surveys last summer. i(Photo: NSF–DOE Vera C. Rubin Observatory / NOIRLab / SLAC / AURA / R. Proctor. Star map: NASA / GSFC Scientific Visualization Studio. Gaia DR2: ESA/Gaia/DPAC. Image Processing: M. Zamani / NSF NOIRLab)

“Searching for a TNO is like searching for a needle in a field of haystacks,” said Matthew Holman, a senior astrophysicist at the Harvard & Smithsonian Center for Astrophysics and former director of the Minor Planet Center. “Out of millions of flickering sources in the sky, teaching a computer to sift through billions of combinations and identify those that are likely to be distant worlds in our solar system required novel algorithmic approaches.”

Holman worked with Kevin Napier, a research scientist at the Center for Astrophysics, to develop the algorithms for detecting distant solar system objects with Rubin data.

The Rubin discoveries that have been announced to date are only a prelude to Rubin’s 10-year Legacy Survey of Space and Time. Simulations suggest that over the course of the coming decade, the Rubin Observatory will find millions of previously undetected asteroids.


Operations of the Vera C. Rubin Observatory are funded by the U.S. National Science Foundation and the U.S. Department of Energy’s Office of Science.

Advertisement

This research is available at the Rubin Asteroid Discoveries Dashboard. In addition to Jurić, Heinze, Kurlander, Holman and Napier, the research team members include Pedro Bernardinelli, a former DiRAC postdoctoral fellow at the UW, now at the Institute for Astronomy, Geophysics and Atmospheric Sciences of the University of São Paulo; Joachim Moeyens, a UW research software engineer and B612 Asteroid Institute team member who earned his doctorate in astronomy at the UW; Siegfried Eggl, a former UW postdoctoral researcher in astronomy, now at the University of Illinois Urbana-Champagne; and Erfan Nourbakhsh at Princeton University.

Source link

Continue Reading

Trending

Copyright © 2025