Connect with us
DAPA Banner

Tech

How Volunteers Saved A Victorian-Era Pumping Station From Demolition

Published

on

D-engine of the Claymills Pumping Station. (Credit: John M)
D-engine of the Claymills Pumping Station. (Credit: John M)

Although infrastructure like a 19th-century pumping station generally tends to be quietly decommissioned and demolished, sometimes you get enough people looking at such an object and wondering whether maybe it’d be worth preserving. Such was the case with the Claymills Pumping Station in Staffordshire, England. After starting operations in the late 19th century, the pumping station was in active use until 1971. In a recent documentary by the Claymills Pumping Station Trust, as the start of their YouTube channel, the derelict state of the station at the time is covered, as well as its long and arduous recovery since they acquired the site in 1993.

After its decommissioning, the station was eventually scheduled for demolition. Many parts had by that time been removed for display elsewhere, discarded, or outright stolen for the copper and brass. Of the four Woolf compounding rotative beam engines, units A and B had been shut down first and used for spare parts to keep the remaining units going. Along with groundwater intrusion and a decaying roof, it was in a sorry state after decades of neglect. Restoring it was a monumental task.

The inventor of the compounding beam engine, Arthur Woolf, was a Cornish engineer who had figured out how to make this more efficient steam engine work. While his engineering made pumping stations like these possible, the many workers and their families ensured that they kept working smoothly. Although firmly obsolete in the 21st century, pumping stations like these are excellent examples of all the engineering and ingenuity that got us to where we are today, and preserving them is the best way to retain all this knowledge and the memories associated with them.

For that reason, one can really congratulate the volunteers who turned this piece of history into a museum. It features a static display of the restored machinery. If you want to see it running, there are seven demonstrations of the station operating under steam every year, during which the six-story tall machinery can be observed in all its glory.

Top image: Claymills Pumping Station in 2010. (Credit: Ashley Dace)

Advertisement

Source link

Advertisement
Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Tech

Minnesota Kicks Off Legal Battle With Trump Administration To Hold ICE Shooters Accountable

Published

on

from the occupied-territories dept

This story was originally published by ProPublica. Republished under a CC BY-NC-ND 3.0 license.

They asked nicely at first. 

After an Immigration and Customs Enforcement agent shot and killed Renee Good, a 37-year-old mother of three who’d recently moved to Minneapolis, local law enforcement officials requested a partnership with the federal government to investigate the case, as they’d done in past shootings involving federal agents.

Advertisement

When the Trump administration refused to cooperate, Minnesota prosecutors ratcheted up their efforts. They sent a series of strongly worded legal letters demanding evidence in the Good shooting as well as the shootings of Julio Cesar Sosa-Celis, a Venezuelan immigrant who was wounded a week after Good was shot, and Alex Pretti, who was killed on Jan. 24.

Still, the administration rebuffed the requests.

This week, prosecutors from Hennepin County and the state of Minnesota took the next step to force the Trump administration’s hand. They filed a federal lawsuit against the departments of Homeland Security and Justice over the evidence in the shootings, an action that Hennepin County Attorney Mary Moriarty, whose jurisdiction covers Minneapolis, characterized as “unprecedented in American history.”

The Trump administration has declined to release the names of the agents involved in the shootings, even after the Minnesota Star Tribune and ProPublica identified the officers involved in the Good and Pretti incidents.

Advertisement

“The federal government has refused to cooperate with state law enforcement, which is unique, rare and simply cannot be tolerated,” Minnesota Attorney General Keith Ellison told reporters. “[We] can’t sit around and let them do it.”

In the standoff over evidence, the case has already become a game of constitutional chicken over states’ rights versus federal immunity, a battle that will have implications for others who wish to hold agents in the president’s immigration surge criminally accountable. 

So far, neither side is showing signs of backing down, foreshadowing a fight that could take years. If prosecutors do eventually file charges against federal agents involved in the shootings, legal experts said the path to trial, much less winning convictions, will be filled with legal and procedural challenges.

“State prosecutors across the country are going to be watching what happens in Minnesota really closely,” said Alicia Bannon, director of the judiciary program at the nonprofit Brennan Center for Justice.

Advertisement

The first test for prosecutors, if they file charges, would be to prove the agents don’t qualify for immunity through the Constitution’s supremacy clause, a rarely invoked legal doctrine that protects federal officers from state prosecutions if they’re acting lawfully and within the scope of their duties.

Failing to pass that test would likely end the case.

The U.S. Supreme Court hasn’t taken up a case involving supremacy clause immunity in over 100 years, Bannon said, and judges have come down differently on legal issues related to its application. 

There’s no easy answer as to whether Minnesota will be able to get past a supremacy clause defense, said Jill Hasday, a constitutional law professor at the University of Minnesota.

Advertisement

“That depends on the facts, but probably the odds are stacked against it,” she said.

Even if they survive such a fight, the cases could be dogged by a series of logistical challenges. Moriarty, who has been leading the investigations, has decided not to seek reelection and will leave office at the end of the year. That means whoever wins the election for her seat in November could inherit the prosecutions. 

In addition to not having the names of the agents, prosecutors don’t know where those agents are now. Minnesota may need to extradite them, potentially from a MAGA-leaning state that may balk at sending them to Hennepin County to stand trial. 

“Will the federal government or other states cooperate with that? I think the answer to that is sort of iffy,” said Ilya Somin, a law professor at George Mason University in Virginia. (Indeed, in a case involving a doctor charged with illegally mailing abortion medication to a Louisiana woman, the state of California has rejected an extradition request, citing its own laws protecting doctors from prosecution elsewhere.)

Advertisement

The fight is focused on three shootings. But Moriarty’s office has opened criminal investigations into 14 additional cases of potentially unlawful behavior by federal agents during Operation Metro Surge, which started in early December and has wound down over the past few weeks. 

The other cases Moriarty is examining involve allegations of excessive force or other misconduct by federal agents, such as an incident in early January in which agents allegedly used force on staff and students on the grounds of a high school.

Prosecutors are also investigating Gregory Bovino, the outgoing Border Patrol commander who helped to lead immigration surges into several American cities and who was seen on video lobbing green-smoke canisters into crowds at a park in Minneapolis. A Department of Homeland Security spokesperson said at the time that Bovino and other agents were responding to a “hostile crowd.”

The tension has played out in a series of demand letters sent by Moriarty to the Justice and Homeland Security departments. “Public transparency is vitally important in these cases — not just for the people of Hennepin County and Minnesota, but for the public nationwide,” Moriarty wrote in one of the letters. “The only way to achieve transparency is through investigation conducted at a local level.”

Advertisement

In January, after the shooting of Good, federal officials had agreed to participate in a joint investigation with the Bureau of Criminal Apprehension — Minnesota’s state police agency tasked with examining use of deadly force cases — according to the letters signed by Moriarty. 

State officials presumed they’d be able to examine evidence, such as the car Good was driving and the guns used to shoot her and the other victims. But the investigators later learned through public statements by high-ranking Trump administration officials that federal agents were no longer planning to share evidence, the letter states. 

Local and state prosecutors don’t have the authority to subpoena them for evidence like in a typical criminal investigation. The demand letters, called Touhy letters, are formal written requests, used as an alternative to a subpoena, asking a federal agency to provide evidence or testimony in a case in which the government is not a party. Moriarty sought an extensive list of evidence in the shootings, from the guns fired by the agents in all three cases to official reports, agent GPS devices and witness statements. The Touhy letters asked for a response by Feb. 17. 

Normally, the federal government complies with Touhy letters as a matter of protocol, as long as releasing the information doesn’t violate an internal policy, said Timothy Johnson, a political science and law professor at the University of Minnesota. 

Advertisement

But on Feb. 13, the FBI told BCA investigators that it won’t share investigative materials in the Pretti case, BCA Superintendent Drew Evans said in a statement. Evans said the police agency had reiterated its requests for evidence in the Good and Sosa-Celis cases.

More than a month after the deadline set by prosecutors, the Trump administration still hasn’t turned over the materials.

“There has been no cooperation from federal authorities,” BCA spokesperson Michael Ernster said. 

The agents involved in the shootings have not spoken publicly, but a spokesperson for the Department of Homeland Security defended Good’s shooting, saying the agent acted in self-defense. They said the Pretti shooting was under investigation by the FBI and the Department of Homeland Security, with the Border Patrol conducting its own investigation. Those investigations could result in discipline or charges, including for civil rights violations. 

Advertisement

The Department of Homeland Security spokesperson said federal officials found that, after Sosa-Celis’ shooting, officers made false statements. But the agency did not say whether it would cooperate with the local authorities or follow a court ruling requiring it to do so.

The Justice Department did not respond to a request for comment or to questions. Neither agency has responded to the lawsuit.

Moriarty called the lawsuit “critically important” to investigating the shooting cases but also said she had not made any decisions on whether her office will file charges.

“There has to be an investigation anytime a federal agent or a state agent takes the life of a person in our community,” she said. “And ultimately the decision may be it was lawful. You don’t know, but that’s why you do the investigation. You are transparent with the results of that investigation, and you are public with your transparency about the decision and how you got there.”

Advertisement

But a lawsuit does not guarantee that prosecutors will get all they want. “The question then becomes, even if Hennepin County or Minneapolis wins the suit, will they comply then?” Johnson asked. “And the answer is probably no.”

If the Trump administration did eventually defy a judge’s order, he said, prosecutors could try to appeal up to the U.S. Supreme Court. As far as what could happen next: “It’s anyone’s guess.”

Filed Under: alex pretti, doj, ice, investigations, julio cesar sosa-celis, keith ellison, minnesota, murder, renee good

Advertisement

Source link

Continue Reading

Tech

Game Jam Winner Spotlight: CARAMENTRAN

Published

on

from the gaming-like-it’s-1930 dept

It’s time for the second in our series of spotlight posts looking at the winners of our eighth annual public domain game jam, Gaming Like It’s 1930! We’ve already covered the Best Adaptation winner, and this week we’re looking at the winner of Best Deep Cut: CARAMENTRAN by RedSPINE and poymakes.

Sometimes, we get entries that were designed for more than one game jam, and this is one of them. In this case, the game was also created for the Themed Horror game jam in which one of the themes was “macabre carnival”. CARAMENTRAN delves specifically into a Provençal carnival tradition from France, in which the “King of Carnival” or Caramentran is put on trial for all the year’s ills then burned at the stake in punishment. As the player, you are Caramentran himself, trying to ward off accusations from the villagers while extinguishing the flames at your feet in a grimy, unsettling horror arcade game.

It’s a fitting premise for a horror game, but what makes it special for this game jam is its visual assets, drawn from a variety of public domain sources. The game’s hauntingly hideous aesthetic comes from a collage of archive images and postcards of actual carnivals in Southern France, combined with figures taken from American magazines, ads, and fashion plates.

Many of the materials are from 1930, while many others are from earlier, and the combination of wildly different styles is viscerally jarring in a way that amplifies the horror. There are no widely recognized images or famous works of art here, only fragments of visual language plucked piece by piece from the vast sea of imagery in the public domain, and for that it’s this year’s Best Deep Cut.

Congratulations to RedSPINE and poymakes for the win! You can play CARAMENTRAN in your browser on Itch. We’ll be back next week with another winner spotlight, and don’t forget to check out the many great entries that didn’t quite make the cut. And stay tuned for next year, when we’ll be back for Gaming Like It’s 1931!

Filed Under: game jam, games, gaming, gaming like it’s 1930, public domain, winner spotlight

Advertisement

Source link

Continue Reading

Tech

OCSF explained: The shared data language security teams have been missing

Published

on

The security industry has spent the last year talking about models, copilots, and agents, but a quieter shift is happening one layer below all of that: Vendors are lining up around a shared way to describe security data. The Open Cybersecurity Schema Framework (OCSF), is emerging as one of the strongest candidates for that job.

It gives vendors, enterprises, and practitioners a common way to represent security events, findings, objects, and context. That means less time rewriting field names and custom parsers and more time correlating detections, running analytics, and building workflows that can work across products. In a market where every security team is stitching together endpoint, identity, cloud, SaaS, and AI telemetry, a common infrastructure long felt like a pipe dream, and OCSF now puts it within reach.

OCSF in plain language

OCSF is an open-source framework for cybersecurity schemas. It’s vendor neutral by design and deliberately agnostic to storage format, data collection, and ETL choices. In practical terms, it gives application teams and data engineers a shared structure for events so analysts can work with a more consistent language for threat detection and investigation.

That sounds dry until you look at the daily work inside a security operations center (SOC). Security teams have to spend a lot of effort normalizing data from different tools so that they can correlate events. For example, detecting an employee logging in from San Francisco at 10 a.m. on their laptop, then accessing a cloud resource from New York at 10:02 a.m. could reveal a leaked credential.

Advertisement

Setting up a system that can correlate those events, however, is no easy task: Different tools describe the same idea with different fields, nesting structures, and assumptions. OCSF was built to lower this tax. It helps vendors map their own schemas into a common model and helps customers move data through lakes, pipelines, security incident and event management (SIEM) tools without requiring time consuming translation at every hop.

The last two years have been unusually fast

Most of OCSF’s visible acceleration has happened in the last two years. The project was announced in August 2022 by Amazon AWS and Splunk, building on worked contributed by Symantec, Broadcom, and other well known infrastructure giants Cloudflare, CrowdStrike, IBM, Okta, Palo Alto Networks, Rapid7, Salesforce, Securonix, Sumo Logic, Tanium, Trend Micro, and Zscaler.

Image 1

The OCSF community has kept up a steady cadence of releases over the last two years

The community has grown quickly. AWS said in August 2024 that OCSF had expanded from a 17-company initiative into a community with more than 200 participating organizations and 800 contributors, which expanded to 900 wen OCSF joined the Linux Foundation in November 2024. 

OCSF is showing up across the industry

In the observability and security space, OCSF is everywhere. AWS Security Lake converts natively supported AWS logs and events into OCSF and stores them in Parquet. AWS AppFabric can output OCSF — normalized audit data. AWS Security Hub findings use OCSF, and AWS publishes an extension for cloud-specific resource details. 

Advertisement

Splunk can translate incoming data into OCSF with edge processor and ingest processor. Cribl supports seamless converting streaming data into OCSF and compatible formats.

Palo Alto Networks can forward Strata sogging Service data into Amazon Security Lake in OCSF. CrowdStrike positions itself on both sides of the OCSF pipe, with Falcon data translated into OCSF for Security Lake and Falcon Next-Gen SIEM positioned to ingest and parse OCSF-formatted data. OCSF is one of those rare standards that has crossed the chasm from an abstract standard into standard operational plumbing across the industry.

AI is giving the OCSF story fresh urgency

When enterprises deploy AI infrastructure, large language models (LLMs) sit at the core, surrounded by complex distributed systems such as model gateways, agent runtimes, vector stores, tool calls, retrieval systems, and policy engines. These components generate new forms of telemetry, much of which spans product boundaries. Security teams across the SOC are increasingly focused on capturing and analyzing this data. The central question often becomes what an agentic AI system actually did, rather than only the text it produced, and whether its actions led to any security breaches.

That puts more pressure on the underlying data model. An AI assistant that calls the wrong tool, retrieves the wrong data, or chains together a risky sequence of actions creates a security event that needs to be understood across systems. A shared security schema becomes more valuable in that world, especially when AI is also being used on the analytics side to correlate more data, faster.

Advertisement

For OCSF, 2025 was all about AI

Imagine a company uses an AI assistant to help employees look up internal documents and trigger tools like ticketing systems or code repositories. One day, the assistant starts pulling the wrong files, calling tools it should not use, and exposing sensitive information in its responses.

Updates in OCSF versions 1.5.0, 1.6.0, and 1.7.0 help security teams piece together what happened by flagging unusual behavior, showing who had access to the connected systems, and tracing the assistant’s tool calls step by step. Instead of only seeing the final answer the AI gave, the team can investigate the full chain of actions that led to the problem.

What’s on the horizon

Imagine a company uses an AI customer support bot, and one day the bot begins giving long, detailed answers that include internal troubleshooting guidance meant only for staff. With the kinds of changes being developed for OCSF 1.8.0, the security team could see which model handled the exchange, which provider supplied it, what role each message played, and how the token counts changed across the conversation.

A sudden spike in prompt or completion tokens could signal that the bot was fed an unusually large hidden prompt, pulled in too much background data from a vector database, or generated an overly long response that increased the chance of sensitive information leaking. That gives investigators a practical clue about where the interaction went off course, instead of leaving them with only the final answer.

Advertisement

Why this matters to the broader market

The bigger story is that OCSF has moved quickly from being a community effort to becoming a real standard that security products use every day. Over the past two years, it has gained stronger governance, frequent releases, and practical support across data lakes, ingest pipelines, SIEM workflows, and partner ecosystems.

In a world where AI expands the security landscape through scams, abuse, and new attack paths, security teams rely on OCSF to connect data from many systems without losing context along the way to keep your data safe.

Nikhil Mungel has been building distributed systems and AI teams at SaaS companies for more than 15 years.

Welcome to the VentureBeat community!

Advertisement

Our guest posting program is where technical experts share insights and provide neutral, non-vested deep dives on AI, data infrastructure, cybersecurity and other cutting-edge technologies shaping the future of enterprise.

Read more from our guest post program — and check out our guidelines if you’re interested in contributing an article of your own!

Source link

Advertisement
Continue Reading

Tech

AT&T’s New OneConnect Bundles Mobile and Home Internet but There’s a Catch

Published

on

It’s easier now to stay connected wherever you are, but getting to that point is still complicated. Wireless plans for phones and home internet plans are typically two separate things, with some crossover or discounts if you get them from the same provider.

AT&T OneConnect puts wireless and home service together in one bundle, with unlimited mobile data for up to 10 voice lines and gigabit broadband at home. However, it’s limited to new AT&T customers only. Here’s how the details break down.

OneConnect offers three pricing tiers, billed monthly:

Advertisement
  • Individual — $90: One member, one voice line, up to three data devices and one household with 1Gbps internet.

  • Duo — $120: Two members, two voice lines, up to six data devices and one household with 1Gbps internet.

  • Family — $225: Unlimited members, up to 10 voice lines, up to 10 data devices and one household with 1Gbps internet.

One notable detail is that the OneConnect subscription prices listed above include taxes and fees, a practice that’s quickly becoming increasingly rare among major carriers. On many plans, including AT&T’s newest wireless plans, those costs are added on top.

For comparison, an AT&T bundle for two people with unlimited wireless and gigabit-speed home internet would cost about $225, including two lines on the AT&T Premium 2.0 plan and AT&T Internet 1000 fiber at $65. For one person, a single Premium 2.0 wireless plan costs $90, plus $65 for home fiber. (It’s also important to note that speeds and availability vary depending on your location.) 

As with any new connection plan, you’ll want to scrutinize the details so you know what you’re getting into.

For instance, OneConnect is currently limited to new customers; existing AT&T customers have no migration path to combine their broadband and wireless services under this digital umbrella. According to an AT&T spokesperson, “Once we gather customer feedback and validate the experience with our initial cohort, we will make OneConnect available to as many customers as possible.”

Advertisement

It’s also entirely BYOD — or ‘bring your own device’: “Limited to bring your own eSIM compatible, unlocked smartphones, tablets, and wearables,” reads the fine print on AT&T’s press statement. There are no phone deals tied to OneConnect, although the spokesperson didn’t rule out that possibility in the future.

Unlike AT&T’s standalone wireless plans, OneConnect follows a one-size-fits-all model. One benefit of AT&T mobile service is that each person on an account can select their own plan. For instance, a parent might choose AT&T Premium 2.0, while a teen could opt for the cheaper but more limited AT&T Value 2.0.

Other major carriers offer home internet and mobile service bundles, but they’re not packaged in the same way. Verizon and T-Mobile, for example, provide discounts if you’ve signed up for both types of plans. 

AT&T is betting that account owners will want a simpler, bundled service instead of two separate plans. With unlimited talk, texting, data and AT&T’s Active Armor service for filtering out unwanted calls and texts, that’s a size that does seem to fit all.

Advertisement

Source link

Continue Reading

Tech

Lucid blames dip in Q1 sales on seat supplier issue

Published

on

Lucid Group finished 2025 on an upswing — building twice as many EVs as the previous year and reporting a 55% uptick in sales. Then the first quarter of 2026 arrived.

The company, which makes the Air sedan and Gravity SUV, reported Friday that it sold 3,093 vehicles in the first quarter, a 42% drop from the previous quarter and about 0.5% lower than the same period last year. It had built many more, about 5,500 in total.

Lucid said the sales dip, and the gap between production and deliveries, is not a demand problem. Instead, the company blames a supplier quality issue with its second-row seats, which disrupted deliveries of the Lucid Gravity for 29 days.

The supplier issue also prompted Lucid to recall more than 4,000 Gravity SUVs. Lucid told the National Highway Traffic Safety Administration that it discovered some of the anchors for the SUV’s second-row seat belts were not properly welded.

Advertisement

Lucid spokesperson Nick Twork confirmed to TechCrunch that the decrease in sales was tied to problems with the supplier. He said that due to an unapproved change made by a supplier, the company issued a stop on Gravity sales that lasted most of February to ensure proper vehicle quality before restarting them. Twork made a point of noting Lucid’s more recent success, saying that “following eight record quarters, we showed strong results in both January and March which very nearly achieved year-over-year growth on their own.” 

Lucid said in its securities filing Friday that the issue has been addressed, and the company seems confident that disruption won’t affect its production goals.

Lucid reaffirmed its previously announce production guidance of between 25,000 and 27,000 vehicles this year. Lucid built 18,378 EVs in 2025. That would represent an increase of as much as 47% from last year.

Techcrunch event

Advertisement

San Francisco, CA
|
October 13-15, 2026

Lucid’s seat supplier troubles come as the company prepares to start building its first vehicle on a new lower-cost platform aimed at the mass market. Lucid has said that first vehicle will cost around $50,000, a price point that will put it in direct competition with the upcoming Rivian R2 SUV, as well as existing products like the Tesla Model Y, Tesla Model 3, and Chevrolet Equinox EV.

Advertisement

Source link

Continue Reading

Tech

Seattle VR gaming studio Polyarc announces ‘significant’ layoffs

Published

on

Moss: Book II. (Polyarc screenshot)

Polyarc, the Seattle-based VR gaming developer behind the award-winning Moss series, announced that it’s had to “significantly reduce the size of the company.”

The announcement, via LinkedIn on Monday, notes that the layoffs come after “an unsuccessful team-wide effort to secure funding following the cancellation of a major project.”

The company, which had around 52 employees according to LinkedIn, did not specify how many were affected, but said it plans to share a spreadsheet with information about those who were impacted to help them make connections in new job searches.

Polyarc was founded in 2016 by Tam Armstrong, Danny Bulla, and Chris Alderson, all three of whom had formerly worked on Destiny at Bungie. The studio’s debut project, the fantasy adventure Moss, came out in 2018 to critical and commercial success, which led to both a 2021 sequel and a multiplayer spinoff in 2023.

The Moss series, which began as exclusives for the PlayStation VR platform before going multiplatform, is on the short list of candidates for VR gaming’s “killer app.” In Moss, players take the role of a Reader, an unseen individual who discovers a magical book in a forgotten library. That book allows you to watch and affect events in the fantasy world of Moss, where a young mouse named Quill is on a quest to save her kingdom.

Advertisement

The problem for the VR market, however, is that much of it is driven by Meta, and Meta has been steadily stepping back from its VR endeavors for the better part of the last couple of years. In January, another wave of VR layoffs at Meta closed several studios and dramatically lowered the headcount at Bellevue, Wash.-based Camouflaj.

There are still major players in VR gaming, such as Valve and its upcoming Steam Frame. It’d be a mistake to say the sector is dead or dying, but Meta drove so much of the conversation around VR that its slow abandonment has destabilized the format.

In addition, the last three years have been a tough time to work in the video game industry, as numerous companies have been forced to slim or shut down. Other recently affected studios in the Pacific Northwest include Phoenix Labs, Monolith Productions, and Rec Room.

Source link

Advertisement
Continue Reading

Tech

Amazon meets FedEx Office: A seamless return and one very dumb question about stamps

Published

on

A microchip pet door awaits its fate at the FedEx Office on NW 46th Street in Seattle. (GeekWire Photo / Todd Bishop)

For a while now, since the closure of the Amazon Fresh Pickup in Seattle, I’ve been complaining about having to drive across the Ballard Bridge to Whole Foods to do my Amazon returns. 

So when news emerged that FedEx Office locations are now part of Amazon’s drop-off network, I jumped at the chance to try it. Turns out there’s a FedEx Office on the way to GeekWire HQ, near the PCC on NW 46th Street (across from the “Up” house in the Ballard Blocks complex).

I walked in with a microchip pet door (long story), showed the QR code on my phone, got it scanned, handed over the unpackaged item, and walked out with a receipt. No box, tape, or label required, just as with other drop-off locations. There was no line.

The refund hit my account the same day.

The one thing that made me scratch my head is that, unlike returning something at a Kohl’s or Whole Foods, there’s no real ancillary benefit for FedEx Office. I dropped off the package and there was nothing else to do in the store. I had no copies to make, nothing to ship, and no need for any of the miscellaneous supplies in their limited displays.

Advertisement

However, I was in need of traditional U.S. Postal Service stamps, so I asked if they sold them, and the guy looked at me like I was a complete idiot. Fair enough.

But for pure convenience, it seems like a win for Amazon customers. 

Amazon and FedEx severed their logistics relationship back in 2019 as Amazon built out its own delivery network. Now they’re patching things up, and more than 1,500 FedEx Office locations are accepting returns as part of a network of over 10,000 drop-off points nationwide. 

We discussed this (and much more) on this week’s GeekWire Podcast. Listen above, and subscribe to GeekWire in Apple PodcastsSpotify, or wherever you listen.

Advertisement

Source link

Continue Reading

Tech

KeeperDB brings zero-trust database access to privileged access management

Published

on

Database credentials remain one of the most common attack vectors in enterprise breaches, yet most organisations still manage them through shared spreadsheets, hardcoded connection strings, or standalone credential vaults with no session oversight. Keeper Security, the Chicago-based cybersecurity company best known for its password management platform, is attempting to close that gap with KeeperDB, a new capability that embeds database access controls directly into its privileged access management (PAM) platform.

The product was announced at RSA Conference 2026 in San Francisco, where Keeper also collected 18 industry awards across categories including password management, privileged access management, and zero-trust security.

What KeeperDB actually does

KeeperDB adds a vault-native database access interface to KeeperPAM, Keeper’s unified privileged access management platform. In practical terms, this means developers, database administrators, and security teams can connect to MySQL, PostgreSQL, Oracle, and Microsoft SQL Server databases directly from the Keeper Vault, without exposing credentials in plaintext or relying on separate database management tools.

Every database session is governed by centralised policies, with full session recording for audit and compliance purposes. The idea is straightforward: if organisations already store their passwords, secrets, and privileged credentials in Keeper, database access should live there too, rather than requiring a separate tool with its own credential store.

Advertisement

“KeeperDB represents a natural evolution of our zero-trust architecture,” said Darren Guccione, CEO and co-founder of Keeper Security. “By embedding database access directly into the vault, we eliminate the credential sprawl that creates risk in most enterprise environments.”

Advertisement

The credential sprawl problem

The challenge KeeperDB addresses is well documented. Database credentials in most organisations are scattered across configuration files, environment variables, CI/CD pipelines, and individual developer machines. When an employee leaves or a credential is compromised, tracking down every instance of that credential becomes an exercise in archaeology.

Traditional database access tools compound the problem. Each tool maintains its own connection profiles and saved credentials, creating multiple copies of sensitive information outside any centralised governance framework. For organisations subject to SOC 2, HIPAA, PCI DSS, or similar compliance requirements, this fragmentation makes audit preparation significantly more time-consuming.

KeeperDB’s approach consolidates database access under the same zero-knowledge encryption and policy engine that already governs passwords, SSH keys, API tokens, and remote desktop sessions in KeeperPAM. Credentials are never exposed to users in plaintext, access is granted based on role-based policies, and every query session is recorded.

Proxy mode for existing workflows

Recognising that many teams have established workflows with existing database clients, Keeper is also introducing KeeperDB Proxy. This companion feature allows developers to continue using their preferred tools (pgAdmin, MySQL Workbench, DBeaver, and similar clients) while routing connections through Keeper’s infrastructure. The proxy maintains centralised policy enforcement, credential protection, and session visibility without requiring teams to abandon their existing tooling.

Advertisement

This is a pragmatic concession. Asking database administrators to switch from tools they have used for years is a reliable way to generate friction and reduce adoption. By offering both a native vault interface and a proxy mode, Keeper is betting that organisations will adopt whichever path creates the least disruption.

A broader PAM strategy

KeeperDB is the latest addition to a platform that has expanded considerably beyond its password management origins. KeeperPAM now includes password and passkey management, secrets management for DevOps and CI/CD pipelines, privileged session management with recording, remote browser isolation, secure remote desktop and SSH access via Keeper Connection Manager, and now database access.

The company’s strategy is to consolidate multiple point solutions into a single platform with a single credential store and a single policy engine. For managed service providers (MSPs), Keeper announced a revamped 2026 partner programme in February with tiered discounts and expanded enablement resources, suggesting that the mid-market and channel are key growth targets alongside direct enterprise sales.

The F1 connection

Keeper’s RSAC presence coincided with the company’s broader visibility push. Now in its third season as the official cybersecurity partner of the Atlassian Williams F1 Team, Keeper launched a global advertising campaign in March 2026 featuring driver Alex Albon. The campaign, filmed during pre-season testing in Bahrain, draws parallels between the real-time data protection required in Formula 1 operations and the identity-first security model that Keeper promotes for enterprise environments.

Advertisement

Williams uses KeeperPAM to protect passwords, infrastructure secrets, and privileged accounts both at its Grove headquarters and trackside, where race strategy, telemetry, and engineering systems depend on tightly controlled access to sensitive data.

What this signals

The broader trend KeeperDB reflects is the continued consolidation of identity and access management tools. Organisations that once maintained separate solutions for password management, secrets management, privileged access, remote connectivity, and database access are increasingly looking for unified platforms that reduce complexity and the number of credential stores to protect.

Keeper is not the only vendor pursuing this strategy. CyberArk, BeyondTrust, and Delinea have all expanded their PAM platforms in recent years. What distinguishes Keeper’s approach is its zero-knowledge architecture (meaning Keeper’s own servers cannot access customer data) and its consumer-grade user experience, which the company argues drives higher adoption rates than traditional enterprise PAM tools.

KeeperDB is available now for KeeperPAM customers, with support for MySQL, PostgreSQL, Oracle, and Microsoft SQL Server. KeeperDB Proxy is expected to follow in a subsequent release.

Advertisement

Source link

Continue Reading

Tech

The Dreame Miracle Pro finally gives my scalp and hair the attention they deserve

Published

on

Why you can trust TechRadar


We spend hours testing every product or service we review, so you can be sure you’re buying the best. Find out more about how we test.

Dreame Miracle Pro: two-minute review

The Dreame Miracle Pro is a premium dryer that does a lot more than just dry your hair. Alongside six modes — Cool, Scalp, Essence, Comfort, Quick Dry, AI Smart — it comes with a built-in essence mister, a ring of red and near-infrared light therapy around the barrel, and a distance sensor that automatically adjusts heat and airflow depending on how close the dryer is to your head.

Advertisement

Source link

Continue Reading

Tech

Open Graphics Card Powers Cyberpunk “Laptop”

Published

on

For once, we can avoid debating in the comments what constitutes a “cyberdeck”, because [LCLDIY] does not refer to his cyberpunk masterpiece as such — he calls it a laptop. Considering the form factor is more like an all-in-one with a built-in laser projection keyboard, that’s arguably an even more controversial label to use, but as stylish this build is, it’s what’s inside it that interests us most.

This would be much easier than the original for our old eyes, especially in the dark.

No, not the cash-register motherboard that serves as the brain, though that has got to be worth some hacker cred. No, it’s the graphics card [LCLDIY] designed to drive 10″ electroluminescent (EL) displays that really has us interested. EL screens have a unique and beautiful glow that many find captivating, but we don’t see them all that often for two reasons. One is price: if you can’t find them surplus, they’re not cheap. The other is driving them, which [LCLDIY]’s project helps with, because the graphics card is open source.

The card is PCI, so you’ll need an adapter to plug it into a modern PCIe slot, or you’d have to redesign the thing. Since this isn’t elegant-engineering-a-day, we know which we’d do. The card is based on the CHIPS65548/5 chip, which means you should be able to find driver support under Linux and Windows. [LCLDIY] seems to be using Windows 2000, but that might just be because it’s all been downhill since then.

If the cyberpunk laptop wasn’t enough inspiration, [LCLDIY] also created a giant-scale Game Boy using the same 10″ screen and DIY graphics card. The soft glow of the EL display is particularly suited to the low-res nature of the retro games, as it’s not entirely unlike a CRT. You can see it in action–both builds!– in videos embedded below.

The last time somebody posted an EL display here, they had to build the driver board for it, too.

Advertisement

Source link

Advertisement
Continue Reading

Trending

Copyright © 2025