In late 2024, the federal government’s cybersecurity evaluators rendered a troubling verdict on one of Microsoft’s biggest cloud computing offerings.
The tech giant’s “lack of proper detailed security documentation” left reviewers with a “lack of confidence in assessing the system’s overall security posture,” according to an internal government report reviewed by ProPublica.
Advertisement
Or, as one member of the team put it: “The package is a pile of shit.”
For years, reviewers said, Microsoft had tried and failed to fully explain how it protects sensitive information in the cloud as it hops from server to server across the digital terrain. Given that and other unknowns, government experts couldn’t vouch for the technology’s security.
Such judgments would be damning for any company seeking to sell its wares to the U.S. government, but it should have been particularly devastating for Microsoft. The tech giant’s products had been at the heart of two major cybersecurity attacks against the U.S. in three years. In one, Russian hackers exploited a weakness to steal sensitive data from a number of federal agencies, including the National Nuclear Security Administration. In the other, Chinese hackers infiltrated the email accounts of a Cabinet member and other senior government officials.
The federal government could be further exposed if it couldn’t verify the cybersecurity of Microsoft’s Government Community Cloud High, a suite of cloud-based services intended to safeguard some of the nation’s most sensitive information.
Advertisement
Yet, in a highly unusual move that still reverberates across Washington, the Federal Risk and Authorization Management Program, or FedRAMP, authorized the product anyway, bestowing what amounts to the federal government’s cybersecurity seal of approval. FedRAMP’s ruling — which included a kind of “buyer beware” notice to any federal agency considering GCC High — helped Microsoft expand a government business empire worth billions of dollars.
“BOOM SHAKA LAKA,” Richard Wakeman, one of the company’s chief security architects, boasted in an online forum, celebrating the milestone with a meme of Leonardo DiCaprio in “The Wolf of Wall Street.” Wakeman did not respond to requests for comment.
It was not the type of outcome that federal policymakers envisioned a decade and a half ago when they embraced the cloud revolution and created FedRAMP to help safeguard the government’s cybersecurity. The program’s layers of review, which included an assessment by outside experts, were supposed to ensure that service providers like Microsoft could be entrusted with the government’s secrets. But ProPublica’s investigation — drawn from internal FedRAMP memos, logs, emails, meeting minutes, and interviews with seven former and current government employees and contractors — found breakdowns at every juncture of that process. It also found a remarkable deference to Microsoft, even as the company’s products and practices were central to two of the most damaging cyberattacks ever carried out against the government.
FedRAMP first raised questions about GCC High’s security in 2020 and asked Microsoft to provide detailed diagrams explaining its encryption practices. But when the company produced what FedRAMP considered to be only partial information in fits and starts, program officials did not reject Microsoft’s application. Instead, they repeatedly pulled punches and allowed the review to drag out for the better part of five years. And because federal agencies were allowed to deploy the product during the review, GCC High spread across the government as well as the defense industry. By late 2024, FedRAMP reviewers concluded that they had little choice but to authorize the technology — not because their questions had been answered or their review was complete, but largely on the grounds that Microsoft’s product was already being used across Washington.
Advertisement
Today, key parts of the federal government, including the Justice and Energy departments, and the defense sector rely on this technology to protect highly sensitive information that, if leaked, “could be expected to have a severe or catastrophic adverse effect” on operations, assets and individuals, the government has said.
“This is not a happy story in terms of the security of the U.S.,” said Tony Sager, who spent more than three decades as a computer scientist at the National Security Agency and now is an executive at the nonprofit Center for Internet Security.
For years, the FedRAMP process has been equated with actual security, Sager said. ProPublica’s findings, he said, shatter that facade.
“This is not security,” he said. “This is security theater.”
Advertisement
ProPublica is exposing the government’s reservations about this popular product for the first time. We are also revealing Microsoft’s yearslong inability to provide the encryption documentation and evidence the federal reviewers sought.
The revelations come as the Justice Department ramps up scrutiny of the government’s technology contractors. In December, the department announced the indictment of a former employee of Accenture who allegedly misled federal agencies about the security of the company’s cloud platform and its compliance with FedRAMP’s standards. She has pleaded not guilty. Accenture, which was not charged with wrongdoing, has said that it “proactively brought this matter to the government’s attention” and that it is “dedicated to operating with the highest ethical standards.”
Microsoft has also faced questions about its disclosures to the government. As ProPublica reported last year, the company failed to inform the Defense Department about its use of China-based engineers to maintain the government’s cloud systems, despite Pentagon rules stipulating that “No Foreign persons may have” access to its most sensitive data. The department is investigating the practice, which officials say could have compromised national security.
Microsoft has defended its program as “tightly monitored and supplemented by layers of security mitigations,” but after ProPublica’s story published last July, the company announced that it would stop using China-based engineers for Defense Department work.
Advertisement
In response to written questions for this story and in an interview, Microsoft acknowledged the yearslong confrontation with FedRAMP but also said it provided “comprehensive documentation” throughout the review process and “remediated findings where possible.”
“We stand by our products and the comprehensive steps we’ve taken to ensure all FedRAMP-authorized products meet the security and compliance requirements necessary,” a spokesperson said in a statement, adding that the company would “continue to work with FedRAMP to continuously review and evaluate our services for continued compliance.”
The program was an early target of the Trump administration’s Department of Government Efficiency, which slashed its staff and budget. Even FedRAMP acknowledges it is operating “with an absolute minimum of support staff” and “limited customer service.” The roughly two dozen employees who remain are “entirely focused on” delivering authorizations at a record pace, FedRAMP’s director has said. Today, its annual budget is just $10 million, its lowest in a decade, even as it has boasted record numbers of new authorizations for cloud products.
Advertisement
The consequence of all this, people who have worked for FedRAMP told ProPublica, is that the program now is little more than a rubber stamp for industry. The implications of such a downsizing for federal cybersecurity are far-reaching, especially as the administration encourages agencies to adopt cloud-based artificial intelligence tools, which draw upon reams of sensitive information.
The General Services Administration, which houses FedRAMP, defended the program, saying it has undergone “significant reforms to strengthen governance” since GCC High arrived in 2020. “FedRAMP’s role is to assess if cloud services have provided sufficient information and materials to be adequate for agency use, and the program today operates with strengthened oversight and accountability mechanisms to do exactly that,” a GSA spokesperson said in an emailed statement.
The agency did not respond to written questions regarding GCC High.
A “Cloud First” World
About two decades ago, federal officials predicted that the cloud revolution, providing on-demand access to shared computing via the internet, would usher in an era of cheaper, more secure and more efficient information technology.
Advertisement
Moving to the cloud meant shifting away from on-premises servers owned and operated by the government to those in massive data centers maintained by tech companies. Some agency leaders were reluctant to relinquish control, while others couldn’t wait to.
In an effort to accelerate the transition, the Obama administration issued its “Cloud First” policy in 2011, requiring all agencies to implement cloud-based tools “whenever a secure, reliable, cost-effective” option existed. To facilitate adoption, the administration created FedRAMP, whose job was to ensure the security of those tools.
FedRAMP’s “do once, use many times” system was intended to streamline and strengthen the government procurement process. Previously, each agency using a cloud service vetted it separately, sometimes applying different interpretations of federal security requirements. Under the new program, agencies would be able to skip redundant security reviews because FedRAMP authorization indicated that the product had already met standardized requirements. Authorized products would be listed on a government website known as the FedRAMP Marketplace.
On paper, the program was an exercise in efficiency. But in practice, the small FedRAMP team could not keep up with the flood of demand from tech companies that wanted their products authorized.
Advertisement
The slow approval process frustrated both the tech industry, eager for a share in the billions of federal dollars up for grabs, and government agencies that were under pressure to migrate to the cloud. These dynamics sometimes pitted the cloud industry and agency officials together against FedRAMP. The backlog also prompted many agencies to take an alternative path: performing their own reviews of the products they wanted to adopt, using FedRAMP’s standards.
It was through this “agency path” that GCC High entered the federal bloodstream, with the Justice Department paving the way. Initially, some Justice officials were nervous about the cloud and who might have access to its information, which includes highly sensitive court and law enforcement records, a Justice Department official involved in the decision told ProPublica. The department’s cybersecurity program required it to ensure that only U.S. citizens “access or assist in the development, operation, management, or maintenance” of its IT systems, unless a waiver was granted. Justice’s IT specialists recommended pursuing GCC High, believing it could meet the elevated security needs, according to the official, who spoke on condition of anonymity because they were not authorized to discuss internal matters.
Pursuant to FedRAMP’s rules, Microsoft had GCC High evaluated by a so-called third-party assessment organization, which is supposed to provide an independent review of whether the product has met federal standards. The Justice Department then performed its own evaluation of GCC High using those standards and ruled the offering acceptable.
By early 2020, Melinda Rogers, Justice’s deputy chief information officer, made the decision official and soon deployed GCC High across the department.
Advertisement
It was a milestone for all involved. Rogers had ushered the Justice Department into the cloud, and Microsoft had gained a significant foothold in the cutthroat market for the federal government’s cloud computing business.
Moreover, Rogers’ decision placed GCC High on the FedRAMP Marketplace, the government’s influential online clearinghouse of all the cloud providers that are under review or already authorized. Its mere mention as “in process” was a boon for Microsoft, amounting to free advertising on a website used by organizations seeking to purchase cloud services bearing what is widely seen as the government’s cybersecurity seal of approval.
That April, GCC High landed at FedRAMP’s office for review, the final stop on its bureaucratic journey to full authorization.
Microsoft’s Missing Information
In theory, there shouldn’t have been much for FedRAMP’s team to do after the third-party assessor and Justice reviewed GCC High, because all parties were supposed to be following the same requirements.
Advertisement
But it was around this time that the Government Accountability Office, which investigates federal programs, discovered breakdowns in the process, finding that agency reviews sometimes were lacking in quality. Despite missing details, FedRAMP went on to authorize many of these packages. Acknowledging these shortcomings, FedRAMP began to take a harder look at new packages, a former reviewer said.
This was the environment in which Microsoft’s GCC High application entered the pipeline. The name GCC High was an umbrella covering many services and features within Office 365 that all needed to be reviewed. FedRAMP reviewers quickly noticed key material was missing.
The team homed in on what it viewed as a fundamental document called a “data flow diagram,” former members told ProPublica. The illustration is supposed to show how data travels from Point A to Point B — and, more importantly, how it’s protected as it hops from server to server. FedRAMP requires data to be encrypted while in transit to ensure that sensitive materials are protected even if they’re intercepted by hackers.
But when the FedRAMP team asked Microsoft to produce the diagrams showing how such encryption would happen for each service in GCC High, the company balked, saying the request was too challenging. So the reviewers suggested starting with just Exchange Online, the popular email platform.
Advertisement
“This was our litmus test to say, ‘This isn’t the only thing that’s required, but if you’re not doing this, we are not even close yet,’” said one reviewer who spoke on condition of anonymity because they were not authorized to discuss internal matters. Once they reached the appropriate level of detail, they would move from Exchange to other services within GCC High.
It was the kind of detail that other major cloud providers such as Amazon and Google routinely provided, members of the FedRAMP team told ProPublica. Yet Microsoft took months to respond. When it did, the former reviewer said, it submitted a white paper that discussed GCC High’s encryption strategy but left out the details of where on the journey data actually becomes encrypted and decrypted — so FedRAMP couldn’t assess that it was being done properly.
A Microsoft spokesperson acknowledged that the company had “articulated a challenge related to illustrating the volume of information being requested in diagram form” but “found alternate ways to share that information.”
Rogers, who was hired by Microsoft in 2025, declined to be interviewed. In response to emailed questions, the company provided a statement saying that she “stands by the rigorous evaluation that contributed to” her authorization of GCC High. A spokesperson said there was “absolutely no connection” between her hiring and the decisions in the GCC High process, and that she and the company complied with “all rules, regulations, and ethical standards.”
Advertisement
The Justice Department declined to respond to written questions from ProPublica.
A Fight Over “Spaghetti Pies”
As 2020 came to a close, a national security crisis hit Washington that underscored the consequences of cyber weakness. Russian state-sponsored hackers had been quietly working their way through federal computer systems for much of the year and vacuuming up sensitive data and emails from U.S. agencies — including the Justice Department.
At the time, most of the blame fell on a Texas-based company called SolarWinds, whose software provided hackers their initial opening and whose name became synonymous with the attack. But, as ProPublica has reported, the Russians leveraged that opening to exploit a long-standing weakness in a Microsoft product — one that the company had refused to fix for years, despite repeated warnings from one of its engineers. Microsoft has defended its decision not to address the flaw, saying that it received “multiple reviews” and that the company weighs a variety of factors when making security decisions.
In the aftermath, the Biden administration took steps to bolster the nation’s cybersecurity. Among them, the Justice Department announced a cyber-fraud initiative in 2021 to crack down on companies and individuals that “put U.S. information or systems at risk by knowingly providing deficient cybersecurity products or services, knowingly misrepresenting their cybersecurity practices or protocols, or knowingly violating obligations to monitor and report cybersecurity incidents and breaches.”
Advertisement
Deputy Attorney General Lisa Monaco said the department would use the False Claims Act to pursue government contractors “when they fail to follow required cybersecurity standards — because we know that puts all of us at risk.”
But if Microsoft felt any pressure from the SolarWinds attack or from the Justice Department’s announcement, it didn’t manifest in the FedRAMP talks, according to former members of the FedRAMP team.
The discourse between FedRAMP and Microsoft fell into a pattern. The parties would meet. Months would go by. Microsoft would return with a response that FedRAMP deemed incomplete or irrelevant. To bolster the chances of getting the information it wanted, the FedRAMP team provided Microsoft with a template, describing the level of detail it expected. But the diagrams Microsoft returned never met those expectations.
“We never got past Exchange,” one former reviewer said. “We never got that level of detail. We had no visibility inside.”
Advertisement
In an interview with ProPublica, John Bergin, the Microsoft official who became the government’s main contact, acknowledged the prolonged back-and-forth but blamed FedRAMP, equating its requests for diagrams to a “rock fetching exercise.”
“We were maybe incompetent in how we drew drawings because there was no standard to draw them to,” he said. “Did we not do it exactly how they wanted? Absolutely. There was always something missing because there was no standard.”
A Microsoft spokesperson said without such a standard, “cloud providers were left to interpret the level of abstraction and representation on their own,” creating “inconsistency and confusion, not an unwillingness to be transparent.”
But even Microsoft’s own engineers had struggled over the years to map the architecture of its products, according to two people involved in building cloud services used by federal customers. At issue, according to people familiar with Microsoft’s technology, was the decades-old code of its legacy software, which the company used in building its cloud services.
Advertisement
One FedRAMP reviewer compared it to a “pile of spaghetti pies.” The data’s path from Point A to Point B, the person said, was like traveling from Washington to New York with detours by bus, ferry and airplane rather than just taking a quick ride on Amtrak. And each one of those detours represents an opportunity for a hijacking if the data isn’t properly encrypted.
Other major cloud providers such as Amazon and Google built their systems from the ground up, said Sager, the former NSA computer scientist, who worked with all three companies during his time in government.
Microsoft’s system is “not designed for this kind of isolation of ‘secure’ from ‘not secure,’” Sager said.
A Microsoft spokesperson acknowledged the company faces a unique challenge but maintained that its cloud products meet federal security requirements.
Advertisement
“Unlike providers that started later with a narrower product scope, Microsoft operates one of the broadest enterprise and government platforms in the world, supporting continuity for millions of customers while simultaneously modernizing at scale,” the spokesperson said in emailed responses. “That complexity is not ‘spaghetti,’ but it does mean the work of disentangling, isolating, and hardening systems is continuous.”
The spokesperson said that since 2023, Microsoft has made “security‑first architectural redesign, legacy risk reduction, and stronger isolation guarantees a top, company‑wide priority.”
Assessors Back-Channel Cyber Concerns
The FedRAMP team was not the only party with reservations about GCC High. Microsoft’s third-party assessment organizations also expressed concerns.
The firms are supposed to be independent but are hired and paid by the company being assessed. Acknowledging the potential for conflicts of interest, FedRAMP has encouraged the assessment firms to confidentially back-channel to its reviewers any negative feedback that they were unwilling to bring directly to their clients or reflect in official reports.
Advertisement
In 2020, two third-party assessors hired by Microsoft, Coalfire and Kratos, did just that. They told FedRAMP that they were unable to get the full picture of GCC High, a former FedRAMP reviewer told ProPublica.
“Coalfire and Kratos both readily admitted that it was difficult to impossible to get the information required out of Microsoft to properly do a sufficient assessment,” the reviewer told ProPublica.
The back channel helped surface cybersecurity issues that otherwise might never have been known to the government, people who have worked with and for FedRAMP told ProPublica. At the same time, they acknowledged its existence undermined the very spirit and intent of having independent assessors.
A spokesperson for Coalfire, the firm that initially handled the GCC High assessment, requested written questions from ProPublica, then declined to respond.
Advertisement
A spokesperson for Kratos, which replaced Coalfire as the GCC High assessor, declined an interview request. In an emailed response to written questions, the spokesperson said the company stands by its official assessment and recommendation of GCC High and “absolutely refutes” that it “ever would sign off on a product we were unable to fully vet.” The company “has open and frank conversations” with all customers, including Microsoft, which “submitted all requisite diagrams to meet FedRAMP-defined requirements,” the spokesperson said.
Kratos said it “spent extensive time working collaboratively with FedRAMP in their review” and does not consider such discussions to be “backchanneling.”
FedRAMP, however, was dissatisfied with Kratos’ ongoing work and believed the firm “should be pushing back” on Microsoft more, the former reviewer said. It placed Kratos on a “corrective action plan,” which could eventually result in loss of accreditation. The company said it did not agree with FedRAMP’s action but provided “additional trainings for some internal assessors” in response to it.
The Microsoft spokesperson told ProPublica the company has “always been responsive to requests” from Kratos and FedRAMP. “We are not aware of any backchanneling, nor do we believe that backchanneling would have been necessary given our transparency and cooperation with auditor requests,” the spokesperson said.
Advertisement
In response to questions from ProPublica about the process, the GSA said in an email that FedRAMP’s system “does not create an inherent conflict of interest for professional auditors who meet ethical and contractual performance expectations.”
GSA did not respond to questions about back-channeling but said the “correct process” is for a third-party assessor to “state these problems formally in a finding during the security assessment so that the cloud service provider has an opportunity to fix the issue.”
FedRAMP Ends Talks
The back-and-forth between the FedRAMP reviewers and Microsoft’s team went on for years with little progress. Then, in the summer of 2023, the program’s interim director, Brian Conrad, got a call from the White House that would alter the course of the review.
Chinese state-sponsored hackers had infiltrated GCC, the lower-cost version of Microsoft’s government cloud, and stolen data and emails from the commerce secretary, the U.S. ambassador to China and other high-ranking government officials. In the aftermath, Chris DeRusha, the White House’s chief information security officer, wanted a briefing from FedRAMP, which had authorized GCC.
Advertisement
The decision predated Conrad’s tenure, but he told ProPublica that he left the conversation with several takeaways. First, FedRAMP must hold all cloud providers — including Microsoft — to the same standards. Second, he had the backing of the White House in standing firm. Finally, FedRAMP would feel the political heat if any cloud service with a FedRAMP authorization were hacked.
DeRusha confirmed Conrad’s account of the phone call but declined to comment further.
Within months, Conrad informed Microsoft that FedRAMP was ending the engagement on GCC High.
“After three years of collaboration with the Microsoft team, we still lack visibility into the security gaps because there are unknowns that Microsoft has failed to address,” Conrad wrote in an October 2023 email. This, he added, was not for FedRAMP’s lack of trying. Staffers had spent 480 hours of review time, had conducted 18 “technical deep dive” sessions and had numerous email exchanges with the company over the years. Yet they still lacked the data flow diagrams, crucial information “since visibility into the encryption status of all data flows and stores is so important,” he wrote.
Advertisement
If Microsoft still wanted FedRAMP authorization, Conrad wrote, it would need to start over.
A FedRAMP reviewer, explaining the decision to the Justice Department, said the team was “not asking for anything above and beyond what we’ve asked from every other” cloud service provider, according to meeting minutes reviewed by ProPublica. But the request was particularly justified in Microsoft’s case, the reviewer told the Justice officials, because “each time we’ve actually been able to get visibility into a black box, we’ve uncovered an issue.”
“We can’t even quantify the unknowns, which makes us very uncomfortable,” the reviewer said, according to the minutes.
Microsoft and the Justice Department Push Back
Microsoft was furious. Failing to obtain authorization and starting the process over would signal to the market that something was wrong with GCC High. Customers were already confused and concerned about the drawn-out review, which had become a hot topic in an online forum used by government and technology insiders. There, Wakeman, the Microsoft cybersecurity architect, deflected blame, saying the government had been “dragging their feet on it for years now.”
Advertisement
Meanwhile, to build support for Microsoft’s case, Bergin, the company’s point person for FedRAMP and a former Army official, reached out to government leaders, including one from the Justice Department.
The Justice official, who spoke on condition of anonymity because they were not authorized to discuss the matter, said Bergin complained that the delay was hampering Microsoft’s ability “to get this out into the market full sail.” Bergin then pushed the Justice Department to “throw around our weight” to help secure FedRAMP authorization, the official said.
That December, as the parties gathered to hash things out at GSA’s Washington headquarters, Justice did just that. Rogers, who by then had been promoted to the department’s chief information officer, sat beside Bergin — on the opposite side of the table from Conrad, the FedRAMP director.
Rogers and her Justice colleagues had a stake in the outcome. Since authorizing and deploying GCC High, she had receivedaccolades for her work modernizing the department’s IT and cybersecurity. But without FedRAMP’s stamp of approval, she would be the government official left holding the bag if GCC High were involved in a serious hack. At the same time, the Justice Department couldn’t easily back out of using GCC High because once a technology is widely deployed, pulling the plug can be costly and technically challenging. And from its perspective, the cloud was an improvement over the old government-run data centers.
Advertisement
Shortly after the meeting kicked off, Bergin interrupted a FedRAMP reviewer who had been presenting PowerPoint slides. He said the Justice Department and third-party assessor had already reviewed GCC High, according to meeting minutes. FedRAMP “should essentially just accept” their findings, he said.
Then, in a shock to the FedRAMP team, Rogers backed him up and went on to criticize FedRAMP’s work, according to two attendees.
In its statement, Microsoft said Rogers maintains that FedRAMP’s approach “was misguided and improperly dismissed the extensive evaluations performed by DOJ personnel.”
Bergin did not dispute the account, telling ProPublica that he had been trying to argue that it is the purview of third-party assessors such as Kratos — not FedRAMP — to evaluate the security of cloud products. And because FedRAMP must approve the third-party assessment firms, the program should have taken its issues up with Kratos.
Advertisement
“When you are the regulatory agency who determines who the auditors are and you refuse to accept your auditors’ answers, that’s not a ‘me’ problem,” Bergin told ProPublica.
The GSA did not respond to questions about the meeting. The Justice Department declined to comment.
Pressure Mounts on FedRAMP
If there was any doubt about the role of FedRAMP, the White House issued a memorandum in the summer of 2024 that outlined its views. FedRAMP, it said, “must be capable of conducting rigorous reviews” and requiring cloud providers to “rapidly mitigate weaknesses in their security architecture.” The office should “consistently assess and validate cloud providers’ complex architectures and encryption schemes.”
But by that point, GCC High had spread to other federal agencies, with the Justice Department’s authorization serving as a signal that the technology met federal standards.
Advertisement
It also spread to the defense sector, since the Pentagon required that cloud products used by its contractors meet FedRAMP standards. While it did not have FedRAMP authorization, Microsoft marketed GCC High as meeting the requirements, selling it to companies such as Boeing that research, develop and maintain military weapons systems.
But with the FedRAMP authorization up in the air, some contractors began to worry that by using GCC High, they were out of compliance. That could threaten their contracts, which, in turn, could impact Defense Department operations. Pentagon officials called FedRAMP to inquire about the authorization stalemate.
The Defense Department acknowledged but did not respond to written questions from ProPublica.
Rogers also kept pressing FedRAMP to “get this thing over the line,” former employees of the GSA and FedRAMP said. It was the “opinion of the staff and the contractors that she simply was not willing to put heat to Microsoft on this” and that the Justice Department “was too sympathetic to Microsoft’s claims,” Eric Mill, then GSA’s executive director for cloud strategy, told ProPublica.
Advertisement
Authorization Despite a “Damning” Assessment
In the summer of 2024, FedRAMP hired a new permanent director, government technology insider Pete Waterman. Within about a month of taking the job, he restarted the office’s review of GCC High with a new team, which put aside the debate over data flow diagrams and instead attempted to examine evidence from Microsoft. But these reviewers soon arrived at the same conclusion, with the team’s leader complaining about “getting stiff-armed” by Microsoft.
“He came back and said, ‘Yeah, this thing sucks,’” Mill recalled.
While the team was able to work through only two of the many services included in GCC High, Exchange Online and Teams, that was enough for it to identify “issues that are fundamental” to risk management, including “timely remediation of vulnerabilities and vulnerability scanning,” according to a summary of the team’s findings reviewed by ProPublica.
Those issues, as well as a lack of “proper detailed security documentation” from Microsoft, limit “visibility and understanding of the system” and “impair the ability to make informed risk decisions.”
Advertisement
The team concluded, “There is a lack of confidence in assessing the system’s overall security posture.”
A Microsoft spokesperson said in a statement that the company “never received this feedback in any of its communications with FedRAMP.”
When ProPublica read the findings to Bergin, the Microsoft liaison, he said he was surprised.
“That’s pretty damning,” Bergin said, adding that it sounded like language that “would’ve generally been associated with a finding of ‘not worthy.’ If an assessor wrote that, I would be nervous.”
Advertisement
Despite the findings, to the FedRAMP team, turning Microsoft down didn’t seem like an option. “Not issuing an authorization would impact multiple agencies that are already using GCC-H,” the summary document said. The team determined that it was a “better value” to issue an authorization with conditions for continued government oversight.
While authorizations with oversight conditions weren’t unusual, arriving at one under these circumstances was. GCC High reviewers saw problems everywhere, both in what they were able to evaluate and what they weren’t. To them, most of the package remained a vast wilderness of untold risk.
Nevertheless, FedRAMP and Microsoft reached an agreement, and the day after Christmas 2024, GCC High received its FedRAMP authorization. FedRAMP appended a cover report to the package laying out its deficiencies and noting it carried unknown risks, according to people familiar with the report.
It emphasized that agencies should carefully review the package and engage directly with Microsoft on any questions.
Advertisement
“Unknown Unknowns” Persist
Microsoft told ProPublica that it has met the conditions of the agreement and has “stayed within the performance metrics required by FedRAMP” to ensure that “risks are identified, tracked, remediated, and transparently communicated.”
But under the Trump administration, there aren’t many people left at FedRAMP to check.
While the Biden-era guidance said FedRAMP “must be an expert program that can analyze and validate the security claims” of cloud providers, the GSA told ProPublica that the program’s role is “not to determine if a cloud service is secure enough.” Rather, it is “to ensure agencies have sufficient information to make these risk decisions.”
The problem is that agencies often lack the staff and resources to do thorough reviews, which means the whole system is leaning on the claims of the cloud companies and the assessments of the third-party firms they pay to evaluate them. Under the current vision, critics say, FedRAMP has lost the plot.
Advertisement
“FedRAMP’s job is to watch the American people’s back when it comes to sharing their data with cloud companies,” said Mill, the former GSA official, who also co-authored the 2024 White House memo. “When there’s a security issue, the public doesn’t expect FedRAMP to say they’re just a paper-pusher.”
Meanwhile, at the Justice Department, officials are finding out what FedRAMP meant by the “unknown unknowns” in GCC High. Last year, for example, they discovered that Microsoft relied on China-based engineers to service their sensitive cloud systems despite the department’s prohibition against non-U.S. citizens assisting with IT maintenance.
Officials learned about this arrangement — which was also used in GCC High — not from FedRAMP or from Microsoft but from a ProPublica investigation into the practice, according to the Justice employee who spoke with us.
A Microsoft spokesperson acknowledged that the written security plan for GCC High that the company submitted to the Justice Department did not mention foreign engineers, though he said Microsoft did communicate that information to Justice officials before 2020. Nevertheless, Microsoft has since ended its use of China-based engineers in government systems.
Advertisement
Former and current government officials worry about what other risks may be lurking in GCC High and beyond.
The GSA told ProPublica that, in general, “if there is credible evidence that a cloud service provider has made materially false representations, that matter is then appropriately referred to investigative authorities.”
Ironically, the ultimate arbiter of whether cloud providers or their third-party assessors are living up to their claims is the Justice Department itself. The recent indictment of the former Accenture employee suggests it is willing to use this power. In a court document, the Justice Department alleges that the ex-employee made “false and misleading representations” about the cloud platform’s security to help the company “obtain and maintain lucrative federal contracts.” She is also accused of trying to “influence and obstruct” Accenture’s third-party assessors by hiding the product’s deficiencies and telling others to conceal the “true state of the system” during demonstrations, the department said. She has pleaded not guilty.
There is no public indication that such a case has been brought against Microsoft or anyone involved in the GCC High authorization. The Justice Department declined to comment. Monaco, the deputy attorney general who launched the department’s initiative to pursue cybersecurity fraud cases, did not respond to requests for comment.
Advertisement
She left her government position in January 2025. Microsoft hired her to become its president of global affairs.
A company spokesperson said Monaco’s hiring complied with “all rules, regulations, and ethical standards” and that she “does not work on any federal government contracts or have oversight over or involvement with any of our dealings with the federal government.”
A former core infrastructure engineer has pleaded guilty to locking Windows admins out of 254 servers as part of a failed extortion plot targeting his employer, an industrial company headquartered in Somerset County, New Jersey.
According to court documents, 57-year-old Daniel Rhyne from Kansas City, Missouri, remotely accessed the company’s network without authorization using an administrator account between November 9 and November 25.
Throughout this time, he allegedly scheduled tasks on the company’s Windows domain controller to delete network admin accounts and to change the passwords for 13 domain admin accounts and 301 domain user accounts to “TheFr0zenCrew!”.
The prosecutors also accused Rhyne of scheduling tasks to change the passwords for two local admin accounts, which would affect 3,284 workstations, and for two more local admin accounts, which would impact 254 servers on his employer’s network. He also scheduled some tasks to shut down random servers and workstations on the network over multiple days in December 2023.
Advertisement
Subsequently, on November 25, Rhyne emailed a number of his coworkers a ransom email titled “Your Network Has Been Penetrated,” saying that all IT administrators had been locked out of their accounts and that server backups had been deleted to make data recovery impossible.
Additionally, the emails threatened to shut down 40 random servers daily over the next ten days unless the company paid a ransom of 20 bitcoin (worth roughly $750,000 at the time).
“On or about November 25, 2023, at approximately 4:00 p.m. EST, network administrators employed at Victim-1 began receiving password reset notifications for a Victim-1 domain administrator account, as well as hundreds of Victim-1 user accounts,” the criminal complaint reads.
“Shortly thereafter, the Victim-1 network administrators discovered that all other Victim-1 domain administrator accounts were deleted, thereby denying domain administrator access to Victim-1’s computer networks.”
Advertisement
Forensic investigators found that on November 22, Rhyne used a hidden virtual machine and his account to search the web for information on clearing Windows logs, changing domain user passwords, and deleting domain accounts as he planned his extortion plot.
One week earlier, Rhyne made similar web searches on his laptop, including “command line to remotely change local administrator password” and “command line to change local administrator password.”
Rhyne was arrested in Missouri on Tuesday, August 27, and released after his initial appearance in federal court. The hacking and extortion charges to which he pleaded guilty carry a maximum penalty of 15 years in prison.
Earlier this month, a North Carolina data analyst contractor was found guilty of extorting his employer, Brightly Software (a Software-as-a-Service company previously known as SchoolDude), for $2.5 million.
Advertisement
Automated pentesting proves the path exists. BAS proves whether your controls stop it. Most teams run one without the other.
This whitepaper maps six validation surfaces, shows where coverage ends, and provides practitioners with three diagnostic questions for any tool evaluation.
Ronan Rogers and Ruth Callanan discuss innovation in the west of Ireland and the evolution of Ireland’s STEM careers.
Ireland’s medtech sector is moving beyond traditional biomedical engineering, according to Ronan Rogers, the senior R&D director for cardiac ablation solutions at Medtronic. He explained the region has built “real depth”, not just in medtech, but across key areas such as pharmaceutical science, advanced analytics and digital technology. Areas that are now “increasingly converging”.
“That diversity of opportunity is a huge strength for Ireland,” he told SiliconRepublic.com. “It allows people from different professional backgrounds to find meaningful, high‑impact careers in healthcare, while helping Ireland move further up the value chain as a centre for complex, globally relevant innovation.”
Having recently expanded its Galway-based pharmaceutical laboratory, the Medtronic facility now serves as a west of Ireland hub for high-tech innovation and the evolving needs of the global healthcare space. Rogers is of the opinion that this is reflective of the convergence of the country’s medtech divisions.
Advertisement
Noting that the primary purpose of the lab “is to integrate pharmaceutical, engineering and analytical expertise under one roof to address the complex challenges of combination products, [that is] where a medical device and a medicine work together”.
“We see that convergence very clearly in this laboratory and there is a wide range of career paths in our industry, whether that’s a pharmacist drawn to the faster innovation cycles and applied science of medtech, or a software developer who wants to use their skills to solve real healthcare challenges and code with a deeper sense of purpose.”
What opportunities exist?
With the expansion comes the opportunity for students and professionals to consider a new role, either as part of Medtronic or within Galway’s thriving life science and medtech spaces.
“Galway offers a unique innovation ecosystem where infrastructure, academic partnerships and a significant medtech footprint all provide a strong foundation for sustaining Ireland’s leadership in the life sciences sector,” said Ruth Callanan, Medtronic’s director of site quality.
Advertisement
With the investment focused on significantly expanding R&D capability and technical depth within a critical space in the Irish medtech sector, Medtronic has increased lab space by almost a half and introduced analytical technologies that didn’t exist there before.
Callanan said: “This creates the conditions for future high‑value work as programmes grow. It strengthens Galway’s ability to attract and retain highly specialised talent, pharmaceutical scientists, chemical and materials engineers and it allows work that was previously outsourced internationally to be done here in Ireland.
“Over time, as demand and activity scale, we do expect this capability to support additional specialist roles, phased in over the coming years. Importantly, it reinforces Ireland’s position at the forefront of advanced medtech R&D and reflects a broader industry trend toward self-sufficiency in high-tech analytical testing.”
Step into the future
She explained the new lab will enable experts to integrate processes as the facility will be responsible for the entire life cycle of product development, from early phase R&D through to post-market oversight.
Advertisement
She added: “The laboratory utilises advanced LCMS [liquid chromatography-mass spectrometry] and GCMS [gas chromatography-mass spectrometry] technologies, which act as ‘molecular microscopes’. This allows our scientists to identify unknown compounds or impurities at extremely precise levels.”
According to Rogers, the new lab has a role to play in what he believes to be the reshaping of how STEM careers in Ireland are perceived and pursued, with Callanan noting this creates for students and professionals opportunities to engage with careers that bridge the gap between various scientific disciplines.
“A laboratory of this size and complexity requires students and professionals with a wide range of skills and experience across multiple disciplines,” she said.
“Just as importantly,” added Rogers, “we’re sending a clear signal to pharmacists, chemists and analytical scientists that medtech offers deep, intellectually challenging career paths that go well beyond traditional manufacturing or even classical biomedical engineering.”
Advertisement
Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.
Colorado is rolling out an average-speed camera system that tracks vehicles across multiple points instead of catching them at a single camera, making it much harder for drivers to dodge tickets with apps like Waze and Radarbot. Motor1 reports: The state’s new automated vehicle identification systems (AVIS) use several cameras to calculate your average speed between them, and if it is 10 miles per hour or more over the limit, you get a ticket. No longer will you be able to slow down as you approach a camera and speed back up after passing it, not that you should be speeding on public roads in the first place.
Colorado began deploying this new camera system after legislators changed the law in 2023, allowing AVIS for law enforcement use. The systems, installed on various roads and highways throughout the state, first began issuing warnings, but police began issuing tickets late last year.
The most recent section of road to fall under surveillance is a stretch of I-25 north of Denver, which brought the state’s growing panopticon to our attention. It began issuing tickets on April 2. The Colorado Department of Transportation installed the cameras along a construction zone. The fine is $75 and zero points for exceeding the speed limit, and the police issue it to the vehicle’s owner, regardless of who is driving.
Oracle’s Cloud Experience Center in downtown Seattle. (GeekWire File Photo / Todd Bishop)
Oracle is laying off 491 employees in Washington state, according to a filing Tuesday from the state Employment Security Department.
The cuts impact workers at two Seattle offices as well as remote employees and take effect June 1. The cloud and database giant stated in its WARN letter that the offices will not be closing.
Earlier this month, Bloomberg and others reported that Oracle was planning to cut thousands of jobs across the company as it tries to fund the high-cost deployment of new data centers. The reductions are also the result of AI-driven efficiencies within the organization, according to comments by Mike Sicilia, Oracle’s co-chief executive, in an earnings call March 10.
“The use of AI coding tools inside Oracle is enabling smaller engineering teams to deliver more complete solutions to our customers more quickly,” Sicilia said, according to the publication CIO.
Oracle declined to comment on the newest job cuts.
Advertisement
The Washington layoffs affect more than 230 software developers across multiple seniority levels and an additional 48 employees with the title of software development. The cuts include workers in senior director and vice president roles, as well as managers, product developers, product managers, program managers, site reliability developers, technical analysts, user experience developers and others.
The layoffs are the latest in a series of Oracle reductions. In August the company laid off 161 workers, followed by 101 employees in October. By last fall, Oracle had approximately 3,800 employees in the Seattle area, according to LinkedIn.
Oracle has grown its presence in the region over the past decade, tapping into the area’s engineering talent pool as it battled Amazon and Microsoft in the cloud. In recent years, the company has establishedpartnerships with both Seattle-area giants.
Now all three, plus other tech companies, have been undergoing multiple rounds of job reductions, with recent Meta cuts impacting 168 Washington workers and T-Mobile confirming new layoffs last Friday.
Finalists for GeekWire’s 2026 Sustainable Innovation Award, from left going clockwise: Helion Energy fusion device (Helion Photo), OCOchem team (OCOchem Photo), TerraPower’s mock fuel rods (TerraPower Photo), Ravel team (Ravel Photo), and IUNU’s image capturing system. (IUNU Photo)
Climate change is battering the earth with record-setting high temperatures, more powerful storms and devastating wildfires. A slate of cutting-edge sustainability companies are fighting back with technologies that aim to curb carbon emissions and help humanity navigate a change world.
This award, presented by Amazon, recognizes the Pacific Northwest’s leaders in this space. The Sustainable Innovation Award finalists this year are Helion, IUNU, OCOChem, Ravel and TerraPower.
Now in its 18th year, the GeekWire Awards is the premier event recognizing the top leaders, companies and breakthroughs in Pacific Northwest tech, bringing together hundreds of people to celebrate innovation and the entrepreneurial spirit. It takes place May 7 at the Showbox SoDo in Seattle.
Carbon Robotics, an ag-tech company building weed-killing machines that use artificial intelligence and computer vision to recognize and zap unwanted plants, won the category last year.
Continue reading for information on this year’s finalists, which were chosen by a panel of independent judges from community nominations. You can help pick the winner: Cast your ballot here or in the embedded form at the bottom of this story. Voting runs through April 16.
Advertisement
Helion Energy has spent 13 years working to replicate the physics that power the sun and stars — pursuing nearly limitless clean energy for the grid. The Everett, Wash.-based company is currently developing its seventh-generation prototype while simultaneously building what it hopes will be the world’s first commercial fusion plant, in Eastern Washington.
Backers include OpenAI CEO Sam Altman, and Microsoft has signed a deal to purchase power from that first facility. Helion has raised more than $1 billion toward its goal — though whether it can deliver remains an unanswered question.
The Seattle ag tech startup IUNU wants to bring computer vision and AI to the commercial greenhouse — deploying autonomous rail-mounted cameras and canopy-level sensors that can spot early signs of disease, track plant growth, and tell growers exactly what to do about it.
Pronounced “you-knew,” IUNU was founded in 2013 by CEO Adam Greenberg, the son of a botanist and co-founder of a clean water startup called Pure Blue Technologies. The company has deployed its technology across six countries, has additional offices in Canada and Netherlands, and has raised $60 million.
Advertisement
Unwanted carbon dioxide has a higher purpose thanks to OCOchem. The Richland, Wash., startup is taking water and captured industrial CO2 and turning them into chemicals that can be converted into clean-burning hydrogen fuel, used in aviation deicers, or fed to microorganisms that biosynthesize proteins.
The company has raised $11.2 million from investors plus additional grant dollars, and has multiple pilot projects underway as it scales up production. Todd Brix launched OCOchem in 2017 after a nearly two-decade career at Microsoft.
Seattle’s Ravelhas developed a proprietary, planet friendly technology that unwinds the components of fabric blends through a process it calls “purification recycling.” Ravel’s target is elastane, which is known as spandex or Lycra and added to essentially every category of apparel.
The startup recovers the elastane, turning it into cost-competitive, recycled plastic pellets that serve as the raw material for making polyester fabrics. Ravel launched in 2019 and last year announced a pre-seed funding round.
Advertisement
In March, TerraPower became the first next-generation nuclear company in the U.S. to receive federal construction approval — a milestone for the Bill Gates-backed startup, which is engineering smaller, modular reactors designed to be assembled from factory-built components. Each reactor generates 345 megawatts and pairs with a molten salt energy storage system that can supply additional power.
The Bellevue, Wash., company broke ground on a demonstration plant in Kemmerer, Wyo., in 2024 and aims to start splitting atoms there by the end of 2030. TerraPower has raised $1.66 billion from investors and secured a $2 billion federal grant.
The event will feature a VIP reception, sit-down dinner and fun entertainment mixed in. Tickets go fast. A limited number of half-table and full-table sponsorships available. Contact events@geekwire.com to reserve a spot for your team today.
Advertisement
(function(t,e,s,n){var o,a,c;t.SMCX=t.SMCX||[],e.getElementById(n)||(o=e.getElementsByTagName(s),a=o[o.length-1],c=e.createElement(s),c.type=”text/javascript”,c.async=!0,c.id=n,c.src=”https://widget.surveymonkey.com/collect/website/js/tRaiETqnLgj758hTBazgd5M58tggxeII7bOlSeQcq8A_2FgMSV6oauwlPEL4WBj_2Fnb.js”,a.parentNode.insertBefore(c,a))})(window,document,”script”,”smcx-sdk”); Create your own user feedback survey
Although many of us associate hotels with cushy business trips or relaxing holiday getaways, frequent travelers will know that it does come with its own set of issues. While some minor annoyances, like not being able to stream your content, can be solved by bringing a fire TV stick, other problems, such as bed bugs, are harder to solve.
Despite being around for millions of years, bed bug infestations are still a recurring problem, even for expensive hotel chains. And, as anyone who has to deal with them can tell you, you may need to hire professional help if they ever reach your home. Because of this, it’s best to follow the standard bed bug prevention protocol, such as using suitcase stands and inspecting the room with tools like UV flashlights. If you’re looking for one such tool that is affordable, Harbor Freight sells a UV flashlight for under $8 that might be perfect for your next business trip.
Harbor Freight has been known to sell well-rated flashlights, with most of them under the Braun label. Priced at $7.99, the Braun UV Leak Detector LED Flashlight can generate 395 nM UV light and is the cheapest UV flashlight on offer at Harbor Freight as of March 2026. Apart from helping you spot pests, UV flashlights can also be used to detect all kinds of stains, leaks, and even counterfeit currency, which could all be valuable uses when you’re on the road or at home. Here’s what else you should know.
Advertisement
The Braun UV flashlight is rated highly by those who have bought it
Running on three AAA batteries, Harbor Freight says this flashlight has a 5.5-hour total run time, so it can be convenient when traveling to locations with no sockets or portable chargers. For an improved grip, this flashlight has both a knurled body as well as a ridged collar. It has a 10-foot range, but this model can’t be used as as normal flashlight and it doesn’t have the standard white light.
Advertisement
As of this writing, the Braun UV Leak Detector does not have a significant number of reviews, so it’s hard to say what customers think of it. For what it’s worth, however, the four buyers who have left reviews all rated it 5 stars, with one reviewer saying it was “not super bright but gets the job done.” If you want a tool that has both UV and white lights, Braun sells a more compact UV flashlight that can also double as a normal flashlight. For $24.99, the Braun 400 Lumen Rechargeable Penlight with UV Light is highly rated and can run an extra half hour more than the $8 UV model. Of course, these extra features are going to cost you.
Man City vs. Liverpool will air in the US on ESPN and ESPN Plus, and is also available via ESPN Select or ESPN Unlimited.
The pick of this weekend’s FA Cup quarterfinals sees Man City host Liverpool in a blockbuster cup clash at the Etihad Stadium.
Advertisement
Man City’s goal with this last-eight faceoff is to move a step closer to claiming the prize following last month’s Carabao Cup triumph over Arsenal. City’s route to the quarterfinals has seen it beat Exeter and Salford before easing past Premier League Newcastle 3-1 at St. James’ Park in the previous round.
Liverpool, meanwhile, comes into this cup tie looking to get back to winning following their Premier League defeat to Brighton before the international break. With the Reds out of the EPL title race and also eliminated from the Champions League, this tournament provides their final opportunity to claim the silver cup this season, as well as ease the mounting pressure on manager Arne Slot amid what has so far been a disappointing campaign.
Manchester City takes on Liverpool at the Etihad Stadium on Saturday. Kickoff is set for 12:45 p.m. BST local time in the UK, which is 7:45 a.m. ET or 4:45 a.m. PT in the US and Canada, and 10:45 p.m. AEDT in Australia.
Advertisement
Pep Guardiola’s Manchester City have won each of their last 17 home fixtures in the FA Cup.
Adam Davy/ PA Images / Getty Images
Livestream Man City vs. Liverpool in the US
Every match from this point in the tournament will be available to stream live on ESPN Plus, which is accessible via the network’s ESPN Select or ESPN Unlimited streaming packages. ESPN Select carries ESPN Plus and is the cheaper option at $13 per month.
ESPN’s streaming platforms have been shaken up in recent months. The sports network now offers two tiers with its new direct-to-consumer setup: ESPN Select and ESPN Unlimited. ESPN Select is essentially what ESPN Plus used to be, with the same content available to subscribers, including FA Cup soccer, for $13 per month. If you want full access to ESPN’s networks and services, such as ESPN, ESPN2, ESPN3, ESPNews and ESPN Deportes, as well as all of ESPN Select’s content, then ESPN Unlimited is the way to go. It costs $30 per month.
Livestream Man City vs. Liverpool in the UK
TNT Sports and the BBC are sharing duties for the FA Cup this season, with this Sunday afternoon game set to be shown on TNT Sports 1.
You can access TNT Sports via Sky Q, Virgin Media and EE TV as part of a TV package.
Alternatively,TNT Sports has a new streaming home with the launch of HBO Max in the UK. It costs £31 either way and comes in a package that includes Discovery Plus’ library of documentary content.
A bundle including HBO Max’s entertainment plan alongside TNT Sports currently costs £31 per month.
Advertisement
Livestream Man City vs. Liverpool in Canada
Canadian soccer fans looking to watch this FA Cup fixture can watch all the action live via Sportsnet.
Sportsnet
Sportsnet is available via most cable operators, but cord-cutters can subscribe to the standalone streaming service Sportsnet Plus instead, with prices starting at CA$30 per month or CA$250 per year for the standard plan.
Advertisement
Livestream Man City vs. Liverpool in Australia
Football fans in Australia can watch FA Cup matches live on the streaming service Stan Sport.
Stan
Stan Sport will set you back AU$20 a month, on top of a Stan subscription, which starts at AU$12. It is worth noting the streaming service is offering a seven-day free trial. On top of select FA Cup matches, a subscription gives you access to Premier League, Champions League and Europa League action, along with international rugby and Formula E.
Looking for the most recent Mini Crossword answer? Click here for today’s Mini Crossword hints, as well as our daily answers and hints for The New York Times Wordle, Strands, Connections and Connections: Sports Edition puzzles.
Need some help with today’s Mini Crossword? When you solve it, the puzzle makes a colorful shape and spells out a very California phrase. Read on for all the answers. And if you could use some hints and guidance for daily solving, check out our Mini Crossword tips.
If you’re looking for today’s Wordle, Connections, Connections: Sports Edition and Strands answers, you can visit CNET’s NYT puzzle hints page.
Mercedes-Benz is about to change something fundamental about how cars feel to drive, and it’s not just another software update. The company is bringing steer-by-wire tech to a production vehicle for the first time, starting with the refreshed EQS, and it’s a pretty big departure from how steering has worked for over a century.
Mercedes-Benz
And yes, this is the same kind of tech that’s been used in aircraft for years, and was even showcased on the Mercedes-Benz Vision Iconic. Now, it’s finally making its way into a luxury sedan.
What does “steer-by-wire” actually mean here?
In simple terms, Mercedes is removing the physical connection between the steering wheel and the front wheels. Instead of a mechanical linkage, your inputs are sent electronically to actuators that turn the wheels.
Mercedes-Benz
That might sound a bit unnerving at first, but Mercedes says it has built in multiple redundancies, sensors, and control systems to ensure safety. In fact, the company has already tested the setup for over a million kilometers before bringing it to production. There are also some real advantages here. Because everything is software-controlled, the steering ratio can change dynamically depending on speed, making parking easier while keeping things stable at highway speeds.
Mercedes-Benz
And then there’s the design twist. Since there’s no need for a traditional steering column, Mercedes is pairing this system with a yoke-style steering wheel. It’s flatter, more futuristic, and designed to improve visibility of the instrument cluster.
Why this could be a turning point for cars
With steer-by-wire, carmakers get far more flexibility in how steering behaves, how interiors are designed, and even how future autonomous features are integrated. It also opens the door to a more “software-defined” driving experience. Things like steering feel, responsiveness, and feedback can be tuned digitally, rather than being locked in by hardware.
Mercedes-Benz
Of course, there’s still a trust factor to overcome. Removing a direct mechanical link between driver and wheels is a bold move, and not everyone will be comfortable with it right away. But if Mercedes gets the balance right, this could end up being one of those changes that feels strange at first… and completely normal a few years down the line.
With a denser battery the Arlo Pro 6 adds more battery life over the previous iteration, while maintaining the excellent 2K image quality and flexible installation. With an Arlo Secure subscription you get very powerful object detection, with the highest tier offering person and vehicle recognition into the mix, plus a custom AI detection where you can spot an open gate, missing wheelie bin or pretty much anything else you can think of. All of this together makes the Arlo Pro 6 one of the best and most comprehensive security cameras, but subscriptions are also very expensive and have relatively short video history periods compared to the competition.
You need Arlo Secure for cloud storage and object detection
Introduction
The Arlo Pro 6 2k+ is a somewhat familiar-looking device.
Advertisement
In fact, it looks pretty much like every Arlo camera back to the Arlo Pro 3. Don’t judge this camera on its external looks, as there are enough internal changes that make it a worthy successor to the previous generation (the Arlo Pro 5), including easier setup and a denser battery.
With a more powerful cloud subscription service behind the camera, the Pro 6 can form part of a very capable security system, just don’t expect it to be cheap.
Advertisement
Design and Installation
USB-C Charging
Wall mountable
Can connect to Wi-Fi or a Smart Hub
You can buy the Arlo Pro 6 2K in packs of one, two, three or four, with more expensive kits working out cheaper per camera.
Take a look at the Arlo Pro 5, and the Pro 6 doesn’t seem that different: both look the same, have the same resolution, have a spotlight and are controlled via the same app and cloud service.
Advertisement
But, look a little more closely, and there are some clear changes. First, the camera has a USB-C port, rather than the old magnetic connector of the previous model. That’s a good change, as any USB-C cable can be used, and you don’t have to worry about losing the proprietary connector. In my experience, the USB-C cable seems to charge the battery slightly faster, too.
Image Credit (Trusted Reviews)
Advertisement
Talking of the battery, the new version has a higher-density pack, with 15% more battery life. That should help reduce how often you have to take the camera down for charging, although where it’s pointed and how often recording is triggered.
Image Credit (Trusted Reviews)
Bluetooth is a new addition to the camera, too, which speeds up discovery time when installing the camera. Guaranteed, you only need that the once, but I’ll take anything that makes life easier.
This camera can be connected to Wi-Fi directly or to a Smart Hub, if you have one. A Smart Hub also provides offline recording, although you do lose many of the camera’s best features if doing so.
If you want to go offline and avoid paying for a cloud subscription, something like the EufyCam S4 might make more sense.
Advertisement
Image Credit (Trusted Reviews)
Advertisement
The Arlo Pro 6 comes with a fully adjustable wall mount, which is the same as the one the company has used for years. That’s handy, as you can unscrew and older camera and fit the new one if you need to.
If starting from scratch, the mount is easy to attach to a wall and gives plenty of flexibility to point the camera where you want it.
Features
Needs a subscription to get the most out of the camera
Custom AI detection with the highest subscription tier
Flexible object detection
The Arlo Pro 6 slots into the Arlo app alongside any other cameras you might have. It remains one of my favourite security apps, as it’s so configurable. There’s a home screen that lets me select the location’s modes: Arm Away, Arm Home and Standby.
Just like with a security system, such as the Ring Alarm, these modes let me choose which cameras are active at any time. For example, I have my outdoor cameras record when set to Arm Home, and everything turned on when set to Arm Away.
Advertisement
Advertisement
This page also has customisable widgets, so you can have shortcuts to any camera you want, but you don’t have to have previous of all cameras.
As mentioned above, if you have a Smart Hub you can record offline, but you lose out on all of the smart features. Realistically, then, you need to have an Arlo Secure plan, just be prepared to pay a lot for it.
Arlo Secure gives you cloud recording for one camera at a resolution of up to 2K, with just seven days of history (very stingy), plus Person, Animal, Vehicle and Package Detection.
Upgrade to Secure Multi-Cam and you get cloud storage for four cameras, but otherwise the same features as the single camera package. This costs £11.99 a month, which is still expensive but better overall value than the single camera option if you have more than one camera.
Advertisement
The most advanced features come with the Arlo Secure Plus subscription, which upgrades recording to a maximum of 4K (not relevant here, but it is if you have an Ultra camera), 14 days of cloud history and the new AI detection features, which I’ll get into shortly. This costs £19.99 a month, making it very expensive.
With the more basic package, I can easily cut down on alerts by using motion zones to focus the camera on important areas, and then the excellent people, animal and vehicle detection. Get the right mix, and the number of alerts plummets.
Advertisement
Image Credit (Trusted Reviews)
Pay for the more expensive package and you get person recognition (facial recognition, as most people would call it). You can let the camera pick up people and name them, or feed in photos from your photo library to give the Pro 6 a head start.
Image Credit (Trusted Reviews)
Oddly, people detection is only available on a single camera in your home, so pick the one that makes most sense; most other systems that I’ve tested run facial recognition across all devices.
Vehicle recognition is another new feature. It’s like facial recognition for cars, in that you can tell the camera to spot certain vehicles. This can run on all cameras.
Advertisement
There’s also Custom Detection, which involves taking two snapshots with something different between them: a gate open or a wheelie bin missing, for example. You can then get alerts when the action is detected, either through motion being triggered, by firing the rule at a set time, or when the mode changes.
Advertisement
I set up one to watch for the back door opening, but this proved to be not very reliable, often triggering when there was any motion. I think that the glass doors, and the distance from the camera, confused the system, so Custom Detection might work better with bigger, more obvious changes.
It’s all very clever, and the system is virtually limitless, provided you can train the system, but it’s a very expensive option to have.
Advertisement
All video is recorded to the cloud (assuming you have a subscription), and is available in the Feed section. This can be filtered by date, by device, and then by event type, of which there are far too many to name here. There’s enough granularity to quickly find a clip, although Arlo doesn’t have the fancy AI search that Ring now has.
Image Credit (Trusted Reviews)
Performance
Sharp 2K video
Excellent night vision
Arlo has long been towards the top of the quality tables, and the Pro 6 keeps that record up. Footage is very similar to that from the Pro 5, which isn’t a criticism.
Advertisement
During the day, the footage is exceptionally sharp, and detailed through the frame, with the 160° lens capturing a lot of what’s going on. Colours are excellent and there’s detail through the frame. This is about as good as you can expect from a 2K video camera.
Image Credit (Trusted Reviews)
At night, the Pro 6 can use its spotlight to shoot in full colour, and the results are impressive, with almost as much detail as during the day. The only real change is that motion gets a bit blurry, so it takes a bit of hunting to find a clip where someone’s face is clear; those frames do exist. Again, I’ve not seen better from a 2K camera.
Image Credit (Trusted Reviews)
Arlo says that the battery can last up to eight months on a single charge, although how that pans out will depend on where the camera’s pointing. I recommend angling any battery powered security camera away from high activity areas, such as a main road, to increase battery life.
Based on initial testing, I think that I’d get a good five months between charging, if not longer.
Advertisement
Advertisement
Should you buy it?
You want excellent quality and flexibility
Brilliant 2K footage day and night, flexible placement and long battery life all make this camera a winner whether it’s inside or out.
Advertisement
You want something cheaper to run
Advertisement
This camera works best with an Arlo Secure subscription, which is very expensive compared to the competition, even though it is very good.
Final Thoughts
The overall Arlo system and app remain one of the best available, and the new AI features let you do more than with any other camera, thanks to the training mode. But you have to be prepared to pay for the luxury, and Arlo Secure is expensive and has limited video history compared to the competition.
Advertisement
If you’ve got Arlo Pro 5 cameras, there’s very little here to make it worth the upgrade, but if you’ve got older cameras or are starting from scratch, the Arlo Pro 6 is a brilliant, high-quality camera. If you’d rather have something with cheaper running costs, then read my guide to the best outdoor security cameras.
How we test
Unlike other sites, we test every security camera we review thoroughly over an extended period of time. We use industry standard tests to compare features properly. We’ll always tell you what we find. We never, ever, accept money to review a product.
Find out more about how we test in our ethics policy.
Used as our main security camera for the review period
We test compatibility with the main smart systems (HomeKit, Alexa, Google Assistant, SmartThings, IFTTT and more) to see how easy each camera is to automate.
We take samples during the day and night to see how clear each camera’s video is.
FAQs
Do you need a cloud subscription to use the Arlo Pro 6 2K?
Advertisement
Without a subscription you can view the live feed and get basic notifications, and record to a hub; you need a subscription for cloud storage and for the more advanced detection options.
What’s the difference between the Arlo Pro 6 2K and the Arlo Pro 5?
The Pro 6 has a higher density battery, USB-C charging and it has Bluetooth for faster setup.
You must be logged in to post a comment Login