Sitting in a recent district administrator meeting, I found myself excited about a new student data platform my district is rolling out. This new tool, called by a catchy acronym and presented on a flashy dashboard, would collect a variety of information about student skills, mindsets and achievement. It would let us break down information by subgroup and assign overall scores to students, helping us identify who needs additional support.
Initially, I was enthusiastic about how it could empower teachers to better understand students and improve outcomes. But since then, after conversations with the teachers in my building and reflecting on my own experiences using data in the classroom, I’ve begun to wonder whether we are focusing on the wrong data or placing too much emphasis on data overall.
I love looking at data. I’m excited when data surprises me or shows me something more clearly. It’s motivating to see trend lines sloping upward and green arrows pointing toward the sky. Data can help us see the bigger picture when looking at larger systems. We can see which schools are suspending too many students of color and which districts are improving reading scores. As an administrator, I find this illuminating and helpful in guiding how schools make decisions.
Advertisement
But as data trickles down to classrooms and individual students, the usefulness and impact get murkier. In the Montessori school where I teach, where our focus is guiding the child according to their interests and readiness, the data we have to collect affects what we focus on, often in unexpected ways, and sometimes to the detriment of the system itself.
Teaching to the Test
My school is a successful one, and looking at our annual school report card should be a source of pride for the teachers. The report card is based primarily on our state test scores in math and reading, and various calculations are made from our students’ performance on it. But when we shared the most recent report card that showed our school once again exceeded expectations, the results were met with shrugs and muted applause. It isn’t that they aren’t proud of what our students can do; they just recognize the narrowness of the data and how indirectly it connects to what is happening in their Montessori classrooms.
When I pointed out that our report card showed math achievement was an area for improvement, the response was, “Are you saying we should teach to the test?” They know that we could game the system by focusing on test prep and the specific questions their students might encounter. Because we follow a Montessori curriculum with three grade levels in our classrooms, our sequence doesn’t always align with grade-level standards, which can show up on tests, with students scoring poorly on topics they haven’t been introduced to yet. We could align our curriculum with the test and focus our teaching on what the test assesses, but doing so goes against our philosophy of allowing students to make choices about their learning at their own pace.
With this tension in mind, I wonder if data distorts the focus of education? Our current focus on reading and math scores, based on standardized testing, is part of what we want our schools to do. But teachers know that students are capable of achieving much more than our report cards show. Is there some golden indicator that we just haven’t found yet — a measurement like happiness or flourishing — that would be more meaningful? And of course, if we find it, won’t it also become distorted?
Advertisement
Information Overload
There is also a heavy focus in our district on using data to determine which students qualify for additional support through differentiation, interventions and individualized instruction. Administration requires us to hold monthly meetings to review student data and determine who is progressing and who might need more support. On one level, this seems like a great practice for identifying who needs help, but in reality, the system’s capacity to act on that information is overstretched, leading to distortion and ultimately to burnout.
I remember my frustrations as a teacher in these meetings. The data was interesting and could help you to confirm or question ideas you had about students based on your classroom observations. But it didn’t often provide helpful information for supporting students. The time spent in these meetings outweighed the benefit I got from them, and took away from the little time I had to prepare and plan for my students.
Teachers I work with have regularly expressed feeling overwhelmed by the amount of information they need to consider and the testing required to gather it. In our early grades, due to a new state law mandating early literacy assessments, students are tested monthly on letter-sound identification and oral reading fluency. This generates an unending stream of data to grapple with and a constant feeling of needing to do more to address it, all of which adds to stress on teachers, students and the system. I’ve seen amazing teachers, skilled at connecting with kids and providing rich learning experiences, brought to tears because there was too much red on a data spreadsheet.
Teachers don’t have the time to assess and examine all the data they’re now expected to, and monthly checks of early reading indicators take time away from actually teaching those skills. Being responsive to the data you gather means stopping what you’re doing and finding new ways to help kids learn what the data says they need. Teachers are expected to find new resources and determine when and how to work with small groups that need similar support, while also providing meaningful learning opportunities for other students. And, of course, different kids need different things, so you’d need to do this for multiple groups, which is unrealistic to expect all teachers to have the capacity to do.
Advertisement
Meaningful Measurement
Schools, as they are currently designed, weren’t supposed to be responsive to the amount of data we’re collecting. They were designed to teach a group of students a set of information in a specific sequence each year, and then grade them on how well they learned what they were expected to learn. They were designed to tell us which students could meet the standards, and who couldn’t, not to ensure that each child could learn and flourish.
When I was a classroom teacher, I kept track of how many books my students read each month. It wasn’t research-backed or scientifically valid, but I found the data helpful for identifying who was and wasn’t reading, and thinking about how I could support them. In some cases, it helped me direct kids to books that they might get excited about; in other cases, it just let me know that a particular kid wasn’t that into reading, and that that might have to be OK for now. The data wasn’t complicated, but it let me quantify what I was observing in my classroom in a way that was meaningful to me and, most importantly, helped me connect with my students as whole people.
A key component of Montessori philosophy is the teacher as observer — watching and documenting what students choose and do to understand and assess what they are ready for. Every teacher should have the time and space to measure and track what feels meaningful and helpful to them.
This may look different for every teacher, but the important factor is that it has meaning to them and is connected to their students and their practice. Likewise, we need to remember that standardizing the expectations for students goes against what we know about how people develop. There’s always going to be variation in a dataset — there’s no metric on which we are all the same.
Advertisement
As an administrator, my responsibility is to understand and use data in ways that are helpful, while also protecting teachers and students from distractions and distortions that undermine the larger goals of creating opportunities for growth and learning for all students.
Ultimately, data should serve as a guide rather than a governor, informing our decisions without eclipsing the human elements of teaching and learning. If we can strike that balance, we can create systems that honor both the complexity of children and the professional wisdom of the educators who know them best.
The Veehop 4WD Scooter is worth a look for anyone who wants to take a scooter somewhere a standard two wheeler simply could not handle. Four wheels, each with its own electric motor and independent suspension, give it the kind of all terrain capability that the name suggests, and with the stem folded down it is compact and light enough to fit in most car trunks.
Each of the four hub motors produces 750 watts nominally and up to 1,500 watts at peak, combining for a total output of 6 kilowatts and 177 pound feet of torque. Top speed on flat ground sits at 31 mph, and a 50 percent incline is handled without complaint. With power going to each wheel independently, mud, rocks, gravel, and shallow water are all manageable terrain rather than reasons to turn back.
Powerful Performance: Our 500W motor adult electric scooter reaches exhilarating speeds up to 19 MPH, which ideal for both adults and teens…
Extended Range: The high-capacity battery powers this adult scooter for up to 23 miles on a single 4-5 hours charge. Cruise control maintains your…
Comfortable Ride: The electric scooter is equipped with a dual suspension system and 10-inch solid tires to ensure that bumps are reduced and grip is…
Independent suspension keeps the deck level even when one side drops into a rut, and sturdy plating protects the frame and battery bay from scrapes across rough ground. At 154 pounds it is heavier than a standard scooter, but that weight starts to justify itself the moment the terrain gets interesting. Total load capacity sits at 441 pounds, and an optional saddle lets you ride seated while still controlling the throttle and steering through weight shifts.
Advertisement
The 60 volt battery pulls out in seconds for quick swaps, with a capacity of up to 40 amp hours giving you around 37 miles of range at a steady pace or closer to 25 miles if you are pushing hard. A full charge takes four to five hours with the standard charger, and anyone planning longer sessions can simply carry a spare battery and swap it out as needed.
The folding stem keeps storage straightforward, sliding into most car trunks without much fuss, though hauling it up a flight of stairs is a workout given the weight. A small handlebar display shows speed, battery level, and a basic ride overview, with a thumb throttle and a few simple buttons handling all the controls. The Veehop 4WD starts at 3,750 dollars for the full four-wheel-drive option, the two-wheel drive model comes in at 3,350 dollars. Both are shipping right now and the company is inviting customers to schedule a test ride. They’ve also got extra batteries and a few other smaller accessories up for sale as well. [Source]
The basic principle of radar systems is simple enough: send a radio signal out, and measure the time it takes for a reflection to return. Given the abundant sources of RF signals – television signals, radio stations, cellular carriers, even Wi-Fi – that surround most of us, it’s not even necessary to transmit your own signal. This is the premise of passive radar, which uses passive RF illumination to form an image. The RF signal doesn’t even need to come from a terrestrial source, as [Jean Michel Friedt] demonstrated with a passive radar illuminated by the NISAR radar-imaging satellite (pre-print paper).
NISAR is a synthetic-aperture radar satellite jointly built by NASA and ISRO, and it completes a pass over the world every twelve days. It uses an L-band chirp radar signal, which can be picked up with GNSS antennas. One antenna points up towards the satellite, and has a ground plane blocking the signal from directly reaching the second antenna, which picks up reflections from the landscape under observation. Since the satellite would illuminate the scene for less than a minute, [Jean-Michel] had to predict the moment of peak intensity, and achieved an accuracy of about three seconds.
The signals themselves were recorded with an SDR and a Raspberry Pi. High-end, high-resolution SDRs such as the Ettus B210 gave the best results, but an inexpensive homebuilt MAX2771-based SDR also produced recognizable images. This setup won’t be providing any particularly detailed images, but it did accurately show the contours of the local geography – quite a good result for such a simple setup.
Advertisement
If you’re more interested in tracking aircraft than surveying landscapes, check out this ADS-B-synchronized passive radar system. Although passive radar doesn’t require a transmitter license, that doesn’t mean it’s free from legal issues, as the KrakenSDR team can testify.
Tuesday March 31, is World Backup Day. It’s that necessary annual reminder to back up all our important data and documents. And with so many different ways to back up your files, once you’ve set it up, you can leave it unattended, safe in the knowledge that if the worst happens, you’re protected.
Below you’ll find a comprehensive collection of our favorite backup products and methods, including HDDs and SSDs — both internal and external — portable flash drives, SD cards, NAS systems, and software and cloud backup solutions.
Where possible, I’ve tried to hunt down the lowest prices available for each item, focusing on deals and limited-time discounts. Over the course of the week, I’ll be checking regularly to make sure prices stay accurate and all the products are still in stock.
The ongoing memory crunch does mean prices for some items remain higher than usual, particularly SSDs and other flash-based storage, and demand for high-capacity memory and storage continues to grow. Although deals still appear regularly, they don’t usually last long.
Advertisement
We’re also running our World Backup Day 2026 live blog with news, updates, and expert advice.
Internal SSDs
Internal SSDs are standard in modern PCs, with SATA and NVMe types delivering fast storage in sizes up to 8TB, although 2TB currently offers the best balance of cost and capacity.
Advertisement
Internal HDDs
Internal HDDs remain a popular choice for large-capacity storage, with desktop and NAS models offering dependable performance in sizes up to 24TB, making them ideal for backups, media libraries, and long-term data storage at a lower cost per terabyte.
Portable SSDs
Portable SSDs offer fast, compact storage for backups and travel, with capacities reaching up to 8TB, although 1TB to 4TB models remain the most common choices for everyday use.
Advertisement
External HDDs
External HDDs offer convenient, plug-and-play storage for backups and large files, with portable and desktop models available in sizes up to 24TB, making them a practical choice for expanding storage without opening your PC or upgrading internal hardware.
Desktop NAS
Desktop NAS systems provide centralized storage for homes and small offices, with multi-bay designs supporting large-capacity drives and RAID options, making them ideal for backups, media streaming, and secure file sharing across multiple devices on the same network.
Advertisement
USB drives & memory cards
Flash drives and memory cards provide compact, portable storage for photos, videos, and everyday files, with capacities ranging from a few gigabytes to several terabytes, making them a convenient choice for cameras, laptops, and quick file transfers between compatible devices.
Software & cloud
Software and cloud backup solutions protect important files by creating secure copies on local drives or remote servers, helping guard against hardware failure, accidental deletion, or cyber threats while making it easy to restore data when problems occur.
You don’t hear much about blockchain these days. Back in the late 2010s, when everyone was talking about NFTs and cryptocurrency, companies were keen to put “blockchain” front and center on their press releases. “Look at us,” they were saying, “we’re embracing modern technology.” But after the sad evolution of cryptocurrency, brands seemed to decide they didn’t need the baggage that came with the word “blockchain.” But that doesn’t mean it’s gone away — just that companies are likely to call it something different now. You’re just more likely to hear things being referred to as distributed ledgers or “on-chain” tech.
According to the cryptocurrency exchange Coinbase, 60% of Fortune 500 companies are working on blockchain initiatives. The sectors that use blockchain the most are banking and finance, which account for around 20% of its use, but it’s used across all types of business, including the automotive industry.
Before we look at how carmakers use blockchain, it’s useful to understand what exactly blockchain is. At its most basic, blockchain is a shared digital record that isn’t controlled by any single company or authority. Instead, identical copies are stored across a network of computers, and new information is added in secure, time-stamped “blocks” that are linked together. Because each new entry is verified by the network and connected to what came before it, the record is very difficult to alter or tamper with. This immutability makes it useful for automakers who are looking to provide things like digital battery passports and vehicle provenance. However, some car manufacturers are planning to take the tech even further.
Advertisement
Blockchain is used to store records about supply chains and provenance
IM Imagery/Shutterstock
Blockchain is useful when it comes to storing digital battery passports. These are electronic records tracking the lifecycle of an EV battery and are going to be required in all countries in the European Union by 2027. This regulation affects all automakers who are selling into Europe — including those headquartered in the United States. Automakers need traceability data, and supply chains are international. A modern electric vehicle battery isn’t a single bill of materials so much as a web of upstream mining, refining, processing, cell manufacturing, pack assembly, recycling, and logistics. A blockchain-powered distributed ledger can serve as one definitive record of permissions and provenance that can be shared by different companies.
In June 2024, Volvo Cars launched what it claimed to be the world’s first EV battery passport for its EX90 SUV. The passport uses blockchain to record information such as the origins of raw materials, recycled content, and carbon footprint. Volvo plans to expand the scheme to more of its cars. Meanwhile, Tesla has implemented blockchain solutions to trace the provenance of cobalt in its supply chains. Hyundai and Kia developed an Integrated Greenhouse Gas Information System (IGIS), using blockchain to record emissions across the whole lifecycle of a vehicle.
Advertisement
Another use for blockchain is providing proof of provenance for collectible cars. Porsche is utilizing its unalterable nature to launch a blockchain-based digital passport pilot for classic cars, as well as other collectibles like watches or paintings. Automakers aren’t the only ones using blockchain for car records. In July 2024, Reuters reported that the California Department of Motor Vehicles had digitized 42 million car titles using blockchain technology to detect fraud and streamline title transfers.
Advertisement
Other uses for blockchain in the automotive industry
One of the main uses of blockchain in the automotive industry is handling companies’ finances. For example, BMW uses a blockchain system from JPMorgan to handle international financial transactions automatically. However, some pilots and plans suggest that there may be more innovative uses in the future. The much-hyped — but still not yet available — Sony/Honda Afeela EV sedan promises an “on-chain mobility service platform leveraging a token-based incentive model.” Details are still pretty fuzzy, but it does indicate another use for blockchain in the automotive industry, even if it is just persuading people to share their data by giving them cryptocurrency. Nissan is proposing something similar with its Nissan Passport, which it describes as a “digital certificate that expands the range of experiences you can access based on your actions.”
Toyota, the world’s largest automaker, is betting big on blockchain tech and has its own “Blockchain Lab” exploring how blockchain could be used to give vehicles a secure digital identity, bundle fleets into investable portfolios, and make it easier to attract funding for things like electric vehicle fleets and new mobility services. It is proposing a new blockchain-based protocol called the Mobility Orchestration Network (MON), which would link vehicles with other agencies, like regulators, on one all-encompassing digital platform. Toyota’s interest in blockchain goes beyond car manufacturing. It created Woven City, a blockchain-integrated smart city, in September 2025. The goal here is to use blockchain as a trusted digital system that lets people safely share vehicles, electricity, and city services without needing middlemen or paperwork.
Cybersecurity firm F5 Networks has reclassified a BIG-IP APM denial-of-service (DoS) vulnerability as a critical-severity remote code execution (RCE) flaw, warning that attackers are exploiting it to deploy webshells on unpatched devices.
BIG-IP APM (short for Access Policy Manager) is a centralized access management proxy solution that enables admins to secure and manage user access to their organizations’ networks, cloud, applications, and application programming interfaces (APIs).
Tracked CVE-2025-53521, this security flaw can be exploited by attackers without privileges to perform remote code execution when targeting BIG-IP APM systems with access policies configured on a virtual server.
In addition to flagging the vulnerability as being exploited in the wild, F5 published indicators of compromise (IOCs) and advised defenders to check their BIG-IP systems’ disks, logs, and terminal history for signs of malicious activity.
Advertisement
“This known vulnerability was previously categorized and remediated as a Denial-of-Service (DoS) vulnerability. Due to new information obtained in March 2026, the original vulnerability is being re-categorized to an RCE. The original CVE remediation has been validated to address the RCE in the fixed versions. We have learned that this vulnerability has been exploited in the vulnerable BIG-IP versions,” F5 warned in an advisory update published this Sunday.
“F5 strongly recommends that you consult your corporate security policy for guidelines about incident handling procedures including but not limited to forensic best practices, that are specific to your organization. More specifically, review the policies to ensure that they comply with evidence collection and forensics procedures for a security incident before you attempt to recover the system,” the company added.
Internet threat-monitoring non-profit organization Shadowserver now tracks over 240,000 BIG-IP instances exposed online; however, there is no information on how many have a vulnerable configuration or have already been secured against CVE-2025-53521 attacks.
F5 BIG-IP systems exposed online (BleepingComputer)
The U.S. Cybersecurity and Infrastructure Security Agency (CISA) also added the vulnerability to its list of actively exploited flaws on Friday and ordered federal agencies to secure their BIG-IP APM systems by midnight on Monday, March 30.
“This type of vulnerability is a frequent attack vector for malicious cyber actors and poses significant risks to the federal enterprise,” it warned.
Advertisement
“Apply mitigations per vendor instructions, follow applicable BOD 22-01 guidance for cloud services, or discontinue use of the product if mitigations are unavailable.”
F5 is a Fortune 500 technology giant that provides cybersecurity, application delivery networking (ADN), and various other services to more than 23,000 customers worldwide, including 48 of the Fortune 50 companies.
Automated pentesting proves the path exists. BAS proves whether your controls stop it. Most teams run one without the other.
This whitepaper maps six validation surfaces, shows where coverage ends, and provides practitioners with three diagnostic questions for any tool evaluation.
Starcloud’s latest funding round values the space compute company at $1.1 billion, making it one of the fastest startups to reach unicorn status after graduating from Y Combinator.
The company’s Series A, which closed 17 months after its demo day presentation, was led by Benchmark and EQT Ventures. It’s another sign of the interest in outsourcing data centers to orbit as resource and political obstacles slow their development on Earth, but the business model depends on unproven technology and significant capital expenditure.
Starcloud has now raised a total of $200 million, and launched its first satellite with an Nvidia H100 GPU in November 2025. The company will launch a more powerful version, Starcloud 2, later this year with multiple GPUs, including an Nvidia Blackwell chip and an AWS server blade, as well as a bitcoin mining computer.
The company will also begin developing a data center spacecraft designed to launch from Starship, the reusable heavy lift rocket being built by Elon Musk’s SpaceX. Starcloud 3, as the spacecraft is named, will be a 200 kilowatts, three-ton spacecraft that fits the “pez dispenser” system SpaceX designed to deploy its Starlink satellites from Starship.
Advertisement
CEO and founder Philip Johnston said he expects that will be the first orbital data center that is cost-competitive with terrestrial data centers, with costs on the order of $.05 per kw/hour of power — if commercial launch costs land around $500 per kilogram.
The challenge is that Starship isn’t flying yet; Johnston says he expects commercial access to open up in 2028 and 2029. That’s the reality facing all the big space data center projects: powerful space computers will be cost-prohibitive until a new generation of rockets starts launching at a high operational cadence, something that might not happen until the 2030s.
“If it ends up being delayed, we’ll just carry on launching the smaller versions on Falcon 9,” Johnston said. “We’re not going to be competitive on energy costs until Starship is flying frequently.”
Techcrunch event
Advertisement
San Francisco, CA | October 13-15, 2026
“There’s kind of two business models,” Johnston explains: One is selling processing power to other spacecraft on orbit; the company’s first satellite, for example, analyzes data collected by Capella Space’s radar spacecraft. Then, in the future when launch costs go down, more powerful distributed data centers could potentially pull work from their terrestrial counterparts.
Advertisement
That gets at how new this industry really is. When Nvidia CEO Jensen Huang unveiled the company’s Vera Rubin Space-1 chip modules at his company’s annual GPU Technology Conference last week, he didn’t note that none had been produced or shared with the company’s development partners.
In fact, the number of advanced GPUs on orbit is numbered in the dozens, while Nvidia is estimated to have sold nearly 4 million to terrestrial hyperscalers in 2025.
Or consider that SpaceX’s Starlink communications network, the largest satellite network in orbit with 10,000 spacecraft, produces something around 200 megawatts of energy, while data centers with more than 25 gigawatts of power are currently under construction in the U..S, according to Cushman and Wakefield.
Johnston argues that his company is well ahead of the competition, with the first terrestrial GPU deployed in orbit. It was used to train an AI model in orbit, a first, according to Starcloud, and run a version of Gemini. Beyond the performance, Johnston says Starcloud now has valuable data about what it takes to run a powerful chip in space.
Advertisement
“An H100 is probably not the best chip for space, to be honest, but the reason we did it is we wanted to prove that we could run state of the art terrestrial chips in space,” he told TechCrunch. That hard-won knowledge —another GPU, an Nvidia A6000, failed during launch — will influence future designs.
There is a laundry list of technical challenges to be solved, including efficient power generation and cooling the hot-running chips. Starcloud-2 will have the largest deployable radiator flown on a private satellite; he expects at least two additional versions of that spacecraft will head to orbit, Johnston said.
Then there is the challenge of synchronization. The largest datacenter workloads, often for training, require hundreds or thousands of GPUs to work in tandem. Doing that in space will either require fantastically large spacecraft, or powerful and reliable laser links between spacecraft flying in formation. Most companies working on this technology expect those workloads to come long after simpler inference tasks take place on orbit.
Besides Starcloud, Aetherflux, Google’s Project Suncatcher, and Aethero — which launched Nvidia’s first space-based Jetson GPU in 2025 — are all developing space data center businesses.
Advertisement
The elephant in the room is SpaceX itself, which has asked the U.S. government for permission to build and operate a million satellites for distributed compute in space.
Going head-to-head with SpaceX is a daunting task for any entrepreneur, but Johnston sees room for coexistence.
“They are building for a slightly different use case than us,” he told TechCrunch. “They’re mainly planning on serving Grok and Tesla workloads. It may be at some point that they offer a third party cloud service, but what I think they are unlikely to do is what we’re doing [as] an energy and infrastructure player.”
Can you charge those Li-ion based cells with USB-C charging ports without taking them out of the device? While this would seem to be answered with an unequivocal ‘yes’, recently [Colin] found out that this could easily have destroyed the device they were to be installed in.
After being tasked with finding a better way to keep the electronics of some exercise bikes powered than simply swapping the C cells all the time, [Colin] was led to consider using these Li-ion cells in such a manner. Fortunately, rather than just sticking the whole thing together and calling it a day, he decided to take some measurements to satisfy some burning safety questions.
As it turns out, at least the cells that he tested – with a twin USB-C connector on a single USB-A – have all the negative terminals and USB-C grounds connected. Since the cells are installed in a typical series configuration in the device, this would have made for an interesting outcome. Although you can of course use separate USB-C leads and chargers per cell, it’s still somewhat disconcerting to run it without any kind of electrical isolation.
Advertisement
In this regard the suggestion by some commentators to use NiMHs and trickle-charge these in-situ similar to those garden PV lights might be one of the least crazy solutions.
Reportedly, the crime group accessed more than 350GB of stolen data related to data dumps of mail servers, databases, confidential documents, contracts and other sensitive material.
The extortion group ShinyHunters has been linked to the recent (24 March) breach of the European Commission’s Europa.eu platform, in which a reported 350GB of data, across multiple databases, was accessed and stolen.
In a statement issued after the incident (27 March), the European Commission stated that their early findings suggest that private data has been accessed and Union entities affected by the attack will be contacted. The Commission’s internal systems are not believed to have been affected.
The Commission explained it will continue to monitor the situation, taking the necessary precautions to ensure the security of its systems and data, as well as work to analyse what happened so it can use the results to improve its cybersecurity capabilities.
Advertisement
While the Commission has not shared further details on the incident, alleged data dumps uploaded to ShinyHunters’ Tor data leak site are said to include content from mail servers, internal communications systems, databases, confidential documents, contracts and additional sensitive material. 90GB of information allegedly stolen from the European Commission’s compromised cloud network has already been shared.
ShinyHunters are an extortion group established around 2020, who have carried out a number of high-profile, financially-motivated attacks on groups such as Salesforce, Allianz Life, SoundCloud and Ticketmaster. The criminal organisation also claimed responsibility for an attack on Match Group, which owns Tinder, Hinge, Meetic, Match.com and OkCupid.
In July 2024, AT&T paid a member of the ShinyHunters hacking group $370,000 to delete the data of millions of customers following a massive data breach of its systems. Reportedly, the stolen data exposed the calls and texts of nearly all of the platform’s 110m cellular customers after ShinyHunters stole the information from the cloud data giant Snowflake.
Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.
The White House did not respond to a request for comment about the meetings, but an official who was not authorized to speak on the record, told WIRED at the time: “The White House does not comment on mysterious meetings with unnamed staffers.”
Simultaneously, Trump has also sought to absolve officials of any wrongdoing in the wake of the 2020 election. Last year, Trump gave “full, complete and unconditional” pardons to a slate of people who had tried, and failed, to help him overturn the 2020 election results. In recent months, Trump has pressured Colorado governor Jared Polis to release Tina Peters, the former county clerk in Mesa County, Colorado, who became a hero for the right’s election deniers when she facilitated a security breach during a software update of her county’s election management system.
Peters was found guilty of four felonies, but Trump has been mounting a campaign in recent months to get her released, even going so far as to say he “pardoned” her, even though he has no power to do so given she was convicted on state charges.
Election Day Interference
While Trump has not announced specific plans to deploy troops to polling locations or seize voting machines, he and his administration have certainly been suggesting that such action is not off the table.
Advertisement
In January, Trump lamented not having the National Guard seize certain voting machines after the 2020 election. In early February, White House press secretary Karoline Leavitt told reporters that while she hasn’t specifically heard Trump discussing the possibility, she couldn’t “guarantee that an ICE agent won’t be around a polling location in November.” (The question was in response to former White House adviser Steve Bannon stating: “We’re going to have ICE surround the polls come November. We’re not going to sit here and allow you to steal the country again … We will never again allow an election to be stolen.”)
Earlier this month, during his confirmation hearing to head up the Department of Homeland Security, Senator Markwayne Mullin said he would be willing to deploy ICE to polling locations to address “a specific threat.”
The result of the Trump administration’s drip feed of threats and dog whistles is that those who are running elections in states across the country are already war-gaming what happens if ICE or the National Guard show up at their voting locations.
Michael McNulty, the policy director at Issue One, a nonprofit that tracks the impact of money in politics, also points to the fact that the Department of Justice sent monitors to oversee elections in November in New Jersey and California, despite no federal elections being held. “The concern is that this could become a massive deployment of, quote unquote, observers by the DOJ in 2026 who might do something more, whether it’s intimidation, whether it’s interfering with local election officials, to get data to confirm conspiracy theories,” McNulty tells WIRED.
Advertisement
FBI Raids
On January 28, the FBI raided the election office in Fulton County, Georgia, executing a search warrant that allowed it to seize ballots, ballot images, tabulator tapes, and the voter rolls related to the 2020 election. The search warrant affidavit, unsealed a few weeks ago, shows that the FBI relied on the work of Kurt Olsen, a lawyer who was appointed by the administration to investigate election security in October and who has a long history of working with some of the country’s biggest election deniers, including Patrick Byrne, Mike Lindell, and Kari Lake. Olsen’s claims are based on debunked and previously investigated conspiracy theories about the 2020 election.
The raid was also notable for the presence of Tulsi Gabbard, the director of national intelligence, who is, according to The Guardian, running a parallel investigation into the 2020 election with the apparent tacit approval of Trump.
The product is an active optical cable (AOC) for HDMI. Instead of relying solely on copper, it carries most of its signal over fiber-optic strands. Inside the cable, HDMI electrical signals are converted into optical signals for the journey between the two ends, then converted back to electrical signals at… Read Entire Article Source link
You must be logged in to post a comment Login