The Nothing Ear (a) wireless earbuds, priced at $59 (was $109), deliver impressive performance without breaking the budget. Many consider them the go-to pick for active noise cancellation in this price range, and after a closer look, you’ll see why. Nothing built these earphones with an incredibly clean, transparent stem that showcases the internal components.
Each bud weighs approximately 4.8 grams, and the charging case is so small that it fits easily into a pocket. The case’s design is inspired by pill packaging, and it does an excellent job of remaining tiny and useful. You get three sizes of silicone tips that match the color of the buds, which is a nice addition; these not only help you establish a snug fit, but they also allow you to use your buds for long periods of time without them becoming unpleasant.
Powerful 11 mm Driver: We’ve engineered an even more compact driver that delivers twice the power of Ear (2). In Nothing Ear (a) wireless earbuds,…
45 dB Smart Active Noise Cancellation: Nothing Ear (a) earbuds continuously monitors the seal between the earbud and your ear canal. If noise leakage…
Clear Voice Technology: Crystal-clear calls, anytime, anywhere. Nothing Ear (a) ear buds isolates your voice from background noise, making on-the-go…
The sound is produced by 11mm dynamic drivers in each earbud, and the audio provides a pleasant sound with lots of bass presence, but it is not so overbearing that you become lost in the mids or higher frequencies. The companion app has a basic EQ for any modifications you may want to make, and if you have an Android, you can use the LDAC codec to stream at greater resolutions when your connection allows it. They also support AAC and SBC to ensure compatibility with almost anything else.
Advertisement
Active noise cancellation can reach up to 45 decibels and includes a variety of modes, including high, mid, low, and adaptive, in which the buds adapt to the seal in your ear and begin filtering out noise as they go. Transparency mode allows outside noise to be heard when necessary. The buds include six microphones that work together to make calls extremely clear, and background noise is virtually non-existent during discussions.
Battery life is one area where the Ear (a) earbuds excel, as with ANC turned off, you can get up to 9.5 hours from the buds alone or roughly 42.5 hours with the case. Even if you do turn on the noise cancellation, the playback will last for a long time before requiring another charge, and a short top-up of 10 minutes will provide you with a pleasant bit of extra time. The case itself charges via USB-C, but they’ve opted out of wireless charging to keep costs down.
The controls are all handled via pinch movements on the stems, which take some getting accustomed to but are really responsive once you’ve played around. You can control the volume by swiping, pinching to play or pause, or holding in to switch between ANC modes or access your voice assistant. They also have Bluetooth 5.3, which ensures a rock-solid connection, and multipoint pairing, which allows you to connect up to two devices and switch between them.
The Ear (a) earbuds are designed to survive some rough handling, with an IP54 classification for dust and water resistance, so you should be alright using them in the gym or outside in light rain. The case is not as durable, but it is adequate for general use. Nothing also includes a few extras, such as low-latency option for gamers to decrease annoying audio lag.
It’s been several years since we last did this, but I’d like to remind you all that the National Football League plays a lot of make believe when it comes to what its trademarks for the “Super Bowl” do and do not allow it to do in terms of enforcement. Thanks largely to media outlets that repeat the false narrative the NFL puts out there, far too many people think that businesses, or even members of the public, simply cannot use the phrase “Super Bowl” in any capacity whatsoever if there is any commercial component to it.
TV companies advertising their goods and telling you to “be prepared for the Super Bowl”? Can’t do it. A church holding a party for the game with invitations to the Super Bowl and a 5$ cover charge? Verboten. And this way of thinking is perpetuated by posts like this one from TVLine.
The term “Super Bowl” is an NFL trademark, and licensing that trademark is very, very expensive. After all, the NFL makes a lot of money from “Super Bowl” commercials – 30-second slots for this year’s game have cost upward of $10 million.
Of course, there are ways around not being able to mention the Super Bowl in commercials. Brands that aren’t willing or able to license the name will refer to it as “the big game” or something along those lines instead. What’s more, the brands that pay to license the name still have to work within strict parameters. According to L.A. Tech & Media Law, parties that purchase Super Bowl ad spots can only mention the name of the event for a limited period of time.
In the past, the league has sent cease-and-desists to bars and even churches that host Super Bowl parties and charge an admission fee. In short, if an entity of any kind uses the term for commercial gain, they can expect a letter from the NFL’s lawyers.
Advertisement
Yes, they can, but that shouldn’t be the entirety of the post. The NFL can send whatever letters they like. What matters is whether they are asserting rights they actually have or not. Otherwise, posts like this leave the public with an, at best, incomplete idea of what rights the NFL has and what rights it doesn’t.
The NFL certainly has a trademark on “Super Bowl.” That does not automagically mean it can fully control all uses of that mark, even where there is money involved. Fair use defenses still apply, of course, as does the general standard that the use had to either confuse the public as to the source of the product or service, or falsely imply an association between the company and the NFL. Not all uses, even commercial, will do that.
Stop giving the NFL power it doesn’t actually have. A restaurant putting out a sidewalk sign that says it will have the Super Bowl on its TVs is not trademark infringement by any sane reading of the law. An advertisement merely acknowledging the existence of the Super Bowl does not in and of itself make it infringing.
Yes, the NFL pulls overly protectionist crap with this trademark all the time. Yes, it would take coordinated pushback from more than one corporate entity with deep pockets to fight it. But it’s a fight worth fighting and, at the very least, none of us have to pretend that the NFL has rights it doesn’t have.
Before the Internet, there was a certain value to knowing how to find out about things. Reference librarians could help you locate specialized data like the Thomas Register, the EE and IC Masters for electronics, or even an encyclopedia or CRC handbook. But if you wanted up-to-date info on any country of the world, you’d often turn to the CIA. The originally classified document was what the CIA knew about every country in the world. Well, at least what they’d admit to knowing, anyway. But now, the Factbook is gone.
The publication started in 1962 as the classified “The National Basic Intelligence Factbook,” it went public in 1971 and became “The World Factbook” in the 1980s. While it is gone, you can rewind it, including a snapshot taken just before it went dark on Archive.org.
Browsing the archives, it looks like the last update was in September of 2025. It would be interesting to see a project like Wikipedia take the dataset, house it, and update it, although you can presume the CIA was better equipped. The data is public domain, after all.
Want to know things about Croatia? Unfortunately, the archive seems to have missed some parts of some pages. However, there are other mirrors, including some that have snapshots of the data in one form or another. Of course, these are not always the absolute latest (the link has data from 2023). But we would guess the main languages (Croatian and Serbian) haven’t changed. You can also find the internet country suffix (.hr) and rankings (for example, in 2020, Croatia ranked 29th in the world for the number of broadband internet subscribers scaled for population and 75th in total broadband usage.
Advertisement
We are sorry to see such a useful reference go, but reference books are definitely an endangered species these days.
We may receive a commission on purchases made from links.
Many who are new to photography are often mesmerized by the latest and greatest cameras, where large megapixel counts and full frame sensors are often equated to better image quality. I’ve experienced this myself when I was new to photography some 20 years ago, and I even had similar thoughts when I bought a camera again recently.
But instead of splurging all your money on one of the best mirrorless cameras or a DSLR, I suggest that you get a more affordable camera or even a decent used mirrorless camera instead. You can then use the savings from that to buy a good set of lenses that will do more for your photography. Stepping away from the cheap kit lenses often included in entry-level and even some mid-range camera models can let you unlock your creativity and even give you more flexibility in executing your vision.
Advertisement
However, there are a ton of lenses available on the market, and they can get quite expensive once you start buying everything. This might make shopping for lenses confusing, as you wouldn’t know which to prioritize when you’re building your kit. Of course, there’s no one-size-fits-all answer to this, as lens preferences will vary between shooting styles. But, at the very least, these are some of the lenses that every photographer needs to try at least once in their life, allowing you to explore the different styles and capabilities you get with these lenses.
Advertisement
50mm for everyday use
Russell Johnson / 500px/Getty
The 50mm lens is popularly known as the “nifty fifty” in photography circles, and if you ask any professional photographer or serious hobbyist, this would be one of the first lenses that they’d recommend. Many say that 50mm approximates what the human eye sees, but that is debatable. Nevertheless, it’s still recommended because this focal length has limited distortion, so the photos that it would take typically look natural. It’s also quite versatile, and I used it for portraiture, still and product photography, travel photography, and as a photojournalist.
More importantly, it’s one of the cheapest, “fast” lenses that you can buy, usually offering a f/1.8 aperture or bigger. You can find a brand-new Canon EF 50mm f/1.8 STM lens on Amazon for just $166, while a comparable Sony FE 50mm F1.8 Standard lens goes for $278. If you find this still a bit steep, there are several other third-party lens options from manufacturers like Yongnuo and Viltrox. You can get them even cheaper by buying a used camera lens, but you need to know what to look for.
Note that if you have a cropped-sensor camera, the 50mm lens is cropped like a 75mm (for Fujifilm and Nikon), an 80mm (for Canon), or a 100mm (for Lumix) lens, depending on the model and camera brand. But if you have a Canon camera with an APS-C sensor and want to recreate the field-of-view (FOV) of a 50mm lens on it, a 35mm lens is the closest that you can get from the brand (although a 30mm lens from a third-party manufacturer like Sigma is closer to the FOV of the 50mm on a full frame camera).
Advertisement
35mm for street photography
Dustin Abbott / 500px/Getty
Although the 50mm is a handy lens for nearly every situation, its FOV is rather narrow, especially if you’re shooting in enclosed spaces or when you want to include the environment for a bit of context. That’s why street photographers prefer the wider focal lens of the 35mm — it allows them to capture wider vantage points without getting too much distortion from even wider lenses. It’s also still useful for portrait photography, with the lead photographer on the team that I worked with as wedding photographer getting a Sigma 35mm f/1.4 Art lens after he got to try my more basic Canon EF 35mm f/2 lens.
As usual, you wouldn’t get the same FOV when you mount a 35mm lens on a cropped-sensor camera. And since I sold all my full frame cameras when I retired from the wedding photography industry and “downgraded” to this cheap, yet high-quality digital camera, I bought a Canon EF-S 24mm f/2.8 STM lens which equates to about 38.4mm when attached to my Canon EOS 200D Mk II.
I love this lens for my street photography because it’s relatively small and unassuming, even with a lens hood attached, and you can see a sample in the Instagram post above of a photograph taken with the lens I mentioned. The only downside is that since it’s a prime lens, the only way that I can “zoom” into the image is to physically get closer to it.
Advertisement
100mm macro for portraits and details
spotters_studio/Shutterstock
I would’ve recommended the 85mm lens as a must-try lens, but I often just relegate it to a portraiture and candid photography duties. So, I’d rather suggest a 100mm macro lens which can achieve a similar effect (although at slightly smaller f/2.8 aperture vs. the larger f/1.8 found on the 85mm) of compressing the space between you and the subject, resulting in more flattering portraits. But what I like best about the 100mm is that it’s also a macro lens, and it unlocks a whole new world that you wouldn’t otherwise get from other lenses.
The 100mm macro lens let me get so much closer to my subject compared to the other lenses. My old Canon EF 100mm f/2.8 Macro lets me get as close as 12 inches to my subject (versus the 33 inches minimum focusing distance of the Canon EF 85mm f/1.8), allowing you to see finer details and even reveal the textures of the surfaces of the objects you’ve photographed. That’s why it’s one of the essential lenses you need if you want to get into product photography.
Advertisement
24-70mm f/2.8: the standard lens
Wirestock Creators/Shutterstock
When I worked as a wedding and event photographer, almost everyone in the industry had this lens, even though it’s quite expensive. For example, the Canon EF 24-70mm f/2.8L USM is currently priced at more than $1,200 on Amazon, making it even more expensive than some entry-level and even mid-range cameras. But it’s worth the investment because of how well-rounded it is.
The lens can capture wide areas on the 24mm end of its range, and you can even use its distortion to create an effect. It also retains the ability to take decent portraits at 70mm, while the 35mm and 50mm focal lengths we discussed above are also covered. More importantly, it has a fixed f/2.8 aperture, so you do not need to push the sensitivity on your camera when shooting in low-light situations.
Of course, this lens has its own downsides, too. Aside from being quite expensive, it’s also a large and heavy piece of equipment, weighing 805 grams. It’s a great lens, and I love its wide range and large opening for covering events. But its size and heft tend to make it unsuitable for street photography, especially as you lose the discretion of smaller and lighter prime lenses.
Advertisement
70-200mm f/2.8: a fast zoom lens
wisnupriyono/Shutterstock
As a newbie photographer, I’ve always wanted the long reach of a zoom lens, and I achieved that with the 70-200mm lens. However, this is more than just a zoom lens — aside from getting me nearer to the action, the narrow FOV of this lens brings the background much closer, allowing them to look so much bigger than what you’d usually see with your naked eye.
You can see in the sample photo above the shallow depth-of-field that lets me isolate my subject from the foreground and the background, making it easier to guide the viewer to what I want them to see. This is next to impossible to achieve on other wider lenses, unless you edit the image on your phone or computer.
Advertisement
This is going to be an important part of your kit if you’re into sports and wildlife photography. The 200mm reach can give you decent reach so you can capture the action up close even if you’re sitting courtside or behind the safety barriers of an F1 race. It also lets you capture images of birds and other animals without endangering them or yourself.
However, just like the 24-70mm, this lens is quite expensive. The Canon EF 70-200mm f/2.8L IS III USM currently costs $2,399 on Amazon, a small fortune for most hobbyists but a crucial investment for professionals. But whether you plan to turn your passion into a business or just want to enjoy capturing the beauty of the world the way you see it, you need to try out this lens at least once in your life to see the possibilities that it will give you.
Despite looking relatively the same as the older models, B&W have given it a significant overhaul. The headband has been redesigned to fit a wider range of heads, the controls reshaped to be easier to find and use, while the headphones are slimmer for a more attractive profile.
The only issue we have is with the controls, which we didn’t feel as if they needed to be changed but they work fine enough.
These headphones feature noise cancelling and a transparency mode and despite Bower’s claims of improving both areas, the noise cancelling isn’t as strong as the Bose QuietComfort Ultra Headphones or Sony WH-1000XM6. The transparency mode could be clearer too. ANC is not these headphones’ strongest point.
Advertisement
The Bowers & Wilkins Music app offers the means to customise bass and treble, as well as a custom EQ option to create your own sound profile, a first for a pair of Bowers wireless headphones.
These headphones keep the feature set relatively simple, and aren’t as ‘smart’ or as feature-laded as the less expensive Sony WH-1000XM5 but the app does have built-in streaming support for services such as Qobuz, Deezer, and Tidal.
The battery life remains 30 hours of listening from one charge, though in our tests we found it could go longer with an Android smartphone.
Bluetooth support includes aptX Lossless, the one of the higher quality wireless codecs, and as usual the wireless connection is excellent.
Advertisement
These are also one of the best headphones for call quality, offering great clarity and detail while keeping background sounds to a minimum.
The sound quality here is the best it’s been for the Px7 range. It’s energetic, clear, expressive and natural in how it sounds, the headphones’ levels of detail, dynamism and sense of spaciousness make it one of the best-sounding models on the market.
Low frequencies have more depth and power, the midrange is detailed and the high frequencies clear. If you’re after a pair of wireless headphones for the sound, there’s none better at this price than the Px7 S3.
Sitting in a recent district administrator meeting, I found myself excited about a new student data platform my district is rolling out. This new tool, called by a catchy acronym and presented on a flashy dashboard, would collect a variety of information about student skills, mindsets and achievement. It would let us break down information by subgroup and assign overall scores to students, helping us identify who needs additional support.
Initially, I was enthusiastic about how it could empower teachers to better understand students and improve outcomes. But since then, after conversations with the teachers in my building and reflecting on my own experiences using data in the classroom, I’ve begun to wonder whether we are focusing on the wrong data or placing too much emphasis on data overall.
I love looking at data. I’m excited when data surprises me or shows me something more clearly. It’s motivating to see trend lines sloping upward and green arrows pointing toward the sky. Data can help us see the bigger picture when looking at larger systems. We can see which schools are suspending too many students of color and which districts are improving reading scores. As an administrator, I find this illuminating and helpful in guiding how schools make decisions.
Advertisement
But as data trickles down to classrooms and individual students, the usefulness and impact get murkier. In the Montessori school where I teach, where our focus is guiding the child according to their interests and readiness, the data we have to collect affects what we focus on, often in unexpected ways, and sometimes to the detriment of the system itself.
Teaching to the Test
My school is a successful one, and looking at our annual school report card should be a source of pride for the teachers. The report card is based primarily on our state test scores in math and reading, and various calculations are made from our students’ performance on it. But when we shared the most recent report card that showed our school once again exceeded expectations, the results were met with shrugs and muted applause. It isn’t that they aren’t proud of what our students can do; they just recognize the narrowness of the data and how indirectly it connects to what is happening in their Montessori classrooms.
When I pointed out that our report card showed math achievement was an area for improvement, the response was, “Are you saying we should teach to the test?” They know that we could game the system by focusing on test prep and the specific questions their students might encounter. Because we follow a Montessori curriculum with three grade levels in our classrooms, our sequence doesn’t always align with grade-level standards, which can show up on tests, with students scoring poorly on topics they haven’t been introduced to yet. We could align our curriculum with the test and focus our teaching on what the test assesses, but doing so goes against our philosophy of allowing students to make choices about their learning at their own pace.
With this tension in mind, I wonder if data distorts the focus of education? Our current focus on reading and math scores, based on standardized testing, is part of what we want our schools to do. But teachers know that students are capable of achieving much more than our report cards show. Is there some golden indicator that we just haven’t found yet — a measurement like happiness or flourishing — that would be more meaningful? And of course, if we find it, won’t it also become distorted?
Advertisement
Information Overload
There is also a heavy focus in our district on using data to determine which students qualify for additional support through differentiation, interventions and individualized instruction. Administration requires us to hold monthly meetings to review student data and determine who is progressing and who might need more support. On one level, this seems like a great practice for identifying who needs help, but in reality, the system’s capacity to act on that information is overstretched, leading to distortion and ultimately to burnout.
I remember my frustrations as a teacher in these meetings. The data was interesting and could help you to confirm or question ideas you had about students based on your classroom observations. But it didn’t often provide helpful information for supporting students. The time spent in these meetings outweighed the benefit I got from them, and took away from the little time I had to prepare and plan for my students.
Teachers I work with have regularly expressed feeling overwhelmed by the amount of information they need to consider and the testing required to gather it. In our early grades, due to a new state law mandating early literacy assessments, students are tested monthly on letter-sound identification and oral reading fluency. This generates an unending stream of data to grapple with and a constant feeling of needing to do more to address it, all of which adds to stress on teachers, students and the system. I’ve seen amazing teachers, skilled at connecting with kids and providing rich learning experiences, brought to tears because there was too much red on a data spreadsheet.
Teachers don’t have the time to assess and examine all the data they’re now expected to, and monthly checks of early reading indicators take time away from actually teaching those skills. Being responsive to the data you gather means stopping what you’re doing and finding new ways to help kids learn what the data says they need. Teachers are expected to find new resources and determine when and how to work with small groups that need similar support, while also providing meaningful learning opportunities for other students. And, of course, different kids need different things, so you’d need to do this for multiple groups, which is unrealistic to expect all teachers to have the capacity to do.
Advertisement
Meaningful Measurement
Schools, as they are currently designed, weren’t supposed to be responsive to the amount of data we’re collecting. They were designed to teach a group of students a set of information in a specific sequence each year, and then grade them on how well they learned what they were expected to learn. They were designed to tell us which students could meet the standards, and who couldn’t, not to ensure that each child could learn and flourish.
When I was a classroom teacher, I kept track of how many books my students read each month. It wasn’t research-backed or scientifically valid, but I found the data helpful for identifying who was and wasn’t reading, and thinking about how I could support them. In some cases, it helped me direct kids to books that they might get excited about; in other cases, it just let me know that a particular kid wasn’t that into reading, and that that might have to be OK for now. The data wasn’t complicated, but it let me quantify what I was observing in my classroom in a way that was meaningful to me and, most importantly, helped me connect with my students as whole people.
A key component of Montessori philosophy is the teacher as observer — watching and documenting what students choose and do to understand and assess what they are ready for. Every teacher should have the time and space to measure and track what feels meaningful and helpful to them.
This may look different for every teacher, but the important factor is that it has meaning to them and is connected to their students and their practice. Likewise, we need to remember that standardizing the expectations for students goes against what we know about how people develop. There’s always going to be variation in a dataset — there’s no metric on which we are all the same.
Advertisement
As an administrator, my responsibility is to understand and use data in ways that are helpful, while also protecting teachers and students from distractions and distortions that undermine the larger goals of creating opportunities for growth and learning for all students.
Ultimately, data should serve as a guide rather than a governor, informing our decisions without eclipsing the human elements of teaching and learning. If we can strike that balance, we can create systems that honor both the complexity of children and the professional wisdom of the educators who know them best.
Memory prices across DRAM, NAND and HBM have surged 80 to 90% quarter-over-quarter in Q1 2026, according to Counterpoint Research’s latest Memory Price Tracker. The price of a 64GB RDIMM has jumped from a Q4 2025 contract price of $450 to over $900, and Counterpoint expects it to cross $1,000 in Q2.
NAND, relatively stable last quarter, is tracking a parallel increase. Device makers are cutting DRAM content per device, swapping TLC SSDs for cheaper QLC alternatives, and shifting orders from the now-scarce LPDDR4 to LPDDR5 as new entry-level chipsets support the newer standard. DRAM operating margins hit the 60% range in Q4 2025 — the first time conventional DRAM margins surpassed HBM — and Q1 2026 is on track to set all-time highs.
With an ultimate target of €1bn, Spain’s Mundi Ventures closed on €750m this week for its Kembara Fund for deep tech and climate start-ups.
Deep tech targets the world’s biggest problems, from climate and energy to defence and healthcare. Europe has the talent and the start-ups, but has struggled on the capital side for scaling start-ups. This is what Mundi Ventures’ Kembara Fund aims to address with its focus on Series B and C funding of €15m-€40m, and beyond, for companies based in EU member states.
According to the Kembara team, Europe produces 28pc of global deep tech innovation, but only 3pc of European deep tech companies successfully raise Series B or C rounds. It is that very gap that the Kembara Fund is hoping to bridge using “€1bn dedicated to backing Europe’s deep tech champions at the exact moment when technology is proven and global scale becomes possible”.
In his own words, Kembara partner Yann de Vries says the funding cliff is “deeply personal” to him after his experience with Lilium, which declared insolvency in 2025 having failed to raise adequate investment. Its patents now belong to Archer Aviation.
Advertisement
“A decade ago at Lilium – a leading electric aviation company – we went from a sci-fi idea in a hangar to a NASDAQ-listed company in five years,” de Vries said in a LinkedIn post yesterday.
“I saw first-hand how brutally hard it is for European deep tech teams to raise €50m-€100m rounds and scale globally. That journey is why we built Kembara.”
The European Investment Fund (EIF) is a lead backer of Kembara, announcing in July last year that it would invest €350m in Kembara Fund 1. At the time, the EIC said it was the experience of the Kembara management team and its “differentiated strategy” that were key to receiving the support of the EIF.
“Companies that achieve strategic autonomy in critical technologies – from AI and quantum computing to space systems and clean energy – have the potential to become trillion-dollar global champions,” said de Vries in his post.
Advertisement
“Our ambition is clear: fix Europe’s growth-stage funding gap, where only 3pc of deep tech companies make it to Series B/C today.”
Other Kembara partners include Javier Santiso, Robert Trezona, Pierre Festal and Siraj Khaliq, who de Vries says have a combined experience of 100 years in deep tech, in companies like SpaceX, Palantir, PsiQuantum, OpenAI, Lilium, Ceres Power, Anduril and The Exploration Company.
“We are entrepreneurs united by a shared mission: to build Europe’s leading deep tech platform – one that keeps Europe competitive in the global technology race, tackles the world’s most pressing challenges and delivers outsized returns,” said de Vries. “This is only the beginning…”
Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.
Hollywood’s recent attempts to build entertainment around AI have consistently underperformed or outright flopped, whether the AI in question is a plot device or a production tool. The horror sequel M3GAN 2.0, Mission: Impossible — The Final Reckoning, and Disney’s Tron: Ares all disappointed at the box office in 2025 despite centering their narratives on AI.
The latest casualty is Mercy, a January 2026 crime thriller in which Chris Pratt faces an AI judge bot played by Rebecca Ferguson; one reviewer has already called it “the worst movie of 2026,” and its ticket sales have been mediocre. AI-generated content hasn’t fared any better. Darren Aronofsky executive-produced On This Day…1776, a YouTube web series that uses Google DeepMind video generation alongside real voice actors to dramatize the American Revolution. Viewer response has been brutal — commenters mocked the uncanny faces and the fact that DeepMind rendered “America” as “Aamereedd.”
A Taika Waititi-directed Xfinity commercial set to air during this weekend’s Super Bowl, which de-ages Jurassic Park stars Sam Neill, Laura Dern and Jeff Goldblum, has already been mocked for producing what one viewer called “melting wax figures.”
The removal of an ICE-monitoring app almost a year ago has triggered new questions about whether the US Department of Justice crossed a constitutional line in its dealings with Apple and Google.
How ICEBlock appeared in the App Store before being pulled
On Friday, House Judiciary Committee ranking member Jamie Raskin vowed to investigate the Department of Justice over allegations that it pressured tech giants into removing ICE tracking apps. In a letter addressed to Attorney General Pam Bondi, Raskin asks, “Why is the Department of Justice (DOJ) violating the First Amendment by coercing big tech to block access to lawful apps that the American people use to record, report, and monitor the actions of our own government officers?” Continue Reading on AppleInsider | Discuss on our Forums
For decades, PC gaming meant owning a monolith: a massive, flashing tower that dominated your floor space. But in 2026, the era of the giant box is over. Components have become efficient enough that you no longer need 60 liters of air to cool them. The “Small Form Factor” (SFF) movement’s gone mainstream, proving that you can fit an RTX 5080 and a top-tier CPU into a case the size of a shoebox.
It’s minimal, it’s sophisticated, and it looks a lot better on a desk than a plastic tower. If you’re ready to downsize without downgrading performance, this is where you should start.
The Terra changed the game by proving a PC could look like mid-century furniture. Featuring a genuine walnut wood front panel and anodized aluminum sheets, it’s designed to be seen. The “sandwich” layout puts the GPU on one side and the CPU on the other, allowing it to stay incredibly small (10.4 liters) while still fitting full-sized graphics cards.
While the Terra is rustic, the Era 2 is pure modern elegance. The sculpted silver aluminum exterior feels like high-end audio equipment. It’s optimized for airflow with a unique chimney design, pulling cool air from the bottom and exhausting it out the top. It’s the perfect housing for a professional creative workstation.
This is the reference standard for water-cooled SFF builds. Collaborating with DAN Cases, Lian Li created a sub-11 liter case that somehow fits a 240mm AIO liquid cooler. It’s an industrial, no-nonsense aluminum box that maximizes every millimeter of internal space. If you want the smallest possible footprint with liquid cooling, this is it.
Building in a small case can be intimidating. Cooler Master solves this with the MAX V2. It comes pre-installed with a custom 280mm AIO cooler and an 850W Gold Power supply, with the cables already routed and managed. You just drop in your motherboard and GPU, and you are done. It is designed to handle next-gen power, officially ready for cards like the RTX 5080. It is also 12% off right now.
Small builds used to mean low power, but not anymore. The Loki pushes 850W of Platinum-rated efficiency, enough to drive top-tier silicon. It uses the slightly longer “SFX-L” standard to fit a larger 120mm fan, making it quieter than standard small power supplies. Plus, it includes an RGB fan if you want a subtle glow.
Ask any SFF builder what PSU to buy, and they’ll say “Corsair SF750.” The 2024 refresh brings ATX 3.1 compliance and native PCIe 5.1 cables for modern GPUs. It is incredibly dense, reliable, and features a zero-RPM mode so the fan doesn’t even spin during light work. You can grab it now for 20% off.
In cases like the Fractal Terra, you can’t fit a big liquid cooler. You need high-performance air cooling that stays low. The Big Shuriken 4 is designed exactly for this. At just 67mm tall, it fits where standard coolers can’t, yet it can handle up to 200W of heat thanks to its dense fin stack and high-static pressure fan.
The bottom line
If you want a PC that doubles as home decor, the Fractal Design Terra is the clear winner. For first-time builders who don’t want to stress about cable management or part compatibility, the Cooler Master NR200P MAX V2 is a cheat code that saves hours of frustration. But if you need absolute maximum cooling for high-end components in the smallest possible footprint, the Lian Li A4-H2O remains the gold standard.