Connect with us
DAPA Banner

Tech

Today’s NYT Mini Crossword Answers for April 22

Published

on

Looking for the most recent Mini Crossword answer? Click here for today’s Mini Crossword hints, as well as our daily answers and hints for The New York Times Wordle, Strands, Connections and Connections: Sports Edition puzzles.


Today’s Mini Crossword features some Earth Day-specific clues. Read on for all the answers. And if you could use some hints and guidance for daily solving, check out our Mini Crossword tips.

If you’re looking for today’s Wordle, Connections, Connections: Sports Edition and Strands answers, you can visit CNET’s NYT puzzle hints page.

Advertisement

Read more: Tips and Tricks for Solving The New York Times Mini Crossword

Let’s get to those Mini Crossword clues and answers.

completed-nyt-mini-crossword-puzzle-for-april-22-2026.png

The completed NYT Mini Crossword puzzle for April 22, 2026.

Advertisement

NYT/Screenshot by CNET

Mini across clues and answers

1A clue: It’s clearly recyclable!
Answer: GLASS

6A clue: ___ Day (April 22nd observance)
Answer: EARTH

7A clue: Thick, underground part of a plant stem
Answer: TUBER

Advertisement

8A clue: Small cluster of trees
Answer: GROVE

9A clue: Rowed, as a boat
Answer: OARED

Mini down clues and answers

1D clue: From the ___ (right at the beginning)
Answer: GETGO

2D clue: Author Ingalls Wilder who wrote “Little House on the Prairie”
Answer: LAURA

Advertisement

3D clue: ___ Day, observance on the last Friday of April
Answer: ARBOR

4D clue: Actor Buscemi
Answer: STEVE

5D clue: Rip into bits, as paper
Answer: SHRED

Advertisement

Source link

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Tech

Google’s new Deep Research and Deep Research Max agents can search the web and your private data

Published

on

Google on Monday unveiled the most significant upgrade to its autonomous research agent capabilities since the product’s debut, launching two new agents — Deep Research and Deep Research Max — that for the first time allow developers to fuse open web data with proprietary enterprise information through a single API call, produce native charts and infographics inside research reports, and connect to arbitrary third-party data sources through the Model Context Protocol (MCP).

The release, built on Google’s Gemini 3.1 Pro model, marks an inflection point in the rapidly intensifying race to build AI systems that can autonomously conduct the kind of exhaustive, multi-source research that has traditionally consumed hours or days of human analyst time. It also represents Google’s clearest bid yet to position its AI infrastructure as the backbone for enterprise research workflows in finance, life sciences, and market intelligence — industries where the stakes of getting information wrong are extraordinarily high.

“We are launching two powerful updates to Deep Research in the Gemini API, now with better quality, MCP support, and native chart/infographics generation,” Google CEO Sundar Pichai wrote on X. “Use Deep Research when you want speed and efficiency, and use Max when you want the highest quality context gathering & synthesis using extended test-time compute — achieving 93.3% on DeepSearchQA and 54.6% on HLE.”

Both agents are available starting today in public preview via paid tiers of the Gemini API, accessible through the Interactions API that Google first introduced in December 2025.

Advertisement

Why Google built two research agents instead of one

The launch introduces a tiered architecture that reflects a fundamental tension in AI agent design: the tradeoff between speed and thoroughness.

Deep Research, the standard tier, replaces the preview agent Google released in December and is optimized for low-latency, interactive use cases. It delivers what Google describes as significantly reduced latency and cost at higher quality levels compared to its predecessor. The company positions it as ideal for applications where a developer wants to embed research capabilities directly into a user-facing interface — think a financial dashboard that can answer complex analytical questions in near-real time.

Deep Research Max occupies the opposite end of the spectrum. It leverages extended test-time compute — a technique where the model spends more computational cycles iteratively reasoning, searching, and refining its output before delivering a final report. Google designed it for asynchronous, background workflows: the kind of task where an analyst team kicks off a batch of due diligence reports before leaving the office and expects exhaustive, fully sourced analyses waiting for them the next morning.

The Google DeepMind team framed the distinction on X: “Deep Research: Optimized for speed and efficiency. Perfect for interactive apps needing quicker responses. Deep Research Max: It uses extra time to search and reason. Ideal for exhaustive context gathering and tasks happening in the background.”

Advertisement

“Deep Research was our first hosted agent in the API and has gained a ton of traction over the last 3 months, very excited for folks to test out the new agents and all the improvements, this is just the start of our agents journey,” Logan Kilpatrick, who leads developer relations for Google’s AI efforts, wrote on X.

MCP support lets the agents tap into private enterprise data for the first time

Perhaps the most consequential feature in today’s release is the addition of Model Context Protocol support, which transforms Deep Research from a sophisticated web research tool into something more closely resembling a universal data analyst.

MCP , an emerging open standard for connecting AI models to external data sources, allows Deep Research to securely query private databases, internal document repositories, and specialized third-party data services — all without requiring sensitive information to leave its source environment. In practical terms, this means a hedge fund could point Deep Research at its internal deal-flow database and a financial data terminal simultaneously, then ask the agent to synthesize insights from both alongside publicly available information from the web.

Google disclosed that it is actively collaborating with FactSet, S&P, and PitchBook on their MCP server designs, a signal that the company is pursuing deep integration with the data providers that Wall Street and the broader financial services industry already rely on daily. The goal, according to the blog post authored by Google DeepMind product managers Lukas Haas and Srinivas Tadepalli, is to “let shared customers integrate financial data offerings into workflows powered by Deep Research, and to enable them to realize a leap in productivity by gathering context using their exhaustive data universes at lightning speed.”

Advertisement

This addresses one of the most persistent pain points in enterprise AI adoption: the gap between what a model can find on the open internet and what an organization actually needs to make decisions. Until now, bridging that gap required significant custom engineering. MCP support, combined with Deep Research’s autonomous browsing and reasoning capabilities, collapses much of that complexity into a configuration step. Developers can now run Deep Research with Google Search, remote MCP servers, URL Context, Code Execution, and File Search simultaneously — or turn off web access entirely to search exclusively over custom data. The system also accepts multimodal inputs including PDFs, CSVs, images, audio, and video as grounding context.

Native charts and infographics turn AI reports into stakeholder-ready deliverables

The second headline feature — native chart and infographic generation — may sound incremental, but it addresses a practical limitation that has constrained the usefulness of AI-generated research outputs in professional settings.

Previous versions of Deep Research produced text-only reports. Users who needed visualizations had to export the data and build charts themselves, a friction point that undermined the promise of end-to-end automation. The new agents generate high-quality charts and infographics inline within their reports, rendered in HTML or Google’s Nano Banana format, dynamically visualizing complex datasets as part of the analytical narrative.

“The agent generates HTML charts and infographics inline with the report. Not screenshots. Not suggestions to ‘visualize this data.’ Actual rendered charts inside the markdown output,” noted AI commentator Shruti Mishra on X, capturing the practical significance of the change.

Advertisement

For enterprise users — particularly those in finance and consulting who need to produce stakeholder-ready deliverables — this transforms Deep Research from a tool that accelerates the research phase into one that can potentially produce near-final analytical products. Combined with a new collaborative planning feature that lets users review, guide, and refine the agent’s research plan before execution, and real-time streaming of intermediate reasoning steps, the system gives developers granular control over the investigation’s scope while maintaining the transparency that regulated industries demand.

How Deep Research evolved from a consumer chatbot feature to enterprise platform infrastructure

Today’s release crystallizes a strategic narrative Google has been building for months: Deep Research is not merely a consumer feature but a piece of infrastructure that powers multiple Google products and is now being offered to external developers as a platform.

The blog post explicitly notes that when developers build with the Deep Research agent, they tap into “the same autonomous research infrastructure that powers research capabilities within some of Google’s most popular products like Gemini App, NotebookLM, Google Search and Google Finance.” This suggests that the agent available through the API is not a stripped-down version of what Google uses internally but the same system, offered at platform scale.

The journey to this point has been remarkably rapid. Google first introduced Deep Research as a consumer feature in the Gemini app in December 2024, initially powered by Gemini 1.5 Pro. At the time, the company described it as a personal AI research assistant that could save users hours by synthesizing web information in minutes. By March 2025, Google upgraded Deep Research with Gemini 2.0 Flash Thinking Experimental and made it available for anyone to try. Then came the upgrade to Gemini 2.5 Pro Experimental, where Google reported that raters preferred its reports over competing deep research providers by more than a 2-to-1 margin. The December 2025 release was the pivot to developer access, when Google launched the Interactions API and made Deep Research available programmatically for the first time, powered by Gemini 3 Pro and accompanied by the open-source DeepSearchQA benchmark.

Advertisement

The underlying model driving today’s improvements is Gemini 3.1 Pro, which Google released on February 19, 2026. That model represented a significant leap in core reasoning: on ARC-AGI-2, a benchmark evaluating a model’s ability to solve novel logic patterns, 3.1 Pro scored 77.1% — more than double the performance of Gemini 3 Pro. Deep Research Max inherits that reasoning foundation and layers autonomous research behaviors on top of it, achieving 93.3% on DeepSearchQA (up from 66.1% in December) and 54.6% on Humanity’s Last Exam (up from 46.4%).

gemini-3.1-pro deep-research-qualitative-advacements blog evals

Google’s new Deep Research Max agent outperformed its December predecessor across nearly all qualitative dimensions in internal expert evaluations — but the older version held an edge in internal consistency and faithfulness. (Source: Google DeepMind)

Google faces a crowded field of competitors building autonomous research agents

Google is not operating in a vacuum. The launch arrives amid intensifying competition in the autonomous research agent space. OpenAI has been developing its own agent capabilities within ChatGPT under the codename Hermes, which includes an agent builder, templates, scheduling, and Slack integration, according to reports circulating on social media. Perplexity has built its business around AI-powered research. And a growing ecosystem of startups is attacking various slices of the automated research workflow.

What distinguishes Google’s approach is the combination of its search infrastructure — which gives Deep Research access to the broadest and most current index of web information available — with the MCP-based connectivity to enterprise data sources. No other company currently offers a research agent that can simultaneously query the open web at Google Search’s scale and navigate proprietary data repositories through a standardized protocol. The pricing structure also signals Google’s intent to drive adoption: according to Sim.ai, which tracks model pricing, the Deep Research agent in the December preview was priced at $2 per million input tokens and $2 per million output tokens with a 1 million token context window — positioning it as cost-competitive for the volume of research output it generates.

Advertisement

Not everyone greeted the announcement with unalloyed enthusiasm, however. Several users on X noted that the new agents are available only through the API, not in the Gemini consumer app. “Not on Gemini app,” observed TestingCatalog News, while another user wrote, “Google keeps punishing Gemini App Pro subscribers for some reason.” Others raised concerns about the presentation of benchmark results, with one user arguing that Google’s charts could be “misleading” in how they represent percentage improvements. These complaints point to a broader tension in Google’s AI strategy: the company is increasingly directing its most advanced capabilities toward developers and enterprise customers who access them through APIs, while consumer-facing products sometimes lag behind.

gemini-3.1-pro deep-research-and-max blog evals

Deep Research Max led all competitors on DeepSearchQA and BrowseComp, but GPT 5.4 edged ahead on Humanity’s Last Exam, a benchmark measuring reasoning and knowledge. All results were evaluated by Google DeepMind using publicly available model APIs. (Source: Google DeepMind)

What Deep Research Max means for finance, biotech, and the future of knowledge work

The practical implications of today’s launch are most immediately felt in industries that depend on exhaustive, multi-source research as a core business function. In financial services, where analysts routinely spend hours assembling due diligence reports from scattered sources — SEC filings, earnings transcripts, market data terminals, internal deal memos — Deep Research Max offers the possibility of automating the initial research phase entirely. The FactSet, S&P, and PitchBook partnerships suggest Google is serious about making this work with the data infrastructure that financial professionals already use.

In life sciences, the blog post notes that Google has collaborated with Axiom Bio, which builds AI systems to predict drug toxicity, and found that Deep Research unlocked new levels of initial research depth across biomedical literature. In market research and consulting, the ability to produce stakeholder-ready reports with embedded visualizations and granular citations could compress project timelines from days to hours.

Advertisement

The key question is whether the quality and reliability of these automated outputs will meet the standards that professionals in these fields demand. Google’s benchmark numbers are impressive, but benchmarks measure performance on standardized tasks — real-world research is messier, more ambiguous, and often requires the kind of judgment that remains difficult to automate. Deep Research and Deep Research Max are available now in public preview via paid tiers of the Gemini API, with availability on Google Cloud for startups and enterprises coming soon.

Eighteen months ago, Deep Research was a feature that helped grad students avoid drowning in browser tabs. Today, Google is betting it can replace the first shift at an investment bank. The distance between those two ambitions — and whether the technology can actually close it — will define whether autonomous research agents become a transformative category of enterprise software or just another AI demo that dazzles on benchmarks and disappoints in the conference room.

Source link

Advertisement
Continue Reading

Tech

SpaceX and Cursor strike partnership that might end in a $60 billion acquisition

Published

on

SpaceX and AI company Cursor have struck a new partnership that could see the owner of X buy the AI company for $60 billion later this year. “SpaceXAI and  @cursor_ai  are now working closely together to create the world’s best coding and knowledge work AI,” SpaceX wrote in a post on X.

According to SpaceX, the deal allows for it to either invest $10 billion into the company known for its AI coding tool, or acquire it entirely “later this year” for $60 billion. If an acquisition were to happen, it’s not clear at what point Cursor could officially join the fold of Elon Musk’s rapidly expanding and increasingly enmeshed web of companies. SpaceX bought xAI, the billionaire’s AI company that also controls X, earlier this year. SpaceX is currently getting ready to go public this summer in what will likely be the biggest initial public offering (IPO) in history.

Cursor, which has reportedly been in talks to raise its own $2 billion round of funding, is known for its AI coding tool of the same name that’s become the vibe coding platform of choice for many developers. It allows people to use either its own models or those from other leading AI companies, including OpenAI, Google, Anthropic and xAI.

In a statement, Cursor said its partnership with SpaceX will “accelerate our model training efforts” while addressing infrastructure-related issues that have slowed it down in the past. “We’ve wanted to push our training efforts much further, but we’ve been bottlenecked by compute,” the company said. “With this partnership, our team will leverage xAI’s Colossus infrastructure to dramatically scale up the intelligence of our models for coding and beyond.”

Advertisement

Source link

Continue Reading

Tech

The Electromechanical Computer Of The B-52’s Star Tracker

Published

on

The Angle Computer of the B-52, opened. (Credit: Ken Shirriff)
The Angle Computer of the B-52, opened. (Credit: Ken Shirriff)

In the ages before convenient global positioning satellites to query for one’s current location military aircraft required dedicated navigators in order to not get lost. This changed with increasing automation, including the arrival of increasingly more sophisticated electromechanical computers, such as the angle computer in the B-52 bomber’s star tracker that [Ken Shirriff] recently had a poke at.

We covered star trackers before, with this devices enabling the automation of celestial navigation. In effect, as long as you have a map of the visible stars and an accurate time source you will never get lost on Earth, or a few kilometers above its surface as the case may be.

The B-52’s Angle Computer is part of the Astro Compass, which is the star tracker device that locks onto a star and outputs a heading that’s accurate to a tenth of a degree, while also allowing for position to be calculated from it. Inside the device a lot of calculations are being performed as explained in the article, though the full equations are quite complex.

Not burdening the navigator of a B-52 with having to ogle stars themselves with an instrument and scribbling down calculations on paper is a good idea, of course. Instead the Angle Computer solves the navigational triangle mechanically, essentially by modelling the celestial sphere with a metal half-sphere. The solving is thus done using this physical representation, involving numerous gears and other parts that are detailed in the article.

Advertisement

In addition to the mechanical components there are of course the motors driving it, feedback mechanisms and ways to interface with the instruments. For the 1950s this was definitely the way to design a computer like this, but of course as semiconductor transistors swept the computing landscape, this marvel of engineering would before long find itself too replaced with a fully digital version.

Source link

Advertisement
Continue Reading

Tech

NYT Strands hints and answers for Wednesday, April 22 (game #780)

Published

on

Looking for a different day?

A new NYT Strands puzzle appears at midnight each day for your time zone – which means that some people are always playing ‘today’s game’ while others are playing ‘yesterday’s’. If you’re looking for Tuesday’s puzzle instead then click here: NYT Strands hints and answers for Tuesday, April 21 (game #779).

Strands is the NYT’s latest word game after the likes of Wordle, Spelling Bee and Connections – and it’s great fun. It can be difficult, though, so read on for my Strands hints.

Advertisement

Source link

Continue Reading

Tech

Western Electric Revives U.S. Vacuum Tube Manufacturing, Unveils New Amplifier Designs at AXPONA 2026

Published

on

Western Electric didn’t just show up at AXPONA 2026 with new amplifier designs. It tapped into something a lot of us have been thinking about for years.

I’ve had a thing for tubes since I was a kid, learning the basics with my grandfather before I was ten. Decades later, that interest hasn’t faded. It has just gotten more expensive and harder to justify shelf space. I’ve owned just about every type of tube amplifier you can name and built a few along the way. My wife would argue the collection peaked years ago, but I’m still not walking past a good 6922 or KT88 without at least thinking about it.

The bigger issue hasn’t been the gear. It has been the tubes themselves. Options have thinned out to the point where “choice” often feels like theater. I remember standing in a large musician supply shop staring at five different brands of 12AX7. Different logos, different boxes, different prices. Same factory in Russia. Same tube.

That’s a long way from where things used to be. There was a time when American manufacturing alone offered a deep bench. RCA, GE, Sylvania, Tung Sol. All building serious product, alongside a strong European presence from Mullard, Telefunken, Philips, and others. Today, new production is concentrated in a handful of places: Slovakia, China, Russia, and yes, Rossville, GA, USA. Which is why what Western Electric is doing right now actually matters.

Advertisement

The operation in Rossville, GA, USA belongs to Western Electric, one of the most storied names in American tube manufacturing. For a long stretch, that name carried more history than output. Western Electric was not producing tubes at all. Even now, the lineup coming out of that factory is focused and limited. Two tubes. The 300B and the 308B.

western-electric-308b-tube

The 300B remains the centerpiece. It powers Western Electric’s Type 91E integrated amplifier and continues to define what the brand does best. The 308B is a different story. Production is being ramped back up to support the new 100E monoblock amplifiers, which signals a broader push beyond a single legacy tube.

Both amplifiers were on display in Western Electric’s room at AXPONA 2026. And yes, they deliver exactly what you think they will. If you have even a passing interest in tubes, this is the kind of gear that stops you mid sentence and makes you reconsider your financial priorities.

The 300B Reality

The 300B has been around since the 1930s and, during the golden age of tubes, it powered everything from PA systems to theater installations and clubs around the world. It wasn’t boutique back then. It was the workhorse.

Today, it sits on the other end of the spectrum. Among the most coveted tubes in the new old stock market, with prices that can start just under a thousand dollars and climb into several thousand for early examples. Add the premium for matched pairs or quads and the cost of keeping a 300B amplifier running gets uncomfortable fast. And that’s before you factor in the risk. Most NOS tubes come with little to no warranty. You’re buying history and hoping it holds up.

Advertisement

That’s where Western Electric shifts the conversation. Their current production 300B comes in at $699 each or $1,499 for a matched pair. Still not cheap, but grounded in reality compared to NOS pricing. More importantly, they back it with a five year warranty. That alone changes the math for anyone serious about running a 300B based system long term.

western-electric-300b-tube-case

The 308B: Big Glass, Big Power, and Still a Work in Progress

The 308B is not subtle. It stands roughly 14 inches tall and close to 4 inches in diameter. This is the kind of tube that makes everything around it look like it needs to hit the gym.

Advertisement. Scroll to continue reading.

In Western Electric’s 100E monoblock, a single 308B is rated to deliver 160 watts. That’s more output from one tube than many push pull designs manage with a quad of KT88s. It’s an ambitious play and one that suggests Western Electric is not content to stay in the 300B comfort zone.

Details are still catching up to the product. Pricing and availability have not been finalized, and even the web page listed in the company’s show materials was still under construction the week after AXPONA 2026. That tells you where this sits. Early, promising, and not quite ready for prime time.

Advertisement

91E and 100E: How Western Electric Is Actually Using These Tubes

Western Electric wasn’t just putting tubes on pedestals at AXPONA 2026. They showed how they’re being used across two very different amplifier designs.

The 300B plays two roles here. In the 91E integrated amplifier, it’s the output tube. In the 100E monoblock, it shifts upstream and handles the mid stage. The spotlight in the 100E belongs to the 308B, which drives the final output stage and does the heavy lifting.

western-electric-91e-tube-amp

The 91E integrated amplifier, priced at $8,000, uses a pair of 300B tubes to deliver roughly 20 watts per channel. That number will not impress anyone chasing big power, but that’s not the point. Western Electric built flexibility into the design with interchangeable output modules for 4, 8, and 16 ohm loads. That opens the door to a wide range of loudspeakers, although higher sensitivity designs will make the most sense here.

Connectivity is more modern than you might expect. There are moving coil and moving magnet phono stages built-in, along with RCA inputs for a tuner, CD player, and additional analog sources. On the digital side, the 91E includes Bluetooth, USB, and Ethernet, with an ESS DAC handling up to 16-bit/96 kHz for incoming Bluetooth and USB signals.

Advertisement

Outputs include line out and pre out for system integration, plus dual sets of binding posts. It’s a tube integrated that leans into flexibility without pretending to be something it isn’t. No apps, no ecosystem pitch, and definitely not a Class D network amplifier. It just makes music and throws off enough heat to remind you that winter is coming.

100E Monoblocks and A2 Loudspeakers: Open Window Listening

The rest of what Western Electric brought to AXPONA 2026 leans newer, and in some cases, still short on published detail. The 100E monoblocks were impossible to miss. Physically and visually, they owned the room.

western-electric-100e-tube-amp

Each chassis is built around that 14 inch tall 308B, and yes, it glows in a way that will stop you in your tracks. Subtle is not part of the brief here. Rated at 160 watts per amplifier, the 100E is doing something few tube designs attempt, delivering serious output from a single ended architecture that looks more like industrial art than consumer audio.

Size is part of the story. At roughly 32 inches deep and close to 22 inches wide, these amplifiers are going to dictate the layout of most rooms. Weight is estimated around 160 pounds each, so once they are in place, they are staying there. This is not gear you casually move around on a Saturday afternoon.

Advertisement

The topology is just as unconventional. A 12AT7 handles the input stage, a 300B is used in the mid stage, and the 308B takes over as the output tube. Seeing a 300B in that middle role tells you everything about the scale of this design. Nothing about it is typical.

Heat is not an afterthought either. With plate voltage around 1500 volts and plate dissipation exceeding 220 watts, these amplifiers are going to generate serious thermal output. Ventilation is not optional, especially in smaller rooms.

Advertisement. Scroll to continue reading.

The 100E is impedance matched to the 91E, so building a complete Western Electric system is straightforward if you are willing to commit. At $75,000 each, the monoblocks sit in a very different bracket than the 91E, and the new A2 loudspeakers at $70,000 per pair make it clear this is a full system play, not just a statement amplifier. You accountant would like a word.

Advertisement

Big System Energy, Small Room Reality

The A2 loudspeakers from Western Electric are a hybrid design built around air motion transformer tweeters and midrange drivers, paired with dual dynamic bass drivers. The goal is broad, even coverage with a 180 degree dispersion pattern. This is meant to fill a room, not lock one listener into a single chair.

That ambition ran into a familiar problem at AXPONA 2026. The hotel room was simply too small. The A2 sounded like it wanted more space, more air, more distance to breathe. Instead, it was confined to a setup that forced it to hold back. This is the kind of speaker that needs a larger ballroom or dedicated listening space to make sense.

Feeding the system was the new WE 203C CD player, priced at $12,000. It served as the primary source for a system that, all in, lands around $310,000 before you even start thinking about cables or adding a turntable.

The Bottom Line

What stuck with me most from Western Electric at AXPONA 2026 wasn’t the big glass. It was the small signal tubes quietly doing their job up front. The 12AX7 in the first stage of the 100E may not draw a crowd, but it matters more than it looks.

Advertisement

Western Electric is ramping up production of the 12AX7 and aiming to expand into other small signal tubes as well. If that includes something like a 6SN7, a lot of people are going to pay attention. This is not a niche development. It’s a structural shift. For the first time in decades, American amplifier manufacturers could have a domestic source for one of the most widely used tubes in both hi fi and instrument amplifiers.

For years, “made in the USA” has come with an asterisk. Chassis, transformers, assembly. Sure. Tubes? Usually sourced from Russia, Slovakia, or China. Bringing small signal tube production back to the U.S. changes that conversation in a real way.

With the factory in Rossville, GA, USA only a few hours from me, there’s a good chance to see this firsthand. And if that happens, it’s worth documenting. People should see how this is being done, not just read about it.

Honestly, if Western Electric had shown nothing but that 12AX7 effort, it still would have been one of the most important rooms at the show.

Advertisement

For more information: westernelectric.com

Source link

Advertisement
Continue Reading

Tech

SpaceX Strikes Deal With Coding Startup Cursor For $60 Billion

Published

on

An anonymous reader quotes a report from the New York Times: SpaceX, Elon Musk’s rocket and satellite company, said on Tuesday that it had struck a deal with the artificial intelligence start-up Cursor that could result in its acquiring the young company for $60 billion. SpaceX is making the deal just as it prepares to go public in what is likely to be one of the largest initial public offerings ever. In a social media post, SpaceX said the combination with Cursor, which makes code-writing software, would “allow us to build the world’s most useful” A.I. models.

SpaceX added that the agreement gave it the option “to acquire Cursor later this year for $60 billion or pay $10 billion for our work together.” It is unclear if the companies plan to consummate the deal before or after SpaceX’s I.P.O., which could happen as early as June. […] Cursor, which has raised more than $3 billion in funding, was founded in 2022 and made waves as a fast-growing A.I. start-up. It was under pressure in recent months after OpenAI and Anthropic announced competing code-writing products that were embraced by tech companies. Cursor had been in talks to raise funding in recent weeks.

Source link

Continue Reading

Tech

Framework Has a Better, More Take-Apartable Laptop

Published

on

Framework, the company that makes laptops designed for optimal repairability, announced a new version of its main product, a 13-inch-screen laptop. It’s called the Framework Laptop 13 Pro, and it has far better battery life, a touchscreen, and a haptic touchpad, and is fitted with Intel processors.

At an event in San Francisco today, Framework CEO Nirav Patel showed off the company’s new tech, opening with a joke about making Framework AI—something the company is very much not doing. Framework’s whole thing, after all, is aiming to give users control over the physical tech they use.

“That industry is fighting for you to own nothing, and they own everything,” Patel said about the AI industry. “We’re fighting for a future where you can own everything and be free.”

Framework used the event to detail other updates coming to its 16-inch laptop. It also showed off previews of an official developer kit and a wireless keyboard for controlling your rig from the couch.

Advertisement

Framework 13 Pro

Image may contain Electronics Hardware Computer Hardware and QR Code

The Framework Laptop 13 Pro.

Courtesy of Framework

As the name implies, the 13 Pro is a step up from the company’s last version, the Framework 13. It’s also pricier, starting at $1,199 for a DIY Edition that requires assembling the computer yourself. Prebuilt units start at $1,499 but can be upgraded with more features. Framework says it will start shipping the 13 Pro in June.

Framework’s signature move for its products is the ability to take the thing apart. The 13 Pro is made with that ethos in mind, so its parts can be easily swapped out, upgraded, or replaced. Four Thunderbolt 4 interfaces let you pick which ports (USB-C, HDMI, etc.) you want and then choose where to place them. Framework says it planned the laptop with cross-generation compatibility in mind, so current Framebook 13 laptop owners will be able to use new 13 Pro parts like the mainboard, display, and battery, and put them into their existing machine.

The big changes in the guts of the 13 Pro come from Framework’s shift away from using an AMD processor to Intel’s Core Ultra Series 3 processors, which Framework described in its press release as “just insanely efficient.” That efficiency, along with a bigger battery, translates to more than 20 hours of battery life while streaming 4K Netflix videos, at least that’s the claim. That’s almost 12 hours longer than the Framework 13.

Advertisement
Image may contain Computer Electronics Laptop Pc Computer Hardware Hardware Monitor and Screen

Courtesy of Framework

Image may contain Computer Electronics Laptop Pc Computer Hardware Hardware Monitor and Screen

Courtesy of Framework

Source link

Continue Reading

Tech

Report: New Apple CEO's biggest challenge will be retiring leadership & regular churn

Published

on

Industry-high employee retention levels and executives holding their posts for decades are apparently going to be significant hurdles for incoming Apple CEO John Ternus.

John Ternus in a blue T-shirt walks across a dark background, winking and smiling slightly, with his reflection visible behind him in a glossy vertical surface
John Ternus can’t invent a time machine fast enough, so he’s going to have to pick new Apple leadership, eventually. Image source: Apple

There’s been a trend in tech reporting that attempts to make every employment change from the top down a calamitous occasion. Whether it’s a dozen engineers out of thousands leaving or executives being poached with insane pay packages, every departure is treated as a serious problem.
I’m still not entirely sure why.
Continue Reading on AppleInsider | Discuss on our Forums

Source link

Continue Reading

Tech

Hershey’s Electric Railway in Cuba

Published

on

Why does a chocolatier build a railroad? For Milton S. Hershey, it was a logical response to a sugar shortage brought on by World War I. The Hershey Chocolate Co. was by then a chocolate-making powerhouse, having refined the automation and mass production of its products, including the eponymous Hershey’s Milk Chocolate Bar and the bite-size Hershey’s Kiss. To satisfy its many customers, the company needed a steady supply of sugar. Plus, it wanted a way to circumvent the American Sugar Refining Co., also known as the Sugar Trust, which had a virtual monopoly on sugar processing in the United States.

Why Did Hershey Build an Electric Railroad in Cuba?

Beginning in 1916, Hershey looked to Cuba to secure his sugar supply. According to historian Thomas R. Winpenny, the chocolate magnate had a “personal infatuation” with the lush, beautiful island. What’s more, U.S. business interests there were protected by a treaty known as the Platt Amendment, which made Cuba a satellite state of the United States.

Like many industrialists of the day, Hershey believed in vertical integration, and the company’s Cuban operation eventually expanded to include five sugar plantations, five modern sugar mills, a refinery, several company towns, and an oil-fired power plant with three substations to run it all.

A 1943 rail pass for the Hershey Cuban Railway A 1943 rail pass entitled the holder to travel on all ordinary passenger trains of the Hershey Electric Railway. Hershey Community Archives

The company also built a railroad. To maximize the sugar yield, the cane needed to be ground promptly after being cut, and the rail system offered an efficient means of transporting the cane to the mills, and ensured that the mills operated around the clock during the harvest. By 1920, one of Hershey’s three main sites was processing 135,000 tonnes of cane, yielding 14.4 million kilograms of sugar.

Advertisement

Initially, the Hershey Cuban Railway consisted of a single 56-kilometer-long standard gauge track on which ran seven steam locomotives that burned coal or oil. But due to the high cost of the imported fuel and the inefficiency of the locomotives, Hershey began electrifying the line in 1920. Although it was the first electrified train in Cuba, rail lines in Europe and the United States were already being electrified.

In addition to powering the various Hershey entities, the generating station supplied Matanzas and the smaller towns with electricity. F.W. Peters of General Electric’s Railway and Traction Engineering Department published a detailed account of the system in the April 1920 General Electric Review.

Hershey’s Company Towns

The company town of Central Hershey became the headquarters for Hershey’s Cuba operations. (“Central” is the Cuban term for a mill and the surrounding settlement.) It sat on a plateau overlooking the port of Santa Cruz del Norte, about halfway between Havana and Matanzas in the heart of Cuba’s sugarcane region.

Hershey imported the industrial utopian model he had established in Hershey, Penn., which was itself inspired by Richard and George Cadbury’s Bournville Village outside Birmingham, England.

Advertisement

Elderly man in a suit sits at a polished desk with papers in a dim office. The chocolate magnate Milton S. Hershey had a “personal infatuation” with Cuba.Underwood Archives/Getty Images

In Cuba as in Pennsylvania, Hershey’s factory complex was complemented by comfortable homes for his workers and their families, as well as swimming pools, baseball fields, and affordable medical clinics staffed with doctors, nurses, and dentists. Managers had access to a golf course and country club in Central Hershey. Schools provided free education for workers’ children.

Milton Hershey himself had very little formal education, and so in 1909 he and his wife, Catherine, established the Hershey Industrial School in Hershey, Penn. There, white, male orphans received an education until they were 18 years old. Now known as the Milton Hershey School, the school has broadened its admission criteria considerably over the years.

Hershey duplicated this concept in the Cuban company town of Central Rosario, founding the Hershey Agricultural School. The first students were children whose parents had died in a horrific 1923 train accident on the Hershey Electric Railway. The high-speed, head-on collision between two trains killed 25 people and injured 50 more.

Milton Hershey was a generous philanthropist, and by most accounts he truly cared for his employees and their welfare, and yet his early 20th-century paternalism was not without fault. He was a fierce opponent of union activity, and any hard-won pay increases for workers often came at the expense of profit-sharing benefits. Like other U.S. businessmen in Cuba, Hershey employed migrant seasonal labor from neighboring Caribbean islands, undercutting the wages of local workers. Historians are still wrangling with how to capture the long-lasting effects of U.S. economic imperialism on Cuba.

Advertisement

Can the Hershey Electric Railway Be Revived?

Hershey continued to acquire new sugar plantations in Cuba throughout the 1920s, eventually owning about 24,300 hectares and leasing another 12,000 hectares. In 1946, a year after Milton Hershey’s death and amid growing political uncertainty on the island, the company sold its Cuban interests to the Cuban Atlantic Sugar Co. In addition to Hershey’s sugar operations, the sale included a peanut oil plant, four electric plants, and 404 km of railroad track plus locomotives and train cars.

An old red electric passenger train car sitting on the tracks. Service on the Hershey Electric Railway in Cuba continued into at least the 2010s but became increasingly sporadic, with aging equipment like this car at the Central Hershey station. Hershey Community Archives

The Central Hershey sugar refinery continued to operate even after the Cuban Revolution but eventually closed in 2002. Passenger service, meanwhile, continued on the Hershey Electric Railway, albeit sporadically: By 2012, there were only two trips a day between Havana and Matanzas. This video, from 2013, gives a good sense of the route:

A colleague of mine who studies Cuban history told me that in his travels to the country over almost 30 years, he has never been able to ride the Hershey electric train. It was always out of service or had restricted service due to the island’s chronic electricity shortages, which have only gotten worse in recent years. I’ve been trying to find out if any part of the line is still operating. If you happen to know, please add a comment below.

Advertisement

Photo of a stopped train, with passengers standing in the doorways looking down the track. Cuba’s frequent power outages make it difficult to operate the Hershey Electric Railway. In this 2009 photo, passengers await the restoration of electricity so they can continue their journey.Adalberto Roque/AFP/Getty Images

A 2024 analysis of the economic potential and challenges of reactivating Cuba’s Hershey Electric Railway noted that an electric railway could be a hedge against climate change and geopolitical factors. But it also acknowledged that frequent power outages and damaged infrastructure argue against reactivating the electrified railway, and it favored the diesel engines used on most of Cuba’s rail network.

Cuba has been mostly off-limits to U.S. tourists for my entire life, but it was one of my grandmother’s favorite vacation spots. I would love to imagine a future where political ties are restored, the power grid is stabilized, and the Hershey Electric Railway is reopened to the Cuban public and to curious visitors like me.

Part of a continuing series looking at historical artifacts that embrace the boundless potential of technology.

An abridged version of this article appears in the May 2026 print issue as “This Chocolate Empire Ran on Electric Rails.”

Advertisement

From Your Site Articles

Related Articles Around the Web

Source link

Advertisement
Continue Reading

Tech

Xbox Game Pass just got cheaper, and I’m not complaining about the pivot it comes with

Published

on

If there’s one thing the gaming industry loves more than hype cycles, it’s a good ol’ value shake-up. And right now, Xbox Game Pass is right in the middle of one. Microsoft has officially cut prices across Game Pass tiers, making the service easier on the wallet at a time when subscription fatigue is very, very real. But, as always, there’s a twist. And it’s a big one.

The price drop that comes with a twist

Let’s get the numbers out of the way first, because they’re genuinely compelling. Xbox Game Pass Ultimate has dropped from $29.99 to $22.99 per month, while PC Game Pass now costs $13.99 instead of $16.49. That’s not pocket change. Over a year, that’s a noticeable saving, especially for players juggling multiple subscriptions.

Game Pass Ultimate has become too expensive for too many players. Starting today, we’re dropping the price from $29.99 to $22.99/month.
Future Call of Duty titles will no longer join Game Pass Ultimate on day one. They will join this tier the following holiday after launch (about…

— Asha (@asha_shar) April 21, 2026

But here’s the catch. New entries from Call of Duty are no longer launching day one on the service. Instead, they’ll arrive much later, roughly a year after release. Just to be clear, older Call of Duty titles aren’t going anywhere, so the back catalog remains intact. What’s gone is the instant access to one of gaming’s biggest annual releases, which, let’s be honest, was a huge part of Game Pass’s flex.

Advertisement

The community is… conflicted

The reaction? Exactly as chaotic as expected. There’s a sizable chunk of genuinely relieved players. You see, not everyone subscribes to Game Pass for Call of Duty, and for those users, this feels like getting a discount without losing anything meaningful. If COD wasn’t part of the weekly rotation anyway, the lower price is a straight-up win.

Then there’s the other side. For a lot of players, Game Pass built its reputation on the idea of “pay once, play everything day one.” Losing a flagship franchise from that promise feels like a crack in the foundation. It’s not just about Call of Duty; it’s about what this could mean going forward.

Microsoft just lowered Game Pass prices while quietly removing Call of Duty Day One launches.

They’re charging you less for a worse product and calling it ‘a response to feedback’.

Don’t fall for the trap.

Advertisement

It’s a downgrade disguised as marketing. pic.twitter.com/xn7dFQmcvw

— Yorch Torch Games (@YorchTorchGames) April 21, 2026

And then comes the third wave of takes, arguably the most interesting. Some fans are now asking if Microsoft should go even further and start trimming other bundled perks like EA Play or Fortnite Crew to reduce prices even more.

The thinking is simple. If removing one expensive piece lowers the cost, why not customize the whole thing?

Advertisement

Why Microsoft drew the line here

Here’s where the conversation shifts from emotional to practical. Call of Duty isn’t just another title in a catalog. It’s a yearly blockbuster with a massive, loyal player base that often buys the game regardless of subscriptions. That creates a strange value mismatch. Either players were going to pay for it anyway, or they didn’t care about it much in the first place.

Xbox gave up more than $300 million in sales of Call of Duty on consoles and PCs last year – Bloomberg

From Microsoft’s perspective, that makes it an incredibly expensive inclusion with limited upside. Worse, it likely eats into direct sales, turning what should be a revenue driver into a cost center. And while some fans are calling for more cuts, like removing EA Play, it’s not so simple. Game Pass thrives on being an all-in-one ecosystem. Start unbundling too much, and it risks turning into a fragmented, pick-and-pay service that loses its identity.

With Microsoft even exploring bundling services like Netflix into Game Pass, stripping away more perks would start to chip away at its whole “all-in-one” appeal. At that point, it’s not a powerhouse bundle anymore; it’s just a menu with items missing.

The End of “Too Good to Be True”?

For years, Xbox Game Pass felt like a cheat code. Day-one AAA games, a massive library, and a price that almost didn’t make sense. But eventually, reality caught up. Keeping a giant like Call of Duty in that mix from day one was always going to be expensive, and more importantly, unsustainable.

Advertisement

And honestly, this change feels like Microsoft finally admitting that. Instead of hiking prices even further, they’ve trimmed one of the costliest perks and made the service more accessible again. It’s not perfect, and sure, some fans will miss the old days, but this feels less like a downgrade and more like a smart reset. Not as flashy, but a lot more built to last.

Source link

Continue Reading

Trending

Copyright © 2025