The deep learning revolution has a curious blind spot: the spreadsheet. While Large Language Models (LLMs) have mastered the nuances of human prose and image generators have conquered the digital canvas, the structured, relational data that underpins the global economy — the rows and columns of ERP systems, CRMs, and financial ledgers — has so far been treated as just another file format similar to text or PDFs.
That’s left enterprises to forecast business outcomes using the typical bespoke, labor-intensive data science process of manual feature engineering and classic machine learning algorithms that predate modern deep learning.
Emerging from stealth, the company is debuting NEXUS, a Large Tabular Model (LTM) designed to treat business data not as a simple sequence of words, but as a complex web of non-linear relationships.
Advertisement
Fundamental co-founders Jeremy Fraenkel, Annie Lamont, and Gabriel Suissa. Credit: Fundamental
The tech: moving beyond sequential logic
Most current AI models are built on sequential logic — predicting the next word in a sentence or the next pixel in a frame.
However, enterprise data is inherently non-sequential. A customer’s churn risk isn’t just a timeline; it’s a multi-dimensional intersection of transaction frequency, support ticket sentiment, and regional economic shifts. Existing LLMs struggle with this because they are poorly suited to the size and dimensionality constraints of enterprise-scale tables.
“The most valuable data in the world lives in tables and until now there has been no good foundation model built specifically to understand it,” said Jeremy Fraenkel, CEO and Co-founder of Fundamental.
Advertisement
In a recent interview with VentureBeat, Fraenkel emphasized that while the AI world is obsessed with text, audio, and video, tables remain the largest modality for enterprises. “LLMs really cannot handle this type of data very well,” he explained, “and enterprises currently rely on very old-school machine learning algorithms in order to make predictions.”
NEXUS was trained on billions of real-world tabular datasets using Amazon SageMaker HyperPod. Unlike traditional XGBoost or Random Forest models, which require data scientists to manually define features — the specific variables the model should look at — NEXUS is designed to ingest raw tables directly.
It identifies latent patterns across columns and rows that human analysts might miss, effectively reading the hidden language of the grid to understand non-linear interactions.
The tokenization trap
A primary reason traditional LLMs fail at tabular data is how they process numbers. Fraenkel explains that LLMs tokenize numbers the same way they tokenize words, breaking them into smaller chunks. “The problem is they apply the same thing to numbers. Tables are, by and large, all numerical,” Fraenkel noted. “If you have a number like 2.3, the ‘2’, the ‘.’, and the ‘3’ are seen as three different tokens. That essentially means you lose the understanding of the distribution of numbers. It’s not like a calculator; you don’t always get the right answer because the model doesn’t understand the concept of numbers natively.”
Advertisement
Furthermore, tabular data is order-invariant in a way that language is not. Fraenkel uses a healthcare example to illustrate: “If I give you a table with hundreds of thousands of patients and ask you to predict which of them has diabetes, it shouldn’t matter if the first column is height and the second is weight, or vice versa.”
While LLMs are highly sensitive to the order of words in a prompt, NEXUS is architected to understand that shifting column positions should not impact the underlying prediction.
However, Fraenkel distinguishes Fundamental’s work as operating at a fundamentally different layer: the predictive layer. “What they are doing is essentially at the formula layer—formulas are text, they are like code,” he said. “We aren’t trying to allow you to build a financial model in Excel. We are helping you make a forecast.”
Advertisement
NEXUS is designed for split-second decisions where a human isn’t in the loop, such as a credit card provider determining if a transaction is fraudulent the moment you swipe.
While tools like Claude can summarize a spreadsheet, NEXUS is built to predict the next row—whether that is an equipment failure in a factory or the probability of a patient being readmitted to a hospital.
Architecture and availability
The core value proposition of Fundamental is the radical reduction of time-to-insight. Traditionally, building a predictive model could take months of manual labor.
“You have to hire an army of data scientists to build all of those data pipelines to process and clean the data,” Fraenkel explained. “If there are missing values or inconsistent data, your model won’t work. You have to build those pipelines for every single use case.”
Advertisement
Fundamental claims NEXUS replaces this entire manual process with just one line of code. Because the model has been pre-trained on a billion tables, it doesn’t require the same level of task-specific training or feature engineering that traditional algorithms do.
As Fundamental moves from its stealth phase into the broader market, it does so with a commercial structure designed to bypass the traditional friction of enterprise software adoption.
The company has already secured several seven-figure contracts with Fortune 100 organizations, a feat facilitated by a strategic go-to-market architecture where Amazon Web Services (AWS) serves as the seller of record on the AWS Marketplace.
This allows enterprise leaders to procure and deploy NEXUS using existing AWS credits, effectively treating predictive intelligence as a standard utility alongside compute and storage. For the engineers tasked with implementation, the experience is high-impact but low-friction; NEXUS operates via a Python-based interface at a purely predictive layer rather than a conversational one.
Advertisement
Developers connect raw tables directly to the model and label specific target columns—such as a credit default probability or a maintenance risk score—to trigger the forecast. The model then returns regressions or classifications directly into the enterprise data stack, functioning as a silent, high-speed engine for automated decision-making rather than a chat-based assistant.
The societal stakes: beyond the bottom line
While the commercial implications of demand forecasting and price prediction are clear, Fundamental is emphasizing the societal benefit of predictive intelligence.
The company highlights key areas where NEXUS can prevent catastrophic outcomes by identifying signals hidden in structured data.
By analyzing sensor data and maintenance records, NEXUS can predict failures like pipe corrosion. The company points to the Flint water crisis — which cost over $1 billion in repairs — as an example where predictive monitoring could have prevented life-threatening contamination.
Advertisement
Similarly, during the COVID-19 crisis, PPE shortages cost hospitals $323 billion in a single year. Fundamental argues that by using manufacturing and epidemiological data, NEXUS can predict shortages 4-6 weeks before peak demand, triggering emergency manufacturing in time to save lives.
On the climate front, NEXUS aims to provide 30-60 day flood and drought predictions, such as for the 2022 Pakistan floods which caused $30 billion in damages.
Finally, the model is being used to predict hospital readmission risks by analyzing patient demographics and social determinants. As the company puts it: “A single mother working two jobs shouldn’t end up back in the ER because we failed to predict she’d need follow-up care.”
Performance vs. latency
In the enterprise world, the definition of better varies by industry. For some, it is speed; for others, it is raw accuracy.
Advertisement
“In terms of latency, it depends on the use case,” Fraenkel explains. “If you are a researcher trying to understand what drugs to administer to a patient in Africa, latency doesn’t matter as much. You are trying to make a more accurate decision that can end up saving the most lives possible.”
In contrast, for a bank or hedge fund, even a marginal increase in accuracy translates to massive value.
“Increasing the prediction accuracy by half a percent is worth billions of dollars for a bank,” Fraenkel says. “For different use cases, the magnitude of the percentage increase changes, but we can get you to a better performance than what you have currently.”
Ambitious vision receives big backing
The $225 million Series A, led by Oak HC/FT with participation from Salesforce Ventures, Valor Equity Partners, and Battery Ventures, signals high-conviction belief that tabular data is the next great frontier.
Advertisement
Notable angel investors including leaders from Perplexity, Wiz, Brex, and Datadog further validate the company’s pedigree.
Annie Lamont, Co-Founder and Managing Partner at Oak HC/FT, articulated the sentiment: “The significance of Fundamental’s model is hard to overstate—structured, relational data has yet to see the benefits of the deep learning revolution.”
Fundamental is positioning itself not just as another AI tool, but as a new category of enterprise AI. With a team of approximately 35 based in San Francisco, the company is moving away from the bespoke model era and toward a foundation model era for tables.
“Those traditional algorithms have been the same for the last 10 years; they are not improving,” Fraenkel said. “Our models keep improving. We are doing the same thing for tables that ChatGPT did for text.”
Advertisement
Partnering with AWS
Through a strategic partnership with Amazon Web Services (AWS), NEXUS is integrated directly into the AWS dashboard. AWS customers can deploy the model using their existing credits and infrastructure. Fraenkel describes this as a “very unique agreement,” noting Fundamental is one of only two AI companies to have established such a deep, multi-layered partnership with Amazon.
One of the most significant hurdles for enterprise AI is data privacy. Companies are often unwilling to move sensitive data to a third-party infrastructure.
To solve this, Fundamental and Amazon achieved a massive engineering feat: the ability to deploy fully encrypted models—both the architecture and the weights—directly within the customer’s own environment. “Customers can be confident the data sits with them,” Fraenkel said. “We are the first, and currently only, company to have built such a solution.”
Fundamental’s emergence is an attempt to redefine the OS for business decisions. If NEXUS performs as advertised—handling financial fraud, energy prices, and supply chain disruptions with a single, generalized model—it will mark the moment where AI finally learned to read the spreadsheets that actually run the world. The Power to Predict is no longer about looking at what happened yesterday; it is about uncovering the hidden language of tables to determine what happens tomorrow.
The race to regulate artificial intelligence infrastructure has arrived at a crossroads in Washington state.
After weeks on the sidelines, Microsoft publicly declared its opposition to a controversial state bill that aims to rein in the environmental and economic impacts of the massive data centers powering the AI boom.
Labeling the proposed regulations “uniquely anti-competitive,” Microsoft’s senior director of Washington state government affairs, Lauren McDonald, urged Senate leaders on Friday evening to reconsider key features of House Bill 2515.
“We respectfully urge the committee not to advance the bill without significant changes,” McDonald said in testimony before the Senate Committee on Ways & Means.
The bill aims would require utilities and data center companies to create agreements that protect rate payers from increased power costs and brings transparency to the environmental impacts of the facilities.
Advertisement
Microsoft, which operates roughly 30 data centers in Washington alone, plans to spend up to $140 billion on global infrastructure this year, while has Amazon committed to spending $200 billion this year on capital expenditures worldwide, predominately for its Amazon Web Services cloud business.
Elected officials, communities and tribal leaders nationwide are increasingly anxious about data center deployments driving up electricity rates with their power-hungry electronics and consuming vast quantities of water to cool the devices. President Trump and other officials are pursuing commitments to ensure tech companies protect ratepayers from price increases.
Tech companies, labor organizations and municipalities that have seen job creation and the benefits of taxes generated by the facilities have pushed back against the regulations. Microsoft President Brad Smith last month launched a community-focused initiative pledging to bear its own electrical costs and emphasizing its support of local taxes.
At the same time, the Seattle Times reported today that Microsoft and Amazon have been working aggressively behind the scenes to weaken HB 2515, and that Amazon is currently “neutral” on the bill. The company, which has historically concentrated its Pacific Northwest data center footprint in Oregon, has not testified publicly on the legislation.
Advertisement
The legislation
HB 2515 has passed the House and is edging closer to a vote from the full Senate — though tech sector opposition could sink the measure. The bill is shifting and evolving with different amendments and new language under consideration. The legislation’s main components include:
Ratepayer Protection: Utilities must create tariffs or policies that insulate ratepayers from short- and long-term financial risks associated with data center energy use.
Transparency: Date centers must publish annual reports on water, energy, refrigerant use, and air pollution, with a comprehensive sustainability report every three years.
Resource Forecasting: Data centers must coordinate with regulators and utilities on energy load forecasting.
Carbon Credits: The availability of free carbon credits to meet state regulations would be limited.
Clean Energy Certification: Facilities that open or expand after July 1, 2026, must certify their use of new clean energy, using 80% clean power by 2030 and all clean energy by 2045.
MacDonald raised concerns at the hearing about the legislation preventing a data center in Malaga, Wash., that was built in 2023 from being able to open later this year, presumably due to the clean energy requirements.
One particularly controversial piece — which was not included in the version of the bill that passed the House but is still being discussed — requires data centers to curtail or stop drawing power from the grid in energy emergency situations. Opponents said the rule could disable facilities that support essential operations such as access to electronic medical records or tech to dispatch first responders.
Seeking statewide standards
Proponents of HB 2515 frame the measure as a necessary step to put rules in place for a sector that is rapidly expanding, stoked by the soaring use of artificial intelligence.
“The game is changing on data centers before our very eyes,” Zach Baker, policy director for the nonprofit NW Energy Coalition, told lawmakers. “The common sense guardrails in this bill are needed to protect affordability, grid reliability and the environment.”
Advertisement
Washington is currently home to approximately 126 data centers and related facilities. Microsoft has the most data centers in the state out of any company, while Sabey Data Centers has eight of the facilities, according to the research firm Baxtel.
Rep. Beth Doglio, D-Olympia, lead sponsor of the legislation, earlier this month testified that 16 new data center projects are planned for Walla Walla and an expansion underway in Vantage is tapping new gas-powered energy.
The bill would create a statewide standard for utilities siting new facilities in their communities, she said. “I just hope that we are able to make sure that we do data centers right in this state.”
I try to keep my use of cliches to a minimum but damn… they just don’t make ‘em like this anymore. Ben-Hur is a sweeping, nigh-four-hour saga of vengeance vs. virtue, set against the rise of Christianity. The story follows larger-than-life hero Ben-Hur (who else but Charlton Heston?), a Judean prince betrayed by his best friend and doomed to a life of brutal servitude. Through his unbreakable spirit and unimaginable grit, he survives to seek retribution, only to find redemption as his hatred is eclipsed by the parallel life and sacrifice of Jesus of Nazareth.
At the time of its release in 1959, this was reportedly the most expensive film ever made, surpassing the previous champ, Heston’s The Ten Commandments from three years earlier. The arena for the chariot race was the largest film set ever built, covering 18 acres and requiring thousands of extras to fill. The production techniques were on the cutting edge as well, a deliberate middle finger at the burgeoning television medium, combining special anamorphic lenses with 65mm film to bring audiences an ultra-wide 2.76:1 aspect ratio with exceptional image clarity and precision. When projected in 70mm, that extra 5mm was reserved for the movie’s six-track stereophonic sound, quite different from what we know today yet discrete and high-fidelity. Its monumental financial success likely saved the studio, MGM, from ruin, and its record 11 Oscars represented a win in almost every nominated category, save for its adapted screenplay.
Apparently no longer satisfied with the previous 8K scan of the original 65mm camera negative, Warner undertook a brand-new one for this 4K Ultra HD debut, yielding one of the all-time great masters of the format. Cinematographer Robert Surtees’ framing captures all the spectacle without cropping or the need for excessive panning, while the exceptional depth of focus keeps the actors fully present in three-dimensional space. The costumes are a celebration of Technicolor, most frequently the Roman reds but sumptuous blue and purple cloaks as well. The chariot race is destined to be played over and over for system demos, and the awe-inspiring scale on display is impossible to overstate, although the thicker horizontal black bars top and bottom mean that we’re using less of our screen’s real estate than usual. (TV vs. cinema: “It goes on. The race… is not over!”) The movie is spread across two discs–100GB for the longer first half plus a BD66–assuring a high bitrate.
The Dolby Atmos remix is sonically spectacular as well, with a generous spread that includes remarkably active height channels. The Romans love their trumpets and their brassy twang has a way of filling a room, while the trebly jingling of a jailer’s keys wafts through the air in several scenes. Below decks of the galley with the rowers, we have a palpable sense of the deck above and the water all around. The four-horse teams pulling the chariots shook the walls of my home theater with their thundering hoofbeats, a thrill further amplified by the 360-degree cheering of the enormous, enthusiastic crowd. And if you’re a fan who thought that Miklós Rózsa’s hieratic musical score hit before, just you wait until you hear it remixed and remastered in this immersive new rendition, complete with overture, intermission and entr’acte. (You’ll find it isolated on an alternate channel in Dolby Digital 2.0 as well.) The original-release six-track stereo has been carried over here too, as a 5.1 option, noted on the packaging as 5.0.
Ben-Hur 4K Ultra HD Back Cover (2026)
Clearly the emphasis with this three-disc release is audio/video quality above all, but completists will notice that a handful of significant extras from the 2011 “50th Anniversary” Blu-ray has gone missing. There’s still plenty here to pick through on the bundled HD platter: screen tests with Leslie Nielsen and others, an hour-long “making of” and a largely anecdotal profile of Heston. A couple of short, lightweight new featurettes have been added, so kudos for the effort. The archival commentary track is edited together between separate sessions with the star and historian T. Gene Hatcher, which keeps it moving and avoids long stretches of silence.
Also available as a SteelBook ($89.99 at Amazon), Ben-Hur is a landmark of filmmaking that genuinely deserves its many accolades, and Warner’s new 4K edition likewise deserves a spot in your library.
The U.S. Navy got its official start on October 13, 1775, when the Continental Congress formally established the first Continental Navy. The first four ships in this newly formed naval force were the Alfred, the Columbus (both 24-gun frigates), the Andrew Doria, and the Cabot (14-gun brigantines). Three schooners — the Hornet, Wasp, and Fly – quickly followed them into this so-called “fleet.” Today, the Navy has approximately 296 battle force ships ready to deploy at a moment’s notice.
However, this number changes based on the shifting global political climate at any given time. Some estimates claim the Navy has as many as 472 total “assets,” of which 11 are mighty aircraft carriers, around which a strike group (CSG) is formed. A typical CSG consists of one carrier, two guided-missile cruisers, two anti-aircraft warships, and one or two anti-submarine destroyers or frigates. These vessels can remain deployed at sea for extended periods, depending on their mission.
Advertisement
Determining the Navy’s longest deployed ship isn’t as straightforward as one might think. Well, it is, but it’s not telling the full story. Technically, the current single-longest deployment belongs to the aircraft carrier USS Midway (CVA-41), which has since become a museum and can be visited in San Diego, California. Between April 10, 1972, and March 3, 1973, it spent 332 days at sea during the Vietnam War. However, when talking about these deployment records, many sources include a caveat along the lines of “since 1964,” with deployments by ships in the modern era being referred to as occurring in the post-Cold War or post-Vietnam era.
Advertisement
There might be a new winner
Now, here’s the rest of the story. Trailing closely behind Midway’s rooster tale is the USS Coral Sea (CVA-43). According to Naval History and Heritage Command (an official U.S. Navy website), the ship spent 331 days at sea. However, the independent news service for the U.S. Naval Institute claims it was only 329. Whatever the number, it still spent 11 months cruising 105,000 miles while deployed in the Western Pacific, fighting the Vietnam War.
As for the modern era, the CSG led by the Nimitz-class nuclear-powered USS Abraham Lincoln (CVN-72) was deployed on April 1, 2019, from Norfolk, Virginia. It didn’t return to port in San Diego, California, until January 20, 2020 — just as the COVID-19 pandemic started to rear its ugly head. Its 10-month, 295-day deployment is considered the longest — in the post-Cold War era.
What about the saga of the USS Nimitz (CVN-68), which for 341 days sailed through the Persian Gulf and the South China Sea during the pandemic? Its deployment fittingly began on April 1, 2020, and didn’t return to port until February 26, 2021. That would indeed be historical, except most sources don’t count the extra days it was forced to sequester at sea due to the pandemic – above and beyond its official 263-day deployment. All those records might soon be in jeopardy, though. The USS Gerald R. Ford (CVN-78), the world’s largest aircraft carrier, has been at sea since June 24, 2023 (240 days and counting). President Donald Trump recently sent it to the Middle East as tensions between Iran and the U.S. escalate, which could ultimately allow the Ford to shatter the record. Only time will tell.
There’s been so much happening in School Spiritsseason 3 that I’ve hardly had time to think.
Kyle’s (Ari Dalbert) spirit remains safe in the ghost world after the shooting, but the same cannot be said for Van Heidt (Michael Adamthwaite). Not only that, but we also finally know the identity of White Eyes, which could come to a chaotic head next week… and don’t even get me started on poor Simon (Kristian Ventura).
Jennifer Tilly’s villainous superintendent Dr. Deborah Hunter-Price also wants to shut down the school altogether… so does that mean the upcoming episode 8 is the final-ever outing for Paramount+‘s supernatural hit?
Advertisement
Don’t be so sure. While nothing has been confirmed at this stage. Ventura has hinted that School Spirits season 4 could easily be ours in the foreseeable future.
‘I think the show’s going to run… I hope it does’
“I’ll say that the cliffhanger is gonna have everyone just breaking their phones,” Ventura teases about upcoming season 3 episode 8. “But of course there’s room for more ghosts and chaos.
“These writers are amazing. You can feel where the story is going. We’ve just about explored what we wanted to, and there’s only so many untapped regions. I really don’t feel this is a show that will waste the viewers experience.
“There is definitely a little bit more to tap into. And cliffhanger is great. I think the show’s gonna run, and I hope it does,” he added.
Sign up for breaking news, reviews, opinion, top tech deals, and more.
Advertisement
It’s rare to see so much confident positivity from somebody waiting to see if a show is renewed, particularly in interviews. Usually, actors and crew politely reply that they hope whatever they’re working on will return for more, but this reaction feels much more charged.
If School Spirits season 4 does go ahead, I have a feeling it will be the final one. There will likely be some loose threads left over from the season 3 finale, but not enough storytelling to warrant a broad future. Here’s hoping I’m right.
Compal Electronics, the company that makes many of the laptops you see under famous brand names, has introduced a concept called the AI Book. This design replaces the traditional palm rest and trackpad with a color E Ink touchscreen. You can rest your hands on an active display that shows information, allows you to tap it in, and even accepts stylus input for handwriting or sketching.
The main screen sits above a standard keyboard layout, but below the keys, an E Ink panel spreads over the laptop’s body, right where your wrists would ordinarily rest. This provides you with a secondary display that continues to work even when the laptop is turned on, allowing writers to scribble down fast notes while sitting in a meeting, artists to sketch out preliminary ideas, or anyone to just peek at reminders without having to divert their focus to the main screen. The E Ink technology is amazing at keeping graphics sharp and stable, and it’s also good at consuming little power once it’s showed you something; this is a trick that e-readers have been exploiting for years.
WHY IPAD AIR — iPad Air is powerful, versatile, and comes in a choice of two sizes. Featuring the incredible performance of the M3 chip built for…
APPLE INTELLIGENCE — Apple Intelligence is the personal intelligence system that helps you write, express yourself, and get things done…
PERFORMANCE AND STORAGE — M3 is a powerful chip built for Apple Intelligence that brings amazing performance for advanced creative and product…
Even when the lid is closed, the design functions normally. There’s a hinge that allows the palm rest area to flip out to the side slightly, so even when the laptop is closed, you can still see a small portion of the E Ink display, which shows things like notifications, calendar items, and brief messages. After a few more flips, the display becomes visible, transforming the closed laptop into a tiny notepad or reference tool that you may peek at whenever you want. Because E Ink is always on, none of this drains the battery or requires the main system to be turned on.
Compal calls this device an AI Book, which implies that it will display a variety of AI-generated content on the lower screen, such as summaries, suggestions, and images. The description emphasizes how nicely handwriting input, rapid references, and generated material all work together, whether the laptop is open on your desk or closed and ready for you to take it up again. They’ve also included some ambient lighting to provide subtle hints to draw your attention if you’re pausing for a moment.
Compal has consistently entered design awards with forward-thinking ideas, and the AI Book just received an award from the iF Design prizes in 2026. Some of their previous plans have been somewhat wacky, such as modular structures or expandable screens, but they have yet to make it into stores. There is still no information on the specs of this item, like as the size of the display, the type of processor within, or the size of the battery… the attention is solely on the palm rest innovation they have developed. [Source]
Anthropic’s chatbot Claude seems to have benefited from the attention around the company’s fraught negotiations with the Pentagon.
As first reported by CNBC, as of Saturday afternoon, Claude is currently ranked number two among free apps in Apple’s US App Store — the number one app is OpenAI’s ChatGPT, and number three is Google Gemini.
According to data from SensorTower, Claude was just outside the top 100 at the end of January, and has spent most of February somewhere in the top 20. Its ranking has climbed in the last few days, from sixth on Wednesday to fourth on Thursday to second on Saturday (today).
After Anthropic attempted to negotiate for safeguards preventing the Department of Defense from using its AI models for mass domestic surveillance or fully autonomous weapons, President Donald Trump directed federal agencies to stop using all Anthropic products and Secretary of Defense Pete Hegseth said he’s designating the company a supply-chain threat.
In the late 1980s, a little monochrome television appeared in certain public spaces, and for a few quarters, you could see some programming on it. Known as the Vend-O-Vision, this small device transformed idle waiting into something you might pay to see.
Mini-TV USA got the ball rolling in 1989, with the first documented use being on November 29th of that year. Whether it be a laundromat, restaurant, an airport, or a hotel, you could install one of these devices and make some additional money while customers waited. The idea was simple: put one in a waiting area and collect the quarters. Customers faced no monthly bills or ownership hassles, just the straightforward act of inserting a coin.
Each Vend-O-Vision contained a regular Panasonic black and white set, such as a TR5040P, housed inside a strong metal case. The screen was modest, which was common for portable TVs at the time. It picked up VHF and UHF channels fine with a simple antenna setup, and a coin acceptor on the front had a reject button for when customers put in bad coins. Then, once a quarter was inserted, a timer activated and powered the set for the duration you specified, which might be 10 minutes, 15 minutes, or 20 minutes, depending on your settings.
A small slider on the device allowed you to adjust how long the set would stay on for each quarter. You had to manually tune the channels and use the TV’s knobs / dials to get what you wanted. When the timer ran out of time, it turned the power back off. You could even keep your quarters in the closed box, and some versions included a counter to track total insertions for easy revenue checks. It ran everything on a compact 9-volt power supply and had a power pass-through outlet out back for added convenience.
It’s difficult to find any of these devices today because Mini-TV USA ran into some problems early on. Starting around 1990, corporate salesmen were exhibiting these items off at trade exhibitions, assuring customers they could earn a fortune, but it’s safe to say that wasn’t exactly accurate. By 1995, the Federal Trade Commission had taken action against the corporation for deceptive marketing practices. Operations were mostly winding down by then, and a few years later they were gone for good, leaving behind a handful of units, some of which were still sealed in their original packaging. [Source]
Bloodborne fans may not be happy to hear that a remake was reportedly rejected, but that doesn’t mean it’s completely off the table. Bluepoint Games, Sony’s closed-down studio behind many PlayStation remakes, pitched remaking the classic Gothic horror RPG in early 2025, but was blocked by the game’s developer, FromSoftware, according to a Bloomberg report.
As Bloomberg reported, Bluepoint pitched a Bloodborne remake after several years of working towards a live-service title in the God of War franchise that was ultimately canceled. Looking for the next project, a modern-day version of Bloodborne made a lot of sense, considering the title came out in 2015 and Bluepoint was responsible for the successful Demon’s Souls remake in 2020. However, Bloomberg‘s sources said that FromSoftware was against it, but didn’t offer a concrete reason why. With some digging, Bloomberg‘s Jason Schreier pointed to an interview from Kinda Funny Games with PlayStation exec Shuhei Yoshida, which aired last year. In the video, Yoshida mentioned that FromSoftware’s president, Hidetaka Miyazaki, wanted to pursue a Bloodborne remake, but was too busy to do it himself and “doesn’t want anyone else to touch it.”
After failing to get the Bloodborne remake greenlit, Bluepoint wasn’t able to secure another project for more than a year, according to the Bloomberg report. Now that Bluepoint has been shut down, we’re likely even further away from a remake. That’s not to say a remake will never happen, but when it does, it’ll have to get a stamp of approval and likely a lot of oversight from FromSoftware.
Anthropic’s Claude AI assistant “jumped to the No. 2 slot on Apple’s chart of top U.S. free apps late on Friday,” reports CNBC:
The rise in popularity suggests that Anthropic is benefiting from its presence in news headlines, stemming from its refusal to have its models used for mass domestic surveillance or for fully autonomous weapons… OpenAI’s ChatGPT sat at No. 1 on the App Store rankings on Saturday, while Google’s Gemini was at No. 3… On Jan. 30, [Claude] was ranked No. 131 in the U.S., and it bounced between the top 20 and the top 50 for much of February, according to data from analytics company Sensor Tower… [And Friday night, for 85.3 million followers] pop singer Katy Perry posted a screenshot of Anthropic’s Pro subscription for consumers, with a heart superimposed over it.
Friday Anthropic posted “We are deeply grateful to our users, and to the industry peers, policymakers, veterans, and members of the public who have voiced their support in recent days. Thank you. “
The acquisition comes after Anthropic unveiled Claude Sonnet 4.6, its best model yet for computer usage.
Anthropic has acquired Seattle-based AI computer interface builder Vercept for an undisclosed amount to help further the Claude product’s agentic abilities.
The acquisition comes just after Anthropic unveiled its latest Claude Sonnet 4.6, the company’s best model yet for computer usage.
Vercept was founded in 2024 by former Allen Institute for AI (AI2) researchers Matt Deitke, Kiana Ehsani, Ross Girshick, Luca Weihs and Oren Etzioni.
Advertisement
Etzioni served as AI2’s founding CEO, is the co-founder behind AI2 Incubator and a venture partner with Madrona, both of which have supported Vercept.
Shortly after emerging from stealth in early 2025, the AI start-up released its flagship product Vy, a cross-platform AI agent that enables users to control their computers with natural language for navigation of apps and content.
The start-up has raised more than $50m, including a $16m round in January 2025. Its backers include Fifty Years VC founding partner Seth Bannon – who also served as a board member on Vercept – Point Nine Capital, AI2 Incubator and Madrona.
Big Tech leaders, including former Google CEO and chair Eric Schmidt, Jeff Dean, the chief scientist at Google DeepMind, and Kyle Vogt, the founder and former CEO of Cruise, reportedly participated in the January 2025 raise.
Advertisement
Not all of Vercept’s founding team was pleased with the acquisition by Anthropic.
Etzioni, in a post on LinkedIn, said: “After a little bit more than a year, Vercept is throwing in the towel and giving their customers 30 days to get off the platform. Sad.” Vy is scheduled to shut down on 25 March.
In a separate post, he held lead investor Bannon as partly responsible for Vercept “failing to hire a single product [or] business person”. He alleged that the start-up’s board was led by Bannon and by CEO Ehsani, who had “zero experience”.
Meanwhile, founding member Deitke left Vercept to join Meta last summer for a pay package that reportedly amounted to $250m over four years.
Advertisement
Last December, Anthropic acquired the 2021-founded coding toolkit Bun to accelerate Claude Code. Bun, according to Anthropic, had improved the JavaScript and TypeScript developer experience by optimising for reliability and speed.
On Tuesday (24 February), Anthropic and DocuSign announced the integration of Claude Cowork to enable DocuSign users to create, review and manage agreements using natural language prompts.
Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.