Jeff Bezos framed this copy ofa 2006 BusinessWeek cover, reflecting Wall Street’s skepticism about AWS at the time. (Jeff Bezos via X, May 2022)
In the early days of Amazon Web Services, technical evangelist Jeff Barr was putting in long hours on the road, pitching a novel concept: rent computing power for 10 cents an hour, and storage for 15 cents a gigabyte per month — no servers to buy, no data centers to build.
Barr remembers calling his wife to check in at the end of the day. Get a nice dinner, she told him, you deserve it. But later, at the restaurant, looking at the menu and doing the math in his head, he couldn’t help but ask himself if the pennies were adding up.
“Did enough people start using these servers to buy me a decent steak?” he wondered.
He probably should have ordered the filet.
Two decades later, AWS generates nearly $129 billion a year in revenue. That’s enough to rank in the top 40 of the Fortune 500 if it were a standalone company, ahead of the likes of Comcast, AT&T, Tesla, Disney, and PepsiCo. Companies such as Netflix, Airbnb, Slack, Stripe and thousands more have built massive businesses on its platform.
Advertisement
When AWS goes down, it ripples across the web, taking down apps, websites, and services that most users never knew were on a common infrastructure.
But the business that defined cloud computing — bankrolling Amazon’s expansion into everything from streaming to same-day delivery — is now grappling with the most significant challenge since it launched. The rise of AI has upended the industry, empowering Microsoft, Google and others, and creating competitive dynamics that seem to change every month.
For the first time, AWS faces questions about its long-term ability to lead the market it created.
With Amazon marking the 20th anniversary of AWS this month, GeekWire spoke with early builders, current AWS insiders, and longtime observers of the company to tell the story of how the business got started, how it won the cloud, and what it’s up against now.
Advertisement
Scalable, reliable, and low-latency
Officially, Amazon pegs the public launch of AWS to March 14, 2006. That’s when it announced “a simple storage service” that offered software developers “a highly scalable, reliable, and low-latency data storage infrastructure at very low costs.”
Dubbed S3, it was Amazon’s first metered cloud service: the first time developers could pay for exactly what they used, billed in tiny increments, with no upfront commitment.
“We think it can be a meaningful, financially attractive business.” A Bloomberg News story quotes Jeff Bezos about AWS in November 2006. S3 launched earlier in the year.
All of this might seem mundane in a modern world where the cloud and internet services are almost like electricity and water, seemingly always there when you need them.
But remember the context of that moment: Facebook was available only on college campuses. Netflix arrived on DVDs in the mail. The iPhone was still a year away from being unveiled. And over at Microsoft in Redmond, they were finally getting ready to ship Windows Vista.
The asterisk in the headline
The history of Amazon Web Services is more complicated than it might seem, and it’s actually a subject of some disagreement behind the scenes. There are multiple origin stories, including one offered by Amazon itself, and others by former employees who say the company has tidied up the narrative over the years to shape the lore around its current leaders.
Advertisement
Journalist Brad Stone, author of the canonical Amazon book, “The Everything Store,” discovered this when Andy Jassy — the longtime AWS CEO who would go on to succeed Jeff Bezos as Amazon CEO — disputed aspects of his telling of the AWS story in a one-star review.
One point of contention: the origins of EC2, the AWS service built by a small team in South Africa, and the degree to which it sprang from the process Jassy led or was born independently.
Part of the challenge: Amazon, despite operating the storehouse of the internet, isn’t great at preserving its own history. The company, which cooperated with this piece, wasn’t able to unearth key documents such as Jassy’s original AWS six-pager from September 2003.
Some former Amazon leaders take things further back, to a set of e-commerce APIs that Amazon released in July 2002, allowing outside developers to access its product catalog and build applications on top of it. By that accounting, AWS is closer to 24 years old.
Advertisement
Overcoming internal opposition
The effort was led by business leader Colin Bryar, who ran Amazon’s affiliates program, along with technical leader Robert Frederick, whose Amazon Anywhere team (focusing on making Amazon’s site and features available on mobile devices) had been working since 1999 on internal web services that became the foundation for the external APIs.
Amazon in those days was on Seattle’s Beacon Hill, in the landmark art deco Pacific Medical Center tower overlooking downtown. Jeff Bezos was directly involved from the early days, as a believer in the vision that Amazon’s infrastructure capabilities could become a big business.
In 2002, when Bryar initially pitched a roomful of senior leaders on the idea of opening up Amazon’s product catalog and features as web services to outside developers, nearly all of them said no, as Frederick recalled in a recent interview.
The objections piled up: it would cannibalize existing business, it would educate competitors. Then, as Frederick remembers it, Bezos looked around the table and let out one of his trademark piercing laughs. Amazon’s founder wanted to see what developers would do.
Advertisement
“Let’s do it,” Frederick recalls Bezos saying, “and let’s have them surprise us.”
Later, in a July 2002 press release announcing “Amazon.com Web Services,” Bezos used nearly identical language: “We can’t wait to see how they’re going to surprise us.”
Big developer response
Within months, tens of thousands of developers had signed up. Increasingly, they were asking for things like storage, hosting, and compute, recalled Frederick, who worked at Amazon through mid-2006. He went on to found IoT platform Sirqul in 2013 and remains its CEO.
Another veteran of those early days agreed that the developer response to those initial e-commerce APIs may have opened the minds of Amazon’s leaders to the larger possibilities.
Advertisement
“Maybe that’s where Andy’s brain lit up. … Maybe that’s where Jeff’s brain lit up,” said Dave Schappell, referring to Jassy and Bezos. Schappell arrived at Amazon in 1998 as Jassy’s MBA intern, dropped out of Wharton to stay, and spent the next seven years working with him.
Schappell ran the associates program after Bryar, became an early head of product for AWS, and hired the original product managers. Those product managers included Jeff Lawson, who went on to found Twilio. Schappell himself became a well-known Seattle entrepreneur before returning to AWS for four years after Amazon acquired his startup TeachStreet.
The ‘crystal-clear movie moment’
Jeff Barr was one of the developers who noticed.
Now an Amazon VP and longtime AWS chief evangelist, Barr was working as an outside consultant in the web services field when he logged into his Amazon Associates account one day in 2002 and noticed a new message.
Advertisement
AWS Chief Evangelist Jeff Barr, joined in in the early days of the business. (Amazon Photo)
Amazon now had XML, it said, referring to the data-formatting standard that allowed software systems to communicate over the internet. Amazon was making its product catalog available as a web service and connecting it to the affiliate program, a surprising move at the time.
“I clicked through, I signed up for the beta. I downloaded it right away,” Barr recalled.
He sent feedback to the email address in the documentation. They actually replied.
Before long, he was invited to a small developer conference at Amazon’s headquarters — maybe four or five attendees at the Pacific Medical Center tower, in a semicircular open space with a view of the city. The developers sat in the middle, with Amazon employees around them.
At some point, one of the Amazon presenters announced that they were so impressed at how developers had found the APIs and started publishing apps within 48 hours that they were going to look around the rest of the company for more services to open up.
Advertisement
“That was that crystal-clear movie moment,” Barr said. He turned to an Amazon employee nearby and told her: “I have to be a part of this.”
Creating the cloud
But what Frederick and team had built was essentially a way for outside developers to access Amazon’s product data. It was not yet the cloud as we know it today.
That move started in mid-2003, as Jassy told the story in a 2013 talk at Harvard Business School. Jassy, then serving as Bezos’s technical advisor, was tasked with figuring out why software projects across Amazon were taking so long. It turned out that engineers were spending months building storage, database, and compute solutions from scratch.
In a meeting of six or seven people that summer, someone made the observation that would change the company’s trajectory. Jassy recalled the thinking during his HBS talk: “We’re pretty good at this. And if we’re having so many problems, and we don’t have anything we can use externally, I imagine lots of other companies probably have the same problem.”
Advertisement
Around the same time, Amazon recruited Werner Vogels, a Cornell distributed systems researcher, as its chief technology officer. He almost didn’t take the call. “It’s an online bookstore,” he recalled in a LinkedIn post last week. “How hard could their scaling be?”
But the company was wrestling with every problem he and his colleagues had been theorizing about — fault tolerance, consistency, availability at scale — live in production, every day.
Fundamental building blocks
Schappell remembers those early days as a non-stop cycle of six-page memos and meetings with Jassy and Bezos, all focused on trying to figure out what to build.
The concept that would define AWS — breaking every capability down to its most basic building block, or “primitive” — didn’t arrive fully formed. “I don’t think he said that on day one,” Schappell said of Bezos. “I think he said it after he read 47 of our six-pagers.”
Advertisement
Each primitive would stand on its own, and customers would pay only for what they used, billed in tiny increments. It was a direct rebuke to the licensing models of companies such as database giant Oracle, where customers paid for everything whether they used it or not.
Rahul Singh, who joined AWS in January 2004 as one of its first engineers, recalled the early technical plans going through just one layer of review before reaching Bezos and Jassy. (It’s the kind of streamlined decision-making that Jassy is now trying to restore across the company.)
Fault tolerant by design
In one early meeting, Bezos told the engineers he wanted a server touched exactly twice: once when installed in the data center, and once years later when it was pulled out. In between, nothing. The software had to be built to tolerate failures, leaving dead machines behind and moving on. It was a philosophy that would define the architecture of the cloud.
On Singh’s first day, his manager Peter Cohen sat him down in the lunch area and handed him a planning document (a “PR/FAQ” in Amazon lingo) that had just been approved by Bezos.
Advertisement
“We’re calling this S4,” Cohen said. Singh looked at the name of the product, Simple Server-Side Storage Service, and pointed out that it should be called S5. Singh recalls Cohen’s response: “Yeah, you’re really smart, aren’t you? Let’s see if you can actually build this.”
It was eventually shortened to Simple Storage Service, or S3.
The queuing service called SQS had launched in 2004 in beta (adding further to the debate over the origin story and what counts as the launch) but S3 was the first made generally available.
A billion-dollar business?
Jassy, then the VP in charge of AWS, would hold all-hands meetings in a conference room for four or five engineers, most of them straight out of college and grad school, as Singh recalled in an interview. Jassy ran them with the discipline of a much larger organization, repeating over and over that AWS could be a billion-dollar business, at a time when it had no revenue at all.
Advertisement
Singh remembers being highly skeptical.
“I was young and naive, and I remember thinking: a billion, that’s a really big number,” Singh said. Years later, he would joke with Jassy that the prediction had been completely wrong: it turned out to be a multi-billion dollar business, many times over.
In a LinkedIn post marking the March 14 anniversary, current AWS CEO Matt Garman — who joined the company as a summer intern in 2005, before the launch of S3 — recalled how early customers like FilmmakerLive and CastingWords took a bet on the fledgling platform.
“That shift changed the economics of building technology almost overnight,” he wrote.
Advertisement
Meanwhile, in Cape Town …
While one team was building S3 in Seattle, the compute side of the equation was taking shape 10,000 miles away. Chris Pinkham, an Amazon VP who wanted to move back to his native South Africa, was given permission to set up a development office in Cape Town.
His small team built EC2 — the Elastic Compute Cloud — largely independent of the Seattle operation. The local tech community was a bit bewildered by what Amazon was doing.
“We knew this bookstore had arrived in town,” recalled Dave Brown, who was working at a local payments startup at the time. He asked his friends who had joined what they were doing.
Dave Brown, Vice President, AWS Compute & ML Services, at AWS re:Invent 2025. (Amazon Photo)
“It’s kind of like, you know, you can rent a computer on the internet,” they told him.
Brown asked about the revenue. “Tens of dollars every single day,” they said.
Advertisement
He remembers wondering why they were wasting their time on that.
The answer became clear when EC2 launched in August 2006, five months after S3, adding compute to storage as another fundamental building block of AWS and the cloud.
Early customers showed EC2’s range: a Spider‑Man movie used it for rendering, and Facebook apps like FarmVille and Animoto spun up instances on demand, as Brown recalled.
A New York Times engineer used a personal credit card to run optical character recognition on the paper’s scanned archives over a weekend, making the entire archive searchable, after being told by the company that it would be cost-prohibitive using traditional approaches. It cost a grand total of a couple hundred bucks, even after initially screwing up and doing it over again.
Advertisement
Typing ahead of the characters
Brown joined in August 2007, the 14th person on the EC2 team. They worked out of a tiny office in Constantia, the winelands part of Cape Town, across the highway from vineyards.
They occupied part of one floor of an office building. There was one conference room, and two offices. The rest was open plan. The team was 14 engineers, one product manager, and Peter DeSantis, the leader who came from Seattle to help build the service.
The internet connection was a four-megabit DSL line shared by the entire office, with 300 milliseconds of latency to the data centers in the U.S. When engineers typed on their screens, each character had to make the round trip across the ocean and back before it appeared.
“You get really good at typing ahead of where the actual characters are appearing,” Brown said.
Advertisement
Every morning, someone had to find the VPN token to get the office online. It lasted about 10 hours before it automatically reset. “Everybody would be shouting, where’s the VPN token?”
Scrambling to keep up
One day, they were running low on computing capacity. DeSantis came out of his office and told the engineers to shut down the machines they were using for testing. That freed up enough capacity to keep the service going for a few days until the next racks of hardware came online.
Marc Brooker, now an AWS VP and distinguished engineer working on agentic AI, joined the EC2 team in Cape Town in 2008. He could see the entire team from his desk. When Brown was away one day, Brooker and the team covered every surface of his office in sticky notes — the kind of prank that only works in a small office where everyone knows everyone else.
Brooker was drawn in by something he heard about in his job interview: the team had built a way to make a distributed system look like a physical hard drive to the operating system.
Advertisement
“Wow, that is so cool,” he recalled thinking. “Here’s 20 other things I can think of that we could do with that kind of technology.”
That instinct, that the building blocks of the cloud could be combined and recombined in ways no one at Amazon had imagined, was at the core of what made AWS catch on.
AWS VP Mai-Lan Tomsen Bukovec, who oversees AWS’s core data and analytics services, in front of a whiteboard on which she mapped the evolution from the early days of S3 to the AI era at Amazon’s re:Invent building in Seattle. (GeekWire Photo / Todd Bishop)
“The world would be in a very different place if you didn’t have the freedom to experiment, to pilot, to try something, to move on to some other idea, that AWS first introduced,” said Mai-Lan Tomsen Bukovec, an AWS VP who has led S3 for 13 of its 20 years.
Prasad Kalyanaraman, now the AWS vice president who oversees global infrastructure, previously spent years building supply-chain forecasting systems for Amazon’s retail operation. Around 2011, Charlie Bell, then a senior AWS leader, asked him to help with a problem: the team was forecasting its compute demand using spreadsheets.
Advertisement
He adapted the supply-chain forecasting tools for AWS, but the cloud business kept outrunning every model he built.
“The funny thing about forecasts is that forecasts are always wrong,” he said. “It’s very hard to actually predict exponential growth.”
How AWS grew
It began with startups. The companies that would define the next era of technology were building on AWS. Airbnb, Instagram, and Pinterest all got their start on AWS.
John Rossman, a former Amazon exec and author of books including “The Amazon Way” and “Big Bet Leadership,” remembers Jassy pulling him aside for coffee at PacMed around 2008. Rossman had left Amazon and was working as a consultant to large businesses. Jassy wanted to know: did he think big companies would ever be interested in on-demand computing?
Advertisement
Maybe, maybe not, Rossman said. He was working with Blue Shield of California at the time, and tried to imagine them running on AWS. It was hard to picture. At the time, the typical AWS customer was a startup developer with little budget for infrastructure. The idea that a big insurance company would run on AWS seemed like a stretch.
“I was a little bit of a pessimist on it,” Rossman said.
But soon things started to change.
Netflix moved its streaming infrastructure to AWS starting in 2009, a decision that carried particular weight because it competed with Amazon in video. In 2013, the CIA awarded AWS a contract over IBM, signaling that the platform was trusted at the highest levels of security.
Advertisement
Microsoft tips its hat
AWS’s pricing model, in which customers paid only for what they used, was a direct threat to the licensing businesses of tech’s old guard. Whether burying their heads in the sand or just preoccupied, the companies that would become the biggest AWS rivals were slow to respond.
Microsoft didn’t unveil its cloud platform — code-named “Red Dog,” and initially launched as “Windows Azure” — until October 2008, more than two years after S3 debuted. Bill Gates had left his day-to-day role at Microsoft a few months earlier. The company was still recovering from the aftermath of the Vista flop.
“I’d like to tip my hat to Jeff Bezos and Amazon,” said Ray Ozzie, then Microsoft’s chief software architect, at the launch event — a rare public acknowledgment of a competitor’s lead.
Azure didn’t reach general availability until 2010, and its early approach was more of a platform for applications, not the raw infrastructure that made AWS so popular with developers. It took years to build out comparable offerings.
Advertisement
Google launched App Engine, a platform for running applications, in 2008, but didn’t offer raw computing infrastructure to rival EC2 until Compute Engine arrived in 2012.
‘The AWS IPO’
For years, AWS grew in something close to silence. Amazon said little about the overall growth, and didn’t break out the financial results for the business in its quarterly earnings reports.
Then, in April 2015, Amazon reported its first-quarter earnings with AWS broken out in detail for the first time, and it stunned the industry. The business had a $6 billion annual revenue run rate and was growing 50% a year.
The modest expo hall at the first AWS re:Invent, under construction in 2012, left. Last year’s conference, right, drew 60,000 people to Las Vegas. (2012 Photo Courtesy Jeff Barr; 2025 Photo by Todd Bishop)
AWS generated more than $250 million in profit that quarter alone, with operating margins around 17%. This was a stark contrast with the rest of Amazon, scraping by on traditional retail margins of 2% to 3%. AWS was making significantly more profit on every dollar of revenue.
The hosts of the Acquired podcast, in their extensive 2022 history about the rise of Amazon Web Services, would later call this moment “the AWS IPO,” in effect.
Advertisement
Amazon stock jumped 15% on the news.
“I was blown away,” said Schappell, the early AWS product leader who left in 2004 and later listened to the first AWS earnings breakout while training for a marathon. For years, he had assumed Amazon was losing billions on AWS. The reality was the opposite: AWS had become so profitable that it was effectively bankrolling Amazon’s future.
The margins kept climbing, reaching 35% by early 2022.
Then the pandemic cloud boom faded. Inflation spiked amid broader economic uncertainty. Customers scrutinized their cloud bills and pulled back spending. AWS revenue growth fell from 37% to 12% over the course of the year, the slowest in its history. Margins fell to 24%.
Advertisement
The ChatGPT moment
Then everything changed, for Amazon and everyone else.
On November 30, 2022, OpenAI released ChatGPT, with little fanfare at first. The consumer AI chatbot quickly became the fastest-growing application in history, reaching 100 million users in two months, and sending the technology world into a frenzy in the ensuing months.
For AWS, the stakes were huge. Every major wave of technology over the previous 15 years, from mobile to social to streaming to e-commerce, had been built on its platform.
If AI was the next wave, AWS needed to lead the way again.
Advertisement
Amazon was far from absent in AI. AWS had launched SageMaker in 2017, giving developers tools to build and deploy machine learning models. It had released custom AI chips for inference and training. Alexa, the voice assistant, had been processing natural language queries since 2014. Amazon had spent many years and billions of dollars on machine learning.
But none of it looked or worked like ChatGPT. The new model could write code, draft essays, answer complex questions, and hold a conversation. It was not a feature. It was a product people wanted to use. And it was built by an AI lab running on Microsoft Azure.
‘AWS sneaked in there’
The irony: OpenAI didn’t start on Microsoft’s cloud. It launched on AWS.
When the AI lab debuted in December 2015, AWS was listed as a donor. OpenAI was running its early research on Amazon’s infrastructure under a deal worth $50 million in cloud credits.
Advertisement
Microsoft CEO Satya Nadella learned about it after the fact. “Did we get called to participate?” he wrote to his team that day, in an email that surfaced only recently in a court filing from Elon Musk’s suit against Microsoft and OpenAI. “AWS seems to have sneaked in there.”
Microsoft moved fast. Within months, Nadella was courting OpenAI. The AWS contract was up for renewal in September 2016. “Amazon started really dicking us around on the [terms and conditions], especially on marketing commits,” Sam Altman wrote to Musk, who was then OpenAI’s co-chair. “And their offering wasn’t that good technically anyway.”
By that November, Microsoft had won the business.
Six years later, with the launch of ChatGPT, that bet paid off in ways no one could have predicted. Microsoft stock surged. Amazon, like many others in the industry, was scrambling to figure it all out — suddenly trying to keep up with the future of a market it had long defined.
Advertisement
Pivoting to generative AI
The AWS CEO at the time was Adam Selipsky, who had helped build the business from its earliest days before leaving in 2016 to run Tableau, the data visualization company. He returned in May 2021 to lead AWS after Jassy was promoted to succeed Bezos as Amazon CEO.
In a May 2024 interview with Selipsky, on one of his last days in the role, GeekWire asked him directly if Amazon had been caught flat-footed by the rise of generative AI.
After a member of his team interjected to say the question seemed to be informed by reading too many Microsoft press releases, Selipsky dismissed the idea that AWS was behind.
While that narrative might have “more sizzle” and generate clicks, Selipsky said, the reality was different, as evidenced by Amazon’s years of work in AI and machine learning.
Advertisement
AWS had announced Inferentia, a chip for deep learning, in 2018, building on its 2015 acquisition of Annapurna Labs, the Israeli chip startup. It began work on CodeWhisperer, an AI coding assistant, in 2020 — before GitHub Copilot existed, the company notes. In 2021, it launched Trainium, a chip designed to train models with 100 billion or more parameters.
Dario Amodei, CEO of Anthropic, right, speaks with Adam Selipsky, then CEO of Amazon Web Services, at AWS re:Invent on Nov. 28, 2023. (GeekWire File Photo / Todd Bishop)
At the same time, Selipsky acknowledged that AWS had “pivoted many thousands of people from other interesting, important projects to work on generative AI” — a scale of reallocation signaling something other than business as usual inside the company.
Tomsen Bukovec, who now oversees AWS’s core data services including S3, analytics, and streaming, said her team’s response was less a pivot than a process of learning.
They educated themselves on what the technology meant for their services, she said, and thought deeply about what it would look like for AI to both create and consume data at scale.
The question her team started asking in late 2022: what does the world look like when 70 to 80 percent of the usage of your services comes through AI?
Advertisement
“AI is going to use it at 10 times to 100 times the rate of a human, and it’s going to do it all day long, all the time, 24 hours,” she said. “AI never goes to sleep.”
Scrambling to meet the moment
The pressure to catch up in generative AI was felt across the company. In a lawsuit filed in Los Angeles Superior Court, an AI researcher who worked on Amazon’s Alexa team alleged that a director instructed her to ignore internal copyright policies because “everyone else is doing it.”
The complaint described ChatGPT’s launch in late November 2022 as causing “panic within the organization.” Amazon has denied the allegations, and the case is still pending.
On Amazon’s earnings call in early February 2023 — two months after ChatGPT’s launch — Amazon CEO Andy Jassy did not discuss generative AI or large language models.
Advertisement
Matt Garman, AWS CEO, speaks at AWS re:Invent 2025. (GeekWire File Photo / Todd Bishop)
By the next quarter’s call, in late April 2023, he spoke about it for nearly ten minutes, describing it as “a remarkable opportunity to transform virtually every customer experience that exists.”
In September 2023, the company announced an investment of up to $4 billion in Claude maker Anthropic, the AI startup founded by former OpenAI researchers. The investment would eventually grow to $8 billion — which seemed like a lot at the time.
Selipsky left AWS in mid-2024. Garman, whom Selipsky had hired as a product manager in 2006, succeeded him as CEO, charged with leading the cloud business into the new era.
From CodeWhisperer to Bedrock
The roots of Amazon’s response actually predated ChatGPT by more than two years, although it faced initial skepticism internally. In 2020, Atul Deo, an AWS product director, wrote a six-page memo proposing a generative AI service that could write code from plain English prompts.
Jassy, who was still leading AWS at the time, wasn’t sold. His reaction, as Deo later told Yahoo Finance, was that it seemed like a pipe dream. The project launched in 2023 as CodeWhisperer, an AI coding assistant.
Advertisement
But by then, ChatGPT had redrawn the landscape, and the team realized they could offer something broader: a platform giving customers access to a range of foundation models through a single service. AWS called it Bedrock. The name reflected an ambition to do for AI models what the company had done years earlier with its Relational Database Service, which wrapped MySQL, Oracle, and other database engines in a common management layer.
Bedrock would do the same for large language models.
The decision to offer multiple models rather than push a single in-house option was deliberate, and rooted in a pattern AWS had followed for years. It brought multiple CPUs to the cloud: AMD, Intel, and its own Graviton. It offered Nvidia GPUs alongside its own Trainium chips.
Fastest-growing AWS service
Amazon’s view is that choice drives competition, which drives down prices for customers.
Advertisement
“We knew there was never going to be one model to rule everybody,” said Dave Brown, the AWS vice president who oversees EC2, networking, and custom silicon. “And even the best model was not going to be the best model all the time.”
Bedrock launched in preview in April 2023 and reached general availability that September, with models from Anthropic, Meta, and others alongside Amazon’s own. Two years later, it had become the fastest-growing service AWS had ever offered, with more than 100,000 customers.
On Amazon’s most recent earnings call, Jassy described it as a multi-billion-dollar business, with customer spending growing 60% from one quarter to the next.
At the end of 2024, Amazon added its own entry to the model race. The company introduced a family of foundation models called Nova, positioned as a lower-cost, lower-latency alternative to the third-party models on the Bedrock platform.
Advertisement
Amazon CEO Andy Jassy unveils the Nova models at AWS re:Invent in December 2024. (GeekWire Photo / Todd Bishop)
As Fortune’s Jason Del Rey observed, it was a page from the e-commerce playbook: build the marketplace first, then stock it with a house brand. Just as Amazon sells goods from thousands of merchants alongside its own private-label products, Bedrock offered models from Anthropic, Meta, and others, and now Amazon’s own models to go along with them.
At re:Invent in late 2025, AWS pushed further, unveiling what it called “frontier agents” — autonomous AI systems designed to work for hours or days without human involvement.
One, built into Amazon’s Kiro coding platform, can navigate multiple code repositories to fix bugs while a developer sleeps. Last month, the Financial Times reported that Amazon’s own AI coding tools caused at least one AWS service disruption. Amazon acknowledged the incident but publicly disputed aspects of the reporting, citing a misconfigured role, not the AI itself.
The $200 billion bet
Like its rivals, AWS is also building the physical infrastructure to back it up. In 2025, less than a year after it was announced, AWS opened Project Rainier, one of the world’s largest AI compute clusters, centered in Indiana, powered by more than 500,000 of Amazon’s Trainium2 chips.
Named after the mountain visible from Seattle, Rainier was built to train and run Anthropic’s next generation of Claude models, using Amazon’s own Trainium chips rather than Nvidia GPUs.
Advertisement
Kalyanaraman, the AWS vice president who oversees global infrastructure, said the project forced AWS to rethink its supply chain from the ground up. The goal was to minimize the time between a chip leaving its fabrication facility and serving a customer workload.
Rainier was built at a faster pace than anything AWS had ever done, Kalyanaraman said, with more than 100,000 Trainium chips available to Anthropic in under a year. But it wasn’t a one-off. He called it the new template for how AWS would build AI infrastructure going forward.
Then, late last month, came the deal that brought the story full circle.
OpenAI — the company that launched on AWS in 2015 and left for Microsoft Azure the following year — announced a partnership with Amazon that included up to $50 billion in investment and a cloud agreement worth more than $100 billion over eight years.
Advertisement
OpenAI committed to run workloads on Amazon’s custom Trainium chips, making it the second major AI lab after Anthropic to do so. The two companies had been talking since at least May 2023, according to SEC filings, but Microsoft’s right of first refusal on OpenAI’s compute had blocked a deal until those restrictions were loosened in the latest renegotiation.
By late 2025, AWS revenue was growing at its fastest pace in more than three years, up 24% to $35.6 billion a quarter. The company disclosed that its Trainium and Graviton chips had reached a combined annual revenue run rate of more than $10 billion. Bedrock had surpassed 100,000 customers and was generating revenue in the billions.
The competitive picture was also coming into sharper focus.
In mid-2025, Microsoft disclosed standalone Azure revenue for the first time: $75 billion a year, up 34%. Google Cloud had crossed a $50 billion annual run rate. AWS, at more than $116 billion a year at the time, was still larger — but no longer running away with the market.
All of this helps to explain Amazon’s record capital spending. On the company’s latest earnings call, Jassy defended plans to spend $200 billion this year, most of it on AI infrastructure.
Advertisement
The figure is so large it would consume nearly all of Amazon’s operating cash flow. Facing a Wall Street backlash, Jassy called artificial intelligence “an extraordinarily unusual opportunity to forever change the size of AWS and Amazon as a whole.”
What’s next: Bear and bull cases
Longtime observers are divided on the company’s AI bet.
Corey Quinn, a cloud economist who works with AWS customers through his Duckbill consultancy, sees little real‑world traction for Amazon’s Nova models. “You know someone is an Amazon employee when they talk about Nova, because no one else is,” he said.
Some businesses bypass Amazon’s Bedrock platform entirely because of capacity constraints and slower speeds, he said, going to third-party providers like Anthropic rather than inserting Bedrock as a “middleman” — unless they’re trying to retire their committed AWS spend.
Advertisement
Looking forward, Quinn pointed to a historical parallel. Twenty years ago, Cisco was the most valuable company in the world, the backbone of the internet. Today it is a profitable but largely invisible utility. AWS, he said, could be headed for the same fate.
“It’s very clear that there will be a 40th anniversary for AWS, because that inertia does not go away,” Quinn said. “But will it be at the center of tech policy and giant companies, or is it going to be a lot more like the Cisco of today?”
Om Malik, the veteran tech writer, cast a critical eye on Amazon’s OpenAI investment.
By his math, Amazon is paying roughly 16 times more per percentage point of OpenAI than Microsoft did, with none of the exclusive IP rights, revenue share, or primary API access that Microsoft locked up years ago. The cost of being late, Malik wrote, is measured in billions.
Advertisement
The lobby at AWS headquarters, the re:Invent building in Seattle. (GeekWire Photo / Todd Bishop)
Rossman, the former Amazon executive who was once skeptical about AWS demand from big business, sees a different picture. He agrees that AWS is strong in infrastructure, the picks and shovels. But where Quinn sees that as a ceiling, Rossman sees it as a moat.
The models are the commodity, Rossman contends. They leapfrog each other constantly. What matters is everything the models run on and through: the chips, the servers, the data centers, the power. AWS is building more of that stack than most competitors.
“That’s where the value is,” he said.
Rossman said he could envision AWS operating nuclear power plants someday. The long-term winners, he said, will be the companies that deliver the best AI at the lowest cost per token. That’s where AWS’s vertical integration — from Trainium chips to Bedrock to the data center itself — gives it an advantage competitors can’t easily replicate.
As for the risk of spending too much, Rossman put it simply: you have to decide which side of history you’d prefer to fail on — overbuilding or underbuilding. Amazon isn’t taking chances.
Advertisement
In an internal all-hands meeting last week, Jassy said AI could help AWS reach $600 billion in annual revenue, double his own prior estimate, Reuters reported. He had been thinking for years that AWS could be a $300 billion business in a decade. AI, he said, changed the math.
Jack Conte created Patreon to try and earn extra from his YouTube videos. The musician-turned-businessman is now managing a platform with 3 million monthly active users, and has plenty to say to big corporations operating chatbots and other AI platforms. First and foremost, these AI companies should stop crying foul… Read Entire Article Source link
A new bill proposed in California “goes after big tech companies” writes Semafor. Supported by Y Combinator, Cory Doctorow , and the nonprofit advocacy group Fight for the Future, it’s called the “BASED” act — an acronym which stands for “Blocking Anticompetitive Self-preferencing by Entrenched Dominant platforms.”
As announced by San Francisco state representative Scott Wiener, the bill “will restore competition to the digital marketplace by prohibiting any digital platform with a market capitalization greater than $1 trillion and serving 100 million or more monthly users in the U.S., from favoring their own products and services on the platforms they operate.”
For years, giant digital platforms like Apple, Amazon, Google, and Meta have used their immense power to promote their own products and services while stifling competitors — a practice also known as self-preferencing. The result has been higher prices, diminished service, and fewer options for consumers, and less innovation across the technology ecosystem.
Advertisement
Self-preferencing also locks startups and mid-sized companies out of the online marketplace unless they play by rules set by their competitors. As a new generation of AI-powered startups seeks to enter the marketplace, their success — and public access to the innovations they produce — depends on their ability to compete on an even playing field.
“Anticompetitive behavior is everywhere on the internet,” said Senator Wiener, “from rigged search results, to manipulative nudges boosting the ‘house’ product, to anti-discount policies that raise prices, to the dreaded green bubble that ‘breaks’ the group chat. When the world’s largest digital platforms rig the game to favor their own products and services, we all lose. By prohibiting these anticompetitive practices, the BASED Act will protect competition online, empower consumers and startups, and promote innovations to improve all our lives.” The announcement includes a quote from Teri Olle, VP of the nonprofit Economic Security California Action, saying the act would “safeguard merit-based market competition. This legislation stands for a simple principle: owning the stadium doesn’t mean that you get to rig the game.”
Some conduct prohibited by the proposed bill includes
Manipulating the order of search results to favor a provider’s products or services, irrespective of a merit-based process,
Using non-public data generated by third-party sellers — including sales volumes, pricing, and customer behavior — to develop competing products that are subsequently boosted above the third-party sellers’ product…
And the announcement also notes that “under the terms of the bill, providers could not prevent consumers from obtaining a portable copy of their own data or restrict voluntary data sharing (by consumers) with third parties.”
“This is exactly the kind of common-sense antitrust reform we need if we want the next generation of startups to have a fair shot rather than watching Big Tech pull up the ladder behind them.”
— Jeremy Stoppelman CEO and Co-Founder, Yelp
“California has led the way on privacy, and now it has a chance to lead on digital competition. SB 1074 would prohibit the self-preferencing tactics that dominant platforms use to box out competitors — the same tactics that make it harder for people to discover and switch to privacy-respecting alternatives like DuckDuckGo.”
Advertisement
— Kamyl Bazbaz, Chief Communications and Policy Officer, DuckDuckGo
“When users can freely choose privacy-focused alternatives without artificial barriers, everyone benefits — from independent developers to everyday people who deserve control over their digital lives.”
— Raphael Auphan, Chief Operating Officer, Proton
“[The BASED act] is about stopping market corruption — the moment when a platform uses its control over the pipes to bury rivals, tax every transaction, and quietly swallow the open web. This bill restores something simple and very American: if you build something great, you should win or lose on the merits, not on whether a gatekeeper decides to rig the rules.”
Advertisement
— Garry Tan, CEO of Y Combinator
“If there’s one thing we’ve learned from the enshittification of digital platforms, it’s this: *someone* is going to regulate the way you use the internet. If governments don’t step in, that regulator will be a powerful *company*, a platform that structures markets to maximize its interests, at the expense of technology makers, technology users, buyers *and* sellers.”
The distinction between the models lies in their platforms. The AI+ versions use Intel’s latest Panther Lake platform and are powered by Core Ultra Series 3 chips. These configurations support higher performance, featuring two DDR5 memory slots. Read Entire Article Source link
Odds are, you’ve taken pills before; it’s a statistical certainty that some of you reading this took several this morning. Whenever you do, you’re at the mercy of the manufacturer: you’re trusting that they’ve put in the specific active ingredients in the dosage listed on the package. Alas, given the world we live in, that doesn’t always happen. Double-checking actual concentrations requires expensive lab equipment like gas chromatography. It turns out checking for counterfeit pills is easier than you’d think, thanks to a technique called Disintegration Fingerprinting.
The raw voltage signal from the sensor is stored as a “disintegration fingerprint” of particles detected per minute.
It’s delightfully simple: all you need is a clear plastic cup, a stir plate, and a handful of electronic components — namely, a microcontroller, a servo, and an IR line-following sensor. You’ve probably played with just such a sensor: the cheap ones that are a matched pair of LED and photodetector. It works like this: the plastic cup, filled with water, sits upon the stir plate. To start the device, you turn on the stir plate and actuate the servo to drop the pill in the water. The microcontroller then begins recording the signal from the photo-diode. As the pill breaks up and/or dissolves in the water, the swirling bits are going to reflect light from the IR LED. That reflectance signal over time is the Disintegration Fingerprint (DF), and it’s surprisingly effective at catching fakes according to the authors of the paper linked above. Out of 32 different drug products, the technique worked on 90% of them, and was even able to distinguish between generic and brand-name versions of the same drug.
Of course, you do need a known-good sample to generate a trustworthy fingerprint, and there’s that pesky 10% of products the technique doesn’t work on, but this seems like a great way to add some last-mile QA/QC to the drug distribution chain, particularly in low and middle-income countries where counterfeit drugs are a big problem.
An artist’s conception shows Portal Space Systems’ Starburst spacecraft at left and its larger Supernova platform in the distance at right, both outfitted with Paladin Space’s Triton payload for orbital debris tracking and removal. (Portal Space Systems Illustration)
Bothell, Wash.-based Portal Space Systems is partnering with an Australian venture called Paladin Space on a commercial service that would round up and dispose of potentially dangerous orbital debris.
The concept — known as Debris Removal as a Service, or DRAAS — is meant to address one of the most pernicious problems facing spacecraft operators: how to dodge tens of thousands of pieces of space junk that are zipping through Earth orbit.
Since its founding in 2021, Portal has been focusing on the development of maneuverable orbital vehicles that could rendezvous with other satellites, either for servicing or for disposal. Its flagship is the Supernova in-space mobility platform, which will be equipped with an innovative solar thermal propulsion system. There’ll also be a smaller version of the spacecraft, called Starburst. Starburst-1 is due for launch as early as this year, and Supernova is scheduled to make its debut in 2027.
Meanwhile, Paladin Space has been working on a reusable payload called Triton, which is designed to track and capture tumbling pieces of orbital debris that are less than 1 meter (3 feet) in size. That small-to-medium size category accounts for most of the debris that’s being tracked in orbit.
“Triton is built to remove dozens of those objects in a single mission, which fundamentally changes the cost structure of debris remediation and provides the greatest benefit to satellite operators,” Paladin CEO Harrison Box said today in a news release.
Advertisement
A space debris hit to the space shuttle Endeavour’s radiator was found after one of its missions. The entry hole was about a quarter-inch wide, and the exit hole was twice as large. (USGS Photo / circa 2007)
The Portal-Paladin partnership calls for installing Triton hardware on Starburst spacecraft. Portal’s orbital platform would go out in search of space junk, and Paladin’s payload would grab the debris. When Triton’s trash bin is full, it would be dropped off for safe disposal while the spacecraft remains in orbit for continued servicing.
The companies are targeting an initial deployment in 2027, focusing on heavily trafficked bands of low Earth orbit. Future missions may take advantage of Supernova’s added capabilities to service a wider variety of orbits.
Other efforts to remove orbital debris are in the works: A Japanese company called Astroscale executed two orbital test missions (ELSA-d and ADRAS-J) and is now gearing up for follow-up demonstration missions (COSMIC, ADRAS-J2 and ELSA-M). A Swiss company called ClearSpace is working with the European Space Agency on an experimental mission that would take a defunct satellite out of orbit.
Portal CEO Jeff Thornburg said DRAAS will be much more than a one-off demonstration. “This is about making debris removal operational, not experimental,” he said. “Satellite data underpins communications, navigation, weather forecasting and national security. Maintaining that infrastructure requires active debris management. For the first time, we can do that as a repeatable service.”
Portal has already attracted millions of dollars in financial support from SpaceWERX, a division of the U.S. Space Force that focuses on bridging the gap between commercial technologies and military needs. Its partnership with Paladin targets a different market for in-space services. NASA has estimated that debris avoidance maneuvers cost U.S. satellite operators roughly $58 million annually.
Advertisement
At least one potential customer is going public about its interest. Portal said Starlab Space, a joint venture that is working on a commercial space station, has signed a letter of intent to integrate the DRAAS service into future station operations. Starlab’s team includes Airbus, Voyager Technologies, Northrop Grumman, Mitsubishi and Palantir.
“Safety is the foundation of everything we’re building at Starlab,” said Brad Henderson, Starlab’s chief commercial officer. “We’re engineering a station designed to last for decades, one that must meet the highest standards of integrity to protect our crew and the science that will live aboard. Capabilities that reduce collision risk and limit the need for frequent collision avoidance maneuvers directly serve that mission.”
There are two main kinds of coffee subscription providers: roasters and retailers. Both roasters and multi-roaster retailers sell great coffee. This guide contains a mix of both.
Roasters are cafés and small-batch producers who buy raw beans from farmers all over the world and roast them to perfection. By buying from a roaster, you’re directly supporting the people who make your favorite coffees. The downside is you usually won’t have as broad a selection. Roasters usually sell only their own coffee, but that often means special blends and single origins are available from a roaster that you can’t get from a retailer. Your local roaster down the street may also have subscription offers, giving you the chance to buy local without leaving your house—and often catch a discount.
Advertisement
Retailers or Multi-Roaster Subscriptions are coffee subscription providers who buy their beans from many different roasters, then ship bags of coffee to you. A multi-roaster retailer will often have a much broader selection of high-quality coffee available (from multiple brands) to ship to your doorstep—often selected and curated carefully by coffee experts. The downside on some subscriptions is that you’re not buying directly from a roaster, which means the coffee may not be as fresh. (That’s where this guide comes in. We can tell you how fresh they are, because we always test each one and take note of the roast dates on each coffee bag.)
Subscription Beans vs. Locally Roasted Beans
Look: If you live in a big city with great coffee—and let’s be clear, nearly every midsize city in the United States has at least a couple of excellent roasters—the best way to try fresh roasts and new beans, and learn about them, is to … go to your local roaster. Look up your local coffee roasters or visit your favorite coffee shop and ask where they get their beans. Buy the beans. Talk to people. It’s fun, if you like talking to people.
Heck, this is also true when you’re traveling. The best coffee you can find is often the cup you drink when you’re on the road, in a new place, tasting something new. Even if you don’t live on the road, it’s fun to explore different shops when you do travel.
Advertisement
But the wonder of the internet is that you’re not limited to only the best of what’s local. Subscriptions allow you to take the temperature of the most interesting roasters from all over the country, without going anywhere in particular. Heard about that one roaster in Delaware or North Carolina making crazy coffee with co-ferments and natural fermentation? A roaster in Guatemala highlighting beans from their neighbors? Let them surprise you. Are you new to the world of premium coffee, and you’d like some help from the curators at Trade Coffee or Podium Coffee Club to learn what you like?
This is why you might take a subscription. The world is at your door—even the world you’ve never even visited. I’m also lazy enough to order subscriptions from roasters a 15-minute drive away, but this is between you and your local ecologist.
But also, sometimes it’s homesickness for what used to be local. One of the best, most interesting, and kinda attitudinal roasters I know in this country is a tiny spot in South Jersey called Royal Mile. They used to be my favorite local coffee shop, when I lived in Philly and would drive to Jersey to get the coffee. Now they aren’t local at all, because I moved. But through the magic of the internet and the US Postal Service, I can still get their truly wild, surprising, mad-scientist single-origin bags anytime I want. What a privilege.
How We Test Coffee Subscriptions
Advertisement
To test these subscriptions, we try a variety of beans from each service, both our own picks and any curated options. We brewed each bag in different ways to see which beans were best suited to which brewing method. Over subscriptions he tested, Scott Gilbertson covered the spectrum of grinds with espresso, moka pot, French press, pour over, and Turkish or cowboy coffee. Matthew Korfhage wanders through espresso, AeroPress, drip, cold brew, pour-over, and a wealth of somewhat unclassifiable devices.
It’s worth doing the same if you have access to different brewing methods, especially if you opt for a subscription that offers a lot of variety. A roast that makes a great shot of espresso does not necessarily make the best pour-over coffee, and vice versa. Some roasters, like the excellent Equator Coffee, offer one subscription specifically for espresso, one for decaf, and another for light single-origin roasts that lend themselves to drip and pour-over. It can also be rewarding to take notes on your favorites. Some of these services offer a way to do this on the site, which is handy, though a paper notebook works well enough. If you’d like some more pointers on brewing, be sure to read our guide to brewing better coffee at home.
Are Coffee Subscriptions Worth It?
A delivery coffee subscription service often does offer discounts on shipping or the base cost of each bag, as compared to buying single bags for delivery. But usually, subscriptions will be premium beans, so it won’t be as cheap as the less-fresh, often preground coffee from your grocery store.
Advertisement
But if you’re the sort who likes to try the best freshly roasted single-origin Ethiopian or Guatamalan beans from roasters all over the country? This is where coffee subscriptions shine. You’re also often getting the best speciality bags a roaster has to offer, or a curated selection from a certified Q-grader—meaning you’re a lot more likely to find new roasts and origins you wouldn’t have come across on your own.
But a coffee subscription gives me access to beans from all over the country and world. It’s a mix of ease and adventure, and a chance to be a barista at my own home multiroaster café. I enjoy that I can get fresh-roasted beans from a coffee farm in Guatemala who roasts their own impossibly fresh beans onsite, alongside world-famous beans from other farmers right down the road—or taking a world tour each month with beans from my favorite globe-hopping roaster, Atlas Coffee Club.
But for others, a coffee subscription is just a way to get a steady drip of their favorite bag from their favorite roaster, guaranteed to arrive every week or every two weeks. Simple convenience is its own form of worth it.
Advertisement
How Does WIRED Select Coffee Subscriptions to be Reviewed?
There’s a lot of good coffee out there. And I am never not trying coffee—drip, espresso, cold brew, I’m consistently drinking it and testing out new roasters. I’ve been writing about coffee for 15 years on both coasts, and I’ve always been on the lookout for new and exciting growers, roasters, and beans.
Coffee can be subjective, of course, and everyone has their preferences. I include my personal favorite roasters among this list, rotate in new discoveries I figure readers might be interested in, and also solicit favorites from other very… wired… WIRED reviewers with different palates. But when deciding what subscriptions to include in our small, curated list, I also ask: What does this subscription offer that others don’t? I’m often looking for coffee subscriptions that best serve particular types of drinkers—a new service, a new delivery method, a clever way to cater to what you (whoever you may be) really want at your doorstep each week.
Advertisement
Often, a unique or uniquely useful or just kinda cool subscription model or roaster will be the first I’m in line to test. Other times, I get sent a sample bag of beans and it sends me over a moon. Always feel free to send a note about a particularly terrific roaster or subscription, at [email protected].
How Have Tariffs Affected Coffee Prices?
Ain’t gonna lie. Tariffs don’t help coffee prices. Pretty much all coffee roasted and sold in the United States is imported. If it costs more to bring into the Unites States, it will eventually cost more to buy.
This is one of many factors that affected coffee prices throughout last year, including extreme weather in Brazil and Vietnam, increasing demand, and relatively flat supply. All of these factors, including tariffs, have contributed to coffee prices rising drastically since the beginning of 2025. By fall 2025, commodity coffee bean prices were 40 percent higher than the same time the previous year. In late 2025, fully a quarter of our dozen top-pick coffee subscriptions raised prices by a buck or two a bag.
Advertisement
This year has been kinder. While still up considerably since 2024, coffee commodity prices seem to have stabilized a bit after a small bipartisan delegation of lawmakers introduced a bill that would specifically exempt coffee from tariffs. In November 2025, most of the largest coffee tariffs were rolled back by presidential decree. The Supreme Court then rolled back all tariffs in February, and nixed a presidential attempt to unilaterally instate another round of 10 percent tariffs, raising the specter of tariff refunds.
But lingering effects remain, and it’s not clear coffee prices have gone back down after last year’s hikes. This is true especially because many roasters absorbed higher costs for a number of months before hiking consumer prices. The best I can say is that none of my top coffee subscription picks raised prices in 2026.
Subscriptions can absorb high coffee commodity prices in part by selecting which beans get sent. In many cases, subscriptions are able to charge less than the individual bags you see at the supermarket, because of guaranteed sales (kinda the same way subscribing to a magazine costs less than buying at the newsstand.)
There are so many coffee subscriptions out there, and honestly, a lot of them are very good coffee. Some are even amazing coffee. This list would need to be three times as long to capture every one of them at the least. I have way more subscriptions I’ve loved than I have space to talk about them, so here I’ve gathered some past picks that we here at WIRED like; some of these provide very specific services too. Have a favorite we haven’t tried? Send an email to [email protected]
Gento Coffee for $48 for two bags: Gento is part of a new and welcome trend: growers who roast their own coffee and ship directly from the source. In Guatemala, Gento takes this a step further, roasting beans from other local growers that rank among the most esteemed bean farms in the world. But in this case, the beans might only travel down the road to be roasted. The single-origin subscription is really the play, here. Alongside roasts from Gento’s own beans from the Prentice family farm, you might find roasts from esteemed Guatemalan growers like Genaro Juarez and Patrona Perez. If those names don’t mean anything to you yet, they will after you try them.
Advertisement
Photograph: Matthew Korfhage
Camber Coffee for $20+ per 12-ounce bag: Bellingham, Washington, roaster Camber Coffee slipped under the radar for me for maybe too long—but amid its 10-year anniversary celebrations, I finally remedied this. Camber makes distinguished, aromatic, balanced single-origin coffees and a truly chocolatey espresso blend called Big Joy that lives up to its name: it’s like a fudge brownie in espresso form. Subscriptions net you 10 percent off list price on each bag.
Sunday Coffee Project for $27 per box ($45 for two): Portland’s Sunday Coffee Project is a roaster without a café, a fun art project, and a home to some of the most distinctive, funky, fruity, interesting coffee I know in this country. This could be a yeast-fermented Thai light roast that tastes a whole lot like Sangria, or an Ethiopian so floral you’ll swear you got invited to a spring wedding. Plus, your coffee comes in a little art box, designed to look like a coffee-themed children’s cereal complete with games on the back and a little cartoon character on the front: maybe a sheep lifting weights or a snake playing tennis. It’s a wee roaster, and they’ve dialed back their offerings from weekly new roasts to monthly new roasts. But if you like light and adventurous coffee, a box from Sunday Coffee Project may be your favorite thing you get in the mail that month.
Advertisement
Courtesy of Trucup
Trücup for $17 per 12-ounce bag: Are you sensitive to the acids in coffee, but you love coffee? Trücup makes unique, low-acid coffee through what it calls a natural steam process, which makes it a great option for caffeine lovers with sensitive stomachs or those who suffer from gastroesophageal reflux disease or heartburn. (Standard disclaimer: If you’ve been diagnosed with GERD, talk to your physician before you try any coffee.) Either way, even those with well-fortified stomachs may want to take note. WIRED Reviewer Scott Gilbertson loves this coffee for a more mellow cup in the afternoon or evening.
Grounds and Hounds for $19: We’ve recommended this as a top pick in the past, for its mix of feel-good donations to animal shelters and excellent roasts. Grounds and Hounds offers small-batch roasted blends and single-origin beans, with 20 percent of its profits going to benefit animal shelters. The brand has some of WIRED reviewer Scott Gilbertson’s personal favorite coffees, especially the dark roasts. (Try the Snow Day Winter Roast when it’s available.) Subscriptions are mostly recurring, individual-bag subscriptions.
Advertisement
Wonderstate Coffee for $19 to $21 per 10.5 ounce bag: Wisconsin’s Wonderstate, previously named Kickapoo, is quite possibly the nation’s first fully solar-powered roaster—and has a long and vocal commitment to providing higher pay to farmers. It’s also a quite excellent roaster. The most recent batch of single origins I tried had a tendency toward light, subtle, mild-mannered, and lightly tannic brews—a cosmopolitan palate that’s also Midwestern-polite.
Photograph: Scott Gilbertson
French Truck Coffee for $18 to $22 a bag: French Truck Coffee got its start in New Orleans and now has a dozen of its signature yellow storefronts scattered around town. WIRED operations manager Scott Gilbertson is a fan of the Big River blend, which has a deep, rich, and very robust flavor profile that’s especially well-suited to pour-over brewing. In fact, French Truck has some of the most detailed brewing instructions around.
Birds & Beans Coffee for $18+ a bag: Like birds? Clear-cut coffee farms can be hard on them. But Birds & Beans is a coffee roaster devoted to making sure its coffee is grown in Smithsonian-certified, bird-friendly farms with tree cover that helps birds thrive. The dark roasts in particular are delicious and genuinely dark: Scarlet Tanager is a favorite of WIRED operations manager Scott Gilbertson.
Advertisement
Stone Creek Coffee for $40 (two bags): Milwaukee-based Stone Creek Coffee delivers its fresh, flavorful coffee in big 1-pound bags, with a variety of blends and single-origin options available. The Cream City blend in particular is a delightful medium roast with some warmer flavor notes like chocolate and brown sugar rounded out by some fruity flavors, according to former WIRED coffee writer Jaina Grey, giving the coffee an almost cacao nib flavor. Add a little milk and it’s almost like drinking hot cocoa. A monthly subscription delivers two bags a shipment.
Grit Coffee for $17+ a bag: From its roastery in Charlottesville, Virginia, Grit Coffee roasts up excellent blends, including an excellent, roasty, chocolatey Side Hustle blend with a subtle high note of acidity to balance it out. But what really differentiates Grit from other roasters is grit. The roaster makes long-term, often 10-year commitments to its coffee farmers.
Photograph: Jaina Grey; Getty Images
Lady Falcon for $49 (two bags): Lady Falcon Coffee Club may draw you in with the art nouveau-style bags. But the luscious, velvety coffee within is what will keep you coming back, according to former WIRED reviewer Jaina Grey. Each coffee blend is thoughtfully mixed to heighten the flavors present in the contributing coffees, and the flavor notes are spot-on.
Angels’ Cup for $28 a bag: Angels’ Cup is more like a distance-learning coffee school than a box subscription service, and the Black Box subscription is like a blind coffee tasting from afar. You will learn what you actually like and dislike about coffee, along with some education through the app, roaster’s notes, and notes from fellow tasters.
Advertisement
Mistobox for $20+ a bag: With more than 500 different coffees from 50-plus roasters, Mistobox makes a good gift subscription, especially if you don’t know what kind of coffee to get someone. Somewhere in those 500 choices, your coffee fanatic should find something that will make them happy. One of the most compelling and surprising offerings: Misto lets you choose the most you’re willing to pay per shipment, and your offerings will change accordingly. Delivery frequency can also be customized down to the day. But as it appears they’re in the process of transferring to a new back end, we’ll give them a moment before assessing the new website and ordering system.
When you have multiple keyboards installed, you can manage them on iOS by opening Settings, then choosing General > Keyboard > Keyboards. To swap between keyboards you’ve installed, tap and hold the globe icon that appears in the lower left corner of all your keyboards.
On Android, you can find your keyboards via System > Keyboard > On-screen keyboard from Settings. To switch between them, tap and hold on the globe icon that appears in the lower right corner whenever a keyboard is on the screen.
The Best Phone Keyboards to Try
Gboard (Android, iOS) is a good option to start with here. It’s preinstalled by default on Pixel phones, but it’s also an excellent keyboard pick for iPhones and Android phones not made by Google. It’s fast and clean, works really well for GIFs, emoji, and stickers, and supports glide typing (where you swipe over letters to form words rather than tapping on each individual letter).
Then there’s SwiftKey (Android, iOS), which is developed by Microsoft. As you might expect, there’s Copilot AI integration built right in, so if you’re stuck for something to say, you can use generative AI to do your writing for you. SwiftKey will also learn your writing style as you go, meaning autocorrections and suggestions get more accurate over time.
Advertisement
SwiftKey comes with a range of settings to play around with.
Photograph: David Nield
Typewise (Android, iOS) demonstrates how third-party keyboards can be a little out of the ordinary. It offers an unusual layout that makes use of hexagonal letter and character tiles, and which Typewise says can seriously speed up your typing speed. There’s also support for multiple languages, AI integrations, and custom gestures.
You may be familiar with Grammarly from the web and the desktop (and from the recent news about its missteps), but the grammar and spell checker service is also available as a keyboard on iOS and as a keyboard extension on Android. As well as checking on your writing, Grammarly puts AI front and center: You can get writing suggestions from a prompt, for example, or change the tone of an existing message with a couple of taps.
If you’re interested in customization options above everything else, then consider Mister Keyboard for iOS. It’s stacked with ways to tweak the look and layout of your iPhone’s keyboard, and to access features like emoji and the clipboard. Either pick one of the preset themes, or take pixel-by-pixel control over the keyboard.
Advertisement
Mister Keyboard isn’t available for Android, but there is theming support in Futo Keyboard for Android. It also includes smart autocorrect and text editing tools, and prides itself on its privacy. The keyboard app doesn’t ask for permission to connect to the internet, so you know that your keystrokes aren’t being sent anywhere.
The Securities and Exchange Commission has closed its investigation into electric vehicle startup Faraday Future, despite SEC staff on the case recommending an enforcement action last year, TechCrunch has learned.
Four sources familiar with the investigation, who were granted anonymity to speak about the government case, told TechCrunch that the SEC informed the company and people involved in the probe about the closure this past week.
The dismissal of the case comes amid a historic drop in enforcement actions by the SEC, which only initiated four cases against publicly-traded companies in its 2025 fiscal year, a recent report shows. The SEC did not respond to an after-hours request for comment.
The investigation into Faraday Future lasted for nearly four years. The SEC was looking at whether the EV startup made “false and misleading statements” when it went public in a 2021 merger with a special purpose acquisition company (SPAC), and was also probing whether Faraday Future faked the sales of its first electric vehicles in 2023 — a claim that’s been made by at least three former employee whistleblowers.
Advertisement
The financial regulator sent the startup multiple subpoenas, regulatory filings from Faraday Future show. The SEC also took depositions of multiple former employees and executives in 2024 and 2025, three of the people familiar with the case have told TechCrunch.
In July 2025, Faraday Future revealed the SEC had sent the company and multiple executives — including founder Jia Yueting — letters known as “Wells Notices.” The SEC sends Wells Notices when staff working a case have decided to recommend the agency take enforcement action.
It’s not clear if Faraday Future ever responded to the Wells Notices sent last year. As recently as February, the company disclosed in regulatory filings that it had not. “The Company and executives plan to engage with the SEC to explain why enforcement action is not warranted,” Faraday Future wrote in such a filing last month. A company spokesperson said Sunday that Faraday Future would share more information later Sunday.
Techcrunch event
Advertisement
San Francisco, CA | October 13-15, 2026
The Department of Justice also sent Faraday Future requests for information after the SEC opened its investigation in 2022. Faraday Future has referred to this as an “investigation” in regulatory filings; the DOJ has never confirmed if it opened a full probe, and it did not respond to an after-hours request for comment.
Advertisement
It is rare for the SEC to not pursue an enforcement action after sending a Wells Notice. One study done at the Wharton School in 2020 showed that around 85% of targets who receive a Wells Notice wind up in court with the SEC.
The SEC investigated nearly every electric vehicle startup that went public in a SPAC merger over the last six years. In almost all of those cases, the agency reached a settlement with the startups. It dismissed an investigation into Lucid Motors in 2023, and as TechCrunch first reported in February, the SEC ended a probe into bankrupt EV startup Fisker late last year.
Origins of the investigation
Faraday Future was founded in California in 2014 by Jia, a businessman who at the time was running a booming tech conglomerate in China known as LeEco. It was one of many new companies trying to become the “next Tesla” or, optimistically, a “Tesla killer.”
Faraday snapped up talent from Tesla, other automakers, and also tech companies like Apple, and at one point employed as many as around 1,400 employees. But things got bumpy quickly. The company turned heads, in both good and bad ways, at the 2016 Consumer Electronics Show, with a flashy concept car and the lofty goal of being as disruptive as the iPhone.
Advertisement
The company revealed its first vehicle the following year: a luxury electric SUV called the FF91. By the end of 2017, though the company was nearly out of cash and had laid off or furloughed hundreds of workers. Jia’s company in China had collapsed, and he self-exiled to California as the government in his home country placed him on a debtor blacklist. (It was at this time that a close business associate to Jeffrey Epstein pitched the sex criminal on investing in Faraday Future, as well as other EV startups, as TechCrunch recently revealed. Epstein never invested.)
Faraday Future was rescued by an investment from major Chinese real estate conglomerate Evergrande. But that relationship fell apart quickly, too, with Evergrande walking away by the end of 2018 and Faraday Future laying off even more employees.
Jia nominally stepped aside as CEO in 2019 and also filed for personal bankruptcy to settle billions of dollars of LeEco debt he had personally guaranteed. But behind the scenes, he was still largely in charge of the company.
This became an issue when Faraday Future went public in 2021 and raised about $1 billion. Members of the newly-appointed public company board believed that Faraday’s executives had misrepresented Jia’s control over the day-to-day operations — especially after a short seller report was published that scrutinized Faraday Future — and formed a special committee to investigate.
Advertisement
That committee hired an outside law firm and a forensic accounting firm, and within the first few months it started reporting its findings directly to the SEC, the three people familiar with the investigation told TechCrunch.
Between January and April 2022, Jia was sidelined as a result of the board’s investigation, a senior VP named Matthias Aydt (who is now co-CEO with Jia) was placed on probation for six months, and another VP named Jerry Wang (who is Jia’s nephew) was suspended. (Wang ultimately resigned after “failure to cooperate with the investigation,” according to company filings, but is now back with Faraday Future.)
The committee’s work also showed that Faraday Future had, in the two years before it went public, survived in part on multi-million-dollar loans made to the company by low-level employees with connections to Jia — known as “related party transactions” in legal parlance.
On March 31, 2022, Faraday Future disclosed that the SEC had opened its investigation. The startup revealed the requests for information from the DOJ in June.
Advertisement
Dodging another bullet
Through the rest of 2022, and amid the early stages of the SEC investigation, employees and people close to Jia waged a campaign to regain control of the board and his company. This eventually resulted in death threats against some directors, who ultimately resigned, paving the way for people close to Jia to run the company once more.
Faraday Future finally delivered the first few FF91 SUVs in early 2023. Former employees have sued the company alleging that these were not true sales, and that the company had misled investors. The SEC investigators working the case subpoenaed Faraday Future about issues related to these sales, filings show.
Former executives and employees were initially deposed by the SEC in 2024, according to the people familiar with the investigation. The SEC sat some of them for longer depositions in the first half of 2025, the people said.
The Wells Notice sent in July 2025 said SEC staff had made “a preliminary determination to recommend that the Commission file an enforcement action against the Company alleging violations of various anti-fraud provisions of the federal securities laws.”
Advertisement
Specifically, the Wells Notice referenced “purported false or misleading statements” made during the SPAC merger process about “related party transactions” and Jia’s “role in the Company.” Jia, his nephew Wang, and two other unnamed employees also received Wells Notices.
Faraday Future is still trying to sell the FF91, but it has also recently changed its business in a few ways. The company is importing more affordable hybrid and electric vans from China. It also appears to be selling re-badged versions of Chinese robots, and turned a publicly-traded biotechnology company into a firm focused on crypto.
Those efforts have not stopped the company’s struggles. On Friday, the company announced it had received a warning from the Nasdaq that its stock price was under the minimum of $1, which could eventually lead to the company being de-listed.
Research underway at UW Medicine’s Institute for Protein Design. (GeekWire Photo / Lisa Stiffler)
Nobel Laureate David Baker will lead a new University of Washington initiative that’s launching with $7 million to develop designer enzymes and proteins to solve challenges in medicine, technology and sustainability.
The funding comes from the Washington Research Foundation (WRF), a nonprofit organization based in Seattle that supports research and entrepreneurship in the state. WRF granted nearly $200,000 last year to the UW Institute for Protein Design (IPD), which Baker leads, to create a plan for the new program.
The goal of the four-year initiative is to accelerate IPD’s work by educating new scientists, translating discoveries into commercially viable tools and supporting the launch of startups.
Baker received the 2024 Nobel Prize in Chemistry for using AI and machine learning to create never-before-seen proteins. He is the director of the IPD and a UW biochemistry professor.
Enzymes, which are a category of proteins, have key applications in global industries such as pharma, agriculture, energy and manufacturing. They’re able to orchestrate the transformation of molecules and dramatically speed up essential chemical reactions.
Advertisement
“With AI, we can now design these molecules from scratch, tailored precisely to the task at hand,” Baker said in a statement. “This grant from WRF will help us push this technology further and train a new generation of scientists to bring designed enzymes from the computer to the lab to the market.”
IPD has already launched more than 10 startups, including PvP Biologics, acquired by Takeda; Icosavax, acquired by AstraZeneca; A-Alpha Bio; and Neoleukin Therapeutics.
The project will start receiving grant dollars from the Washington Research Foundation around June of this year. Further support is coming from the philanthropist and former Citibank CEO Sanford Weill; the Fund for Science and Technology, which is part of Microsoft co-founder Paul Allen’s philanthropies; and the IPD Breakthrough Fund. The UW is providing IPD with additional office and lab space in Seattle’s South Lake Union as part of the initiative.
The grant comes from the foundation’s BioInnovation Grants program, which launched last year and has funded three additional efforts to advance Washington state’s life sciences sector. The program has committed more than $32 million across five institutions.
The result of the mod is less a novelty case mod and more a proof-of-concept for what a hybrid Xbox/PC box could look like in practice, arriving months before Microsoft’s own Project Helix promises official support for PC titles on next-gen Xbox hardware. Read Entire Article Source link
You must be logged in to post a comment Login