Connect with us
DAPA Banner

Tech

Instacart Promo Code: Save on Groceries in March 2026

Published

on

Like many others, I first used Instacart during the early days of the pandemic when it was a lifesaver. Literally. As the primary caretaker of my immunocompromised grandmother, I was at a loss for how to do something as simple as feed her without risking dangerous exposure. And although I love delivery from a restaurant, it’s expensive and unhealthy. With Instacart, I was able to get her healthy groceries and favorite comfort foods delivered to us right at home without having to risk exposure.

Even now, post-COVID, Instacart still saves the day for me. I live in Brooklyn, and there are many heavy things I need from the grocery store that I physically can’t haul up and down the subway stairs and throughout the streets. With Instacart, I can have any grocery store of my choosing come to me. Welcome to the future. If you’ve been curious about this grocery delivery service, now is a great time to check it out and save big. If you’re an old stan like me, fear not, we have an Instacart promo code and various deals to slash those grocery bills.

Find the Best Instacart Promo Code for Massive Savings in March 2026

Here’s the best place to find the latest Instacart savings and Instacart discount codes, as we are constantly scouring the web to find things like savings for first-time users, deals on specific items or brands, discounts on Instacart+ memberships, and more. Make sure you check back often, we update these Instacart coupons when we find great new deals and add more when new seasonal deals pop up.

Get Free Delivery on Your First 3 Orders

This deal is so good that I wish I were a new customer. If you’ve never used Instacart, now is a fantastic time to try it out and save massively. Right now, when you sign up for Instacart, you’ll get $0 delivery fees on the first three orders. Depending on your order size and location, that can amount to some serious cash savings. Like other delivery services, you are paying for convenience, and delivery fees can add up fast. To sign up, you’ll just need to enter your email address, set up a profile, and get to shopping! Get free delivery on your first 3 orders today, just make sure you hit that $10 order minimum, and know that some service fees may still apply.

Advertisement

Claim a $50 Instacart Credit by Applying for the Instacart Mastercard

If you use Instacart as the primary way you get groceries, getting an Instacart Mastercard might be a good idea to help you save even more on the purchases you were already going to make. And right now, you’ll get a $50 Instacart credit, which will be automatically loaded to your account once the card has been approved. With your eligible Mastercard, you’ll get three free months of Instacart+, along with $10 off your second order placed each month.

Snag $10 Off Petco Orders for Your Pets

If you’re already an Instacart member, you can still save. Right now, pet parents don’t have to haul heavy bags of kibble or litter to-and-fro. With this Instacart coupon, you’ll get a $10 discount on orders over $50 from Petco through Instacart. So whether you want to save a trip or save your back, Petco on Instacart can help you get everything from chew toys to treats and litter, delivered to your door with a $10 Instacart discount.

Enjoy $10 Off Walgreens Essentials Delivered Fast

In complete honesty, these days I use Instacart mostly when I’m sick or hungover. Instacart has partnered with Walgreens to deliver essentials like Gatorade, Advil, soup, and whatever else your ailments might call for. Plus, you’ll get a $10 discount on orders over $40 placed at Walgreens through Instacart. Go ahead, stay in bed and let the goodies come to you.

Refer a Friend to Shop & Get a $10 Bonus

If you know someone who hasn’t tried Instacart, now’s a great time to spread the love. When an Instacart customer makes a referral, they can get up to $40 in credits to use across their first two orders, and you’ll get a $10 credit for referring them once their delivery is complete. That’s big savings, and it’s easier than ever to refer. You can refer through text, email, or social media using your personalized referral code or link. Plus, there is no limit to the number of users you can refer to Instacart. To start in-app, tap the Account icon in the upper right corner of the app, tap Refer a friend, and share your referral link via contacts, text, email, social media, or other channels. If you do this on the web, you’ll click the horizontal lines, tap “Invite friends,” and share your referral link or code via contacts, text, email, social media, or other channels to secure the Instacart discount.

Advertisement

Activate Unlimited Free Delivery With Instacart+ Membership

Having an Instacart+ membership has tons of benefits, including free delivery on orders over $10, $0 delivery fees on eligible restaurant orders over $25, a free Peacock subscription, and access to New York Times Cooking. You can unlock all of these benefits with Instacart+, for $99 per year or $10 per month for unlimited free delivery and other Instacart+ membership perks.

Source link

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Tech

This Is How Trump Is Already Threatening the Midterms

Published

on

The White House did not respond to a request for comment about the meetings, but an official who was not authorized to speak on the record, told WIRED at the time: “The White House does not comment on mysterious meetings with unnamed staffers.”

Simultaneously, Trump has also sought to absolve officials of any wrongdoing in the wake of the 2020 election. Last year, Trump gave “full, complete and unconditional” pardons to a slate of people who had tried, and failed, to help him overturn the 2020 election results. In recent months, Trump has pressured Colorado governor Jared Polis to release Tina Peters, the former county clerk in Mesa County, Colorado, who became a hero for the right’s election deniers when she facilitated a security breach during a software update of her county’s election management system.

Peters was found guilty of four felonies, but Trump has been mounting a campaign in recent months to get her released, even going so far as to say he “pardoned” her, even though he has no power to do so given she was convicted on state charges.

Election Day Interference

While Trump has not announced specific plans to deploy troops to polling locations or seize voting machines, he and his administration have certainly been suggesting that such action is not off the table.

Advertisement

In January, Trump lamented not having the National Guard seize certain voting machines after the 2020 election. In early February, White House press secretary Karoline Leavitt told reporters that while she hasn’t specifically heard Trump discussing the possibility, she couldn’t “guarantee that an ICE agent won’t be around a polling location in November.” (The question was in response to former White House adviser Steve Bannon stating: “We’re going to have ICE surround the polls come November. We’re not going to sit here and allow you to steal the country again … We will never again allow an election to be stolen.”)

Earlier this month, during his confirmation hearing to head up the Department of Homeland Security, Senator Markwayne Mullin said he would be willing to deploy ICE to polling locations to address “a specific threat.”

The result of the Trump administration’s drip feed of threats and dog whistles is that those who are running elections in states across the country are already war-gaming what happens if ICE or the National Guard show up at their voting locations.

Michael McNulty, the policy director at Issue One, a nonprofit that tracks the impact of money in politics, also points to the fact that the Department of Justice sent monitors to oversee elections in November in New Jersey and California, despite no federal elections being held. “The concern is that this could become a massive deployment of, quote unquote, observers by the DOJ in 2026 who might do something more, whether it’s intimidation, whether it’s interfering with local election officials, to get data to confirm conspiracy theories,” McNulty tells WIRED.

Advertisement

FBI Raids

On January 28, the FBI raided the election office in Fulton County, Georgia, executing a search warrant that allowed it to seize ballots, ballot images, tabulator tapes, and the voter rolls related to the 2020 election. The search warrant affidavit, unsealed a few weeks ago, shows that the FBI relied on the work of Kurt Olsen, a lawyer who was appointed by the administration to investigate election security in October and who has a long history of working with some of the country’s biggest election deniers, including Patrick Byrne, Mike Lindell, and Kari Lake. Olsen’s claims are based on debunked and previously investigated conspiracy theories about the 2020 election.

The raid was also notable for the presence of Tulsi Gabbard, the director of national intelligence, who is, according to The Guardian, running a parallel investigation into the 2020 election with the apparent tacit approval of Trump.

Source link

Advertisement
Continue Reading

Tech

Fiber HDMI cables enable full-bandwidth 8K over runs up to 990 feet

Published

on


The product is an active optical cable (AOC) for HDMI. Instead of relying solely on copper, it carries most of its signal over fiber-optic strands. Inside the cable, HDMI electrical signals are converted into optical signals for the journey between the two ends, then converted back to electrical signals at…
Read Entire Article
Source link

Continue Reading

Tech

In with a bang, out in silence — the end of the Mac Pro

Published

on

For almost two decades, the Mac Pro bounced between coveted and beloved, to derided and forgotten. Now, it’s finally over.

Silver computer tower with a handle, power button, and ventilation holes on the side.
Apple is reportedly pressing the off switch on the Mac Pro

All political careers end in failure, and all devices fade out as they are eventually superseded. Yet this time it’s more that the Mac Pro has been usurped, and possibly even stabbed in the back.
If you’re a Mac Pro fan, you know this day is coming, and you probably don’t want to believe it. It’s true that the Mac Pro has long lost its crown as the most powerful Mac, but still this is the legendary Mac Pro.
Continue Reading on AppleInsider | Discuss on our Forums

Source link

Continue Reading

Tech

Is It Time For Open Source to Start Charging For Access?

Published

on

“It’s time to charge for access,” argues a new opinion piece at The Register. Begging billion-dollar companies to fund open source projects just isn’t enough, writes long-time tech reporter Steven J. Vaughan-Nichols:


Screw fair. Screw asking for dimes. You can’t live off one-off charity donations… Depending on what people put in a tip jar is no way to fund anything of value… [A]ccording to a 2024 Tidelift maintainer report, 60 percent of open source maintainers are unpaid, and 60 percent have quit or considered quitting, largely due to burnout and lack of compensation. Oh, and of those getting paid, only 26 percent earn more than $1,000 a year for their work. They’d be better paid asking “Would you like fries with that?” at your local McDonald’s…

Some organizations do support maintainers, for example, there’s HeroDevs and its $20 million Open Source Sustainability Fund. Its mission is to pay maintainers of critical, often end-of-life open source components so they can keep shipping patches without burning out. Sentry’s Open Source Pledge/Fund has given hundreds of thousands of dollars per year directly to maintainers of the packages Sentry depends on. Sentry is one of the few vendors that systematically maps its dependency tree and then actually cuts checks to the people maintaining that stack, as opposed to just talking about “giving back.”

Sentry is on to something. We have the Linux Foundation to manage commercial open source projects, the Apache Foundation to oversee its various open source programs, the Open Source Initiative (OSI) to coordinate open source licenses, and many more for various specific projects. It’s time we had an organization with the mission of ensuring that the top programmers and maintainers of valuable open source projects get a cut of the tech billionaire pie.

Advertisement

We must realign how businesses work with open source so that payment is no longer an optional charitable gift but a cost of doing business. To do that, we need an organization to create a viable, supportable path from big business to individual programmer. It’s time for someone to step up and make this happen. Businesses, open source software, and maintainers will all be better off for it.

One possible future… Bruce Perens wrote the original Open Source definition in 1997, and now proposes a not-for-profit corporation developing “the Post Open Collection” of software, distributing its licensing fees to developers while providing services like user support, documentation, hardware-based authentication for developers, and even help with government compliance and lobbying.

Source link

Continue Reading

Tech

From “Hello, World!” to AI: What Skills Actually Prepare Students for the Future?

Published

on

This article is part of the collection: Teaching Tech: Navigating Learning and AI in the Industrial Revolution.


A little over a decade ago, schools were swept into what many described as a movement to prepare students for the future of work. That work was coding — “Hello, world!”

Districts introduced new courses, nonprofits expanded access to computer science education and a growing ecosystem of programs promised to teach students the skills needed to enter the tech workforce. For many, it felt like a necessary correction to a rapidly digitizing world. But over time, a more complicated picture emerged.

While access to computer science education expanded, the relationship between early coding exposure and long-term workforce outcomes became uneven. The “learn to code” movement raised an important question that still lingers today: Which skills actually endure when technologies change? That question has resurfaced in a new form.

Advertisement

Today, generative AI is driving a similar wave of urgency. Schools are once again being encouraged to adapt quickly, often with the same underlying rationale that teachers must prepare students for a future shaped by emerging technologies.

But if the instructional role of AI remains unclear, and if the tools themselves are likely to evolve rapidly, the more persistent challenge may lie elsewhere.

After conducting a two-year research project alongside teachers, who are adapting and are open to integrating AI, we found that uptake is still minimal. Most of our participants, including those who are engineering or computer science teachers, still struggle to identify a clear or universal instructional use case for widespread AI integration.

So, what should students learn to help them adapt to whatever comes next?

Advertisement

A growing body of research suggests that the answer may lie not in teaching students how to use a particular AI system, but in helping them understand the computational ideas that make those systems possible.

The Limits of Teaching the Tool

In recent years, many discussions about AI education have centered on teaching students how to use generative tools effectively. Prompt engineering, for example, has become a common topic in professional development workshops and online tutorials.

Yet, focusing heavily on tool-specific skills can create a familiar educational problem, because technology changes faster than curricula.

Teaching students how to interact with a specific interface risks becoming the equivalent of teaching to standardized tests, rather than teaching students important lessons that don’t appear on state exams.

Advertisement

The history of computing education offers a useful example. In the early 2010s, a wave of coding initiatives encouraged schools to teach programming skills broadly. While many of those programs expanded access to computer science education, subsequent analysis showed that workforce pipelines in technology remained uneven, and many students learned tool-specific skills without developing deeper computational reasoning abilities.

That experience offers a cautionary lesson for the current AI moment. If the goal of integrating AI into education is long-term preparation for technological change, focusing narrowly on how to use today’s tools may not be the most durable strategy.

The Skill That Outlasts the Tool

A growing body of research suggests that computational thinking is a more durable educational objective.

Computational thinking refers to a set of problem-solving practices used in computer science and other analytical disciplines. These include:

Advertisement
  • breaking complex problems into smaller components
  • recognizing patterns
  • designing step-by-step processes
  • evaluating the outputs of automated systems

These skills apply not only to programming but also to fields ranging from engineering to public policy.

Importantly, they also help students understand how algorithmic systems operate.

When students learn computational thinking, they gain the ability to analyze how technologies like AI produce results rather than simply accepting those results as authoritative.

In this sense, computational thinking provides a conceptual bridge between traditional academic skills and emerging digital systems.

What Teachers Are Already Doing

Many teachers in our study were already moving in this direction, often without using the term computational thinking.

Advertisement

When teachers asked students to analyze chatbot errors, they were encouraging students to examine how algorithmic systems produce outputs. When they designed exercises comparing training data and algorithms to everyday processes, they were helping students reason about how automated systems work.

These approaches do not require students to rely heavily on AI tools themselves. Instead, they position AI as a case study for examining how technology shapes information.

That framing aligns with longstanding educational goals around critical thinking, media literacy and problem-solving.

Implications for Educators

If the instructional use case for generative AI remains uncertain, educators may benefit from focusing on skills that remain valuable regardless of which tools dominate in the future.

Advertisement

Several practical approaches are already emerging in classrooms. Teachers can use AI systems as objects of analysis, asking students to evaluate outputs, identify errors and investigate how models generate responses.

Lessons can connect AI to broader topics such as data quality, algorithmic bias and information reliability.

Assignments that emphasize reasoning, structured problem solving and evidence evaluation continue to support the kinds of cognitive work that remain central to learning.

These approaches allow students to engage with AI without allowing the technology to replace the thinking process itself.

Advertisement

Implications for EdTech Developers

The experiences teachers described also highlight an opportunity for edtech companies.

Many current AI tools were developed as general-purpose language systems and later introduced into education contexts. As a result, teachers are often left to determine whether and how those tools align with classroom learning goals. Future products may benefit from deeper collaboration with educators during the design process.

Teachers in our conversations were already experimenting with small classroom applications, designing AI literacy lessons and building course-specific chatbots.

These experiments resemble early-stage product development.

Advertisement

Partnerships between educators, edtech developers and product managers could help identify instructional problems that AI systems could realistically address.

The Next Phase of the Research

The conversations described in this series represent an early attempt to document how teachers are navigating the arrival of generative AI.

As schools continue experimenting with these tools, the next challenge will be to develop governance frameworks that help educators evaluate when and how AI should be used in learning environments.

Our research team is beginning the next phase of this work by partnering with school districts to develop guidance for AI governance and inviting edtech companies interested in exploring these questions collaboratively.

Advertisement

Rather than assuming that AI will inevitably transform classrooms, this phase of the project will focus on identifying the conditions under which AI tools actually support teaching and learning and how to reduce harm when they don’t.

The fourth grade teacher’s question remains a useful guide: What can I actually use this for in math?

Until the answer becomes clearer, many teachers will likely continue doing what professionals in any field do when new technologies appear: experimenting cautiously, adopting what works and relying on their judgment to decide where or if the tool belongs.


If your school, district, organization, or edtech company is interested in learning more about joining our next project on AI governance, contact our research team at research@edsurge.com.

Advertisement

Source link

Continue Reading

Tech

French AI start-up Mistral raises $830m in debt

Published

on

The Paris-based company is building out ‘cutting-edge’ European data centres with a total capacity ambition of 200MW by 2027.

French AI start-up Mistral has raised $830m in its first debt financing, for the purposes of funding its data centre near Paris.

The company said the deal, supported by a consortium of seven “top-tier” global banks, would pay for Nvidia Grace Blackwell infrastructure with 13,800 Nvidia GB300 GPUs at the “cutting-edge” centre, bringing powered capacity to 44MW.

The data centre at Bruyères-le-Châtel, scheduled to be operational in the first half of this year, was previously earmarked to train AI models belonging to Mistral and its customers, while also “delivering high-performance inference services”, according to the company.

Advertisement

Last month, Mistral said it would spend over $1.4bn in Sweden on digital infrastructure, including a data centre, building towards its stated goal of 200MW of capacity across Europe by 2027.

“Scaling our infrastructure in Europe is critical to empower our customers and to ensure AI innovation and autonomy remain at the heart of Europe,” said Arthur Mensch, CEO of Mistral AI.

“We will continue to invest in this area, given the surging and sustained demand from governments, enterprises and research institutions seeking to build their own customised AI environment, rather than depend on third-party cloud providers.”

Mistral said it is building a “vertically integrated AI company” comprising “frontier open-weight models, deep enterprise integration, production deployments and its own compute infrastructure”.

Advertisement

It counts organisations in the tech, retail, logistics and public sectors among its customers, and has already partnered with the likes of AMSL, Ericsson and the European Space Agency to train models on their proprietary data.

Earlier this month, Mistral launched both ‘Small 4’, the newest model in its fully open-source ‘Small’ series with an aim of consolidating capabilities of its flagship models, and ‘Forge’, a platform that lets enterprises build custom models trained on their own data.

Last September, the 2023-founded French AI darling announced a Series C raise of around $2bn at a post-money valuation of more than $13bn, led by Dutch chipmaker ASML. Existing investors DST Global, Andreessen Horowitz, Bpifrance, General Catalyst, Index Ventures, Lightspeed and Nvidia took part.

Although a frontrunner in the European AI space, Mistral is far behind US competitors such as OpenAI and Anthropic in terms of funding levels and valuations.

Advertisement

Mistral is a founding member of the Nvidia Nemotron Coalition. As part of the initiative, Mistral and Nvidia plan to co-develop frontier open-source AI models.

Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.

Source link

Advertisement
Continue Reading

Tech

NASA Picks Intuitive Machines for a 2030 Artemis Moon Delivery Loaded with Science Tools and a Human Time Capsule

Published

on

NASA Intuitive Machines 2030 Artemis Moon Delivery
NASA has awarded Intuitive Machines a $180.4 million contract to deliver seven science payloads to a carefully chosen site near the lunar south pole. The Houston based company will use one of its larger lander configurations for the mission, designated IM-5, with a target landing date of around 2030 at Mons Malapert. The location was selected for good reason. The ridge maintains fairly consistent line of sight with Earth, receives relatively steady sunlight, and sits close to permanently shadowed regions that may hold water ice, a resource that could prove critical to sustaining long term human operations on the Moon.



The lander arrives loaded with instruments ready to start collecting data from the moment it touches down. A stereo camera package developed at NASA’s Langley Research Center, called the Stereo Cameras for Lunar Plume Surface Studies, will capture how the descent engines disturb the fine lunar soil, information that will help engineers design landing systems that cause less disruption to the surface. A near infrared spectrometer mounted on a small rover from Honeybee Robotics, led by NASA’s Ames Research Center, will then scan for minerals and potential ice deposits while also measuring surface temperatures and mapping how the soil composition varies across the landing area.


LEGO Technic NASA Artemis Space Launch System Rocket Building Toy for Boys & Girls – STEM Learning…
  • BUILD AN OFFICIAL NASA ROCKET – Kids prepare to explore outer space with the LEGO Technic NASA Artemis Space Launch System Rocket (42221) building…
  • 3-STAGE ROCKET SEPARATION – Young builders can turn the hand crank to watch the rocket separate in 3 distinct stages: solid rocket boosters, core…
  • STEM BUILDING TOY FOR KIDS – This educational rocket kit was created in collaboration with NASA and ESA to showcase the authentic system that will…

A mass spectrometer called MSolo, built at NASA’s Kennedy Space Center, will analyze gases present at the landing site immediately after touchdown, focusing on lightweight molecules that could prove useful for future lunar explorers. Radiation monitoring is handled by a set of four detectors developed by the Korea Astronomy and Space Science Institute, measuring surface exposure levels to assess risks for both equipment and future crew while also providing insight into the geological history of the surrounding area.


A set of small sensors aboard the Australian Space Agency’s Roo-ver will track how landing plumes interact with surface materials across varying distances over time, part of NASA Goddard Space Flight Center’s Multifunctional Nanosensor Platform. The Roo-ver will also demonstrate its ability to navigate and move independently across uneven lunar terrain. A Laser Retroreflector Array, also out of Goddard, rounds out the payload with a compact set of mirrors designed to bounce laser signals back to orbiting spacecraft, improving navigation accuracy for future missions passing overhead or coming in to land nearby and helping establish reliable reference points across the lunar surface.

Advertisement


Rounding out the cargo is Sanctuary on the Moon, a time capsule developed in France containing information about human civilization, science, technology, culture, and the human genome, etched onto 24 durable synthetic sapphire discs. It is built to last, and designed to be found.
[Source]

Source link

Advertisement
Continue Reading

Tech

Google’s new compression drastically shrinks AI memory use while quietly speeding up performance across demanding workloads and modern hardware environments

Published

on


  • Google TurboQuant reduces memory strain while maintaining accuracy across demanding workloads
  • Vector compression reaches new efficiency levels without additional training requirements
  • Key-value cache bottlenecks remain central to AI system performance limits

Large language models (LLMs) depend heavily on internal memory structures that store intermediate data for rapid reuse during processing.

One of the most critical components is the key-value cache, described as a “high-speed digital cheat sheet” that avoids repeated computation.

Source link

Advertisement
Continue Reading

Tech

Apple’s Early Days: Massive Oral History Shares Stories About Young Wozniak and Jobs

Published

on

Apple’s 50th anniversary is this week — and Fast Company’s Harry McCracken just published an 11,000-word oral history with some fun stories from Apple’s earliest days and the long and winding road to its very first home computers:


Steve Wozniak, cofounder, Apple: I told my dad when I was in high school, “I’m going to own a computer someday.” My dad said, “It costs as much as a house.” And I sat there at the table — I remember right where we were sitting — and I said, “I’ll live in an apartment.” I was going to have a computer if it was ever possible. I didn’t need a house.

Woz even remembers trying to build a home computer early on with a teenaged Steve Jobs and Bill Fernandez from rejected parts procured from local electronics companies. Woz designed it — “not from anybody else’s design or from a manual. And Fernandez was one of those kids that could use a soldering iron.”

Bill Fernandez: The computer was very basic. It was working, and we were starting to talk about how we could hook a teletype up to it. Mrs. Wozniak called a reporter from the San Jose Mercury, and he came over with a photographer. We set up the computer on the floor of Steve Wozniak’s bedroom.

Well, the core integrated circuit that ran the power supply that I built was an old reject part. We turned on the computer, and the power supply smoked and burnt out the circuitry. So we didn’t get our photos in the paper with an article about the boy geniuses.

But within a few years Jobs and Wozniak both wound up with jobs at local tech companies. Atari cofounder Nolan Bushnell remembers that Steve Jobs “wasn’t a good engineer, but he was a great technician. He was pristine in his ability to solder, which was actually important in those days.” Meanwhile Allen Baum had shared Wozniak’s high school interest in computers, and later got Woz a job working at Hewlett-Packard — where employees were allowed to use stockroom parts for private projects. (“When he needed some parts, even if we didn’t have them, I could order them.”) Baum helped with the Apple I and II, and joined Apple a decade later.

Advertisement

Wozniak remembers being inspired to build that first Apple I by the local Homebrew Computing Club, people “talking about great things that would happen to society, that we would be able to communicate like we never did [before] and educate in new ways. And being a geek would be important and have value.” And once he’d built his first computer, “I wanted these people to help create the revolution. And so I passed out my designs with no copyright notices — public domain, open source, everything. A couple of other people in the club did build it.”

But Woz and Jobs had even tried pitching the computer as a Hewlett-Packard product, Woz remembers:

Steve Wozniak: I showed them what it would cost and how it would work and what it could do with my little demos. They had all the engineering people and the marketing people, and they turned me down. That was the first of five turndowns from Hewlett-Packard. Steve Jobs and I had to go into business on our own.

In the end, Randy Wigginton, Apple employee No. 6 remembers witnessing Jobs, Wozniak, and Ronald Wayne the signing of Apple’s founding contract, “which is pretty funny, because I was 15 at the time.” And it was Allen Baum’s father who gave Wozniak and Jobs the bridge loan to buy the parts they’d need for their first 500 computers.

After all the memories, the article concludes that “Trying to connect every dot between Apple, the tiny, dirt-poor 1970s startup, and Apple, the $3.7 trillion 21st-century global colossus, is impossible.”

But this much is clear: The company has always been at its best when its original quirky humanity and willingness to be an outlier shine through.

Advertisement

Mark Johnson, Apple employee No. 13: I was in Cupertino just yesterday. It’s totally different. They own Cupertino now.

Jonathan Rotenberg, who cofounded the Boston Computer Society in 1977 at age 13: People want to hate Apple, because it is big and powerful. But Apple has an underlying moral purpose that is immensely deep and expansive…

Mike Markkula, the early retiree from Intel whose guidance and money turned the garage startup into a company: The culture mattered. People were there for the right reasons — to build something transformative — not just to make money. That alignment produced extraordinary results…

Steve Wozniak: Everything you do in life should have some element of joy in it. Even your work should have an element of joy… When you’re about to die, you have certain memories. And for me, it’s not going to be Apple going public or Apple being huge and all that. It’s really going to be stories from the period when humble people spotted something that was interesting and followed it

Advertisement

I’ll be thinking of that when I die, along with a lot of pranks I played. The important things.

Source link

Continue Reading

Tech

Kandou AI raises $225M from SoftBank and Synopsys to solve AI’s memory wall

Published

on

Kandou AI, a Swiss semiconductor company that builds chip-to-chip interconnect technology, has raised $225 million in what it calls a Series A round, led by Maverick Silicon with strategic participation from SoftBank, Synopsys, Cadence Design Systems, and Alchip Technologies. The round values the company at $400 million. The label is worth pausing on: Kandou was founded in 2011 and previously raised more than $163 million across Series B and C rounds under the name Kandou Bus. The “Series A” designation reflects a rebrand and leadership change, not a fresh start.

The company’s new chief executive, Srujan Linga, a former Goldman Sachs managing director, took over in 2025 from founder Amin Shokrollahi, an EPFL professor of mathematics and computer science who invented the core technology. Shokrollahi’s contribution, a signalling method called Chord that sends correlated signals across multiple wires to increase bandwidth by a factor of two to four while halving power consumption, remains the technical foundation. The rebrand to Kandou AI and the repositioning toward artificial intelligence infrastructure is Linga’s doing, and it appears to have worked: the $225 million raise is the largest in the company’s history and brings SoftBank, one of the most aggressive AI infrastructure investors, onto the cap table.

The bet against light

What makes Kandou AI’s position unusual is not the problem it is trying to solve but the material it proposes to solve it with. The AI industry’s interconnect bottleneck is real and well documented. As models scale to hundreds of billions of parameters and training clusters expand to tens of thousands of GPUs, the speed at which data moves between processors and memory has become the binding constraint on performance. At signalling speeds of 224 gigabits per second, traditional copper interconnects consume roughly 30 per cent of total cluster power, with signal degradation so severe that reach is limited to less than a metre without amplification.

The prevailing industry response has been to move to optics. Ayar Labs raised $500 million in March 2026 at a $3.8 billion valuation for its co-packaged optical interconnects. Marvell completed a $3.25 billion acquisition of Celestial AI in February, buying photonic fabric technology that claims 25 times the bandwidth of copper alternatives at a tenth of the latency. The optical interconnect market for AI data centres is projected to grow from $3.75 billion in 2025 to $18.36 billion by 2033.

Advertisement

Kandou AI is betting that copper is not finished. Its Chord signalling technology, the company claims, can achieve path-to-Shannon-capacity efficiency, reducing power consumption and system costs by a factor of ten while extending copper links to 448 gigabits per second and beyond. If that claim holds, it would mean that the billions being spent on optical interconnect transitions are at least partially premature, and that existing copper infrastructure can be made to work for several more hardware generations at a fraction of the cost.

The strategic investors tell the story

The composition of the investor syndicate matters more than the headline figure. Synopsys and Cadence are the two dominant providers of electronic design automation tools. Their participation is not purely financial; it signals potential integration of Kandou AI’s serialiser/deserialiser intellectual property into the design flows that chip architects use to build processors and memory controllers. Alchip, a Taiwanese ASIC design services company, provides a path to manufacturing. SoftBank, which has invested more than $100 billion in AI-adjacent companies through its Vision Fund and direct investments, adds the scale capital and the strategic network.

The practical implication is that Kandou AI’s technology could appear inside chips designed by other companies rather than requiring customers to adopt Kandou’s own silicon. This is a licensing and IP model, similar in structure to Arm’s approach in mobile processors, and it is a more capital-efficient path to market dominance than manufacturing and selling chips directly. Whether Kandou can execute on that model with a $400 million valuation and $225 million in fresh capital, against optical competitors valued at ten times as much, is the central question.

The valuation gap

At $400 million, Kandou AI is valued at roughly a tenth of Ayar Labs and an eighth of what Marvell paid for Celestial AI. That gap could reflect market scepticism about copper’s longevity in AI infrastructure, or it could reflect the fact that Kandou’s technology, if it works as claimed, does not require the industry to rip out its existing wiring. Copper is already in every data centre. If Kandou’s signalling technology can make it fast enough for another generation of AI workloads, the adoption curve would be faster and cheaper than an optical transition.

The risk is that “another generation” may not be long enough. AI model sizes and training cluster scales are growing at a pace that consistently outstrips infrastructure predictions. What is adequate at 448 gigabits per second today may be inadequate at the terabit-per-second speeds that next-generation models will demand within two to three years. Optical interconnects, for all their cost and complexity, offer a higher theoretical ceiling.

Advertisement

Kandou AI’s $225 million buys it time to prove that the ceiling can wait. The company’s 15-year history and the technical credibility of Chord signalling, which has been deployed commercially in consumer electronics since the mid-2010s, lend substance to the bet. But the AI infrastructure market has a pattern of rewarding ambition over incrementalism, and a company arguing that the existing material is good enough faces a harder narrative sell than one promising to replace it entirely. The investors on this round appear to be betting on engineering pragmatism. Whether the market agrees will depend on how quickly the optical transition matures, and whether Kandou’s copper can keep pace with an industry that has shown little interest in waiting for anything.

Source link

Advertisement
Continue Reading

Trending

Copyright © 2025