Building the Second Web
Plus: making gasoline from air, metals from myths, zippers from buttons, new builds from bricks, and more in Roundup #43
Hello fellow aspiring cognoscenti,
I recently happened upon the word “cognoscenti.” It’s the name of a website section from the public radio station run by my old university. I looked up the meaning and loved it. Plus, it is fun to say.
This issue leads with another AI-centric piece.
We pick up from the last issue, fueled by recent announcements from a range of developer conferences (including events my agency leads for brands like Microsoft and Confluent).
So, after spending a few weeks deep in developer, technie, nerdie playgrounds, I’m going a bit deeper into the emerging AI tech stack.
Then we roll into a round-up that starts with actual alchemy and then moves through lots of intrigue — and, after that, there are a few other fun bits to explore.
Oh, and I am on Team Writes with Emdashes.
Have been for many, many years. Even though emdashes are now a sign of AI-first writing, I’m sticking with them in my Actual Human writing — even when I use them in grammatically incorrect ways.
Okay, now — let’s dig in.
Building the Second Web
A parallel internet optimized for AI agents is taking shape, and the tools to build it have finally arrived.
In the last piece, we explored how AI agents will change brand engagement by redefining B2B as a new mess of Brand-to-Bot, Bot-to-Brand, and Bot-to-Bot. This piece picks up where that left off.
As AI agents take on more tasks—discovery, filtering, decision-making—they need infrastructure built for them. And, with that need, the web itself is splitting:
The visible web, still built for humans.
The invisible one, now emerging, is being built for bots.
And during this season of developer conferences—Microsoft Build, Google I/O, Confluent Current, and others—the tools and platforms (and acronym soups) are dropping.
A New Stack for a New Web
At Microsoft Build, we saw the foundation of this new web is taking shape around two major components:
Natural Language Web (NLWeb): A new protocol that gives agents access to semantic, structured content on websites. Instead of parsing HTML designed for humans, agents will interpret meaning natively.
Model Context Protocol (MCP): A system that lets agents securely interact with apps, APIs, and data sources—pulling in context from subscription services, drives, and other connected systems and enabling personalized, agent-initiated actions across the web.
At Google I/O, they announced developer initiatives include Project Astra, its next-gen AI interface, and its growing integration of LearnLM to help agents learn from user behavior.
Together, these protocols and initiatives are framing what we could call the Agentic Stack—and it is the key foundation for the big tech companies to win the AI integration race.
Watch the MAPs
Now that the agentic web is coming into focus, it’s time for brands to start releasing their experiments and explorations—and with that we’re entering the age of the MAP: Minimum Agentic Product.
Just like MVPs were the buzzy phrase as brands explored their first “web 2.0 websites” that tapped into APIs, and “mobile apps” played with social graphs, and “web 3.0 apps” that did whatever they were trying to do (looking at you Nike .SWOOSH), MAPs will become a brand’s first playground for agentic actions.
The first MAPs are already emerging, where agents already need access but struggle with today’s web:
Booking and scheduling for events, restaurants, travel, and more
Product discovery and purchasing, like the recent AI commerce push from Visa, Mastercard, and PayPal
Service comparisons for phone plans, insurance, SaaS products, and more
Knowledge interfaces for agent-to-agent attempts to resolve customer issues before escalating to humans
And, of course, there will be marketing stunts, brand mashups, and April Fool's pranks. Because, because.
Things We Humans Can Try Now
While much of the agentic web is still being built, we can already try early prototypes.
Google’s Project Mariner is a new experiment in human-agent interaction inside Chrome.
Their first examples include:
Finding personalized jobs:
using information from a resume to find personalized job listingsHiring a tasker to build furniture:
navigating to an email inbox, finding a recent furniture order, and then goes to TaskRabbit to find a tasker that can help assemble the item.Ordering missing ingredients:
looking through Google Drive to find a family recipe, note which ingredients the user is missing, and navigate to Instacart to purchase missing ingredients.
Joining Google is Opera with Neon, a new “fully agentic” browser that can surf the web for us.
And both of them (and many other MAPs that emerge every day) are joining the agentic browser OG: OpenAI’s Operator. (It is the “OG” because it was released waaay back in January.)
(Re)Building an Open Web
Last thought.
This agentic web is still forming. And yes, many of its standards and protocols are being shaped by the usual tech giants. But there’s a chance to get it right this time.
As Anil Dash recently wrote, this shift is a chance to reimagine what we once tried to build with Web 2.0.
His piece—“Web 2.0 2.0”—outlines how the open, creative, human-first ideals of early social platforms were lost to walled gardens.
This time, with agents doing the navigating, we need a new kind of openness. One that prioritizes the interoperability, accessibility, and structure we tried to build 20 years ago with Web 2.0 ideals.
Because for the new B2B to meet its glorious potential, that future must live in an open, agentic web.
The Roundup
Almost Alchemy
Gasoline from Air,
this new machine (not a prototype, but an actual working device!) runs on renewable electricity and produces gasoline that’s fully compatible with existing engines, requiring no modifications
Non-Newtonian Motorcycle Helmets,
(aka where I learned what a “non-Newtonian substance” is)
Olo,
a new color that has been discovered (or created, depending on your perspective around the science here)
Gold from Lead,
this type of alchemy was made real—sure, only for a split second, and only trillions of times less than would be required to make a piece of jewelry—but scientists made it happen
Seeing More
Gold-in-Eye,
injecting gold into an eye is shown to restore vision for those with retinal damage
Autofocus Glasses,
these glasses, a world’s first, intelligently adapt to their wearer’s eye movements, automatically focusing to help them see more and see sharply
Augmented Reading Lenses,
sees the content the user is reading and activates the related visuals and sounds
Speed It Up
Intelligent Speed Assist,
prevent a car from exceeding the speed limit — coming to a US State near you
Self-propelled Zippers,
just push the button on a wireless remote
Answer Engine Optimization (AEO),
an emerging subset of SEO that focuses on improving the content AI pulls from and links to when answering a question
LegoGPT,
new system from researchers at Carnegie Mellon University not only designs Lego models that match prompts but also ensures they can be built in the real world
Behavior Mods
Virtual Weddings,
more couples are choosing digital ceremonies hosted in the virtual spaces where their relationships first blossomed—like Minecraft
Armored Medieval Combat,
aka AMMA, it’s not LARPing (Live Action Roleplay) or reenactment, it is a contact sport with weapons—swords, axes, maces, spears, and shields— made of real metal like those forged in the Middle Ages
Proteinification,
every aisle of the grocery store has seemingly become the protein aisle
Unbossing,
Gen Z professionals are intentionally steering clear of traditional management roles to prioritize their mental health and well-being
Future Movie Plots (Informed by Real Life)
A Special Add-on Roundup
Over the years, The New New has rounded up a range of “straight from a science fiction movie”, “horror stories start like this”, or “plot of a future James Bond movie” entries (looking at you, vaccines via mosquitoes and shape-morphing robots). For this issue, let’s give them their own section…
Mysterious Bacteria Not Found on Earth Are Growing on China’s Space Station
Alarming Spy Device Can Read Text in an Open Book From Nearly a Mile Away
Good luck sleeping, all.
Because It’s Beautiful
…and gets our mind off the things I put in that section above.
A Restaurant That Looks Like a Sketch
In Japanese, Shirokuro means “black and white.”
At Shirokuro — a new omakase restaurant in NYC — the entire dining experience unfolds inside what looks like a 2D drawing.
Every surface is hand-painted in black and white lines, creating the illusion that guests have stepped inside a living sketchbook.
It’s part manga, part mind-bending stage set.
Beautifully All Together.
Stacks on Stacks
Racing Back to a Different AI-centric Stack
In Issue 34, I mapped the quiet but consequential race playing out across Big Tech:
the push to vertically integrate AI across the full tech stack.
The idea is simple: in order to deliver a truly personal AI — one that works across your devices, understands your habits, and anticipates your needs — tech companies need to control every layer.
That means more than just apps or assistants. It requires alignment across productivity tools, operating systems, personal devices, home computing, the cloud, and more. This issue set up and dug into that AI stack in full.
Meanwhile, the roundup captured emerging signals from other corners of innovation:
Out of the Labs: Cement-as-battery tech, gel-based semi-solid-state batteries, electric bandages, and mosquito detection lasers
On the Internet: AI slop, social slams, and the rise of “Digitine” — the digital guillotine
In the Real World: Social prescribing for well-being, drone-based memorials, and subtle shifts in work culture (“busy bragging”)
In Style: Mobility wearables, hybrid heels, and AI-assisted outfits born in your Notes app
There was a lot to explore.
Click in, scroll through, and see how much of the future had already started forming:
Issue 34 → The AI Integration Race
The New New’s mission is to fuel foresight. Every issue delivers a curated view into the discoveries, launches, trends, and movements shaping tomorrow—all explored through broad landscapes, from labs and studios to businesses and culture.
Each month(ish), this is pulled together by me, Brent Turner, and published on LinkedIn, Substack, and my site.
Okay, I'm off to have AI help me rewrite Naughty By Nature’s OPP to be about MCPs.
- B
⌘
PS: for the search crawlers and AI bots, the piece on Building the Second Web was originally published over here.