Apps just for you

In the future, we’ll all have highly personal apps that do our very specific tasks, professionally and personally.

These won’t be off-the-shelf tools like Gemini, Claude, or ChatGPT.

They will be tools that we build using these off-the-shelf subscriptions.

Why is it important to start automating tasks that are repetitive and wasting your time right now, knowing that these models will only improve?

Let me use an analogy:

If you are a lucky homeowner, you know the difference between the bones of your house and a remodel job.

The bones are what you buy. The ability to remodel is the ability give your home a facelift every time $50,000 is burning a hole in your pocket.

Given unlimited resource, I’m sure that every homeowner wouldn’t mind a regular remodel, an easy way to keep their home fresh for decades.

Well, that’s how AI works.

You use the tools available today to build the bones of your app. You set up the databases, the structures, and the concepts behind the personal apps that automate your life and business. You accept that these apps are imperfect works in progress.

Here’s the beautiful part: each time a new model drops, with greater capabilities, you get to have it go through your code base and improve, fix errors, and apply a new coat of paint. In this way, you can improve your house of code every time a new model drops, often in one fell swoop.

I’ve built over 20 apps at this point, and as soon as Claude announced Opus 4.7, the first thing I did was have it run through my codebase, update, refactor, and polish. This is how my apps are constantly improving and becoming more functional and useful over time.

But you can’t remodel a house that doesn’t exist. You can’t improve on an app that you haven’t built.

So today is the day to build the foundation of your house. And trust that you will be able to remodel it perpetually over the years. Just think of what kind of palace you’ll have five years from now, if you start today?

Earth Day

I spoke recently about the profound lessons the Artemis II crew learned seeing our tiny, fragile planet from a distance.

I alluded to the idea that their revelation is nothing new.

To prove my point, I went back to the Apollo archives.

Jim Lovell, from Apollo 8: “One of the most fascinating parts of space flight is the observation of the Earth…

…the problems everybody has appear to be smaller… It’s hard to imagine why people cannot live more peacefully with one another.”

Every single person who views our tiny planet from a distance is made keenly aware of how valuable and rare it is.

And to bring another perspective into it, from a book I’m reading that I’ve thoroughly enjoyed:

“Remarkably we [humans] are even quite closely related to fruit and vegetables. About half the chemical functions that take place in a banana are fundamentally the same as the chemical functions that take place in you.

It cannot be said too often: all life is one. That is and I suspect will forever prove to be the most profound true statement there is.”

- Bill Bryson: A Short History of Nearly Everything

So on this day, take a moment to ask whether your work, your mission, your company, your leadership is bringing us closer to the realization that we are all one—that we all must peacefully coexist here on this planet, our only home, or whether you’re working to take us further away from these ideals.

Focusing in the age of AI

When TV had three channels, we all shared the same context.

We watched the same shows. Laughed at the same late night host. Bought the same toothpaste. And the next morning at the water cooler, everyone had something to talk about because everyone had watched the same thing.

That ended when the internet fractured us into a billion content silos.

Right now, it's almost impossible that you and I are reading the same books. Watching the same shows. Using the same apps. Our feeds are so personalized that two people sitting next to each other on the subway might as well be living in different centuries.

But something strange is happening. For the first time in a long time, we're all reading the same book again. It’s just not a book, it’s AI.

AI is the water cooler moment right now for our entire species.

It's all anyone is talking about — at work, at dinner, at conferences, in group chats, in boardrooms, in classrooms. And honestly? It's a really good book. It's equal parts Stephen King, Isaac Asimov, Sigmund Freud, and Jackson Pollock. It's thrilling and unsettling and deeply personal and nobody can quite figure out what they're looking at. And none of us knows how this book ends. It’s a real nail-biter!

Just like The Da Vinci Code, you don't have to read this book. But everyone else is reading it. And the conversations are happening whether you're in them or not.

Now — what if we're wrong?

What if AI is all hype? What if it's not lasting technology? What if it turns out to be a complete dead end, a flashy chapter in human history that leads absolutely nowhere, like my pog collection collecting dust in my parents’ basement? The most expensive Tamagotchi of all time?

Well, then it just turns out to be a bad book. Or worse — a great book with a terrible ending, which is the worst kind of book.

But when have you ever regretted reading a book?

Even the bad books taught you something. Even the ones you abandoned halfway through changed how you thought about the next one you picked up. Everything you've ever learned has only made you better at learning the next thing.

Nobody is going to look back and regret paying attention here. The people who will have regrets are the ones who heard everyone talking about the most important book of their generation and said "I'll wait for the movie."

In my opinion? Don’t wait for the movie. This is one book you have to read for yourself.

"Planet Earth: You Are a Crew"

24 hours after seeing Earth as a tiny speck in an endless black abyss, the Artemis II crew gave their first interview. And I was riveted. Their message was not subtle: “It’s a special thing to be a human, and it’s a special thing to be on planet Earth.”

Each member reiterated how rare and special our planet is, and while constantly hugging each other, they sought to remind us that from space, there are no boundaries, countries, or races.

You could tell their lives had been irrevocably changed, and that they were struggling to find the words to convey the importance to us of what they’d just witnessed.

The Overview Effect

Astronauts have been saying the same thing since the space program began: When you see our pale blue dot from a distance, you fully comprehend how rare it is. The magic isn’t out there somewhere. And it’s not on Mars. It’s right here, on this planet that we barely understand. At this moment, 99%+ of our deep oceans are completely unexplored. We have no clue what’s going on even just a few miles below our feet. Our ignorance of our own planet is boundless. And yet we seek to conquer others…

The Overview Effect is the name for the spiritual sense of one-ness that astronauts feel when they see Earth from a distance. And astronauts have been practically screaming at us to get the message: TAKE CARE OF THIS PLANET! STOP FIGHTING LIKE YOU AREN’T ALL THE SAME. WE ARE ALL IN THIS TOGETHER!

And like any profound message on YouTube, the comments tell the story. The trip was fake. It’s the democrats’ fault. If only Israel had done… Yuck.

In the face of one of humanity’s most powerful messages, the hate machine of social media can instantly trivialize it, making way for divisive, tribal bickering.

That’s why it’s our job—it’s our job to always remain focused on the big picture, and on the truth. Artemis II crew, I heard you. And I love you for it.

“For now, while we breathe and are among our fellow humans, let's cherish the qualities that make us human.”

— Seneca, Anger, Mercy, Revenge

The message has been the same for thousands of years: We need to live here and now on this planet. This is our lifeboat. This is our ship. We're the only crew we'll ever have.

Leadership advice has been the same for 2,500 years

I just finished former Navy Seal Jocko Willink's book Extreme Ownership.

It’s a great book about taking radical responsibility for our actions. Like many leadership principles, it only works if the leader follows the program too. A leader who expects extreme ownership from their subordinates without truly following the wisdom themselves is a tyrant.

When reading, I couldn't stop thinking about how Extreme Ownership is essentially the same book as How to Win Friends and Influence People but with wildly different packaging.

It’s also the same book as Meditations by Marcus Aurelius. And the Tao Te Ching. And Ben Franklin’s Poor Richard's Almanack. And Seneca's letters to Lucilius two thousand years ago.

The core message is always the same: Take radical responsibility for your own actions. Listen more than you speak. Meet people where they are, not where you wish they were.

Jocko says "there are no bad teams, only bad leaders." Carnegie said "any fool can criticize, condemn, and complain — and most fools do." Seneca wrote "If you live according to nature, you will never be poor; if you live according to opinion, you will never be rich." Lao Tzu taught that "the wise leader does not push; he lets things happen." Marcus Aurelius reminded himself daily: "you have power over your mind, not outside events."

It’s always the same story, isn’t it?

Jocko's version is for people who respond to discipline and direct orders from someone whose voice is hoarse from decades of screaming. Carnegie's is for people who respond to warmth and social grace. The Stoics wrote for people who respond to philosophical reflection. Lao Tzu wrote for people who resonate with paradox and gentle stillness.

The wisdom hasn't changed in 2,500 years. The packaging changes because the audience changes.

And this is exactly what I see happening in AI right now.

In my podcast, dozens of tech founders echoed the same core ideas about implementing AI: automate what's repeatable so humans can focus on what requires judgment.

I for example, have learned that my time is best spent judging which of several objects is, in fact, cake.

But one version of this timeless story is wrapped in bro hype and rocket emojis. Another is dripping with fear and "your job is disappearing" mortal terror.

Same medicine. Different bottles.

The question isn't whether you have the right AI strategy. It's whether you can make the people who need to execute it actually understand what you're asking them to do and why they should do it. And you need to be intellectually honest enough to know and admit why you’re really doing what you’re doing.

The chief problem of our time isn’t one of technology: it’s one of translation, accountability, and communication.

And it's been the same problem for 2,500 years.