If you saw the news about Jack Dorsey restructuring Block around AI last week and felt a little unsettled, I want you to actually read the article before you spiral. Because the headline and the memo are telling two very different stories, and the one worth paying attention to isn't the one that went viral.

Dorsey's argument is that organizational hierarchy has existed for one reason: information routing. As companies grew, they needed layers of management to aggregate updates from below and pass decisions from above. His claim is that AI can now do that automatically, which means the layers become unnecessary. In their place, he proposes two roles — DRIs, who own specific problems and set direction across teams, and player-coaches, who develop people while still doing technical work themselves.

Where the premise breaks down

I want to be fair to the argument before I push back on it. There is information routing in my job. I give my manager updates on where my team stands, what we're blocked on, where we need help getting resources or clearing the path. That's real, and I'm not going to pretend it isn't part of the work. But if I'm being honest about how I actually spend my time, that's maybe 20% of it. The rest is reviewing work and giving feedback, coaching people through problems they haven't encountered before, setting the strategy for what my team builds and why, managing the stakeholder relationships that determine what we prioritize, and navigating the conversations that don't have clean answers. None of that is information routing. That's the actual job.

And here's what I can't get past when I read Dorsey's proposal: everything he's describing a DRI and a player-coach doing is exactly what a good manager already does. DRIs set direction and own cross-functional problems with real authority. Player-coaches develop people while staying technically grounded. He hasn't designed a replacement for management — he's described a new org structure that still requires every skill that makes management effective. He just attached different labels to the roles.

The part of his argument I think deserves the most scrutiny is the idea that AI can handle organizational coordination. Right now, my team is deploying a new attribution model for our marketing org — a more robust methodology that gives us a real read on incrementality and makes our spend optimization significantly better. The analytical case for it is airtight. And we have been navigating internal pushback for months. The biggest blocker hasn't been technical complexity. It's been Finance, who are hesitant to change the way they've been measuring things for years. Getting them aligned has required sustained relationship-building, addressing their concerns one at a time, and earning enough trust that they'd actually agree to change their methodology. That's not something that resolves faster because an AI world model tells everyone it's the right call. The resistance is human. The solution has to be human too.

Dorsey does acknowledge that people will still sit at the edges of his model, handling the things AI can't navigate. But those people have opinions and emotions and incentives, and they're going to push back regardless of what the system recommends. Someone still has to facilitate consensus across those people, know the right way to have that conversation and with whom, and move things forward when the system can't. That's a management function, even if it lives inside a DRI title.

I genuinely think Dorsey believes what he's writing, and I think his model might actually work at Block specifically. A fully remote company where all work generates machine-readable outputs is a different environment than most of us operate in. But for organizations where decisions still get made through relationships, where trust is built in real conversations over time, I don't see this translating.

What this actually means for you

If you're a manager who read that headline and felt your stomach drop this week, I'd actually read Dorsey's article as an argument for doubling down on the skills that make management matter — direction-setting, people development, trust-building, facilitation across competing interests. He named all of those as the things that will survive whatever comes next. He just gave them different job titles.

And more fundamentally, there is no AI that builds trust for you, or develops a junior team member's judgment over the course of a year, or sits in a difficult conversation and navigates it with the nuance the moment requires. You can get advice on how to do those things better. But you still have to do them.

P.S.

 A lot of the skills you've built working with AI — breaking down complex problems, iterating quickly, synthesizing information — translate directly into management. But they also break down in specific ways when people are involved. I put together a free guide that maps exactly where that translation works and where it doesn't: 

.

Keep reading