The Tools We Trusted Are Due for a Reckoning
This post continues a thread I've been pulling on for a few weeks now — how AI is reshaping not just how we write code, but the entire ecosystem built around software development. This time I want to zoom out further, because I think the implications go well beyond engineering teams.
I've been in and around software development long enough to have watched whole categories of tools go from essential to vestigial. I remember when source control was optional. When "continuous integration" sounded like a research project. When the idea that a SaaS product could manage your entire sales pipeline in a browser felt like science fiction.
Each of those transitions had a moment — usually quiet, usually underestimated — where the old way of doing things stopped being the obvious choice.
I think we're in several of those moments simultaneously right now. And the three things I keep coming back to are: how we run development teams, which tools we actually need, and whether the SaaS contracts we've been paying for are still the best answer to the problems they solve.
Agile Solved the Wrong Problem
Let me be direct about something that I think the industry is going to spend the next two years dancing around: Agile, as we practice it, was designed to manage human-speed code production. Story points. Velocity. Sprint burndown. These are all tools for estimating and measuring how fast humans can write software.
What happens when that stops being the bottleneck?
I had a conversation with a friend who leads an engineering team at a mid-size fintech. A few months ago, his team shipped a feature in four days that he'd budgeted an entire sprint for. Not because they worked nights. Because they leaned into AI-assisted development and the constraint — which had always been the act of writing code — just wasn't there anymore. The planning artifact, the estimate, the velocity chart: all instantly outdated.
His problem wasn't delivery. His problem was that his stakeholders didn't trust a four-day timeline for something that had historically taken three weeks. The process had created an expectation, and now the process was getting in the way.
That's the paradox I keep seeing. Agile gave us a language for managing uncertainty in human-speed delivery. But when delivery compresses, the uncertainty doesn't disappear — it relocates. It moves from "can we build this?" to "should we build this?" and "is what we built actually correct?"
Sprint ceremonies, done well, were always about communication and alignment. Those needs don't go away. But the cadence and the currency both need to change. The estimate as a negotiated unit of effort starts to feel dishonest when effort is no longer the scarce resource. Judgment is.
DORA Is Not the Problem
I want to be careful here, because I've seen a version of this conversation go sideways fast: none of this means we stop measuring things.
DORA metrics — deployment frequency, lead time for changes, mean time to recovery, change failure rate — are still genuinely useful signals of delivery health. If anything, they matter more in an AI-accelerated environment, because the cost of building the wrong thing quickly is higher than the cost of building the right thing slowly.
What DORA doesn't tell you is whether your team has the judgment to steer AI toward value. It tells you how fast and safely you're shipping. It doesn't tell you what you're shipping or why.
The question I'd be asking if I were leading an engineering organization right now isn't "how do I measure velocity in an AI-assisted world?" It's "what does good judgment look like, and how do I build a team where that's the thing I'm hiring and developing for?"
Specification quality — the ability to articulate what needs to be built and why, precisely enough that AI can execute on it usefully — is going to be a core engineering competency. Right now most teams treat it as a nice-to-have. That's going to flip.
JetBrains Didn't Do Anything Wrong
I've been a JetBrains user for years. IntelliJ in particular is one of the most thoughtfully built pieces of software I've used. The people who made it understood something important: the IDE isn't just a text editor, it's a cognitive prosthetic. It extends your ability to hold complexity in mind, navigate a large codebase, catch problems early.
That framing — IDE as cognitive prosthetic — is precisely the thing that's under pressure right now.
The traditional IDE was designed around a specific model: a human reads code, reasons about code, writes code, debugs code. Everything in the IDE was built to make those steps faster and less error-prone. Code completion. Static analysis. Refactoring tools. Debuggers.
Here's what I've noticed in my own workflow over the past several months: I spend less time in those features than I used to. Not because they got worse. Because the nature of what I'm doing has shifted. More of my time is spent reviewing, evaluating, steering — and less time is spent in the details of authoring. The altitude has changed.
That's not JetBrains' fault. They built an exceptional tool for the job as it was defined. But the job is being redefined, and the question now is whether the IDE can reinvent itself around the new shape of the work.
The tools that survive this will be the ones that reorient around the human-AI collaboration model rather than the human-as-author model. Not "help me write code" but "help me and an AI build software together, with me focused on what I'm actually good at." That's a genuinely different product. And the native AI development environments being built from scratch around this model are moving fast.
JetBrains has the engineering talent and the user trust to make this transition. But so did a lot of companies that didn't. The ones that thrive in platform shifts are the ones willing to cannibalize their own products before someone else does.
The Deal We Made with SaaS
There's an implicit contract at the center of every enterprise SaaS relationship. It goes something like: we will build software that solves a general version of your problem. You will adapt your processes to fit our software. In return, you get reliability, continuous improvement, and the comfort of knowing you're not alone.
Salesforce is the apex expression of that deal. I know companies that have built their entire go-to-market motion around how Salesforce thinks about pipelines, stages, and opportunity management. They didn't just buy software — they adopted a worldview.
The problem with that deal is that it was always a compromise. General solutions require general processes. Customization was possible, but expensive. Real bespoke behavior meant certified partners, development sprints, and change management projects that made even simple workflow adjustments feel like construction.
Here's the thing: the barrier to building custom software was never imagination. Businesses have always known what they actually need. A CRM that reflects the specific shape of their sales process, not a generalized one. Support tooling organized around their customer segments. Reporting that surfaces the metrics their business actually runs on.
For a long time, the cost of building those things was prohibitive enough that the SaaS compromise made sense. I used to tell startup founders: don't build internal tools until you absolutely have to. Buy, don't build. Use your engineering capacity for your core product.
I'm not sure that advice holds anymore.
When a business can build a working custom CRM in weeks — not quarters — and maintain it with a fraction of the headcount the legacy vendor required, the value proposition of the general solution changes fundamentally.
This Doesn't Mean Salesforce Is Dead
I want to be precise here, because "Salesforce is dead" is a take that's easy to write and almost certainly wrong.
The enterprises that will remain Salesforce customers are the ones where the complexity is genuine: global compliance requirements, deep integration dependencies, data governance at a scale that makes custom builds legitimately hard. That market isn't going anywhere.
What does change is the mid-market — which has long been the growth engine for most enterprise SaaS companies. That segment is going to start making different calculations. Some of those companies are going to realize that the thing they were paying $200k a year for can be rebuilt in a form that actually fits the way they work, by a much smaller team, in a much shorter timeframe.
The SaaS companies that survive this aren't the ones whose value proposition was "we built the tool so you didn't have to." That reason to exist is getting weaker. The ones that survive will be the ones offering what AI genuinely can't replicate: trust infrastructure (security, compliance, audit trails baked in), meaningful network effects (data that becomes more valuable the more organizations contribute), and platform depth that's genuinely hard to reassemble from scratch.
If your moat is "we did the hard work of building this," you need a new moat.
The Common Thread
Here's what strikes me about all three of these shifts: they're all versions of the same thing.
AI doesn't just make the old ways faster. It moves the constraint.
In development process, the constraint moves from execution to judgment. In developer tooling, the constraint moves from authoring to evaluating. In enterprise SaaS, the constraint moves from building software to deciding what to build and being willing to own it.
Every system we built — our processes, our tools, our vendor relationships — was optimized for where the constraint used to live. The organizations that are going to lead the next decade are the ones that can honestly assess where the constraint is now, and rebuild around that.
That's harder than it sounds. It means letting go of frameworks and tools and contracts that have been reliable for years. It means developing capabilities — judgment, specification quality, software ownership — that weren't traditionally core to how most teams operated.
But it also means that a lot of things that used to require massive resources, scale, and patience are suddenly within reach. The bespoke CRM. The development process that actually fits how your team works. The tooling built for how you develop software now, not how you developed it ten years ago.
That's not a threat. That's an opening. The question is whether you're positioned to walk through it.

