Why “how long will this take?“ is the wrong question to ask

Jess Wolfe, Customer Success Manager · Feb 6, 2026

You’re in a leadership sync, and someone from the C-suite asks a straightforward question:

When will the self-service product be ready?

All eyes turn to you. Marketing needs a date for the launch, sales has already set expectations with prospects, and now you’re expected to give an answer.

So you suggest you circle back with a more definite answer and go back to your team to ask what feels like a logical question:

How long will this take?

What follows is the all-too-familiar estimate dance — engineers pad the tasks to account for unknowns, product managers push back to keep timelines reasonable, and someone suggests breaking it into story points as a compromise.

Eventually, you land on a number that everyone knows is approximate, but it’s good enough to report up.

You can probably guess what’s going to happen on or around that date: the feature hasn’t shipped, and the business is disappointed in engineering, again. (This is one of many reasons we don’t set deadlines at Swarmia.) Your team didn’t give bad estimates, it’s just that no one stopped to ask whether they were ready to estimate at all.

Engineering organizations lean on output measures (velocity, story points, hours logged) when they’re trying to make plans feel predictable. The problem is that teams treat these numbers as commitments rather than signals, especially when plans don’t work out.

Stop handing out blank checks

When you ask, “How long will this take?” without any constraints, you’re handing your team a blank check.

You’re saying: here’s the full vision, including every edge case and future requirement we can think of — now tell me exactly when it will be finished.

That combination (unbounded scope paired with a demand for certainty) makes accurate estimation hard. Teams can increase estimates to protect themselves from unknowns, which makes them look slow. Or they can offer optimistic timelines that fall apart at the first hiccup. Most teams have been burned by both.

As one product leader once put it to me:

If you give me a blank check, I can do everything — and it’ll take forever.

You see this in organizations everywhere. A product manager arrives with a real customer need and asks how long the work will take. There’s no clarity on budget, no shared understanding of what “good enough” looks like, and no prioritization between must-haves and nice-to-haves. The team is left to infer scope, anticipate technical complexity, and account for risks they don’t yet understand, all while producing a timeline that won’t come back to bite them later.

Until you decide what you’re actually building, any timeline you produce is a best guess.

Flip the question

So instead of asking, “How long will this take?” start by introducing constraints and asking a different kind of question.

For example:

  • If we had four weeks, what would be the most valuable thing we could ship?
  • With this budget, what could we build that would meaningfully reduce uncertainty?
  • What’s the smallest version of this idea that would let us see whether it works?

When time or money is fixed and scope becomes the variable, the quality of the conversation improves. Engineers can talk about tradeoffs and technical risk. Product managers are pushed to make prioritization decisions explicit. Rather than negotiating over dates, the group is talking about value.

This is where predictability appears — not because the work suddenly becomes simple, but because it’s deliberately bounded.

This doesn’t mean you should never think about timeframes. At Swarmia, we size work deliberately — stories aim for days, tasks aim for hours. But that’s different from starting with “how long will this take?” The distinction is crucial: we constrain the time first (two weeks for a story), then shape the scope to fit. We’re designing bounded solutions, not estimating an unbounded problem.

Planning means making bets under uncertainty

Once you introduce constraints, it becomes easier to treat product development planning for what it is: a series of bets made under uncertainty.

Instead of “when will this be done?” you’re asking:

  • What are we trying to learn?
  • How much are we willing to invest to learn it?
  • What would justify continued investment?

Instead of committing to a full build upfront, teams fund experiments: small chunks of work designed to test assumptions. After each one, you make a decision: continue, change direction, or stop entirely.

Uncertainty doesn’t disappear, but it becomes visible and manageable. You’re no longer pretending to know the future, and you’ve got room to learn and adjust as you go.

When leadership wants dates anyway

This approach works well when you have stakeholder buy-in. But what if your VP or board demands a committed date?

Here’s how to navigate those conversations:

When asked “When will SSO be ready?”:

Don’t say: “I’ll get back to you with an estimate.”

Try: “I can give you a confident timeline once we’ve de-risked the technical approach. Can we fund 4 weeks to validate whether this is 2 months of work or 6 months? That investment will give us enough information to commit to a date with confidence.”

When marketing needs a launch date:

Don’t say: “Engineering thinks 3 months, but it could slip.”

Try: “We can commit to having something ready in 3 months. The question is scope. With that timeline, we can support the top 3 enterprise providers. Full coverage of all providers would take 5-6 months. Which matters more to you: timing or comprehensive coverage?”

When you’re told “just give me your best guess”:

Don’t say: “Probably 3 months, but no promises.”

Try: “My best guess is 3 months, but that’s based on assumptions we haven’t validated. I’d much rather give you a confident commitment in 4 weeks after we’ve looked into the technical unknowns. Would you prefer a rough guess now, or a solid commitment soon?”

The goal here is to avoid committing before you have enough information to confidently be accountable for something.

How to introduce this in your organization

If you’re working in an environment where everyone expects estimates upfront, you can’t flip the entire organization overnight. But you can start with one initiative and demonstrate a better way.

Instead of answering how long something will take, ask what constraints you’re working under. What’s the runway for this initiative? How quickly does the business want to learn whether this idea is viable? What level of investment feels reasonable right now?

These questions clarify the difference between building indefinitely and building something useful within a defined window. Those two paths require different planning approaches, and it’s better to be explicit about which one you’re on.

A simple change looks like this:

  1. Set a constraint, usually time or money
  2. Be explicit about what you’re trying to learn
  3. Decide deliberately whether to continue investing

In larger organizations, this shows up as participatory budgeting, where leaders make clear choices about what they’re willing to bet on in a given quarter. Leaders commit not to delivery at all costs, but to learning enough to make better decisions later.

Build vs. buy conversations get easier too. When investment is capped, it becomes clearer when buying an off-the-shelf solution makes more sense than building something yourself.

The role of breaking work down

I’m not saying predictability is impossible. Prediction does get easier as work gets smaller and more consistently sized. At the level of individual tasks (writing tests, fixing bugs, updating configuration) teams and individuals are usually quite accurate.

Trouble starts when organizations apply that same confidence to initiatives that span months or quarters, where uncertainty compounds quickly. Hidden dependencies, unfamiliar systems, and evolving requirements all make precise estimation difficult.

Breaking work down during sprint planning serves two purposes: it surfaces complexity that wasn’t visible before, and when done consistently, it creates predictable flow. When developers walk through the steps involved, they uncover assumptions. And when stories consistently complete in similar timeframes, that consistency becomes the foundation for forecasting larger efforts.

For larger initiatives — like “add semantic search” or “migrate to the new payment processor” — you’re still better off treating the work as a sequence of bounded bets rather than a single plan to be executed. When the bets pay off, they graduate to the roadmap as evidence-backed priorities, rather than hopeful predictions.

Where to start

The next time someone asks you how long something will take, flip the question before passing it on to your team. Ask about constraints, investment, and learning instead of dates.

Start with one initiative and see how the conversation changes when scope is no longer assumed to be unlimited. Notice whether tradeoffs get discussed earlier, whether decisions feel clearer, and whether missed commitments turn into useful signals instead of finger-pointing.

This way of working isn’t always comfortable. It means admitting that you don’t yet know exactly what you’re building — and gaining that certainty through learning, not guessing.

Jess Wolfe
Jess Wolfe is a customer success manager at Swarmia. She brings deep experience in engineering metrics, agile delivery, and product management from her work at Atlassian and beyond.

Subscribe to our newsletter

Get the latest product updates and #goodreads delivered to your inbox once a month.

More content from Swarmia
Miikka Holkeri · Jun 5, 2024

35 questions to ask in your developer experience surveys

“I inherited a developer experience team and was supposed to be in charge of understanding our development lifecycle and its pain points. I had no idea how to gather that information. I…
Read more
Ari-Pekka Koponen · Oct 27, 2021

Reduce bias in retrospectives with better data

Many software organizations run retrospectives based on mostly subjective data collected from the retrospective participants. Studies show this leads to skewed insights that often miss broader…
Read more