How long deals take to close, where they stall, and what separates won deals from lost ones. Generated by AI via Proxuma Power BI MCP server.
How long deals take to close, where they stall, and what separates won deals from lost ones. Generated by AI via Proxuma Power BI MCP server.
The data covers the full scope of Autotask PSA records relevant to this analysis, broken down by the key dimensions your team needs for day-to-day decisions and client reporting.
Who should use this: Sales leads, MSP owners, and account managers tracking pipeline health
How often: Weekly for pipeline reviews, monthly for forecasting, quarterly for strategy
How long deals take to close, where they stall, and what separates won deals from lost ones. Generated by AI via Proxuma Power BI MCP server.
EVALUATE ROW(
"AvgDaysToCloseMeasure", [Conversion - Avg Days to Close],
"WonCount", CALCULATE(COUNTROWS('BI_Autotask_Opportunities'), 'BI_Autotask_Opportunities'[status_name] IN {"Closed","Implemented"}),
"WonAvgDays", AVERAGEX(FILTER('BI_Autotask_Opportunities', 'BI_Autotask_Opportunities'[status_name] IN {"Closed","Implemented"} && NOT(ISBLANK('BI_Autotask_Opportunities'[closed_date])) && NOT(ISBLANK('BI_Autotask_Opportunities'[create_date]))), DATEDIFF('BI_Autotask_Opportunities'[create_date], 'BI_Autotask_Opportunities'[closed_date], DAY)),
"LostAvgDays", AVERAGEX(FILTER('BI_Autotask_Opportunities', 'BI_Autotask_Opportunities'[status_name] = "Lost" && NOT(ISBLANK('BI_Autotask_Opportunities'[lost_date])) && NOT(ISBLANK('BI_Autotask_Opportunities'[create_date]))), DATEDIFF('BI_Autotask_Opportunities'[create_date], 'BI_Autotask_Opportunities'[lost_date], DAY)),
"ActiveAvgAge", AVERAGEX(FILTER('BI_Autotask_Opportunities', 'BI_Autotask_Opportunities'[status_name] = "Active" && NOT(ISBLANK('BI_Autotask_Opportunities'[create_date]))), DATEDIFF('BI_Autotask_Opportunities'[create_date], TODAY(), DAY))
)
Average days from creation to close, broken down by how the opportunity ended
| Cycle Bucket | Deals | Revenue | Avg Days |
|---|---|---|---|
| <= 7 days | 481 | $1,337,013.51 | 1.9 |
| 8–30 days | 179 | $982,317.92 | 15.6 |
| 31–60 days | 40 | $442,641.70 | 42.6 |
| 61–90 days | 25 | $418,524.63 | 72.0 |
| 91–180 days | 42 | $297,369.26 | 125.6 |
| 180+ days | 60 | $465,312.02 | 248.3 |
EVALUATE
GROUPBY(
ADDCOLUMNS(
FILTER('BI_Autotask_Opportunities',
'BI_Autotask_Opportunities'[status_name] IN {"Closed","Implemented"}
&& NOT(ISBLANK('BI_Autotask_Opportunities'[closed_date]))
&& NOT(ISBLANK('BI_Autotask_Opportunities'[create_date]))),
"DaysToClose", DATEDIFF('BI_Autotask_Opportunities'[create_date], 'BI_Autotask_Opportunities'[closed_date], DAY),
"Bucket",
SWITCH(TRUE(),
DATEDIFF('BI_Autotask_Opportunities'[create_date], 'BI_Autotask_Opportunities'[closed_date], DAY) <= 7, "1. <=7 days",
DATEDIFF('BI_Autotask_Opportunities'[create_date], 'BI_Autotask_Opportunities'[closed_date], DAY) <= 30, "2. 8-30 days",
DATEDIFF('BI_Autotask_Opportunities'[create_date], 'BI_Autotask_Opportunities'[closed_date], DAY) <= 60, "3. 31-60 days",
DATEDIFF('BI_Autotask_Opportunities'[create_date], 'BI_Autotask_Opportunities'[closed_date], DAY) <= 90, "4. 61-90 days",
DATEDIFF('BI_Autotask_Opportunities'[create_date], 'BI_Autotask_Opportunities'[closed_date], DAY) <= 180, "5. 91-180 days",
"6. 180+ days"
)
),
[Bucket],
"Deals", COUNTX(CURRENTGROUP(), 'BI_Autotask_Opportunities'[opportunity_id]),
"Revenue", SUMX(CURRENTGROUP(), 'BI_Autotask_Opportunities'[amount]),
"AvgDays", AVERAGEX(CURRENTGROUP(), [DaysToClose])
)
ORDER BY [Bucket]
Opportunity distribution by stage (Dutch stage names from Autotask), ranked by deal count
| # | Stage | Deals | Total Value | Avg Deal | Share |
|---|---|---|---|---|---|
| 1 | Getekend, verwerkt naar ticket | 606 | $1.84M | $3,036 | 42.9% |
| 2 | Offerte | 530 | $7.16M | $13,509 | 37.5% |
| 3 | Getekend, verwerkt naar project | 191 | $1.97M | $10,314 | 13.5% |
| 4 | Offerte verstuurd | 46 | $3.06M | $66,522 | 3.3% |
| 5 | Offerte maken | 38 | $762K | $20,053 | 2.7% |
EVALUATE
ADDCOLUMNS(
SUMMARIZE(
BI_Autotask_Opportunities,
BI_Autotask_Opportunities[stage_name]
),
"DealCount", CALCULATE(COUNTROWS(BI_Autotask_Opportunities)),
"TotalValue", CALCULATE(SUM(BI_Autotask_Opportunities[amount])),
"AvgDeal", CALCULATE(AVERAGE(BI_Autotask_Opportunities[amount])),
"Share", DIVIDE(
CALCULATE(COUNTROWS(BI_Autotask_Opportunities)),
COUNTROWS(BI_Autotask_Opportunities))
)
ORDER BY [DealCount] DESC
Side-by-side comparison of won and lost deals across key metrics
| Metric | Won | Lost | Difference |
|---|---|---|---|
| Deal count | 720 | 509 | +211 won |
| Avg days to close | 65 | 104 | Lost take 60% longer |
| Total value | $3.44M | $7.06M | 2x value lost |
| Avg deal size | $4,778 | $13,872 | Bigger deals lost more |
EVALUATE
ADDCOLUMNS(
FILTER(
SUMMARIZE(
BI_Autotask_Opportunities,
BI_Autotask_Opportunities[status_name]
),
BI_Autotask_Opportunities[status_name] IN {"Closed Won", "Lost"}
),
"DealCount", CALCULATE(COUNTROWS(BI_Autotask_Opportunities)),
"AvgDaysToClose", CALCULATE(
AVERAGEX(
BI_Autotask_Opportunities,
DATEDIFF(
BI_Autotask_Opportunities[created_at],
BI_Autotask_Opportunities[projected_close_date],
DAY))),
"TotalValue", CALCULATE(SUM(BI_Autotask_Opportunities[amount])),
"AvgDealSize", CALCULATE(AVERAGE(BI_Autotask_Opportunities[amount]))
)
How fast value moves through your pipeline, and where the bottlenecks are
The “Offerte verstuurd” stage holds $3.06M across just 46 deals, giving it the highest average deal size in the pipeline at $66,522. These are proposals that have been sent but not signed. They represent the biggest concentration of stalled revenue. The “Offerte maken” stage has another 38 deals worth $762K that have not even been sent yet.
Together, these two pre-decision stages hold $3.82M in unsent or unsigned proposals. That is nearly as much as the entire active pipeline. Every week a proposal sits unsigned, the probability of closing it drops.
EVALUATE
ADDCOLUMNS(
FILTER(
SUMMARIZE(
BI_Autotask_Opportunities,
BI_Autotask_Opportunities[stage_name]
),
BI_Autotask_Opportunities[stage_name] IN {
"Offerte verstuurd", "Offerte maken"}
),
"DealCount", CALCULATE(COUNTROWS(BI_Autotask_Opportunities)),
"TotalValue", CALCULATE(SUM(BI_Autotask_Opportunities[amount])),
"AvgDealSize", CALCULATE(AVERAGE(BI_Autotask_Opportunities[amount])),
"AvgAge", CALCULATE(
AVERAGEX(
BI_Autotask_Opportunities,
DATEDIFF(
BI_Autotask_Opportunities[created_at],
TODAY(),
DAY)))
)
ORDER BY [TotalValue] DESC
The core finding is straightforward: won deals close in 65 days on average, lost deals take 104 days. That is a 60% longer cycle for deals that end in a loss. The pattern is consistent across deal sizes and stages. When a deal drags past 90 days without a decision, the odds shift against you.
The win rate of 58.6% (720 won out of 1,229 decided) is solid, but the value story tells a different picture. Lost deals account for $7.06M in total value compared to $3.44M for won deals. That means the larger deals are the ones being lost. The average lost deal is $13,872 versus $4,778 for a won deal. Bigger proposals take longer, and the longer they sit, the more likely they are to fall through.
The stage data confirms where deals stall. 530 deals sit in the "Offerte" stage, which is the broadest category covering proposal activity. The 46 deals in "Offerte verstuurd" are the most actionable: these proposals have already been sent and are waiting on a decision. At $66,522 average deal size, each one of those stalled proposals represents serious revenue.
Active deals average just 17 days old, which suggests the current pipeline is relatively fresh. But the $3.94M active pipeline needs to convert at the historical 58.6% win rate to deliver roughly $2.3M in closed revenue. If the larger deals in "Offerte verstuurd" are lost at the same rate as historical big deals, the actual yield could be lower.
5 priorities based on the findings above
These proposals have been sent but not signed. At $3.06M total value and $66,522 average deal size, this is the single largest pool of stalled revenue. Pull the list, sort by age, and call every prospect that has had a proposal for more than 30 days. A direct follow-up call converts more stalled proposals than another email.
Won deals close in 65 days. Lost deals average 104 days. The inflection point is somewhere around 75 days. Any deal that passes this mark without a clear next step should be flagged for a pipeline review. The data shows that speed is correlated with winning. Deals that take longer are not just slow, they are more likely to fail.
The average lost deal is worth $13,872 compared to $4,778 for won deals. That is a nearly 3x difference. This could indicate pricing issues, proposal complexity, or the wrong stakeholders being involved on bigger deals. Pull the top 20 lost deals by value and look for patterns: same competitor, same objection, same stage where they dropped off.
Thirty-eight deals worth $762K are in the proposal drafting stage. These are opportunities where a proposal has not even been sent yet. Each day of delay extends the sales cycle and reduces the chance of closing. Set a target: every proposal in "Offerte maken" should be sent within 5 business days, or the deal gets flagged for escalation.
With 720 won deals averaging 65 days, you have a statistically significant baseline. Use this in your monthly pipeline reviews: any deal created more than 65 days ago that has not closed should be weighted at a lower probability. Apply this to the current $3.94M active pipeline to build a more realistic revenue forecast instead of counting everything at face value.
Days to close is calculated as the number of calendar days between the opportunity creation date (created_at) and the projected close date (projected_close_date) in Autotask. For won deals, this reflects the actual sales cycle length. For active deals, it shows the age so far.
The win rate is calculated by dividing won deals by the total of won plus lost deals. Active and implemented deals are excluded from the win rate calculation because they have not reached a final outcome yet. This gives you a cleaner picture of your actual close performance.
The stage names come directly from the Autotask PSA configuration. This demo dataset uses Dutch stage names (Offerte = Proposal, Getekend = Signed, Verstuurd = Sent, Maken = Creating). Your own Autotask instance may use different names depending on how your pipeline stages are configured.
It depends on deal size and complexity. For standard managed services agreements, 30 to 60 days is typical. For larger project-based deals, 60 to 120 days is common. The key metric to watch is not the absolute number but the gap between won and lost deals. If lost deals take significantly longer, it means slow decisions are costing you revenue.
Yes. Add a date filter to the DAX queries using the created_at column. For example, filter to the last 12 months to see recent trends, or compare this quarter to last quarter. Filtering by time period can reveal whether your sales cycle is getting shorter or longer over time.
Yes. Connect Proxuma Power BI to your Autotask PSA, add an AI tool (Claude, ChatGPT, or Copilot) via MCP, and ask the same question. The AI writes the DAX queries, runs them against your real opportunity data, and produces a report like this in under fifteen minutes.
Connect Proxuma Power BI to your PSA, RMM, and M365 environment, use an MCP-compatible AI to ask questions, and generate custom reports - in minutes, not days.
See more reports Get started