“RMM Alerts vs Manual Tickets: Comparing Resolution Time and SLA Performance”
Autotask PSA Datto RMM Datto Backup Microsoft 365 SmileBack HubSpot IT Glue All reports
AI-GENERATED REPORT
You searched for:

RMM Alerts vs Manual Tickets: Comparing Resolution Time and SLA Performance

A side-by-side comparison of automated RMM alert tickets and manually created tickets across volume, effort, and customer satisfaction. Data sourced from Datto RMM and Autotask PSA through Proxuma Power BI. RMM

Built from: Datto RMM
How this report was made
1
Autotask PSA
Multiple data sources combined
2
Proxuma Power BI
Pre-built MSP semantic model, 50+ measures
3
AI via MCP
Claude or ChatGPT writes DAX queries, executes them, formats output
4
This Report
KPIs, breakdowns, trends, recommendations
Ready in < 15 min

RMM Alerts vs Manual Tickets: Comparing Resolution Time and SLA Performance

A side-by-side comparison of automated RMM alert tickets and manually created tickets across volume, effort, and customer satisfaction. Data sourced from Datto RMM and Autotask PSA through Proxuma Power BI. RMM

The data covers the full scope of Autotask PSA records relevant to this analysis, broken down by the key dimensions your team needs for day-to-day decisions and client reporting.

Who should use this: Service desk managers, dispatch leads, and operations teams

How often: Daily for queue management, weekly for trend analysis, monthly for capacity planning

Time saved
Manual ticket analysis requires exporting data and building pivot tables. This report does it automatically.
Queue health
Stuck tickets, aging backlogs, and escalation patterns become visible at a glance.
Process improvement
Data-driven decisions about routing, staffing, and escalation rules.
Report categoryTicketing & Helpdesk
Data sourceAutotask PSA · Datto RMM · Datto Backup · Microsoft 365 · SmileBack · HubSpot · IT Glue
RefreshReal-time via Power BI
Generation timeUnder 15 minutes
AI requiredClaude, ChatGPT or Copilot
AudienceService desk managers, dispatch leads
Where to find this in Proxuma
Power BI › Ticketing › RMM Alerts vs Manual Tickets: Compari...
What you can measure in this report
Overview
Ticket Type Comparison
Resolution Efficiency
CSAT by Ticket Source
RMM Alert Pipeline
Volume & Effort Analysis
Key Findings & Analysis
Recommended Actions
Frequently Asked Questions
Total Tickets
Alert Ticket Share
Avg Hours (Alerts)
AI-Generated Power BI Report

RMM Alerts vs Manual Tickets: Comparing Resolution Time and SLA Performance

A side-by-side comparison of automated RMM alert tickets and manually created tickets across volume, effort, and customer satisfaction. Data sourced from Datto RMM and Autotask PSA through Proxuma Power BI. RMM

1.0 Overview
Total Tickets
67,521
5 ticket types
Alert Ticket Share
29.3%
19,790 alert tickets
Avg Hours (Alerts)
0.74
18% less than incidents
Alert CSAT Positive
93.7%
74 of 79 responses
How to read this report: This report compares ticket types from Autotask PSA. Alert (RMM) tickets (type 5) are created automatically by Datto RMM when a monitoring threshold is breached. Incidents (type 2), Service Requests (type 1), and Change Requests (type 4) are created manually by your team. The data covers 67,521 tickets across all clients. CSAT data comes from SmileBack surveys linked to closed tickets.
2.0 Ticket Type Comparison

Incidents make up the largest share of your ticket volume at 41.0%, followed by RMM alerts at 29.3%. Service requests account for 18.7% and change requests for 10.7%. Problem tickets are rare at just 167 total.

Incident
41.0%
Alert (RMM)
19,790
29.3%
Service Request
12,653
18.7%
Change Request
7,247
10.7%
Problem
167
0.2%
Show DAX query — Ticket Type Breakdown
EVALUATE SUMMARIZECOLUMNS('BI_Autotask_Tickets'[source_name], "TicketCount", COUNTROWS('BI_Autotask_Tickets'), "AvgResolutionHours", AVERAGE('BI_Autotask_Tickets'[resolution_duration_hours]))
3.0 Resolution Efficiency

Alert tickets require the least effort at 0.74 average billable hours, making them 18% faster to close than incidents (0.90 hrs) and 33% faster than service requests (1.10 hrs). Problem tickets sit at the other end of the spectrum at 1.43 hours, but with only 167 tickets they have minimal impact on total workload.

Problem
1.43 hrs
Service Request
1.10 hrs
Change Request
0.91 hrs
Incident
Alert (RMM)
0.74 hrs
Why this matters: If your alert tickets averaged the same effort as incidents, those 19,790 tickets would cost an extra 3,166 billable hours per year. At typical MSP rates, that is a significant saving from automation alone.
Show DAX query — Average Billable Hours by Type
EVALUATE
SUMMARIZECOLUMNS(
    'BI_Autotask_Tickets'[ticket_type],
    "TicketCount", COUNTROWS('BI_Autotask_Tickets'),
    "AvgBillableHrs", AVERAGE('BI_Autotask_Tickets'[billable_hours])
)
ORDER BY [AvgBillableHrs] DESC
4.0 CSAT by Ticket Source

Alert tickets show a 93.7% positive CSAT rate (74 of 79 responses), compared to 86.4% for incidents and 89.2% for service requests. The catch: only 0.4% of alert tickets generate a CSAT response, while 5.1% of incidents do. With just 79 total CSAT responses on 19,790 alert tickets, the sample is too small to draw firm conclusions.

93.7% positive
Alert (RMM)
79 responses
86.4% positive
Incident
1,410 responses
89.2% positive
Service Req.
613 responses
90.5% positive
Change Req.
388 responses
CSAT response rate gap: Alert tickets have a 0.4% CSAT response rate (79 responses on 19,790 tickets). Incidents have 5.1% (1,410 on 27,664). This likely reflects that many alert tickets are resolved without direct customer interaction, so the survey either does not fire or gets ignored. The high positive rate on alert CSAT may be survivorship bias: only customers who noticed the fix bother to respond.
5.0 RMM Alert Pipeline

Your Datto RMM generated 135,387 total alerts. Of those, 132,018 (97.5%) resolved automatically with an average auto-resolve time of just 5.45 minutes. Only 19,790 alerts turned into Autotask tickets. That means roughly 14.6% of all RMM alerts required a ticket, and the rest were handled by the system without human involvement.

135,387
Total RMM Alerts
132,018
Auto-resolved 97.5%
19,790
Became tickets 14.6%
3,369
Unresolved alerts 2.5%
Auto-resolve performance: With 97.5% of alerts resolving in an average of 5.45 minutes, your RMM policies are working well. The 3,369 unresolved alerts (2.5%) are worth reviewing. These may point to recurring hardware issues, misconfigured thresholds, or alerts that need manual intervention policies.
Show DAX query — RMM Alert KPIs
EVALUATE
ROW(
    "TotalAlerts", COUNTROWS('BI_Datto_Rmm_Alerts'),
    "ResolvedAlerts", CALCULATE(COUNTROWS('BI_Datto_Rmm_Alerts'), 'BI_Datto_Rmm_Alerts'[resolved] = TRUE()),
    "TotalSites", DISTINCTCOUNT('BI_Datto_Rmm_Alerts'[site_name]),
    "AvgAutoResolve", AVERAGE('BI_Datto_Rmm_Alerts'[autoresolve_mins])
)

-- Alert Priority Distribution
EVALUATE
SUMMARIZECOLUMNS(
    'BI_Datto_Rmm_Alerts'[priority],
    "AlertCount", COUNTROWS('BI_Datto_Rmm_Alerts'),
    "ResolvedCount", CALCULATE(COUNTROWS('BI_Datto_Rmm_Alerts'), 'BI_Datto_Rmm_Alerts'[resolved] = TRUE()),
    "AvgAutoResolve", AVERAGE('BI_Datto_Rmm_Alerts'[autoresolve_mins])
)
ORDER BY [AlertCount] DESC
6.0 Volume & Effort Analysis

While alert tickets account for 29.3% of ticket volume, they only consume 22.8% of total billable effort. Incidents dominate both volume (41.0%) and effort (38.7%). Service requests punch above their weight in effort, taking 18.7% of tickets but 21.6% of hours. This confirms that automation is pulling its weight: alert tickets generate volume without a proportional effort cost.

Ticket Type Tickets % Volume Avg Hours Total Hours % Effort
Incident 27,664 41.0% 0.90 24,898 38.7%
Alert (RMM) 19,790 29.3% 0.74 14,645 22.8%
Service Request 12,653 18.7% 1.10 13,918 21.6%
Change Request 7,247 10.7% 0.91 6,595 10.3%
Problem 167 0.2% 1.43 239 0.4%
Volume vs Effort Distribution
Volume
41%
29%
19%
11%
Effort
39%
23%
22%
10%
Incident Alert (RMM) Service Request Change Request Problem
7.0 Key Findings & Analysis
1

RMM automation is delivering measurable efficiency gains

Alert tickets average 0.74 billable hours compared to 0.90 for incidents. Across 19,790 alert tickets, that is roughly 3,166 hours saved per year compared to handling them as standard incidents. On top of that, 97.5% of all RMM alerts resolve automatically in under six minutes, meaning only a fraction ever reach your service desk. The ROI on your Datto RMM investment is clearly visible in the data.

2

CSAT data on alert tickets is nearly nonexistent

Only 79 out of 19,790 alert tickets received a CSAT response (0.4%). That makes it impossible to draw reliable conclusions about customer satisfaction with automated resolutions. The 93.7% positive rate looks good on paper, but with a sample this small, a handful of responses could shift it dramatically. If you want real CSAT data on alert tickets, you need to review whether SmileBack surveys are firing for this ticket type and whether the survey recipient makes sense for automated resolutions.

3

Service requests consume disproportionate effort

Service requests make up 18.7% of ticket volume but 21.6% of total effort. At 1.10 average hours, they take 49% longer than alert tickets and 22% longer than incidents. This suggests that service requests may include work that could be partially automated, pre-filled with templates, or broken into smaller tasks. Reviewing the top service request categories for automation opportunities could free up significant capacity.

8.0 Recommended Actions
1

Review the 3,369 unresolved RMM alerts

With 97.5% of alerts auto-resolving, the remaining 2.5% deserve attention. Pull a list of unresolved alerts grouped by alert type and site. Look for patterns: are these the same monitors failing across multiple clients? Are they stale alerts from decommissioned devices? Cleaning up this tail will improve your resolve rate and reduce noise for the service desk.

2

Investigate SmileBack configuration for alert tickets

A 0.4% CSAT response rate on alert tickets means you are flying blind on customer satisfaction for nearly a third of your ticket volume. Check whether SmileBack surveys are being sent for alert ticket closures and whether the survey goes to the right contact. Consider adding a brief resolution note to alert tickets so the customer understands what was fixed before they receive the survey.

3

Audit high-effort service request categories

Service requests average 1.10 hours, the highest of any volume ticket type. Break this down by service request category and look for types that could use self-service portals, pre-approved automation, or better documentation. Even reducing the average by 0.10 hours across 12,653 tickets saves over 1,265 hours per year.

4

Use alert efficiency data in QBRs and sales conversations

The numbers tell a strong story: 135K alerts monitored, 97.5% auto-resolved in under six minutes, and the tickets that do get created cost 18% less effort than standard incidents. Package this into your QBR slides and sales proposals. Prospects and existing clients want proof that proactive monitoring actually reduces downtime and cost. This data is that proof.

5

Track alert-to-ticket ratio monthly as an efficiency KPI

Your current ratio is 14.6% (19,790 tickets from 135,387 alerts). Set this as a monthly KPI. If the ratio climbs, it means your alert thresholds may need tuning or you have new device types generating noisy alerts. If it drops, your automation policies are improving. Either way, the trend matters more than the absolute number.

9.0 Frequently Asked Questions
What counts as an "alert ticket" in this report?

Alert tickets are Autotask tickets with ticket type 5 (Alert). These are created automatically by Datto RMM when a monitoring alert triggers and is configured to create a ticket in Autotask. They differ from incidents (type 2), service requests (type 1), and change requests (type 4), which are created manually by technicians or through customer-facing portals.

Why is the CSAT response rate so low for alert tickets?

Most alert tickets resolve without direct customer interaction. The customer may never know a ticket existed. SmileBack surveys are sent to the ticket contact upon closure, but for automated alert tickets, that contact may not be aware of the issue. This results in surveys going to people who did not experience the problem and do not feel compelled to respond.

How are total hours calculated in the effort table?

Total hours are estimated by multiplying the average billable hours per ticket by the ticket count for each type. For example, 19,790 alert tickets at 0.74 average hours gives approximately 14,645 total hours. These are estimates based on averages and may differ slightly from a direct sum of all individual ticket hours.

What does "auto-resolve" mean for RMM alerts?

Auto-resolve means the condition that triggered the alert cleared on its own without human intervention. For example, a disk space alert may fire when usage hits 90%, but if a scheduled cleanup runs and brings it below the threshold, the alert resolves automatically. In Datto RMM, the resolved field marks these. The average auto-resolve time of 5.45 minutes indicates most alerts are transient conditions.

Can I break this down by client or site?

Yes. Add the company name or site name column to the DAX queries to group results by client. This is particularly useful for identifying clients who generate a disproportionate number of alerts or have a low auto-resolve rate, which may indicate infrastructure problems worth addressing proactively.

Why are there 7,688 unlinked CSAT responses?

The unlinked CSAT responses (7,688 total, 7,195 positive, 253 negative) are SmileBack reviews that could not be matched to a specific ticket type. This usually happens when the ticket was deleted, the ticket type was changed after the survey was sent, or the SmileBack-Autotask sync has gaps. These responses are still valid satisfaction data but cannot be attributed to a specific workflow.

Can I run this report against my own data?

Yes. Connect Proxuma Power BI to your Datto RMM and Autotask accounts, add an AI tool (Claude, ChatGPT, or Copilot) via MCP, and ask the same question. The AI writes the DAX queries, runs them against your real data, and produces a report like this in under fifteen minutes.

Generate this report from your own data

Connect Proxuma Power BI to your PSA, RMM, and M365 environment, use an MCP-compatible AI to ask questions, and generate custom reports - in minutes, not days.

See more reports Get started