Skip to main content

Why do response numbers differ between Dashboard, Reports, and Outbox

Written by Alex Bitca
Updated over 2 weeks ago

It is normal to see different response counts across the Outbox, Dashboard, Campaign Reports, and Feedback pages.

These pages answer different questions and apply date filters differently.

This article explains how each page works and why discrepancies can appear.

Quick summary

  • Outbox focuses on surveys sent and answered in a selected period.

  • Dashboard and Reports focus on responses received in a selected period.

  • Reports deduplicate respondents and keep only the latest response per customer.

  • Feedback page shows all responses, including multiple responses from the same customer.

Because of this, response counts can be higher or lower depending on where you look.

How the Outbox page works

The Outbox page is delivery-focused.

When you apply a date filter, for example, January 1 to January 31:

  • Only surveys sent during that period are included.

  • All delivery stats are calculated only from those surveys, such as:

    • Opened

    • Responded

    • Unsubscribed

    • Bounced

Important: The Outbox never includes responses to surveys that were sent outside the selected date range, even if the response itself was received during that time.

How the Dashboard and Campaign Reports work

The Dashboard and Campaign Reports are insight-focused.

When you apply the same date filter, for example, January 1 to January 31:

  • All responses received during that period are included.

  • It does not matter when the survey was originally sent.

This means responses can be counted even if:

  • The survey was sent in December.

  • The survey was sent several months earlier.

  • The respondent answered late.

Reason 1: Responses received vs surveys sent

This is the most common source of confusion.

Example

  • A survey was sent on December 10.

  • The customer responded on January 5.

Result:

  • Outbox (January filter): response is not counted.

  • Dashboard and Reports (January filter): response is counted.

Why:

  • Outbox filters by send date.

  • Dashboard and Reports filter by response date.

Reason 2: Only the latest response per respondent is counted in reports

Dashboard and Reports reflect the current status of each respondent.

If the same customer answers multiple surveys during the selected period:

  • Only their latest response is counted.

  • Earlier responses from the same customer are ignored in reports.

Example

A customer submits:

  • Promoter response on January 5

  • Promoter response on January 12

  • Passive response on January 25

Result in Dashboard and Reports:

  • Only the January 25 Passive response is counted.

Why:

  • Reports show the most recent sentiment per respondent.

Why Promoters + Passives + Detractors may not match total responses

This can happen in two opposite scenarios.

Scenario A: Dashboard shows more responses than Outbox

This happens when:

  • Responses were received during the period.

  • Surveys were sent outside the period.

Dashboard and Reports include them.

Outbox does not.

Scenario B: Dashboard shows fewer responses than the Feedback page

This happens when:

  • A customer responded multiple times during the selected period.

Result:

  • Feedback page shows all responses.

  • Dashboard and Reports count only the latest response per customer.

Which page should I trust?

It depends on what you are analyzing.

  • Use Outbox when you want to understand delivery and engagement for surveys sent in a specific period.

  • Use Dashboard and Reports when you want to understand customer sentiment and trends during a specific period.

  • Use Feedback page when you want to audit or review individual responses.

All numbers are correct within their own context.

Key takeaway

Differences in response counts are expected and intentional.

They exist because:

  • Some pages filter by send date, others by response date.

  • Some pages deduplicate respondents, others do not.

Understanding this distinction helps interpret your data correctly and avoids false assumptions about inconsistencies.

Did this answer your question?