Skip to content

34 — TestimonialDailyAnalytics Model & Rollup Jobs (Blueprint §5.11)

Cursor-ready plan for daily analytics aggregation model, rollup job design, backfill strategy, and query contracts for dashboard/analytics pages.

34 — TestimonialDailyAnalytics Model & Rollup Jobs (Blueprint §5.11)

Section titled “34 — TestimonialDailyAnalytics Model & Rollup Jobs (Blueprint §5.11)”

Source: 02-Implementation-Blueprint.md§5.11 New model: TestimonialDailyAnalytics.

This document is a build spec only. No code changes are implied until a task references this file.

Related: 14 (analytics screen), 19 (dashboard), 13 (requests log), 20 (schema/migrations), 24 (UTC standards).


Create a reliable daily rollup table so KPI-heavy screens load fast and show consistent numbers without expensive live aggregation over large event datasets.


Blueprint fields:

  • id String @id @default(uuid())
  • shopId String @map("shop_id")
  • date DateTime @map("date") // UTC day bucket
  • requestsSent Int @default(0) @map("requests_sent")
  • requestsClicked Int @default(0) @map("requests_clicked")
  • submissionsCount Int @default(0) @map("submissions_count")
  • approvedCount Int @default(0) @map("approved_count")
  • publishedCount Int @default(0) @map("published_count")
  • videoSubmissionsCount Int @default(0) @map("video_submissions_count")
  • photoSubmissionsCount Int @default(0) @map("photo_submissions_count")
  • createdAt DateTime @default(now()) @map("created_at")
  • updatedAt DateTime @updatedAt @map("updated_at")

Unique:

  • @@unique([shopId, date])

  • Prisma model + migration
  • Daily rollup job design
  • Incremental update strategy
  • Backfill/rebuild strategy
  • Read query contract for analytics/dashboard
  • Hourly or real-time stream processing
  • BI warehouse export

Use UTC day buckets only:

  • Bucket key = YYYY-MM-DD 00:00:00Z
  • All metrics in a row represent that UTC day

Never bucket by store-local timezone in this model (keep local formatting at UI layer).


4) Source-of-truth mapping for each metric

Section titled “4) Source-of-truth mapping for each metric”
Daily fieldSource table(s)Rule
requestsSentTestimonialRequestcount rows with sentAt on that UTC day
requestsClickedTestimonialRequest or TestimonialRequestEventcount distinct requests clicked on day (clickedAt preferred)
submissionsCountTestimonialSubmissioncount rows with submittedAt on day
approvedCountTestimonialSubmissioncount approvals on day (approvedAt preferred; else status-change fallback)
publishedCountTestimonialSubmissioncount rows with publishedAt on day
videoSubmissionsCountTestimonialSubmissionsubmissions where mediaType='video' and submitted on day
photoSubmissionsCountTestimonialSubmissionsubmissions where mediaType='photo' and submitted on day

For accurate daily metrics, ensure the following timestamps exist:

  • sentAt
  • clickedAt
  • submittedAt
  • approvedAt (recommended add if missing)
  • publishedAt

If approvedAt is missing, either:

  1. add migration for it (recommended), or
  2. define fallback logic and accept reduced accuracy.

Cron-triggered server job (e.g. every hour, plus nightly finalization).

Two modes:

  1. Incremental daily upsert (default)
  2. Rebuild range (manual/admin maintenance)

For each active shop and each target day in window:

  1. calculate each metric from source tables
  2. upsert TestimonialDailyAnalytics by (shopId, date)
  3. log rollup result summary (counts, duration)

Use transaction per (shop, day) batch or per shop depending volume.

Each run recomputes:

  • today
  • yesterday
  • optionally last 7 days (to heal delayed events)

This makes late-arriving updates converge without full rebuild.


Provide one manual job endpoint/script:

  • inputs: shopId?, fromDate, toDate
  • operation: delete/recompute rows for range

Use after:

  • schema changes affecting metrics
  • bug fixes in metric logic
  • historical corrections

  • Sum selected range for KPI cards
  • Optionally fetch time series for charts
  • Funnel-related cards can read from this table for speed
  • If funnel needs stages not stored here (opened), combine with live event query or extend model carefully
  • Missing row for day = treat as zero metrics
  • UI should render zero-series gracefully

Run periodic validation job:

  • compare rollup totals vs live aggregate for random shops/date windows
  • alert when drift exceeds threshold (e.g. >1% or >N records)

This catches silent metric regressions.


Required:

  • @@unique([shopId, date])

Recommended additional index:

  • @@index([date]) for global maintenance scans (optional)

Read patterns should be:

  • where: { shopId, date: { gte, lte } }

  • Rollup job creates/updates rows for active days.
  • Re-running job is idempotent (same outputs).
  • Dashboard/analytics range sums match live-source spot checks.
  • Backfill tool can rebuild a date window safely.
  • UTC bucketing consistent around timezone boundaries.

12) Suggested implementation order (for Cursor)

Section titled “12) Suggested implementation order (for Cursor)”
  1. Add model + migration.
  2. Add approvedAt timestamp migration if needed for fidelity.
  3. Implement rollup service with upsert-by-day.
  4. Add scheduled cron route/job trigger.
  5. Add rebuild command/route for maintenance.
  6. Switch analytics/dashboard reads to rollups where appropriate.

  • 02-Implementation-Blueprint.md — §5.11
  • 14-analytics-blueprint-screen-11.md
  • 19-dashboard-screen-1.md
  • 20-database-design-and-migrations-blueprint-section-5.md
  • 24-build-notes-execution-standards-blueprint-section-10.md

This folder already includes 05 through 33 plans. This file is 34-....