Documentation

fsds/bbu-basket-dwell-time-analytics.md

Basket Dwell Time Analytics & Map Visualization

Goal

Answer customer questions:

  1. "How many baskets did I send to each location?"
  2. "How long have they been there?" (dwell time)
  3. Show this on a MAP with Mapbox (API key available)

Keep it simple - just show the data, no risk/alert abstractions.


Implementation Overview

Stage 1: SQL Views (in SupersetAnalyticsViewsSql.cs)

View Name Purpose
bbu.basket_dwell_by_location Location-level: basket count, avg/max dwell days
bbu.basket_dwell_detail Individual basket details with dwell time
bbu.basket_dwell_by_timeframe Baskets grouped by time buckets (< 1wk, 1-2wk, etc.)
bbu.basket_location_map Map-ready: coordinates + basket metrics
bbu.location_return_performance Historical return rates by location

Stage 2: Migration

  • Use /generate-migration with BBU context
  • Name: AddBasketDwellAnalyticsViews

Stage 3: Superset Dashboard

New Dashboard: "Basket Location Tracking"

+--------------------------------------------------+
|  [Total Baskets Out]  [Avg Dwell]  [Locations]   |  <- KPI Big Numbers
+--------------------------------------------------+
|  [Baskets by Location - Bar]  |  [Dwell Buckets] |  <- Where are they + how long
+-------------------------+-------------------------+
|          [MAP: Basket Locations with Dwell Time]  |  <- deck.gl with Mapbox
+--------------------------------------------------+
|  [Baskets by Dwell Time]  |  [Return Performance] |  <- Trends
+-------------------------+-------------------------+
|           [Basket Detail Table]                   |  <- Drill-down
+--------------------------------------------------+

Key SQL Views

1. basket_dwell_by_location (Summary by Location)

CREATE OR REPLACE VIEW bbu.basket_dwell_by_location AS
WITH delivered_baskets AS (
    SELECT
        sia.item_id AS basket_id,
        sia.tenant_id,
        d.location_id,
        d.status_updated_at_utc AS delivered_at_utc,
        EXTRACT(EPOCH FROM (NOW() - d.status_updated_at_utc)) / 86400.0 AS dwell_days
    FROM transport.shipment_item_allocations sia
    JOIN transport.deliveries d ON d.id = sia.delivery_id
    WHERE sia.expect_return = true
      AND d.status = 'DELIVERED'
      AND sia.status NOT IN ('RECEIVED', 'MISSING')
)
SELECT
    db.tenant_id,
    db.location_id,
    l.name AS location_name,
    l.location_full_name,
    l.erp_location AS location_code,
    COUNT(DISTINCT db.basket_id) AS basket_count,
    ROUND(AVG(db.dwell_days)::numeric, 1) AS avg_dwell_days,
    ROUND(MAX(db.dwell_days)::numeric, 1) AS max_dwell_days,
    MIN(db.delivered_at_utc) AS oldest_delivery
FROM delivered_baskets db
JOIN spatial.locations l ON l.id = db.location_id
GROUP BY db.tenant_id, db.location_id, l.name, l.location_full_name, l.erp_location;

2. basket_dwell_detail (Individual Basket Details)

CREATE OR REPLACE VIEW bbu.basket_dwell_detail AS
SELECT
    sia.item_id AS basket_id,
    sia.tenant_id,
    d.location_id,
    l.name AS location_name,
    l.location_full_name,
    ROUND(EXTRACT(EPOCH FROM (NOW() - d.status_updated_at_utc)) / 86400.0, 1) AS dwell_days,
    s.tracking_number AS shipment_tracking,
    d.status_updated_at_utc AS delivered_at,
    -- Get basket identifier (EPC if available)
    (SELECT i.value FROM prism.identifiers i
     JOIN core.identifier_types it ON it.id = i.identifier_type_id
     WHERE i.associated_object_id = sia.item_id AND it.code = 'EPC' LIMIT 1) AS basket_epc
FROM transport.shipment_item_allocations sia
JOIN transport.deliveries d ON d.id = sia.delivery_id
JOIN transport.shipments s ON s.id = sia.shipment_id
JOIN spatial.locations l ON l.id = d.location_id
WHERE sia.expect_return = true
  AND d.status = 'DELIVERED'
  AND sia.status NOT IN ('RECEIVED', 'MISSING');

3. basket_dwell_by_timeframe (Time Buckets)

CREATE OR REPLACE VIEW bbu.basket_dwell_by_timeframe AS
SELECT
    tenant_id,
    CASE
        WHEN dwell_days < 7 THEN 'Under 1 week'
        WHEN dwell_days < 14 THEN '1-2 weeks'
        WHEN dwell_days < 30 THEN '2-4 weeks'
        WHEN dwell_days < 60 THEN '1-2 months'
        ELSE '2+ months'
    END AS dwell_timeframe,
    COUNT(*) AS basket_count
FROM bbu.basket_dwell_detail
GROUP BY tenant_id,
    CASE
        WHEN dwell_days < 7 THEN 'Under 1 week'
        WHEN dwell_days < 14 THEN '1-2 weeks'
        WHEN dwell_days < 30 THEN '2-4 weeks'
        WHEN dwell_days < 60 THEN '1-2 months'
        ELSE '2+ months'
    END;

4. basket_location_map (Map-Ready with Coordinates)

CREATE OR REPLACE VIEW bbu.basket_location_map AS
SELECT
    bdl.tenant_id,
    bdl.location_id,
    bdl.location_name,
    bdl.location_full_name,
    bdl.basket_count,
    bdl.avg_dwell_days,
    bdl.max_dwell_days,
    -- Get coordinates (Azure preferred, fallback to basic address)
    COALESCE(NULLIF(az.latitude, '')::numeric, NULLIF(addr.latitude, '')::numeric) AS latitude,
    COALESCE(NULLIF(az.longitude, '')::numeric, NULLIF(addr.longitude, '')::numeric) AS longitude,
    COALESCE(az.freeform_address, CONCAT(addr.city, ', ', addr.state)) AS address_display,
    addr.city,
    addr.state
FROM bbu.basket_dwell_by_location bdl
JOIN spatial.locations l ON l.id = bdl.location_id
LEFT JOIN spatial.addresses addr ON addr.id = l.address_id
LEFT JOIN LATERAL (
    SELECT * FROM spatial.azure_addresses WHERE address_id = l.address_id LIMIT 1
) az ON true
WHERE COALESCE(NULLIF(az.latitude, ''), NULLIF(addr.latitude, '')) IS NOT NULL;

5. location_return_performance (Historical Return Rates)

CREATE OR REPLACE VIEW bbu.location_return_performance AS
SELECT
    drs.tenant_id,
    drs.location_id,
    l.name AS location_name,
    l.location_full_name,
    COUNT(DISTINCT drs.delivery_id) AS total_deliveries,
    SUM(drs.total_returnable) AS total_baskets_sent,
    SUM(drs.received_returnable) AS total_baskets_returned,
    ROUND(SUM(drs.received_returnable)::numeric / NULLIF(SUM(drs.total_returnable), 0) * 100, 1) AS return_rate_pct,
    SUM(drs.total_returnable) - SUM(drs.received_returnable) AS outstanding_baskets
FROM transport.delivery_return_summary drs
JOIN spatial.locations l ON l.id = drs.location_id
WHERE drs.total_returnable > 0
GROUP BY drs.tenant_id, drs.location_id, l.name, l.location_full_name;

Critical Files to Modify

  1. engines/bbu/src/Acsis.Dynaplex.Engines.Bbu.Database/Analytics/SupersetAnalyticsViewsSql.cs

    • Add 5 new view constants (basket_dwell_by_location, basket_dwell_detail, basket_dwell_by_timeframe, basket_location_map, location_return_performance)
    • Update CreateScripts list
    • Update DropSupersetAnalyticsViews with DROP statements
  2. Migration via /generate-migration

    • Context: BBU
    • Name: AddBasketDwellAnalyticsViews
  3. Superset Configuration (if needed)

    • Add MAPBOX_API_KEY to projects/bbu-rfid/resources/superset/superset_config.py
  4. Superset Datasets & Charts

    • Create 5 datasets via API
    • Create ~8-10 charts
    • Create "Basket Location Tracking" dashboard

Map Visualization (deck.gl with Mapbox)

Configuration Required:

# In superset_config.py
MAPBOX_API_KEY = os.getenv("MAPBOX_API_KEY", "")

Chart Definition:

  • viz_type: deck_scatter (circles) or deck_screengrid (heatmap)
  • Dataset: basket_location_map
  • Longitude column: longitude
  • Latitude column: latitude
  • Point size: based on basket_count
  • Color: gradient based on avg_dwell_days
  • Tooltip: location name, basket count, avg dwell days

Testing Queries

-- Verify dwell summary
SELECT * FROM bbu.basket_dwell_by_location ORDER BY basket_count DESC LIMIT 10;

-- Verify time buckets
SELECT * FROM bbu.basket_dwell_by_timeframe;

-- Verify map data has coordinates
SELECT COUNT(*) total, COUNT(latitude) with_coords FROM bbu.basket_location_map;

-- Verify return performance
SELECT * FROM bbu.location_return_performance ORDER BY return_rate_pct ASC LIMIT 10;

Implementation Steps

  1. Add SQL views to SupersetAnalyticsViewsSql.cs
  2. Run migration: /generate-migration with BBU context
  3. Configure Mapbox (if not already): Add API key to superset_config.py
  4. Create Superset datasets via API for each new view
  5. Create charts (KPIs, bars, pie, map, tables)
  6. Create dashboard combining all charts
  7. Test with real data