Skip to main content
Early Access - The Dcycle CLI is currently available for enterprise customers. Contact us to learn more about access.

Logistics Data Automation

For transport companies and fleet operators, this guide shows how to automate daily uploads of trip data (viajes) and fuel consumption (consumos) to Dcycle.

The Daily Upload Challenge

Fleet operations generate data continuously:
Data SourceVolumeFrequency
Trip records (viajes)100-10,000+ per dayReal-time or daily
Fuel recharges (consumos)50-500+ per dayDaily
Odometer readingsPer vehicleWith each recharge
Manual uploads quickly become unsustainable. Automation ensures:
  • Consistency: Same process every day
  • Timeliness: Data available for reporting
  • Accuracy: No human transcription errors

Quick Start: Daily Upload Script

#!/bin/bash
# daily_logistics_upload.sh
# Run this script daily, typically in the early morning

set -e

# Configuration
DATA_DIR="/data/exports/dcycle"
LOG_FILE="/var/log/dcycle/daily_upload.log"
DATE=$(date -d "yesterday" +%Y-%m-%d)

log() {
    echo "[$(date '+%Y-%m-%d %H:%M:%S')] $1" | tee -a "$LOG_FILE"
}

log "Starting daily upload for $DATE"

# Upload transport requests (viajes)
VIAJES_FILE="$DATA_DIR/viajes_$DATE.csv"
if [ -f "$VIAJES_FILE" ]; then
    log "Uploading viajes: $VIAJES_FILE"
    if dc logistics upload "$VIAJES_FILE" --type requests --yes; then
        log "✓ Viajes uploaded successfully"
        VIAJES_COUNT=$(wc -l < "$VIAJES_FILE")
        log "  Records: $((VIAJES_COUNT - 1))"  # Subtract header
    else
        log "✗ Viajes upload failed"
        exit 1
    fi
else
    log "⚠ No viajes file found for $DATE"
fi

# Upload fuel recharges (consumos)
CONSUMOS_FILE="$DATA_DIR/consumos_$DATE.csv"
if [ -f "$CONSUMOS_FILE" ]; then
    log "Uploading consumos: $CONSUMOS_FILE"
    if dc logistics upload "$CONSUMOS_FILE" --type recharges --yes; then
        log "✓ Consumos uploaded successfully"
        CONSUMOS_COUNT=$(wc -l < "$CONSUMOS_FILE")
        log "  Records: $((CONSUMOS_COUNT - 1))"
    else
        log "✗ Consumos upload failed"
        exit 1
    fi
else
    log "⚠ No consumos file found for $DATE"
fi

log "Daily upload complete"
Schedule with cron:
# Run at 5 AM every day
0 5 * * * /scripts/daily_logistics_upload.sh

CSV File Formats

Transport Requests (viajes)

date,vehicle_plate,origin,destination,distance_km,weight_kg,toc,client
2024-01-15,1234 ABC,Madrid,Barcelona,620,15000,general_cargo,ACME Corp
2024-01-15,1234 ABC,Barcelona,Valencia,350,12000,refrigerated,FreshFoods
2024-01-15,5678 DEF,Valencia,Madrid,350,8000,general_cargo,ACME Corp
2024-01-16,5678 DEF,Madrid,Sevilla,530,10000,bulk,BuildCo
ColumnRequiredDescription
dateYesTrip date (YYYY-MM-DD)
vehicle_plateYesVehicle license plate
originYesStarting location
destinationYesEnd location
distance_kmNoDistance in kilometers
weight_kgNoCargo weight
tocNoType of cargo
clientNoClient identifier

Fuel Recharges (consumos)

date,vehicle_plate,fuel_type,quantity,odometer,station
2024-01-15,1234 ABC,diesel,85.5,125000,Repsol Madrid Sur
2024-01-15,5678 DEF,diesel,120.0,89500,Cepsa Barcelona
2024-01-16,1234 ABC,diesel,92.0,125620,BP Valencia
ColumnRequiredDescription
dateYesRefueling date (YYYY-MM-DD)
vehicle_plateYesVehicle license plate
fuel_typeYesdiesel, gasoline, electric, etc.
quantityYesLiters or kWh
odometerNoOdometer reading
stationNoFuel station name

Integration with TMS/ERP

Export from TMS

Most Transport Management Systems can export to CSV. Set up automated exports:
#!/bin/bash
# export_from_tms.sh
# Example: Export via API from your TMS

DATE=$(date -d "yesterday" +%Y-%m-%d)
OUTPUT_DIR="/data/exports/dcycle"

# Example: Fetch from TMS API
curl -s "https://your-tms.com/api/trips?date=$DATE" \
    -H "Authorization: Bearer $TMS_API_KEY" \
    | jq -r '
        ["date","vehicle_plate","origin","destination","distance_km","weight_kg","toc","client"],
        (.[] | [.date, .vehicle, .from, .to, .distance, .weight, .cargo_type, .customer])
        | @csv
    ' > "$OUTPUT_DIR/viajes_$DATE.csv"

curl -s "https://your-tms.com/api/fuel?date=$DATE" \
    -H "Authorization: Bearer $TMS_API_KEY" \
    | jq -r '
        ["date","vehicle_plate","fuel_type","quantity","odometer","station"],
        (.[] | [.date, .vehicle, .fuel, .liters, .odometer, .station])
        | @csv
    ' > "$OUTPUT_DIR/consumos_$DATE.csv"

echo "Exported data for $DATE"

Database Export

If your data is in a database:
#!/bin/bash
# export_from_db.sh

DATE=$(date -d "yesterday" +%Y-%m-%d)
OUTPUT_DIR="/data/exports/dcycle"

# Export viajes
psql -h $DB_HOST -U $DB_USER -d $DB_NAME -c "\copy (
    SELECT
        trip_date as date,
        vehicle_plate,
        origin_city as origin,
        dest_city as destination,
        distance_km,
        cargo_weight_kg as weight_kg,
        cargo_type as toc,
        client_name as client
    FROM trips
    WHERE trip_date = '$DATE'
) TO '$OUTPUT_DIR/viajes_$DATE.csv' WITH CSV HEADER"

# Export consumos
psql -h $DB_HOST -U $DB_USER -d $DB_NAME -c "\copy (
    SELECT
        refuel_date as date,
        vehicle_plate,
        fuel_type,
        liters as quantity,
        odometer,
        station_name as station
    FROM fuel_logs
    WHERE refuel_date = '$DATE'
) TO '$OUTPUT_DIR/consumos_$DATE.csv' WITH CSV HEADER"

Complete Pipeline

A production-ready pipeline with all best practices:
#!/bin/bash
# logistics_pipeline.sh
# Complete daily logistics data pipeline

set -euo pipefail

# Configuration
CONFIG_DIR="/etc/dcycle"
DATA_DIR="/data/exports/dcycle"
LOG_DIR="/var/log/dcycle"
ARCHIVE_DIR="/data/archives/dcycle"
DATE=${1:-$(date -d "yesterday" +%Y-%m-%d)}

# Load environment
source "$CONFIG_DIR/credentials.env"

# Setup logging
LOG_FILE="$LOG_DIR/pipeline_$DATE.log"
exec > >(tee -a "$LOG_FILE") 2>&1

log() {
    echo "[$(date '+%H:%M:%S')] $1"
}

error_exit() {
    log "ERROR: $1"
    # Send alert
    curl -s -X POST "$SLACK_WEBHOOK" \
        -H 'Content-Type: application/json' \
        -d "{\"text\": \"❌ Logistics pipeline failed: $1\"}" || true
    exit 1
}

# Trap errors
trap 'error_exit "Script failed at line $LINENO"' ERR

log "═══════════════════════════════════════════════════════"
log "  LOGISTICS DATA PIPELINE - $DATE"
log "═══════════════════════════════════════════════════════"

# Step 1: Export from source systems
log ""
log "Step 1: Exporting from TMS..."
/scripts/export_from_tms.sh "$DATE" || error_exit "TMS export failed"

# Step 2: Validate data
log ""
log "Step 2: Validating data..."

VIAJES_FILE="$DATA_DIR/viajes_$DATE.csv"
CONSUMOS_FILE="$DATA_DIR/consumos_$DATE.csv"

if [ ! -f "$VIAJES_FILE" ]; then
    error_exit "Viajes file not found: $VIAJES_FILE"
fi

# Basic validation
VIAJES_LINES=$(wc -l < "$VIAJES_FILE")
if [ "$VIAJES_LINES" -lt 2 ]; then
    log "⚠ Warning: Viajes file has no data rows"
fi

# Check for required columns
if ! head -1 "$VIAJES_FILE" | grep -q "date"; then
    error_exit "Viajes file missing 'date' column"
fi

log "  ✓ Viajes: $((VIAJES_LINES - 1)) records"

if [ -f "$CONSUMOS_FILE" ]; then
    CONSUMOS_LINES=$(wc -l < "$CONSUMOS_FILE")
    log "  ✓ Consumos: $((CONSUMOS_LINES - 1)) records"
else
    log "  ⚠ No consumos file for $DATE"
fi

# Step 3: Upload to Dcycle
log ""
log "Step 3: Uploading to Dcycle..."

dc logistics upload "$VIAJES_FILE" --type requests --yes
log "  ✓ Viajes uploaded"

if [ -f "$CONSUMOS_FILE" ]; then
    dc logistics upload "$CONSUMOS_FILE" --type recharges --yes
    log "  ✓ Consumos uploaded"
fi

# Step 4: Verify upload
log ""
log "Step 4: Verifying upload..."

UPLOADED=$(dc logistics requests list --from "$DATE" --to "$DATE" --format json | jq length)
log "  Requests in Dcycle for $DATE: $UPLOADED"

if [ "$UPLOADED" -lt "$((VIAJES_LINES - 1))" ]; then
    log "  ⚠ Warning: Uploaded count ($UPLOADED) < file count ($((VIAJES_LINES - 1)))"
fi

# Step 5: Archive processed files
log ""
log "Step 5: Archiving files..."

mkdir -p "$ARCHIVE_DIR/$(date -d "$DATE" +%Y/%m)"
mv "$VIAJES_FILE" "$ARCHIVE_DIR/$(date -d "$DATE" +%Y/%m)/" || true
mv "$CONSUMOS_FILE" "$ARCHIVE_DIR/$(date -d "$DATE" +%Y/%m)/" 2>/dev/null || true

log "  ✓ Files archived"

# Step 6: Send success notification
log ""
log "Step 6: Sending notification..."

curl -s -X POST "$SLACK_WEBHOOK" \
    -H 'Content-Type: application/json' \
    -d "{
        \"blocks\": [
            {
                \"type\": \"section\",
                \"text\": {
                    \"type\": \"mrkdwn\",
                    \"text\": \"✅ *Logistics data uploaded for $DATE*\"
                }
            },
            {
                \"type\": \"section\",
                \"fields\": [
                    {\"type\": \"mrkdwn\", \"text\": \"*Viajes:* $((VIAJES_LINES - 1))\"},
                    {\"type\": \"mrkdwn\", \"text\": \"*Consumos:* ${CONSUMOS_LINES:-0}\"}
                ]
            }
        ]
    }" || true

log ""
log "═══════════════════════════════════════════════════════"
log "  PIPELINE COMPLETE"
log "═══════════════════════════════════════════════════════"

Handling Common Issues

Missing Vehicles

If uploads fail because vehicles aren’t registered:
# Check for unknown vehicles before upload
UNKNOWN=$(dc logistics upload "$VIAJES_FILE" --type requests --dry-run 2>&1 | grep "unknown vehicle" || true)

if [ -n "$UNKNOWN" ]; then
    log "Found unknown vehicles, registering..."
    echo "$UNKNOWN" | while read plate; do
        dc vehicle create --plate "$plate" --type truck --fuel diesel --yes
    done
fi

Retry on Failure

upload_with_retry() {
    local file=$1
    local type=$2
    local attempts=0
    local max_attempts=3

    while [ $attempts -lt $max_attempts ]; do
        ((attempts++))
        log "Upload attempt $attempts/$max_attempts: $file"

        if dc logistics upload "$file" --type "$type" --yes 2>&1; then
            return 0
        fi

        if [ $attempts -lt $max_attempts ]; then
            log "  Failed, waiting 60s before retry..."
            sleep 60
        fi
    done

    return 1
}

Incremental Uploads

For real-time or hourly uploads, track what’s been uploaded:
# Track last uploaded timestamp
LAST_UPLOAD_FILE="/var/run/dcycle_last_upload"
LAST_UPLOAD=$(cat "$LAST_UPLOAD_FILE" 2>/dev/null || echo "1970-01-01T00:00:00")

# Export only new records
psql -c "
    SELECT * FROM trips
    WHERE created_at > '$LAST_UPLOAD'
    ORDER BY created_at
" --csv > /tmp/new_trips.csv

# Upload if there's new data
if [ $(wc -l < /tmp/new_trips.csv) -gt 1 ]; then
    dc logistics upload /tmp/new_trips.csv --type requests --yes
    date -Iseconds > "$LAST_UPLOAD_FILE"
fi

Monitoring Dashboard

Create a simple monitoring view:
#!/bin/bash
# logistics_status.sh - Check upload status

echo "╔════════════════════════════════════════════════════════════╗"
echo "║           LOGISTICS UPLOAD STATUS                          ║"
echo "╚════════════════════════════════════════════════════════════╝"
echo ""

# Last 7 days
for i in {0..6}; do
    DATE=$(date -d "$i days ago" +%Y-%m-%d)
    DAY_NAME=$(date -d "$i days ago" +%A)

    # Get counts from Dcycle
    REQUESTS=$(dc logistics requests list --from "$DATE" --to "$DATE" --format json 2>/dev/null | jq length || echo "?")
    RECHARGES=$(dc logistics recharges list --from "$DATE" --to "$DATE" --format json 2>/dev/null | jq length || echo "?")

    # Status indicator
    if [ "$REQUESTS" = "?" ] || [ "$REQUESTS" = "0" ]; then
        STATUS="⚠️ "
    else
        STATUS="✅"
    fi

    printf "%s %-10s  %s  Requests: %4s  Recharges: %4s\n" "$STATUS" "$DAY_NAME" "$DATE" "$REQUESTS" "$RECHARGES"
done

Best Practices

Archive Processed Files

Move uploaded files to an archive directory. Keep for audit trail and reprocessing if needed.

Idempotent Uploads

Include unique identifiers in your data to handle duplicate uploads gracefully.

Monitor Gaps

Set up alerts for days with zero uploads or unusually low counts.

Validate Before Upload

Check required fields and data formats before sending to Dcycle.

Next Steps