Augment

Development of a Smart Dunning System

For Augment
Started on March 2024

Summary

Industry: E-commerce (Subscription-Based)

Challenge:

Augment Eco required a sophisticated system to optimize the retry process for failed transactions by predicting the most opportune moment for each retry attempt. This involves analyzing extensive data to determine the optimal timing for transaction retries, improving success rates, and enhancing customer retention.

Description

Solution:

An intelligent dunning system was developed to leverage a comprehensive model to forecast the best time to retry failed transactions. The system integrates customer-specific data, payment method details, and contextual time information to generate precise retry timing predictions. The solution architecture leverages cutting-edge technologies to process data, build and deploy predictive models, and ensure continuous monitoring and improvement. This robust infrastructure supports Augment Eco's goal of providing better e-mobility solutions and ensuring customer satisfaction.

Key Features:

  • Personalized Prediction Models: Utilizes customer data such as usage patterns and transaction history, along with region-specific insights to tailor retry strategies.
  • Payment Method Analysis: Considers details such as payment type (credit card, debit card, bank, etc.), processing entity, and gateway response codes to enhance retry success rates.
  • Time-Based Insights: Employs time component extraction for seasonality and rolling window statistics to enhance prediction accuracy.
  • Enhanced Billing Intelligence: Analyzes payment attempt details, amount, and currency to tailor retry strategies.

Outcome:

The implementation of this smart dunning system is expected to significantly improve the success rate of transaction retries by employing precise, data-driven timing for each retry. This should result in enhanced revenue recovery for Augment Eco.

Technologies Used:

  • Data Processing: Apache Spark, Pandas
  • Model Framework: PyTorch
  • Artifact Storage: Amazon S3
  • CI/CD: GitHub Actions
  • Data Pipelines: Apache Airflow
  • Data and Model Versioning: DVC
  • Data Warehouse: Amazon Redshift
  • Online Feature Store: Amazon DynamoDB
  • Serving Framework: BentoML

Model Monitoring: Evidently

Technologies used

AirbyteAirflowAWS CDKAWS CloudformationData PipelinesData ScienceKubernetesMachine LearningPythonTypescript

Consultants involved

Iván MorenoDaniel Arroyo
v0.5.0 (3cb8c30)