In the telecommunications sector sending out offers to customers can be very time-critical: some customers, for example, need to be offered to extend their data flatrate, right after they used up their capacity. Data Reply’s client, a multinational telecommunications company, already had a system in place that dealt with these so called "throttle campaigns" (customers receiving an offer to buy a data bucket for the remainder of the billing month).
The system had two main problems. Firstly it was very slow. This not only led to offers sometimes being sent out late, it actually resulted in erroneous offers being sent out to customers for whom the next billing cycle had already begun.
Secondly it was based on a proprietary enterprise tool which cost them 700.000€ per year, just for the license. So when Data Reply took over the project of replacing the old system with a state-of-the-art solution, the main goals were clear: the new system had to be fast and it had to be cost-efficient. The best approach was an event-driven, near-real-time system, based on open source software.
In order to maximize the sales for the abovementioned data volume extensions Data Reply had to identify the right kind of trigger information: It did not, for example, make sense to send out offers to customers who had not used any of their data allowance yet. So a certain kind of trigger event was needed - in this case something like "The customer has used up 80% of the data volume included in his plan". However, this trigger event was just a start, as it would miss out some sales potential.
That is why Data Reply considered the overall context: Questions like "How many days until the plan renews anyway?" or "How expensive is the plan that customer has?" helped generate tailor-made offers for customers: a student on a relatively cheap plan two days before their plan renews for example might get offered 1GB of data whereas the often-travelling manager on a premium corporate plan might get offered 5GB.
The more context, the better Data Reply could tailor the offers to the need of the customer, ensuring that more customers will go for the options made available to them.
The contextual approach meant combining the context data and the trigger events from various source systems in a (near) real-time decision engine. Data Reply developed the engine as a collection of microservices on top of Kafka and Kubernetes layers. Some services create the real-time context, others react to certain events.
To spark the addition of new services to the overall system Data Reply chose two main ways:
The modular microservice-based architecture allows the system to evolve over time from basic triggers and trigger-handling to more complex ones as usage of the system fuels the future context. For example the company can take into account customer response actions to offers which have been made to him/her in the past.
The incremental approach at the core of the project paid off very quickly. The existing expensive proprietary solution was completely gone after one year - with no disruption to the throttle campaigns that were already in place. There are currently 30 of these throttle campaigns active, resulting in a six-figure revenue each day.
The system is so well-received by the client that it sparked ideas on what else can be migrated from legacy implementations to Data Reply’s flexible event-driven-system. The newest addition are survey campaigns - allowing DataReply's client to request quick feedback after a variety of trigger events (e.g. a customer leaves one of the shops and immediately gets sent a link to a survey about their service satisfaction). These surveys can then drive business development and ensure that the client can always stay up to speed with the needs and wishes of customers and make quick and informed decisions about where to best invest into the future.
Data Reply is the Reply group company offering a broad range of advanced analytics and AI-powered data services. We operate across different industries and business functions, enabling them to achieve meaningful outcomes through effective use of data. We have strong competences in Big Data Engineering, Data Science and IPA; we build Big Data platforms and implement ML and AI models in a manner that is repeatable, efficient, scalable, simple and yet secure. We supports companies in combinatorial optimization processes with Quantum Computing techniques that enable an engine with high computational performances.