Objective:
 
A global restaurant chain needed to see the big picture of its sales.  By now, the approach for each city was to generate a sales report based on data being manually retrieved. By default, this translated to business decisions inside every city being taken independently of other locations. From a business perspective, this left no room for spotting the differences or similarities, how the dots would connect between different locations. This is how two main needs emerged:
 

  • being able to map out the performance of offers, sales numbers from all locations
  • drawing deeper business insights based on the larger picture, not solely from a particular location.

 
That’s how the need for an automated dashboard that shows the performance of sales all across locations appeared.
 
The main requirements for the solution were:
 

  • Show up all the transactional data information split by location, offers and period
  • It would automatically refresh on a daily basis with minimum effort
  • The dashboard had a user-friendly design

 
Historically, each city with a restaurant had a sales manager responsible for retrieving all the sales data manually. This meant that each city report would be downloaded separately and then someone had to join them all.
 
Implementation challenge:
 
Manually joining all datasets or downloading them separately and pulling all information into a data analytics software seemed something easy to tackle. An automated solution should easily join data for an entire month, in just a couple of hours.
 
However, this turned out to be a provoking test: the software was heavily slowed down or sometimes it would freeze due to the large amount of data.
 
Solution:
 
First win before implementation: improved performance
 
Before moving on to the actual implementation, a key decision was to pull only a few weeks of data, instead of the entire dataset. The audience of this report was only interested in recent data, so having historical data was not helpful and would have affected performance. Whereas for the last four weeks, we used the data we had to generate a trend chart. This improved consistently the performance of the business intelligence software.
 
Actual implementation
 
The first step was data exploration, in order to get a better understanding of what each metric represents.
Additionally, we looked for the most efficient method of joining the data and spotting the similar elements between them was part of the initial set-up.
We approved each step with the client, who worked closely with us and provided an overview of how they wanted the final dashboard.
The next step was the automation itself, writing the SQL queries, which initially required multiple fixes as the data would not load once it was published on the servers. Unlike traditional implementations, for the first couple of weeks, we made sure we confronted manual data with the automated one, just to double-check its accuracy.
 
Dashboard design
The way all this information would be both visually structured & represented had to be discussed and agreed with the regional managers. The end result was the current dashboard that the client is using to this day.
 
 
Results:

  • Time to generate a sales dashboard: down to 1-2 hours per week to generate insights based on the automated dashboard instead of the 6-8 hours it used to take load data manually.
  • Data quality and accuracy: could not be questioned anymore. More in-depth insights and better business decisions. Today the client has access to an automated sales dashboard that shows which offer performed best based on location
  • Immediate access to metrics such as the number of app/ on-site orders, or total market sales.
About the author

Cognetik