TV broadcasting networks rely on digital behaviour and video data to make commercial decisions, understand users, and build good products. This is why having the highest quality of data is essential for strategic business goals. But, digital data is more volatile than other types of data, explains Robert Børlum-Bach, Head of Analytics Architecture at TV2 Danmark. Robert talked to us about how they succeeded in establishing automated governance of enterprise analytics and marketing data at TV 2 in order to ensure streamlined data collection and processing. He will be sharing more details on the topic during his session at the Data Innovation Summit 2021.
Hyperight: Hi Robert, it’s our pleasure to welcome you as a speaker at the Data Innovation Summit 2021. As a kick-off, please tell us a bit more about yourself and how you ended up working in data analytics and data governance?
Robert Børlum-Bach: Thank you! Looking very much forward to the event.
Data analytics is probably one of the most challenging and fun areas to work in – always evolving, always complex, and always in need of sensemaking and communication.
My analytics journey started within digital marketing – specifically search engine optimization and search engine marketing (often grouped into earned and paid media – with owned as the third.
Working at the time as a consultant, it was (and still is) essential to report and analyse the performance of the initiatives and campaigns. How should the different channels be attributed specific goals and KPIs? How could we create the best media mix and bang-for-the-buck? Which data should be collected to support the right decisions?
Answering these questions, the need for data became apparent – and the need for high quality and unbiased data became even more apparent. “If you can show me a digital marketing campaign report without any errors, I’ll give you a beer” – paraphrased quote from my old analytics mentor Steen Rasmussen. I don’t think he’s ever lost a bet.
How you implement a quality framework and method for data collection which are anchored commercially while respecting user privacy become the leading star for my further career.
Hyperight: Your Data Innovation Summit session will focus on automated governance of enterprise analytics and marketing data at TV2 Danmark. What were the lessons learnt and hopefully now benefits of implementing such a framework?
Robert Børlum-Bach: As both a broadcaster, streaming service and news platform, digital behaviour and video data is essential for navigating commercially, understanding users, and building good products. Many of the strategic key business indicators are based on the digital data collected, so having data of high quality is of high importance.
Digital data is more volatile and “alive” than many other data types. It’s most often collected through the end-client – that could be a browser, an app or a video platform, and a lot can happen on the data’s journey. New technologies, new possibilities, new restrictions, user mistrust, all can have an impact on how the data is collected.
Time is valuable, and being a team of data specialists, we need to be proactive advisors for the business. That means freeing up time and resources from manual testing to be on the forefront of the external technological and juridical changes and the fragmented internal business requirements and stakeholders consuming the data.
We applied the metaphor of a river to understand the data’s journey from the end-client or source system itself, to the processed data for analysis and activation.
Our challenge was, when there was a discrepancy or outage in the data (e.g., in a report or dashboard) – it was difficult to identify where the underlying issue came from. Was it a trend/confounding anomaly (we saw some new digital behaviour due to Covid)? Processing or transformation issue? Communication between stakeholders? Or a technical issue in the source system.
Two “gates” were introduced (the dashed lines in the above illustration), one before data processing and one after data processing.
The first “gate” now includes automated testing of the clients and source systems, and that we set and send the correct cookies, parameters, and dimensions to our data collection platform. This was a change to a high dependency of manual testing beforehand, where we now have +300 tests emulating user and click flows across platforms.
If any issue is identified it’s broadcasted to a specific Slack channel – ready for taking action by the analytics specialist.
This, together with governance health checks and anomaly dashboards, makes the process of identifying where the data issue happened and what it consists of much more constructive.
The talk in October – will dive more into the technical bits and pieces of the implementation and framework of the automated testing solution.
Hyperight: What are some challenges you came across and how did you overcome them?
Robert Børlum-Bach: Technology and automation only bring value when you have clear processes and ownership of them. The challenge was to identify who’s responsible for the communication and escalation of a data outage, and there’s an underlying process to support this. You can have a lot of shiny flows and alerts, but if no action is connected and documented, you’re at a pass. With inspiration in the RACI framework, all data products have stakeholders with different ownerships and responsibilities, which supports the data journey.
The above illustration is a very simplified version of the framework we use for documenting roles and responsibilities of the data flow. The “links” can also be seen as the “gates” described above. The upstream flow often includes the escalation process – what happens when a data error is found and the contracts supporting this, where the downstream flow often is the requirement process from the business. Specific case examples of using this framework will also be presented at the talk (ad!).
Hyperight: What are the trends we’ll see in data analytics and governance in the next couple of years?
Robert Børlum-Bach: With technological restrictions from e.g., Apple and juridical requirements such as GDPR and ePrivacy, the focus for data collection is changing from cookies and end-clients to server side and first party data. This only creates a greater and urgent need for clear data governance and data responsibilities. The business needs to have defined processes for handling data outages – not a nice to have, but a need to have this day today.
Having processes, stakeholder contracts and governance automations that can support these changes, will (looking at the crystal ball) be an important business commodity. And hybrid roles within data architecture, legal (DPO) and business understanding will be highly sought after.
I hope answering these questions opened a bit up for what my talk is all about at DIS 2021. See you there!