There are 3 things I always see when starting a project with a new client:
A GTM container with serious tracking issues, several versions of important events, no naming conventions, modified by 3 different agencies over the past couple years,
Zero documentation about how the analytics setup is supposed to work, and
An internal team with no clue how key metrics are tracked and how they relate to one other and to business goals.
I'm sure that in some cases there were good analytics agencies that did the setup based on a detailed measurement plan, but by the time we are brought in that plan is nowhere to be found and things have changed so much since implementation that it wouldn't be relevant anyway.
But sadly, in most cases, there simply is no documentation. If everyone in the company was a GTM and analytics expert then this wouldn't be a big deal; everyone could just look at how the tags and triggers are configured and understand everything they need to know. But even in this case, a moderately complex tracking implementation can become a maze of custom tags, dataLayer events, JS variables, and triggers that would take way too long to navigate even for those with the technical ability for it.
And what is the value of a $10k analytics setup that nobody understands? It's pretty close to zero. And if that setup is actually broken and people are making decisions based on bad data, the value is negative.
After seeing this situation so many times and not wanting our own analytics implementations to end up in such a sorry state, at Koalatative we decided to do something about it. We call it a data dictionary, but it's not an entirely new concept. It combines elements of a measurement plan, a goal tree map, and an event reference in the form of a living map of the analytics landscape from conception through implementation and maintenance.
Principles behind this:
Non-technical marketers should be able to easily understand which metrics fire at what time,
There should be progressive complexity so that more technical team members can dig into the details of event definitions and parameters as needed,
Metrics should be mapped to business goals, and have clearly defined relationships between them (think rollup goals and primary vs secondary metrics),
The metrics layer should be independent of the implementation details, defined in advance of implementation and continue to be relevant post-implementation.
There's a very good article called Why Most Analytics Efforts Fail that inspired this idea and is definitely worth a read. It goes into more detail on the causes of data issues and provides a spreadsheet event tracking dictionary as a possible solution. But I'm not sold on a spreadsheet as being the right format for this type of thing. For one, you need images, and putting images in spreadsheets is just wrong. But mainly the UX of a spreadsheet doesn't help with the goal of making the data dictionary accessible. And, the standalone nature of a sheet is too limiting. We wanted this to be something that stakeholders would actually look at, so it was important to have it be an integrated part of the CRO program management system, which meant that Airtable was the natural choice.
The main benefit of Airtable is being able to organize all the technical details like events and parameterss in a logical way without cluttering up the main user-friendly Metrics tab, while still keeping everything linked together so you can follow the trail of relationships between all the moving parts.
There are some more specific benefits of using Airtable for this as well:
The metrics are linked to the experiment records, so when you add a test idea, you select the goals for the test, and then through the entire lifecycle of the experiment anyone can click through and see how those metrics are defined, where they fire, and how to find them in analytics.
The metrics are linked to the pages that they are relevant to, so you can either start from the Pages tab and see all the metrics that fire on a particular page, or you can click through from the Metrics tab to the page where there is a link to the page itself.
It's really easy to create new views that filter / group / sort the metrics however you want. For example if you have a couple of different subdomains you can have a main view that groups them all together, and a separate view for each subdomain.
There's an "Other Tools" tab where you can add the metrics that need to be defined separately from your main analytics platform. An example of this would be a testing tool where you want to track the same metrics but they need their own definition which may be slightly different from how that event is defined in GTM / GA4. You can start from the main Metrics tab, see how it's setup in GA4, and also click through to see how the same metric is defined in your testing tool.
While Airtable is a solid option, this concept is not tied to any particular tool. That includes tag managers and analytics tools too; it's equally valuable no matter which one you are using, and as long as your analytics platform uses an event / parameter format, it'll work with the same setup.
However you decide to do it, making a data dictionary part of your process will make your life much easier. New hires or partner agencies will be able to instantly get up to speed on the tracking setup, everyone will be speaking the same language, and that language will be business-focused rather than parseltongue (if you don't get what I mean by that then a data dictionary is definitely for you).
If you've read this far you're probably wondering how to put this idea into practice. We want to make this as easy as possible for you, so we've incorporated the exact data dictionary setup we use with our clients into version 2 of our CRO program management Airtable template. A step-by-step guide is in the works to explain all the tabs and columns and walk you through the entire process of setting this up for your own site, so stay tuned!
We'll send you an email when we publish new content