For those familiar with Adobe Analytics (or its newer cousin, Customer Journey Analytics), you probably know that changing a years-old implementation is not for the faint of heart. Most people just go with the “do the best you can with what you have” approach - and honestly, that’s understandable.
But I found myself in a situation where sticking with the legacy setup simply didn’t make sense anymore. The only real way forward was to rework the foundations. So, here’s a sneak peek into how that’s been going - and what I’ve learned along the way.
Have you ever read the Adobe Analytics setup guide? (Or any analytics setup guide, really.) They all start with the same golden rule: “Know what you want to measure” before figuring out how to measure it.
In Amplitude Analytics, this might mean defining a North Star metric. In Adobe Analytics, it’s about creating what’s called a Design Solutions Document (DSD) - basically, a living blueprint that connects business needs to the data layer and defines how each custom variable, event, and eVar is used in the codebase.
It’s not glamorous work, but it’s absolutely critical if you want to follow best practices - especially in a GDPR-heavy world where data minimization (“if you don’t need it, don’t collect it”) is an actual legal requirement.
For me, this became the biggest challenge of the year: building a DSD that not only documents the current Adobe Analytics setup but also includes all the improvements and cleanups we’ve discussed so far. And let me tell you, it’s no small feat trying to reverse-engineer why something was implemented a certain way - especially when you’re dealing with a 5-year-old setup that you inherited two years after it was created.
I’ll skip the horror stories, but here’s the main takeaway: never skip the planning and need-assessment stage of an analytics implementation. It might feel tedious up front, but it saves years of confusion, redundant variables, and half-broken tracking logic later on.
Once I had a good handle on what we actually have and what we actually need, it was time to start removing the junk.
I began with the low-hanging fruit - variables with no data, no business purpose, and no realistic future use case. Then I moved on to the trickier stuff: the ones that still worked but didn’t make sense anymore in our current business or technical context.
Bit by bit, I stripped the setup down to what really matters.
The result is a much leaner and cleaner Adobe Analytics implementation - easier to manage, easier to maintain, and far more transparent from a data quality standpoint. Sure, it means we’ve drifted even further from the main branch’s setup, but honestly, that’s fine. What we have now actually fits our ecosystem instead of trying to mimic someone else’s.
Up until recently, we had a pretty lazy habit when it came to Adobe Data Collection Platform tags (formerly known as Launch). Whenever a new website popped up, we’d just grab an existing tag, tweak a few lines, and deploy it. The goal was consistency - but in reality, we were just copy-pasting technical debt.
This caused no end of problems. URL path obfuscation that worked for the main domain would completely break on smaller, made-to-order sites. Triggers would fail because different dev teams used different naming conventions or event listeners. One small change in a shared tag could break tracking across multiple unrelated websites overnight.
So I decided to stop the madness and go with a more modular approach: different tag for different website.
Now, whenever I hear that a new domain or product site is coming, I start with a lightweight base tag and sit down with the devs to understand the framework and structure. From there, I build a tailored setup - unique data elements and events for that site, with only the core logic shared where it makes sense.
The outcome? Now the tracking reflects the actual situation of each site's functionality, there are fewer collsions between site tracking, easier debugging and cleaner and more relevant data use in dashboards, but even more importantly - I now understand why each variable exists and what it impacts.
Going against the status quo always raises questions - especially when it means tearing apart a “standardized” company-wide setup. It takes time, effort, and a fair bit of explaining to convince people that cleaning up and simplifying doesn’t mean losing data - it means making data useful again.
But the payoff is absolutely worth it. The implementation is now more compliant, more maintainable, and genuinely easier to understand. The business logic behind each variable is clear again, and if we ever decide to move away from Adobe Analytics entirely, the migration will be much smoother. What we have now is cleaner, future-proof, and, most importantly - actually fits how we work in the Baltics.
Sometimes, going back to basics and tracking less is the best thing you can do to make your data truly valuable.