What they don’t teach you in DTM school

This article assumes a solid general tag manager as well as DTM-specific knowledge base. It will help take you from a “by the book” example implementation to the much messier real world.

There are also some bonus non-DTM tips in here as well.

1. You cannot set events using both s.events in custom code and in the UI*

Let’s say you are setting s.events in custom code because you need to set events equal to currency amounts rather than just increment them, since this is a limitation of the UI.

You might have something like (example order confirmation scenario):

s.events= “event59=”+_satellite.getVar(‘orderTotalProductDiscount’)+”,event60=”+_satellite.getVar(‘orderLevelProductRevenue’)+”,purchase”;

Then, you might later come into the UI and add an event in the Events area. This would be a mistake. The custom code will override the UI, and the UI entry will be meaningless. You have to now add ALL events on the given rule in the custom code section.

*Update: you can, but it’s not the best nor intuitive. See Jenn Kunz’s comment on this post for more.

2. You must avoid special characters in product syntax merchandising eVars to avoid conflict with the product string

Merchandising eVars should not use any characters that are reserved due to their functions in the product string. Prime examples include the semicolon (used to separate pieces of the product string), the pipe | which is used to separate merchandising eVars from each other, or a comma, which is used to end an iteration of the product string. This may seem like common sense, but these sorts of characters are remarkably common in data layers, particularly those serving dual purpose for Google Analytics. Characters like the pipe are used to separate page hierarchies, for example ( Level 1 | Level 2 | Level 3 ). I try to avoid the following characters to stay out of trouble:

*Avoid the below completely or replace with a space

& ; , + % @ # ! * () {} [] ‘ ” =

*Replace | with /

*Avoid hidden or non-ASCII characters

3. “Data element changed” has several limitations

To detect a change in a data element, DTM listens periodically for changes. This makes it attractive as a rule condition. However, “periodically” is often not enough when the change occurs when moving from one page to another. Alternatives that pick up more quickly are using _satellite.track and setting up a direct call rule, or using a CSS selector in an event-based rule if it applies if you are fairly confident it is stable and will not change (keep tabs on it anyway). All of these will be detected more quickly than data element changed.

The second limitation is that data element changed is text-based, not time-based. What I mean is that it is looking for a change in the contents of the data element, but if you push the same text to the data element again, it won’t be detected as a change. Let’s say that you do 2 of the same thing in a row, for example adding an item to the cart from a given page. If you have a data element showing the contents of your event array, and you are depending on it to change to trigger add to cart-related items, it will only trigger once for the 2 add to carts you did from the page if your data element just picks up “addToCart” each time (or whatever text is in the data element.) A way around this is to send an incrementing number at the end of the text string (e.g addToCart1, addToCart2) so it gets detected as a change. This is somewhat inelegant but gets the job done. I have had to use it when I had a structure I couldn’t modify and insufficient dev hours to make an ideal set of changes, for example.

4. There are more special characters to avoid in classification uploads than those listed in Adobe’s help documentation

OK, this one is not DTM-specific, but it’s still a “what they don’t tell you.”

*You cannot use single quotes around data in a classification file. You might be tempted to do this to avoid leading 0’s from being removed from a product ID, for example. However, if you do, it will cause issues with your upload and has been confirmed by my own frustration and client care’s subsequent confirmation that this should be avoided. This is not documented anywhere that I have seen that single quotes surrounding cell data cause issues, but they definitely do. The solution is to use something other than Excel (basic text editors, etc.) to edit the file so the leading 0’s do not get cut off or change your Excel settings accordingly.

*This may be obvious but it happens all the time – avoid using the separator character in the file if at all possible (e.g. tabs within a cell within a tab-separated file). If you must: you can get around this by changing v:2.0 to v:2.1 in cell C1 in the classification template. You can also escape the special characters.

*Using tab (/t), form feed (/r), new line (/n), double quote (“), caret (^), or pound (#) are all no-nos within a cell unless you escape them. The pound symbol within a cell will get your data interpreted as a comment and ignored, which you don’t want, for example.

*Something that doesn’t get repeated often enough: the first row under the header must be blank in the file; your data starts on the following row. If you don’t do this, your file won’t always upload.

5. There are non-publicly documented differences in default allocation for the pages (s.pagename) report between different Adobe products

(Another non-DTM item, but very important to understand.)

Reports and Analytics: uses linear allocation

Workspace: only shows values set on the same hit where the pageName is set

Data Warehouse: uses linear allocation

Ad Hoc Analysis: can be set to default, last, or linear allocation

This means your data will very likely not match if you are looking at the Pages report for a given set of events in R&A vs. Workspace.

6. When you set up the Google Analytics tool in DTM, don’t choose “Google Analytics”

Because you might want to go back in time to before October 2012, it’s still possible to use ga.js rather than analytics.js. You really don’t want to, though. If you want a recognizable, modern GA with enhanced eCommerce capabilities, you will want to add the “Google Universal Analytics” tool. If you choose “Google Analytics” by mistake, enjoy your trip back to the early days of web analytics. Perhaps we can also add an option for Urchin, or Site Counter. I digress.

7. Common marketing tags that you would copy/paste as custom JavaScript or HTML in other tools must be rewritten if you want to run them as non-sequential JavaScript

There are many things that need to happen above and beyond removing surrounding script tags for JS-based marketing tags to run successfully as non-sequential JavaScript in DTM. You will have to rewrite many common tags. Jenn Kunz has a great article on specifics with some good examples here.The recent, uh, “launch” of Adobe Launch (the newest Adobe tag manager) offers hope for this scenario and many others mentioned above, and as it continues to evolve I hope to see these scenarios and more handled.

Testing Checkout Without Using Your Own Credit Card

Don’t use your personal credit card when test credit cards abound!

Proper revenue recording is a critical piece of analytics implementation and testing, but I’m always surprised how few vendors and analysts seem to be aware of the publicly available test credit numbers that are available for use. I’ve heard too many instances of people using their own (real!) cards and then cancelling the purchase, or even actually buying things from the site for test purposes. That’s fine if you actually want to make a purchase, but given the amount of test scenarios required, that can become an expensive habit quickly.

Payment processors typically accept a standard suite of test credit cards that will allow an order to go through on the site, but that are flagged after the fact as a test card and the order is not fulfilled. If you are doing this on production versions of the site, be forewarned that this can impact your order-related metrics both in the analytics platform (artificially increase them provided your IP isn’t blocked in the platform you are validating in) and increase error rates in order management systems since this (as intended) won’t actually go through. It is also true that certain sites are configured to not accept test credit cards in their production instance, so you may receive an error; however, the majority of sites will allow these to be used.

With those caveats out of the way, allow me to introduce you to the standard suite of test cards:

https://stripe.com/docs/testing

This is one of the best resources out there and includes many different scenarios and international cards as well.

The testing procedure is as follows:

  • Go through the normal process to add items to your cart and proceed to checkout
  • For the name, it is helpful to use “Test Test” or something similar so it’s clear this isn’t a real order
  • For email address, it’s ideal to use a valid one you have access to, so that you can receive the order confirmation email
  • For shipping/billing address and phone, use clearly phony data that still meets validation criteria (e.g. 123 Main Street in a given city and 212-555-5555 as the phone)
  • For the CVV (3-4 digit code), use any set of numbers
  • For the expiration date, use any future month/year combination

The site will inform you if there is any error confirming the order, but most times, you’ll get through. Make sure to have your debugger and/or console open and recording prior to purchase so you can see the analytics call come through. I’ll be sending this post as a reference next time I hear talk of someone using a personal card for analytics testing!

The Data Layer: A Primer

The data layer is a key part of most modern web analytics implementations, but it seems there are not many resources to explain it to a less technical audience, or perhaps to someone who is technical but new to working with it. Let’s start with the basics:

What is it? In the world of the W3C data layer (more on that in a minute), the data layer is a JSON object.

To break that statement down, W3C is the World Wide Web Consortium. They provide standards for the web that align architecture and design principles so that the web can work well and continue to grow. (This may seem unnecessary because it seems like the internet “just works”, but if you weren’t around for the wild west days where there were vast differences between what different browsers supported across the web, standards are most welcome and necessary.)

JSON is JavaScript Object Notation. JSON is a way to organize and structure data objects in a human-readable way. It answers the questions “What is this item called, and what is its value?” For example, to take a somewhat amusing example from the W3C’s full documentation, how easy would it be (without knowing anything about the data layer) to answer the question of which product we are viewing here?

digitalData.product[n].productInfo = {
productID: “rog3000”,
productName: “Rogaine”,
description: “Hair Regrowth”,
productURL: “http://site.com/r.html”,
productImage: “http://site.com/rog300_large.png”, productThumbnail: “http://site.com/rog300_thumb.png”, manufacturer: “Pharma”,size: “300ml” };

Pretty simple to see, right? It’s easy to look at this and see the items and what they represent. Contrast this with something like s.eVar47= “rog3000” (which is how the productID piece of this would look if it was hard-coded and assigned to Adobe Analytics conversion variable 47), and you can see how much more human readable this format is.

Why is it? There are many advantages of the data layer, but here are a few. The data layer allows for a simple place to funnel all the data about your website into an organized format that can later be referenced by analytics tools, marketing tags, and more. This helps ensure that you are passing the same data across tools, and gives the flexibility to assign the product name to, say, variable 37 today but variable 40 tomorrow. (Not that I recommend bouncing around which variable you are using without a very good reason, but that’s one example.)

In the past, with hard-coded implementations, you would need to do things such as specify that it was a particular variable number associated with the product name value, and your development team would have to make that adjustment if things ever changed. Using a data layer also allows you to more easily send data to multiple analytics tools (for example if you have both an Adobe Analytics and a Google Analytics implementation running, they can both reference the same set of objects on the site).

As websites evolve, tasks like updating a manual variable assignment get more cumbersome and error-prone, and introducing inconsistencies generally ends badly in terms of data quality. The data layer takes all that out and provides an easy (and again, human-readable) reference point.

Using the data layer with a tag management system (things like Google Tag Manager, Adobe’s Dynamic Tag Manager [DTM], Tealium, Signal, Ensighten, etc.) allows the analytics team to make adjustments like this without waiting on a formal release or involving a development team. It also allows the development team to keep the analytics data flow consistent during major redesigns of the site. It’s easier to see the requirements and they are more meaningful, versus less human readable lines of code that have to continue to be explained and defined. There’s plenty of defining that goes on when creating a data layer as well, but the starting point is much more understandable.

How are objects referred to within the data layer? The path to an object is referenced with dot notation. In the below example digitalData object, to refer to the pageType, I would refer to digitalData.page.category.pageType. This is the type of notation that a tag manager would use.

digitalData.page.category = {

primaryCategory: “FAQ Pages”,

subCategory1: “ProductInfo”,

pageType: “FAQ”};

Is it always called “digitalData”? There are many names for the data layer. The W3C standard name is digitalData, but tag managers like Tealium use utag_data, you will see dataLayer with Google Tag Manager, etc., plus you may see companies that use their own custom names.

Where can I see the data layer? You can open your browser’s console and enter the name of the data layer (see above for common ones), and you will get back a list of key/value pairs within a structure. Some of the data layer may also be visible in the page source depending on many factors, but the console is the most direct and accurate way to see the current values.

To get back the value of only a specific item, you can enter the dot notation format. In the example a few paragraphs up, if I were to enter digitalData.page.category.pageType in the console, I would get back “FAQ”.

How does it fit into the analytics ecosystem? Here is a very basic example of the way the data layer can power an analytics implementation (taking into account the fact that I am not a graphic designer by any stretch of the imagination):

digital data layer process flow

Where do I go from here? Learn more about the specific analytics solutions you are working with and explore how the data layer is set up on your site. This training will vary significantly by tool.

I did not go into JSON arrays since this is a basic intro, but they are very important since they are often used for the products that are being viewed (as one example), and that is very important for most web analytics solutions. It would be good to go through the whole JSON section from W3Schools.

Hope you are able to take some newfound understanding of the process flow and use it to further your web analytics journey!

Update: if you’ve read this and are still asking the question, “Why can’t I just use CSS selectors for everything?”, please read this colorful response from Jim Gordon’s blog: https://jimalytics.com/implementation/data-layer-or-css-selectors/

“It’s just widgets!”: The Dangers of Oversimplification

You could be a web analytics hero with 100 implementations under your belt, a winner of Kaggle competitions, a creator of your own AI platform, and still be perceived as incompetent because of communication and wording issues.

A few years ago, I was presenting at an event and was speaking with people from many different companies beforehand. I was briefly going into the particulars of the business I was in at that time, and the person I was speaking with cut me off and said, “Yeah, it’s all just widgets. No matter what the industry, everyone is just trying to sell more widgets.” Well…not entirely.

The goal of most for-profit companies is to improve their revenue over time, granted. There might be a number of variations on that theme such as increasing market share, changing the product mix in a desired way, launching in a new geographic area, etc.

There are absolutely also non-profit entities whose true goal may lie within the “awareness” space, simply getting people to perform actions in keeping with public or personal health or read about a condition without a specific financial or donation goal in mind (and, of course, plenty that do have a donation goal in mind required to sustain the organization.)

However, the important thing here (and by “here,” I mean from a web analytics perspective) is that an understanding of general business structure and practices does not equate to a specific group or client’s confidence in your abilities.

The reason you are likely working on a web analytics project is that you have the expertise to do so, or are in the process of getting it, and the group or individual that asked you to does not have that time or expertise. The person you are doing it for is (generally) unlikely to evaluate you on your technical skill during a project unless something majorly breaks. So what do they base their perception on?

Often, it’s how accurately you translate their requirements back to them in language that makes sense, and your demeanor in doing so.

Let’s come back to the widgets. If I am in the healthcare field and have hired you to do an implementation for a patient portal, and I refer to the people on this site as “patients”, how will I feel if you say, “OK, you want more sign-up thingies, got it”? The language may not be quite as crude, but you can begin to form an idea of how this might come across. It’s important that the analytics professional reflect back the requirements in the language that the business is actually using, for reasons of precision and mutual understanding. A happy consequence is that the group or client feels heard (reflective listening skills) and is more confident in your abilities (because you appear to listen and properly use terms that they know about.)

Worth highlighting again: this all happens regardless of your actual skill level in analytics. You could be a web analytics hero with 100 implementations under your belt, a winner of Kaggle competitions, a creator of your own AI platform, and still be perceived as incompetent because of communication and wording issues. Perception is not based on your actual skill level since there is often no benchmark to compare you to in the world of the people you are assisting.

So: if you’re great at technical analytics or reporting, the way to get people to realize it is through picking up on proper wording and tone for who you’re working with. If you’re still learning a lot about analytics and reporting, the way to gain trust to take on larger projects is to create satisfaction on the ones you’re working on now through good communication skills. There is no downside!

 

 

The Analytics Bookmark List I Wish I’d Had When I Started

Save time and hold onto these!

This list is a distillation of helpful resources for bookmarking/saving. You can use this post as part of a new hire onboarding plan, or for yourself to keep organized and up to date on analytics happenings.

Analytics Platform Login Pages

Adobe Analytics Enterprise Login

Adobe Analytics Marketing Cloud Login

Google Analytics Login

Tag Manager Login Pages

Adobe Dynamic Tag Manager (DTM) Login

Google Tag Manager (GTM) Login

Tealium Login

Ensighten Login

Debugging Tools

Adobe Analytics Debugger: Useful for validating basic pageload-level props, eVars, events, products, and simple attributes such as “JavaScript enabled” status. Not as useful for reviewing pages with multiple calls, post-pageload activity, custom links, etc. (Bookmarklet – create a placeholder bookmark and enter the JavaScript code provided at this link as the “URL” of the bookmark and place on your bookmarks bar, then click once you are on a site with an implementation.)

Disruptive Debugger: Useful for validating Adobe DTM data element values and whether or not rules load; will pick up post-pageload activity. Data element values change as they come in – will not record previous value but will just update to most current value. (This is a bookmarklet as well – see definition above.)

GTM Debugging Tools: article references multiple helpful tools, some GTM-specific, others more broad.

dataslayer Debugger: cross-platform debugger compatible with DTM, GTM, Tealium, and TagCommander; can monitor any W3C data layer.

WASP (Web Analytics Solution Profiler): WASP inspector is a Google Chrome extension that runs through the Chrome developer tools area. It detects tags across all platforms and offers more insight into the GTM data layer. (WASP profiler is their paid product, but the inspector is free.)

Charles Proxy: useful for debugging complex raw analytics calls and monitoring a continuous stream of activity with recording; can be used with mobile apps for live debugging. Requires a license after a 30 day free trial. Requires some time up front to properly set up certificates if using with a mobile device.

Analytics Blogs and News

Blogs

Justin Cutroni (GA, thought leadership)

Avinash Kaushik (GA, thought leadership)

Simo Ahava (GTM)

Daniel Carlbom (GTM, GA)

Web Analytics Demystified (cross-platform strategy)

Jenn Kunz (Adobe DTM and data layer strategy)

Adobe blog posts related to analytics

Release Notes

Adobe Analytics/Experience Cloud Release Notes

Google Analytics Release Notes

Industry News (broader landscape)

Search Engine Watch (major marketing channels and analytics)

CMO.com (by Adobe)

eMarketer (research/benchmarking)

Econsultancy (market research)

Information Week (IT/big data)

CIO (business systems and trends)

Professional Associations

Digital Analytics Assocation

 

Virtues of Web Analytics: Persistence

Most quality solutions require first going through an amount of frustration that would cause many people to simply give up. Keep persisting, and by doing so, you’ll open the door to greater adventures.

For the intro to the Virtues of Web Analytics series, see the first post.

There are 2 parts of analytics that immediately come to mind when I think of persistence: getting the right people to care about the trends and tools that are important to you, and getting to the bottom of technical or analytical problems.

Neither one is for the faint of heart.

Part I: Getting the right people to care about the trends and tools that are important to you

Many people outside analytics view the entire field as a numbers game and analysts as sort of glorified bean-counters. Another view is that analytics teams speak in a sort of techno-babble that is not worth taking the time to understand, and doesn’t pertain to what other teams are doing. Others have no concept of the field or know just enough to be dangerous. A select few have previously done this work or are very knowledgeable about it. Our task here is to make connections with the rest of the business, regardless of their level of knowledge, in order to overcome misconceptions. One of the easiest ways to do this is to come to the table with a solution or an opportunity you’ve noticed that will directly benefit the group you are speaking with.

To arrive at what might help you connect with a group, you first have to know what is important to them. You might first have a meeting (or devote the initial part of a meeting) to understanding the goals and challenges of the group you’re connecting with, and devote the second meeting/part of the meeting to how your skills and tools can help give the group insight on the needs they express. However, this is not a “one and done” type of relationship. It’s important to keep up to date on how the group evolves their goals, and how they are using (or not using) what you bring to the table. A monthly or quarterly check-in is not a bad idea if you are trying to establish an ongoing relationship, even within a single company. This is an especially good idea if you are proposing sharing the cost of a tool that will benefit another group in addition to your own, or you are trying to get a group to adopt your analytical recommendations.

If you do have a proposal that involves someone else spending money on analytics tools within your company, build trust several months ahead of that; having that as your first ask sours the relationship and doesn’t allow you to prove the value of what you do outside of that context. (In the event that you’re questioning the value of what you do: If you don’t trust your own skills or the quality of the analytics practice where you work, what can you do to change that? There are both free and paid courses, professional associations, conferences, books, articles, podcasts, and more all over at your disposal. That’s a whole ‘nother post.)

Part 2: Getting to the bottom of technical or analytical problems

The first section deals mainly with persistence in your interactions with others. This section is all about you. Any mildly experience analytics pro deals regularly with difficult technical or analysis requests outside the neatly packaged world of ideal KPIs. This is the proving ground on which you can differentiate yourself, through both your actual solution and the way you act while you work through it.

Most quality solutions require first going through an amount of frustration that would cause many people to simply give up. Knowing at the beginning of the game that you will get to this point is helpful. Step back, breathe, take a walk, do something, but don’t let yourself get caught in a cycle of frustration and anger at yourself or the situation. There are many opportunities to practice mindfulness in this field. (Potentially also a whole ‘nother post.) Keep going through this cycle, which often feels like throwing yourself at a wall, over and over, until you start to happen upon something that works (and learn to laugh at yourself.)

Learning many of the major analytics tools in the first place requires a certain amount of “grit” and pushing through feeling less than intelligent at times. If you haven’t started on that journey yet, keep that in mind – don’t let it prevent you from starting, just remember it when you experience frustration, knowing that we all have.

You’ll notice that none of these ideas are quick fixes. Keep persisting, and by doing so, you’ll open the door to greater adventures.

Virtues of Web Analytics: Humility

Since we all can’t be up to date on everything all the time, there is huge benefit in listening to alternative perspectives to your own.

As with any field, analytics has a set of characteristics that can help a person develop into an increasingly better version of themselves. While these characteristics overlap with “virtues” you’ll see in other contexts, these posts are meant to highlight their specific relationship to professional development in this career field.

There seem to be 2 extremes that are too easy to fall into–the first being a self-deprecating beginner sort of mindset where one is unsure about everything and constantly second-guessing solutions, and the other being an overly self-assured mindset, either from lots of experience or a confident beginner who isn’t yet aware of the complexities of the field (for more on this, see the Dunning-Kruger effect). With sufficiently diverse experience, it’s easy to feel that you’ve “seen it all.” In many cases, you have seen a good slice of things, but you haven’t worked on every aspect of an implementation every day – that is simply not possible. No matter what your level of experience, the web analytics stack that surrounds you is constantly evolving, and it’s possible to lose sight of some of the basics or certain aspects of a platform as you narrow your focus for various projects.

This necessitates walking a line where you can be confident in your recommendations, yet open to new information and listening to differing opinions. Since we all can’t be up to date on everything all the time, there is huge benefit in listening to alternative perspectives to your own. That doesn’t mean they’re automatically valid, but allowing the space for them to be aired is supremely valuable, not least of all because it allows you to hone your active listening skills and refine how you can eloquently navigate a set of varied opinions to arrive at a definite project plan.

This can be a hard one to find where the line is, and there are bound to be some times when you veer a bit far to one side or the other of the overconfidence/humility line, but the process of figuring this out is key to continued personal progress. I certainly don’t have everything figured out, but I’ve seen this come up as a recurring theme that can have a big impact. Onward and upward!

The Precision/Exploration Continuum

First rule of analytics: you will have to make decisions with imperfect and incomplete information.

In the web analytics field, there is a constant tension between mathematical rigor and the  “eureka” moment that comes unexpectedly – the hunch that leads down a path of many more questions that need to be explored.

Despite all the (necessary and laudable) efforts placed into precise implementation of the tools, and rigorous statistical analysis of A/B testing, to succeed in this space requires the ability to make the leap to a concrete suggestion with imperfect data, versus knowing when the information you have is so incomplete as to be inadequate for the task.

In this vein, it can actually be a disadvantage in some ways to come from a very mathematically-focused education, since in the world of numbers it is possible to reach a state of perfect or near-perfect proof of something. In the real, messy, day-to-day world of business, however, many complicating factors are likely to arise, and “making all the data perfect” is unlikely to happen anywhere – there is nearly always room for improvement, or past issues that need to be accounted for. That’s the nature of placing analytics on top of live web data, which is prone to complications at the server, internet provider, browser, individual machine, site platform, back-end database, and a number of additional levels, not to mention implementations changing hands over time and human error.

To navigate this, I suggest:

  1.  Spending the first few years in rigorous study of major analytics platforms and learning the “ideal” state.
  2.  Get in a place where you can look at many different implementations to see how things play out in the real world and see different scenarios.
  3.  Throughout, follow your hunches and support them with the data that you can, but when you just “know”, use that as fodder for an A/B test or similar that allows you to fully prove out your theory.
  4.  Don’t be paralyzed by too many small details of an implementation – break the problems or suggestions into major buckets.

Of course, it may not be possible to order your career in precisely this way, but the general idea is that in order to be nimble at the speed the business is nimble, you will have to make decisions with imperfect and incomplete information. The best way to develop your analytics intuition is spending time building a “vocabulary” of scenarios in your mind so that when you encounter a new one, you have a palette to paint from.

Happy analyzing!