...or the lifestyle of the top-level athlete.
Beforehand:
In our previous 2 articles, we discussed how to automate the generation of Tagging Plans with Data On Duty, as well as how to validate the automated deployment of tagging on Pre-Production and Production web environments.
We will now tackle the monitoring of the site in the production environment.
It is during monitoring that we will define the Alerting & Notification process, which will be the subject of a separate article.
Why is monitoring essential?
Data quality monitoring ensures that the information collected is accurate, complete and relevant.
This not only enables us to make informed decisions, but also to optimize marketing campaigns, understand user behavior and continually improve website performance.
A Web environment must be considered as a living entity in constant evolution, due to updates: content, landing pages, tracking, campaigns, external interventions, etc., as well as their frequency.
This is a key point, because even if validation has been done beforehand, it is not sufficient.
Monitoring, a validation operation repeated over time, is therefore fundamental.
It will ensure the quality of tracking (i.e. the data collected) throughout the site's lifecycle.
Key points of data quality monitoring:
- Monitoring data quality involves monitoring tracking.
- What do we need to monitor for optimum efficiency?
- When to monitor data quality and how often?
- How to assess data quality?
- Monitoring methodology
- Case study of an eCommerce site or equivalent
- Impact of monitoring on KPIs
- How does Data On Duty manage this entire process?
1) Data quality monitoring involves tagging monitoring
Tagging is the cornerstone of data collection in web environments, since tags collect data on users and their interactions.
It's essential to ensure that these tags are working properly.
Poor tag implementation can result in inaccurate, incomplete or totally missing data, compromising the integrity of analyses.
When we talk about how the tag works, there are several underlying elements:
- Is he present / absent
- Is it unique / doubled
- How was it populated (trigger sequence, TMS, which TMS, hard coded, timing, duration, etc.)?
- Which script(s)
- How many hits
- What kind of hits
- Which variables
In other words, manual taggae validation on thousands of pages using a browser console or rules-based tools is mission impossible, not to mention dozens (or hundreds) of visitor paths.
Likewise, statistical tests are not valid insofar as a section of the site (or templates) may not be affected by changes at the time of validation, whereas the rest of the site will be.
Finally, this validation will require manual reconciliation with heavy Tagging Plans that are not handled by the solutions available on the market.
Data On Duty makes the difference: Tagging Plans are centralized and versioned within the platform.
Validation and monitoring are then carried out automatically in real time against the Tagging Plan(s).
2) What do we need to monitor for optimum efficiency?
For effective monitoring, several elements need to be monitored:
- URLs: ensure that page URLs are correct and correspond to the pages visited by users
- Visitor paths: analyze high-converting visitor paths (and others) to ensure that interaction tags are triggered, including those triggered by mouse/trackpad actions (scroll, swipe, hover, etc.).
- DataLayer : check that the data is correctly entered.
- Tags : check the presence and correct activation of tags on every page and key event
- Hits : make sure every hit is sent and received, and that it's unique.
- Variables and values: check that captured variables and their values match expectations and are not corrupted or missing.
3) When to monitor data quality and how often?
Data quality must be monitored on an ongoing basis.
However, it is particularly crucial to do so when new features or pages go online: any new implementation can introduce bugs or tagging problems.
After major updates: updates can affect tracking scripts and site structure.
Regularly, according to a defined schedule: regular monitoring enables anomalies to be detected and corrected quickly.
But this is the minimum service.
Monitoring frequency and depth are to web environments what training frequency and intensity are to top athletes.
Monitoring frequency can vary according to several factors:
- Frequency of functional updates
- Frequency of landing page publication
- Campaign frequency and volume
- Frequency of promotional operations
- Unauthorized interventions by other departments or service providers
- Live testing of tags and MarTechs on the production site, etc...
Our monitoring recommendations :
- Multi-product, multi-brand eCommerce sites with frequent marketing operations: daily monitoring
- Sitespresse& media in "pay perclick" revenue: daily monitoring
- Single-brand eCommerce sites: weekly monitoring
- Web to Store eCommerce sites: weekly monitoring
- Sites based on pathways such as banking, insurance, travel, leisure: weekly monitoring
- non-commercial information sites: monthly monitoring.
4) How can data quality be assessed?
To assess data quality, we'll use the Tagging Plan. This plan will detail all the tags supposed to be implemented, the data to be collected, and the variables and values expected. This document will serve as a reference for comparing the data actually collected with the theoretical data.
It's easy to see that, given the task at hand, its level of depth, deployment and frequency, nothing is feasible without automation.
Data On Duty once again brings unique value in that everything is centralized and shared within the platform (Tagging Plans, Validation, Monitoring, Alerting), so not only is automation total, but processing is also carried out at very high speed, exhaustively if necessary, and accessible to all contributors.
5) Monitoring methodology
As we have seen, monitoring is a repetitive operation with varying degrees of frequency, depending on the nature of the site.
We will review 3 methods to determine the most effective monitoring model.
Manual testing
Manual testing can verify the initial implementation of tags and ensure that they collect data accurately. Here are some key steps for effective manual testing:
- Site navigation: browse the various pages of the site using different browsers to check that the tags are triggered as expected.
- Tag inspection: use browser development tools (such as Chrome's Element Inspector) to check the presence and correct operation of tags.
- Debugging tools: use browser extensions such as Google's Tag Assistant to analyze and validate tags and check data sent to tracking servers.
Problem : these tests are very unitary, they can't be industrialized and they quickly become a source of errors, as they are exhausting to carry out. To be avoided.
Script testing
Scripted testing saves time and improves validation efficiency by regularly and systematically checking tags and collected data. Here are a few strategies for automating tests:
- Test scripts: develop scripts that simulate user interactions on the site and verify tag triggering and data collection.
- Test frameworks : use frameworks such as Selenium to generate tag tests on different pages and scenarios.
- Alerts : set up alerts to detect anomalies in collected data, such as sudden drops in traffic or missing data.
Problem : these tests cannot be industrialized and must be constantly adapted. They cost resources, time and energy. To be avoided.
End-to-end automated testing
End-to-end automated testing requires total integration between tagging plans, page templates, visitor paths, the DataLayer, the tag population process, hit triggering, timing, variables and data.
So only a platform integrating all these processes and analysis tools will enable such results to be achieved.
- Mass analyses: these ensure that tracking is fully deployed across all sub-domains, sections and templates. In particular, we check the presence and activation of tags, hits, variables and data.
- Visitor pathways: these complement mass analyses, as they provide control over the interactions of multiple hits and dynamic data, enabling the visitor to go from a landing page to check-out, via PLPs, PDPs, registration forms, selectors, etc...
- Alerting & notifications : their intelligent configuration from the broadest spectrum (Tagging Plan) to the most restricted (variable-value pair on one or more analyses/visitor paths) will enable you to consult only the points to be dealt with, without wasting time on multiple reports.
Problem: solutions based on rules engines or trigger sequences sometimes require long and tedious parameterization, lasting up to several months. What's more, maintaining these rules requires constant monitoring. So these classic solutions generate a lot of work.
Data On Duty eliminates all these problems by industrializing processes, automating them and sharing them across multiple brands, domains, sections, routes, when necessary.
What takes weeks or months with standard solutions, takes just a few hours or days with Data On Duty's generative model.
6) Case study on an eCommerce site or equivalent
Here are some data quality monitoring elements for this type of site:
- Page view tracking: ensure that page view tracking tags collect accurate data on the number of views and duration of consultation
- User engagement: analyze data collected on user engagement, such as clicks on internal links and CTAs
- Conversion tracking: ensure that conversion tracking tags are correctly implemented and collect accurate data on purchases made.
- Product page analysis: check that tags on product pages correctly collect data on product views, additions to cart and cart abandonments.
- Social shares: check that social share tracking tags collect data on shares made on different social platforms
- Optimizing marketing campaigns: ensure that campaign tracking tags collect accurate data on visitor origin and behavior on the site.
7) Impact of Monitoring on KPIs
Commitment of Users
Monitoring data quality can also improve user engagement on a website:
- Behavioral analysis: quality data allows us to better understand how users interact with the site and what type of content, product or advertising interests them most.
- Content improvement & personalization: using the data collected, it is possible to optimize content or personalize the product offering to make the site more relevant to users.
- Increased time spent on the site: by improving the user experience and offering quality content, it's possible to increase the time spent on the site and build visitor loyalty.
Conversion rates
Monitoring data quality can have a direct impact on a website's conversion rate. Here's how:
- Identification of friction points : by collecting precise data on the user journey, it is possible to identify and optimize friction points in the conversion process.
- Experience personalization: quality data allows us to better understand user preferences and personalize experiences accordingly, thus increasing the chances of conversion.
- Landing page optimization: precise data on landing page performance enables us to optimize content and layout to improve conversion rates.
Marketing Campaign ROI
Campaign ROI can be significantly improved with quality data:
- Accurate attribution: reliable data enables better attribution of conversions to different marketing campaigns, helping to identify the most effective campaigns.
- Optimization of advertising spend: by analyzing campaign performance, it is possible to optimize advertising spend by investing more in high-performance campaigns, and adjusting or stopping those that aren't working.
- Segmentation and targeting: accurate data enables audiences to be segmented more effectively, and marketing campaigns to be targeted more effectively.
8) How does Data On Duty operate throughout this process?
The Data On Duty SaaS platform brings together all the processes and tools shared by all contributors involved in online data collection: business owners, digital analysts, digital performance managers, campaign managers, traffic acquisition, IT, data privacy managers...
Similarly, the platform is a centralized, shared repository for tagging and data, keeping a history of 100% object tagging plans and their versions over several years.
This also applies to mass analysis, visitor paths and alerts.
All this without having to enter a single line of code or rule, and of course without cookies, tags, extensions or installation.
Tagging plans are self-generating.
Validation and monitoring are carried out automatically and in real time for mass analyses and visitor journeys, in relation to Tagging Plans.
The whole can be unitary or exhaustive.
A Tagging Plan can be adapted in a few clicks to match another template or section, or simply to be applied to several countries at once.
In short, the Data On Duty platform is generative, automated, complete, integrated, shared and operates end-to-end: from design, to deployment, to operation, making it possible to achieve in hours to days what used to take weeks or months.
Conclusion
Data quality monitoring is essential to maintain the hygiene of your website.
It ensures that data-driven decisions are based on accurate and reliable information.
By regularly monitoring tags, user paths, the DataLayer, and hits, and by continually assessing compliance with the Tagging Plan, you can guarantee the validity, integrity, relevance and resilience of your data, and effectively optimize all your operations on your web environments.
With Data On Duty...
- Unleash the full potential of your web environments with seamless, automated end-to-end tag validation.
- Ensure a superior user experience while optimizing functionality.
- Accelerate TTM up to 90%.
- Proven ROI from the very first month of operation.
- Get ahead of the competition.