Get in touch

Category Archive: Uncategorized

  1. Checkout Extensibility upgrade – July 2024

    Leave a Comment

    Executive Summary:

    Shopify are deprecating the checkout.liquid theme file for the Information, Shipping and Payment pages on 13th August 2024 (with other areas to follow in August 2025)

    Existing customisations and integrated checkout tracking on these pages will no longer function after this date. This will not impact how shopify works, but any additional third party tracking and how that data is collected will be affected. When a user rejects marketing or analytics cookies, Shopify will not fire any tracking in the checkout, whether it sets cookies or not. This has serious impacts on any required tracking or tools such as Google Ads that use cookieless pings to model key metrics back into the tool.

    It’s imperative you review your setup ASAP

    What is the announcement?

    August 13, 2024 is the deadline to upgrade from checkout.liquid to Checkout Extensibility for the Information, Shipping, and Payment pages. Checkout.liquid is deprecated and has not been updated with new functionality since 2021. With ever changing regulations worldwide, including the introduction of card industry rules under PCI DSS v4, checkout.liquid is less secure than Checkout Extensibility, and they will discontinue its support.

    Checkout extensibility overview

    Future of ecommerce overview  

    Why is checkout.liquid being deprecated?

    Customising the Shopify checkout via checkout.liquid is typically complex and time-intensive, requiring advanced coding knowledge. Checkout.liquid customisations are often impacted by upgrades and, in some cases, result in poor checkout performance and a substandard buyer experience. They can also be problematic for security reasons.

    One other additional factor to this change was in August 2022, when a conflict with Google Tag Manager in the theme code caused a 12 hour Shopify checkout global outage.

    Shopify has been seeking to ensure checkout stability and performance whilst enabling the meaningful addition of new capabilities and functionality. As such, it has invested in Checkout Extensibility, a suite of apps and tools that make it easier to customise the Shopify checkout and build bespoke checkout experiences. Checkout Extensibility is secure, code-free, app-based, upgrade-safe and higher-converting. It also integrates with Shop Pay, which means, for the first time ever, express checkout can be customised.

    Crucially, Checkout Extensibility replaces the need for checkout.liquid.

    How will checkout extensibility impact your tracking?

    Despite Shopify positioning this as ‘no change’ to how tracking will work it severely impacts your ability to pass data into all media tracking e.g Google Ads, Meta and GA4)

    The guidance and the Shopify interface says that your pixels (where tracking scripts sit in Checkout Extensibility) will always run. (see below screen grab).

    However, this is incorrect as your tracking script will not run if a user rejects either marketing or analytics cookies. Blocking any tracking when a user rejects these cookies prevents technology such as cookieless pings, these are used by Google tools to model key metrics when a user opts out of cookies. This is a global issue and will continue to impact tracking for all people who’ve upgraded to checkout extensibility.

    Why are shopify doing this with tracking?

    It is unknown why they aren’t allowing any cookieless tracking to fire from their platform. However, Shopify is moving to Checkout Extensibility for several reasons:

    Enhanced Security: The new framework is designed with better security features to protect both merchants and customers.

    Scalability: Checkout Extensibility allows for a more scalable solution, accommodating growing businesses and increased traffic more effectively.

    Improved Customization: The new system offers more robust and flexible customization options, enabling developers to create more sophisticated and tailored checkout experiences.

    Future-Proofing: By adopting modern technology standards, Shopify aims to future-proof its platform, ensuring it remains relevant and capable of supporting new features and improvements.

    What Does It Mean for You?

    You will need to review the following:

    Action Required: You need to migrate your checkout customizations from checkout.liquid to the new Checkout Extensibility framework before the August 13, 2024 deadline.

    Learning Curve: There will be a learning curve as you familiarize yourself with the new framework and its capabilities.

    Opportunities for Improvement: The new system provides opportunities to enhance and innovate your checkout process, potentially improving customer experience and conversion rates.

    Support and Resources: Shopify will likely offer support and resources to assist with the transition, including documentation, tutorials, and possibly migration tools.

    You’ll also need to QA your existing marketing tags to ensure everything works as expected.

    What to do next?

    • Audit how your tags are implemented on checkout extensibility
    • Review turning off the privacy API and using an alternative solution to continue to use cookieless pings for Google Marketing Platform
    • Review using OneTrust (or similar) alongside GTM to ensure this works as expected

    If you’re unsure of your current status or how to complete these steps feel free to email data@fabric-analytics.com and we can help.

    How can Fabric help?

    At Fabric Analytics we can provide different levels of support to ensure you’re fully ready for these new changes.

    As we work on a transparent hourly based model, reach out to us and we can support with any of your needs.

    If you need support reviewing your third party tracking from a custom checkout.liquid instance to Checkout Extensibility, please contact us to find out more about how our team of Experts can help you.

    As one of the most experienced tracking and data agencies in the ecosystem, with a first-class team of experts and developers, we are perfectly positioned to consult on your bespoke requirements and implement these.

    Developer Resources

    Here are a list of Shopify web URLs that verify the details about the transition from checkout.liquid to Checkout Extensibility:

    1. Shopify Developer Documentation: Checkout Extensibility
    2. Shopify Community: Checkout Extensibility Announcement
    3. Shopify Help Center: Migrating from checkout.liquid
    4. Shopify Plus Blog: Secure and Scalable Checkout
    5. Shopify Developers Blog: Customizing Checkout

    These URLs provide verification and further details about the transition, including the reasons behind it, the impacts of missing the deadline, and the benefits of the new Checkout Extensibility framework.

  2. Consent Mode v2 – what does it mean for you?

    Leave a Comment

    Executive Summary:

    Google has unveiled a significant update, emphasising the need for strict compliance among website owners who use Google Marketing Products. This encompasses a range of tools such as Google Ads, GA4, Floodlights, and various other Google tags.

    This change is particularly crucial for businesses that engage with users in the European Economic Area (EEA) & United Kingdom, as adherence to these new guidelines will become mandatory for effective retargeting and audience development within these regions starting 7th March 2024.

    Taking action on these updates is critical; inaction will lead to a considerable reduction in data collection, efficacy of utilising Google’s marketing tools for media buying, audience targeting, and retargeting within the European Economic Area and United Kingdom.

    Starting 7th March 2024, inaction will significantly disrupt conversion tracking, audience building and remarketing.

    What is the announcement?

    Google announced the launch of Consent Mode v2, a significant update to its web and app advertising framework. This new version is designed to ensure compliance with the latest privacy regulations and address users’ expectations regarding online privacy. Consent Mode v2 introduces enhanced features for managing user consent, particularly in relation to Google’s advertising and analytics services.

    Consent Mode overview

    Content Mode technical documentation

    What is Consent Mode?

    Consent mode is a mechanism introduced by Google for Tag management platforms to work alongside CMP’s (cookie management platforms) or gtags to respect a user’s privacy.

    It’s a method that ensures the consent signals you gather are automatically conveyed to Google’s web and app advertising networks. As a result, Google’s tags change their behaviour to accommodate these preferences.

    Google uses this data to enable conversion modelling to recover lost conversions.

    This allows marketers to boost the quality of bidding algorithms and measurement capabilities.

    What is new with Consent Mode v2?

    Consent Mode v2 adds two additional parameters: ad_user_data and ad_personalization,

    which are Google-specific and dedicated for audience building and remarketing use cases.

    Without these two additional characteristics, it will be impossible to develop targeted

    audiences, perform personalised advertising on Google Ads or measure performance in

    the EEA and United Kingdom.

    There are 2 versions of consent mode:

    • “Advanced Consent Mode” covers cookieless pings. Even if consent is not granted, data is sent to Google.
    • “Basic Consent Mode” blocks tags from firing altogether when relevant consent is not granted.

    You will need to send the relevant consent signals if GA4 data is being used to feed Ads audiences through the GA4/Ads integration.

    Google Consent Mode v2 is fully operational. If it is not implemented, the negative effects for advertisers will appear from 7th March 2024.

    What does this mean for you?

    Without Consent Mode v2, no data about new EEA users will be captured by your advertising platforms (Google Ads, GA4, etc.) after March 2024. This will affect measurement and reporting in this region, along with your audience lists and remarketing disabling the ability to run personalised advertising.

    Your bidding algorithms will run based on inaccurate and incomplete data, and your budget will be spent much less effectively.

    For example, if you’re running a Maximise Conversions campaign with a target CPA, it is important that conversions are measured as accurately as possible for the algorithm to function and bid effectively. When fewer conversions are registered (without Consent Mode v2), the strategy will under-evaluate some opportunities, leading to inaccurate bidding and budgets being used in less profitable ways.

    Below is a visual representation of the impact of implementing different consent mode versions:

    Which version of Consent Mode v2 should I implement?

    1. Recommended: for full ads / audience / remarketing capabilities, you need to use “Advanced Consent Mode” with all four parameters in place.
    2. If you want to block tags from firing when consent is not granted and still make something out of Ads, you need to use “Basic Consent Mode”.
    3. You may also decline to utilise consent mode entirely, in which case Ads will be restricted – no conversion or audience features would be available. GA4 will continue to function normally; the Ads integration will be constrained.

    How do I implement Consent Mode v2?

    The first step is to have a consent banner on your website that respects user choices. The

    easiest way to get started with this is to choose a Google Certified CMP partner. Fabric Analytics preferred CMP is onetrust

    Once you have a compliant consent banner in place, Consent Mode v2 can be implemented.

    Update your tagging infrastructure in your tag management system to reflect the Google

    Consent v2 requirements. Fabric Analytics can offer consent mode implementations and all

    the support and documentation needed for you to start measuring effectively.

    What to do next?

    • Audit how your tags are implemented on your website (e.g. GTM, Hard Coded)
    • If required, setup a CMP on your website (e.g. Cookie Bot)
    • If required, setup your tag management system to reflect the Google Consent v2
    • requirements

    Developer Resources

    • App
    • Offline
      • For manual data uploads (not via API/SDK), a Term of Services consent attestation opt-in will be required (and will launch in the product UI in early Q1 2024).
  3. Unveiling the Big Mystery: Google BigQuery and its Fantabulous Benefits!

    Leave a Comment

    Howdy, data enthusiasts! Today, we’re going to embark on a thrilling journey into the land of Google BigQuery, where data reigns supreme and insights flow like a chocolate fountain at Willy Wonka’s factory. Get ready to have your minds blown as we unravel the mysteries of this bewitching analytics tool and discover its glorious benefits!

    What on Earth is Google BigQuery?

    Imagine a vast repository of data, teeming with information from every corner of the digital universe. Now, take that colossal data warehouse, sprinkle some magic Google dust on it, and voila! You’ve got yourself Google BigQuery, the quintessential data analytics platform.

    Google BigQuery is a cloud-based, serverless data warehousing and analytics solution that lets you store, analyze, and glean insights from mind-bogglingly enormous datasets. It’s like having a supercomputer at your fingertips, eagerly waiting to crunch numbers and reveal hidden patterns. It’s the ultimate geek’s paradise!

    Benefits Galore – Prepare to Be Amazed!

    Lightning-Fast Queries: Have you ever waited for a slow query to finish, only to fall asleep and dream about bunnies? Well, BigQuery is here to save the day! With its supercharged processing power and distributed architecture, queries run faster than a cheetah on rollerblades. Say goodbye to impatience and hello to instantaneous results!

    No Infrastructure Hassles: Picture a world without worrying about servers, provisioning, or software updates. Sounds like a utopian paradise, right? BigQuery makes it a reality! It’s a serverless wonderland where you can focus solely on data analysis while Google takes care of the infrastructure behind the scenes. So sit back, relax, and let BigQuery do the heavy lifting.

    Scalability with a Capital ‘S’: Need to process a gazillion rows of data? No problemo! BigQuery can effortlessly handle petabytes of information, making it ideal for businesses with insatiable appetites for data. As your needs grow, BigQuery expands magically, like a bottomless bag of popcorn at the movies.

    Cost-Effective Data Storage: We all love a good bargain, and BigQuery delivers on that front too. With its storage model based on consumption, you only pay for the data you store. No need to worry about upfront costs or excess baggage. It’s like shopping for data warehousing on a 90% off sale—simply marvelous!

    Seamless Integration with Other Google Services: If you’re a fan of the Google ecosystem (and who isn’t?), BigQuery is your golden ticket. It plays harmoniously with other Google services like Google Cloud Storage, Data Studio, and even AI and machine learning tools. It’s like assembling the Avengers of data analytics—all your favorite tools working together for the greater good!

    So there you have it, dear readers—a whimsical tour through the enchanting land of Google BigQuery. With its blazing speed, scalability, cost efficiency, seamless integration, SQL magic, and advanced analytics capabilities, BigQuery is a formidable ally in your quest for data-driven enlightenment.

    Advanced Analytics: BigQuery is not just about querying—it’s a treasure trove of analytical prowess. With its built-in machine learning capabilities, you can level up your analysis game with predictive modeling, anomaly detection, and clustering. It’s like having a crystal ball that can reveal hidden patterns and insights from your data.

    So put on your analytical hats, grab your magnifying glasses, and embark on a data adventure with BigQuery. Embrace the quirks.

  4. The first 100 days of Fabric Analytics: Joy, Fear & an expense account MIA

    Leave a Comment

    It’s been 100 days since we opened our doors at Fabric Analytics and, to paraphrase the old saying, “Time flies when you’re relentlessly upgrading analytics implementations for some of the highest performing teams in commerce.”

    The first 100 days is a period that’s been fetishished in recent times amongst the glossy ‘airport paperback’ genre of personal and professional development, but with our first trading quarter now behind us, it seems like an appropriate moment to take stock of how it’s all going. In the absence of a boss, I’m conducting my own self-review and sharing it with you, dear readers. So, let’s dive in and explore the journey of these initial months; what have been the big hitters?

    Personal attention gets better outcomes.
    One of the aspects I love most about running Fabric Analytics is the ability to provide personal attention to each client account. We have a team of seasoned professionals, with an average of more than 10 years of experience, who bring a wealth of knowledge and expertise. Our clients have often been underserved in the past, and we are determined to deliver a level of service that ensures they capture the full value they deserve.

    It’s rewarding to do things better.
    Vince Lombardi, the renowned American football coach, once said, “The quality of a person’s life is in direct proportion to their commitment to excellence, regardless of their chosen field of endeavor.” This quote resonates deeply with me as I navigate the challenges of managing a fledgling business in a category where ‘good enough’ is often the benchmark. While it can be intimidating to bear the responsibility of meeting the high expectations our clients have of us at Fabric, the opportunity to really move things forwards for their business brings a tremendous sense of purpose and energy to our work. I am driven to make a lasting impact and uphold the highest standards of excellence in our industry and I’ve been heartened by the hugely positive response to our work so far.

    Your support network makes you.
    Running a new business is not a solo endeavor! I am fortunate to have huge support from outside Fabric with family and friends. Their belief in me and what we are building have been invaluable. Building a thriving business requires a strong support system, and I am grateful to have it in abundance.

    Business admin is in equal parts tedious and fascinating.
    From payroll and pensions to office rental and dealing with HMRC, there’s been quite a learning curve! Whilst close to two hours on the phone to HMRC one week chasing a VAT gremlin wasn’t an obvious highlight, each task adds to the breadth of knowledge required to run a successful agency. While it can be overwhelming at times, I’m embracing these challenges as opportunities for growth and improvement.

    Financial accountability is more of a thrill than it sounds.
    As the MD, I have assumed the responsibility of the financial performance of the agency. This heightened level of accountability has certainly sharpened the mind! Whilst I’ve owned commercial performance for well over a decade, nothing makes you as deliberate in your decision making as being the one ultimately holding the purse strings and answering to stakeholders! There have been moments of nostalgia for the corporate expense account, but it pales in comparison to the thrill of building a financially sustainable business from the ground up.

    Focus on the future, learn from the past.
    The first 100 days have provided a glimpse into the joys and challenges of running a new business. We have made significant strides, but it seems almost indulgent to be dwelling on what’s behind us already, as our optimism and determination to propel our clients forward is just spooling up! Together with my exceptional team we are embracing the future with enthusiasm and determination, but I’m going to make sure we also make time occasionally to pause and look back at what we’ve done so that we’re always learning and we make sure to enjoy the journey.

    I’d like to thank our clients, our team and our partners for making this journey so special so far!

    Forwards!

  5. The Rise of Predictive Analytics: The Future of Business Intelligence

    Leave a Comment

    Predictive analytics has emerged as one of the most powerful tools in the world of business intelligence. By leveraging large amounts of data and advanced algorithms, predictive analytics enables companies to make informed decisions and anticipate future outcomes with a high degree of accuracy. The rise of predictive analytics has been driven by several factors, including the proliferation of big data, the growth of cloud computing, and the increasing availability of advanced analytics tools.

    The benefits of predictive analytics are many, and they extend to a wide range of industries and applications. For example, in the retail industry, predictive analytics can be used to optimize inventory management, predict customer buying patterns, and personalize marketing campaigns. In the financial sector, predictive analytics can be used to detect fraud, manage risk, and optimize investment strategies. In the healthcare industry, predictive analytics can be used to improve patient outcomes and streamline operational processes.

    The future of business intelligence lies in predictive analytics, and companies that embrace this technology will have a significant competitive advantage. Predictive analytics is no longer the domain of data scientists and mathematicians. With the rise of cloud-based analytics tools, companies of all sizes can now leverage predictive analytics to gain a competitive edge. By leveraging the power of predictive analytics, companies can turn their data into actionable insights and drive growth and innovation.

    In conclusion, the rise of predictive analytics represents a major shift in the world of business intelligence. By harnessing the power of big data and advanced algorithms, companies can now make informed decisions and anticipate future outcomes with unprecedented accuracy. Whether you are a large corporation or a small business, the future of business intelligence lies in predictive analytics. Embrace this technology, and stay ahead of the curve!

  6. Taxonomy / Naming Conventions – fool proof guide

    Leave a Comment

    Taxonomy / Naming Conventions… not the sexiest topic but one of the most important..

    Taxonomy and governance on marketing data provide structure, consistency, and reliability to your data management and analysis efforts. They enhance data integrity, enable cross-channel insights, improve collaboration, and support efficient decision-making. By implementing these practices, Lotus can leverage the full potential of their marketing data to drive growth and success.

    There are many benefits to a clear taxonomy/naming conventions which include:

    • Consistency and Standardisation
    • Data Integrity and Quality
    • Data Integration and Cross-channel Insights
    • Scalability and Efficiency
    • Collaboration and Communication
    • Futureproofing and Adaptability

    Taxonomy naming conventions are an essential part to global reporting and consistency. We’ve developed a clear process to apply that will allow governance and consistency.

    Stage 1: Define Objectives and Scope

    • Define the objectives of your taxonomy and naming convention. What specific goals do you want to achieve with GA4?
    • Determine the scope of your taxonomy, considering the size and complexity of your business, and the number of entities you need to track.

    Stage 2: Identify Key Dimensions and Metrics

    • Identify the key dimensions and metrics that are relevant to your business. These could include customer segments, product categories, geographic regions, channels, campaigns, or any other data points you want to track.
    • Prioritise the dimensions and metrics based on their importance and relevance to your business goals.

    Stage 3: Establish Naming Conventions

    • Develop a standardised naming convention for each dimension and metric. Ensure that the conventions are consistent, clear, and easily understood by all stakeholders.
    • Consider including prefixes or codes to categorise different types of dimensions or metrics, which can help with organising and filtering data.

    Stage 4: Document and Communicate the Taxonomy

    • Document the finalised taxonomy and naming convention in a comprehensive guide or document. Include definitions, examples, and guidelines to ensure consistent implementation across the organization.
    • Communicate the taxonomy to all relevant stakeholders, including analysts, marketers, and developers. Conduct training sessions if necessary to ensure everyone understands the conventions and their importance.

    Stage 5: Implement and Validate

    • Implement the taxonomy and naming convention in GA4, making sure to follow the established guidelines and conventions.
    • Validate the implementation by monitoring and reviewing the data collected. Ensure that the data is organized and structured correctly and aligns with your predefined objectives and metrics.
    • Continuously review and refine the taxonomy as your business evolves, ensuring it remains relevant and effective.

    Developing a taxonomy and naming convention is an iterative process, and it may require adjustments based on feedback and changing business needs. Regularly revisit and update your taxonomy to keep it aligned with your evolving business requirements.

  7. Importance of a measurement workshop

    Leave a Comment

    Are you unhappy with your GA4 configuration?! I imagine part of this is because you’ve got a ‘standard’ implementation vs a custom one.

    Implementing a custom implementation is critical to get the most from the technology and that starts with a measurement workshop and creation of a measurement plan..

    To help you out we’ve detailed our process below..

    The sole aim of the measurement workshop is to agree exactly what your KPIs are and agree jointly about what needs to be tracked across the website.

    Our measurement discovery workshop is a collaborative session aimed at identifying key business goals, defining relevant metrics, and establishing a measurement framework.

    Our 4-step process ensures we get to an output that’s a shared document across Fabric and the client that will act as a blueprint for what needs to be tracked.

    Step 1: Establish Business Goals and Objectives

    • Discuss and document the primary business goals and objectives. This could include increasing website conversions, improving customer retention, driving revenue growth, or any other relevant goals.
    • Prioritise and finalise the key goals and objectives that will be the focus of the measurement framework.

    Step 2: Define Key Performance Indicators (KPIs)

    • Once the goals and objectives are established, facilitate a discussion to identify the most relevant and meaningful Key Performance Indicators (KPIs) for each goal.
    • Evaluate each proposed KPI based on its alignment with the goals, measurability, and ability to provide actionable insights.
    • Select a set of KPIs that best represent the progress and performance of the defined business goals.

    Step 3: Determine Data Sources and Measurement Methods

    • Define the measurement methods and tracking mechanisms needed to collect the necessary data for each KPI.
    • Document the agreed-upon data sources, measurement methods, and tracking requirements.

    Step 4: Develop a Measurement Plan

    Summarise the outcomes of the workshop by developing a comprehensive measurement plan. Document the business goals, associated KPIs, data sources, measurement methods, and tracking requirements for each KPI.

    • Define the reporting and analysis framework, including the frequency, format, and distribution of reports to stakeholders.
    • Assign responsibilities and establish a timeline for the implementation of the measurement plan.
    • Share the measurement plan with workshop participants and relevant stakeholders for feedback and alignment.
  8. Governance for implementing tagging and tracking – what you need to know!

    Leave a Comment

    Here at Fabric Analytics we see some really simple things that can make an immediate improvement to data collection and quality. Most of those revolve around governance! We’ve written a quick top tips article which focuses on process to getting the most robust data.

    First why is GTM and Governance important?

    1. Centralized Control: GTM provides a centralized platform to manage all tags and tracking codes on a website, allowing for streamlined control and coordination.
    2. Reduced Dependence on Developers: Non-technical users can easily implement and update tags through GTM’s user-friendly interface, reducing the reliance on developers for basic tracking needs.
    3. Version Control and Rollbacks: GTM allows for versioning of containers, enabling marketers to test and deploy changes without affecting the live site. This ensures a safety net for quick rollbacks if needed.
    4. Enhanced Data Security and Privacy:Establishes protocols for safeguarding sensitive information, reducing the likelihood of data breaches and unauthorized access.
    5. Data Quality Assurance:Ensures that data is accurate, reliable, and consistent, leading to higher confidence in decision-making and analysis

    How should governance by applied to tag management?

    Step 1: Tag Creation and Configuration

    • Define Tag Requirements: Start by understanding the tagging requirements for your website or application. Identify the tags you need to implement, such as tracking tags, conversion tags, or third-party scripts.
    • Create Tags in GTM: Log in to your GTM account and create new tags based on the requirements. Configure the tag settings, such as triggering rules, variables, and tag-specific parameters.
    • Test Tag Functionality: Before deploying the tag, test its functionality within the GTM preview mode. Validate that the tag fires correctly and captures the desired data. Use the debug console and browser developer tools to inspect network requests and ensure the tag is functioning as intended.
    • Ensure only the right people in your organisation can publish tags

    Step 2: Deployment in GTM Container

    • Publish Container: Once you have verified the tag functionality, publish the GTM container that contains the newly created tags. This will make the tags available for deployment on your website or application.
    • Implement GTM Container on Website: Follow the instructions provided by GTM to implement the container code on your website or application. This involves placing the GTM container snippet in the header of your websites
    • Test Tag Deployment: After deploying the GTM container, perform a test run to ensure that the tags are firing correctly on your live website. Use browser extensions, such as the GTM Debug Mode or Google Tag Assistant, to validate tag firing and data collection.

    Step 3: Quality Assurance (QA) Testing

    • Create Test Scenarios: Define test scenarios that cover different user interactions and events on your website or application. This could include form submissions, button clicks, page views, and other relevant actions.
    • Execute Test Scenarios: Go through each test scenario, interact with the website or application as a user would, and verify that the tags fire and capture the expected data accurately. Use browser developer tools and GTM’s debug mode to review tag firing in real-time.
    • Data Validation: Review the data captured by the tags in your analytics or tag management platform. Compare the captured data against the expected results to ensure accuracy and completeness.
    • Error Handling: During the QA testing, identify and address any errors or issues encountered. Debug and troubleshoot problematic tags or triggers to ensure smooth and accurate data collection.

    Step 4: Documentation and Sign-off

    • Document QA Results: Create a comprehensive report that outlines the test scenarios, test results, and any issues encountered during the QA process. Document the steps taken to resolve any problems.
    • Share Results and Obtain Sign-off: Share the QA report with relevant stakeholders, such as marketing, analytics, or development teams. Obtain their sign-off to confirm that the tags are accurately implemented and functioning as expected.
    • Ongoing Monitoring and Maintenance: Regularly monitor the tags and their data collection to ensure ongoing accuracy. Perform periodic checks and updates as your website or application evolves. Following this process, you’ll be able to create, deploy, and complete QA for tags on Google Tag Manager, ensuring accurate and reliable data collection for your analytics and tracking needs.
      *#Following this process, you’ll be able to create, deploy, and complete QA for tags on Google Tag Manager, ensuring accurate and reliable data collection for your analytics and tracking needs

    Following this process, you’ll be able to create, deploy, and complete QA for tags on Google Tag Manager, ensuring accurate and reliable data collection for your analytics and tracking needs.

  9. Google Analytics 4 VS Google Analytics 3 – key metric changes

    Leave a Comment

    On 1st July Google Analytics 4 will be the web analytics choice of many businesses with GA3 being deprecated. Since 2015 everyone has got used to ‘staple’ metrics to understand performance. These are sessions, users, bounce rate and last click attribution.

    Just when everyone felt comfortable GA4 has now changed all of these metrics…

    We’ve tried to give a simple explanation of each of the changes to try and make your life a lot easier

    Sessions are changing

    Normally you’d see sessions higher in GA3 vs GA4

    ⭐ There will be ‘overinflated’ traffic sources in GA3 vs GA4. In GA3 it pins the traffic source to however the session started. So… if a session starts from a click on a PPC ad and then in that same session renters the site using a voucher code website GA3 = 2 sessions, GA 4 = 1 session.

    ⭐ But.. the traffic source in GA 4 will be allocated to the PPC ad as this will be pinned to the ‘session start’ traffic source.

    ⭐ Also the funky ‘restart a session if it crosses over midnight is now gone (that’ll be small volume most likely!)

    Users are changing to active users

    Normally you’d see ‘users’ higher in GA3 vs GA4

    ⭐ Active users is basically users that don’t leave you site after an interaction (I think it’s easier to see these as non-bouncing users really..).

    ⭐ Google is now applying ‘engaged sessions’ to users to generate active users. Engaged sessions means they’ve triggered an event on your website (ie scrolled etc, etc)

    ⭐ These will naturally record lower as the active user is only counted if they trigger an event vs being counted by default in GA3.

    ⭐ Also.. If user ID functionality is enabled this will also count multiple ‘users’ as one and dedupe in your interface reports.

    Bounce rate doesn’t exist – it’s engagement rate

    ⭐ Essentially ‘current’ bounce rate in GA3 is percentage of sessions that leave your site without doing anything else. So, bounce rate = single-page sessions / total sessions.

    ⭐ Whereas GA4 essential flips the metric on its head with ‘engaged sessions’

    ⭐ An engaged session is triggered by the visit lasting 10 seconds or longer (this can be changed to 60 seconds if needed), triggering a conversion event (assuming these are set up) or consuming 2 or more pages.

    ⭐ So, engagement rate = engaged sessions / total sessions. In short engagement rate is the inverse of bounce rate (IE % of those who interact vs those who leave)

    ⭐ So using this simple example should make it clearer.. 10 users visit your site. 5 of them leave without clicking on anything^ and 5 of them click onto another page. GA 3 bounce rate = 50%, GA 4 engagement rate = 100%

    ^if they spend more than 10 seconds on the site

    Attribution

    ⭐ GA4 is Data Driven attribution by default and not last click

    ⭐ DDA is an algorithmic model that takes into account each of the touchpoints observed for your website conversion events. It then does some modelling to assign credit to each channel.

    ⭐ GA4 collects up to 50+ touchpoints, ensuring that all of your marketing efforts are taken into account when assigning credit.

    ⭐ So in your reports you’re likely to see conversions like 1,456.67 transactions for paid search as it takes all the data and credits it to the channels that drove conversion.

    ⭐ This is likely to make nutoriusly focused last click channels like affialites decline in credit and upper funnel like display get more credit

    So in summary…

    ⭐Sessions should go down

    ⭐users should go down
    ⭐it’s a new metric to understand page and traffic performance (engagement rate)\

    ⭐Attribution is not last click

    Hope this helps!

  10. Creating a data dictionary – our 6 stage process

    Leave a Comment

    Do you have a data dictionary internally? We find it’s a super useful document for governance, data consistency and continuity.

    With all the confusing metrics in GA4, internal acronyms and data silos we highly recommended creating one! Here at Fabric we can appreciate this isn’t the ‘coolest’ piece of work, but it’ll save so much time in the long term!

    Don’t have one? We’ve put together a bespoke 6 point process to follow!

    We have a bespoke 6 stage process when creating a data dictionary for any business.

    Stage 1: Define the Purpose and Scope

    Clearly articulate the purpose and scope of the data dictionary. Determine the specific objectives and requirements that the data dictionary should fulfil. This includes identifying the intended audience, the types of data to be documented, and the level of detail required.

    Stage 2: Identify Key Stakeholders

    Identify the key stakeholders who will be involved in the creation and maintenance of the data dictionary. These stakeholders may include data analysts, database administrators, data owners, and subject matter experts. Engage with them to understand their needs and perspectives to ensure the data dictionary meets their requirements.

    Stage 3: Inventory Data Elements

    Conduct a thorough inventory of the data elements within the organisation or the specific project. Identify all relevant data sources, databases, tables, columns, and any other components that store or manipulate data. It is crucial to document the data elements accurately, including their names, descriptions, data types, and any constraints or relationships with other elements.

    Stage 4: Define Data Attributes and Relationships

    For each data element identified in the previous stage, define its attributes and relationships. Attributes may include things like data length, format, allowed values, and any other relevant characteristics. Identify relationships between data elements, such as primary keys, foreign keys, and dependencies, to capture the interconnectivity of the data.

    Stage 5: Document Metadata and Business Rules

    Enhance the data dictionary by documenting metadata and business rules associated with each data element. Metadata includes information such as the source of the data, data quality requirements, and data usage guidelines. Business rules define the validation and transformation rules that govern data elements. Documenting these aspects ensures a comprehensive understanding of the data and promotes consistency and accuracy in data management.

    Stage 6: Establish Maintenance Procedures

    Develop procedures for maintaining the data dictionary over time. Assign responsibilities to individuals or teams for updating and reviewing the dictionary regularly. Establish guidelines for incorporating changes, additions, and retirements of data elements. Ensure that the data dictionary remains up to date and relevant by incorporating it as part of the organization’s data governance framework. Remember, this bespoke process can be tailored to the specific needs and requirements of your organisation or project.