This document provides everything you need to get your page set up with Evolv so that data about visitors and events will be flowing through to reporting pages, both Evolv platform dashboards as well as Adobe Analytics.
It also covers how to configure Adobe so that there is an agreement between the data shown in the two platforms, and finally, some methods and examples to validate that agreement.
Note that the below steps start from scratch to set up the project and install the Experience Accelerator. It is also possible that the customer you are working with will already have EA set up. In that case, you can skip step 1.
Also, a mini A/A test with fake traffic is described in step 3 to validate the alignment of data between Evolv and the customer’s Analytics platform. But you should also keep a longer running A/A test up with real traffic for at least a couple of days to look for any anomalies.
Steps to integrate and validate Adobe Analytics
Evolv dimension data is sent to Adobe through an s.tl() call, an on-click event, either when being evaluated for an experiment or at the time of a customer configured event click. By default, Evolv will send each GID, Ordinal, UID, and SID to Adobe Evars, however, Props can be configured as an alternative if needed.
Before you can begin collecting and validating Evolv events in Adobe Analytics, three open variables must be configured to ‘read’ in data. In this example, eVar3, eVar4, and eVar5 have been configured and enabled at the hit level for demonstration.
It is important to configure Evolv variables at the hit level because it is possible for a user to be in more than one experiment at one time. To avoid overriding of data and to most closely align with data reporting in Evolv, expiring configured eVars at the hit level is recommended. If you choose to configure Props instead of eVars then you won’t have to worry about this as Props do not persist.
See How to integrate Adobe Analytics to set up the integration with Evolv.
After the integration and configuration steps are complete, it is important to verify that visitor and conversion data are being collected as expected. It is our recommendation to first do this in a lower, non-production environment, depending on how confident you feel.
There are two ways to do this: first, verify that data is being sent out of the client in the right way, and second, verify that the data is flowing into Adobe Analytics.
First, verify that data is being sent out of the client
The first thing you should do is debug your page using chrome developer tools to see what data is being sent out from the client and back to Evolv. To debug the s.tl() call in your browser, open a page where an Evolv experiment has been configured, open your developer tools and select Console. If you are being evaluated for a Candidate in an Evolv Experiment at that time, you will see CID, SID and UID values (by default) being passed into the variables that you just set up in an s.tl() call as indicated in the pe and pev2 values below.
As you can see, this particular call was a “confirmed” s.tl() call. Pev2 is where we indicate whether the call is confirmed, contaminated or a customer configured event.
Confirmation s.tl() Example:
Note again that GIDs and Ordinals (or CIDs if configured) are not available for customer configured event clicks but are there for confirmation and contaminated calls. When a customer configured event fires, you will receive all of the same information as above, except GIDs and Ordinals, along with the name of the customer configured event in the pev2 value as illustrated below.
Customer Configured Event s.tl() Example (no GID or Ordinal present):
After debugging in your browser and hit data is collected you can typically see data within 2 hours in Adobe Analytics, but also in many cases, within 60-90 minutes.
Second, verify that the data is flowing into Adobe Analytics.
Next, validate that your integration with Adobe Analytics is configured correctly and that traffic is flowing through to GID and Ordinal values in Adobe Analytics, as expected.
To do this, create a visitor level segment that includes visitors in a particular Experiment Group ID with a confirmed event AND excluding visitors who are in the same Experiment Group ID that has a contaminated event. See the example below.
Example Segment for Traffic Validation:
To validate that data is being collected within Analysis Workspace, simply populate a table with the three newly configured variables and “Custom Link Instances” as your metric for total clicks during your selected time period.
Example Initial Data Validation Report:
After data has been collected and you’re beginning to see reports populate with data, you’ll then want to start interpreting and reporting on live experiment data. This section of the document is not so much a set of linear actionable steps, as it is a lesson in some concepts and examples that you need to be familiar with and to apply to your own situation.
First, let’s look at the taxonomy of Evolv variables:
Taxonomy of Evolv Variables
Custom Link Name: All Evolv click events will have “evolvids” as the Custom Link Name so that internal teams can easily distinguish Evolv events from others that are configured by other teams.
Preceding this value will be “confirmed”, “contaminated” or a customer configured event name (eg. add to cart) to indicate what type of call was made.
Three possible values are:
- evolvids: confirmed
- evolvids: contaminated
- evolvids: [customer configured event]
Note that when customer configured events occur, the Experiment Group ID and Ordinal values are not known to Evolv and therefore will not fire on that hit. When reporting on goals or success metrics, you will need to build user level segments as described below.
Experiment Group ID Variable (eg. gid-dc35ad99-7460-4c3b-b931-db6af1c3c007)
Ordinal Variable (eg. ordinal-1)
User ID Variable (eg. uid-11396041_1599594857783)
Session ID Variable (eg. sid-80487144_1599591514556)
Experiment ID Variable (not required): see Candidate ID Variable below.
Candidate ID Variable (not required): Colon-separated Candidate and Experiment IDs (eg. cid-d5bef4f45265:eid-7c0fc0794a)
How to use Evolv Variables for Reporting
Because users can be in multiple Ordinals at one time, Evolv variables are configured to only fire on a single hit. The reason for this is to avoid any rewriting of values (due to last-touch attribution) that may happen when a user experiences more than one Ordinal variation. By expiring values at the hit level, you will have the greatest reporting flexibility in Adobe Analytics and accuracy when comparing to experiment data in the Evolv platform.
There is a slight drawback to sending data in a Custom Link and is only problematic if not accounted for. If a user enters your digital property on a page where an experiment is running, that will have a negative impact on the bounce rate metric. Because Adobe defines a bounce rate as the ratio of visits that contained exactly one hit compared to the number of visits on that page, a Custom Link hit preceding the s.t() call for that page will decrease the bounce rate for the page where an experiment is running.
However, most organizations have alternative metrics for bounce rates for different scenarios. A common alternative bounce metric is one that is out-of-the-box, called “Single Page Visits”. This metric is particularly helpful when a user reloads the page or fires link tracking calls. To use this metric as an alternative to bounce rate, simply divide Single Page Visits by Entries to the page.
Adobe Analytics Single Page Visit (Metric):
Because you are now onboarding Evolv technology and the integration with Adobe Analytics may impact some internal reporting teams, our recommendation is for the analytics CoE team to communicate this change to all users and propose an alternative reporting method to affected and potentially unaware reporting teams.
If changing your metric is not an option then a custom event handler is an alternative option.
Alternative Method for Sending Information to Adobe Analytics
As described in the documentation here, the integration code accepts a function called ‘customEventHandler’. If this is implemented, the Evolv integration will run this method instead of calling s.tl.
Reporting on Candidate Traffic
There are a few necessary steps for reporting: segmentation, building reports, and making sure that you perform appropriate reporting comparisons to control.
To report on specific Ordinal traffic, you must create user-level segments that include users in the Ordinal of interest while also excluding the ones that have contaminated. See example below. While contaminated users are fairly rare in most cases, if for some reason the Ordinal receives a sizable number of contaminated users, then reporting in Adobe Analytics will differ from what is in Evolv. That’s because Evolv doesn’t count users who have contaminated and didn’t actually receive the Ordinal experience. For consistency and accuracy, remove them from your Adobe Analytics reporting as well.
How to remove contaminated users (example):
To report on these users, simply apply your segment(s) to a Freeform Table in Analysis Workspace and drag in the metrics you want to measure.
Let’s look at how to report on success metrics sent to Adobe Analytics by Evolv, as well as how to report on custom configured metrics configured through Adobe Analytics on the client-side.
How to Report on Evolv KPIs
In this instance, you can see that we have two segments with non-contaminated users in different Ordinals. To report on conversions for users in each of these Ordinal segments, simply create a Hit level segment and a calculated metric.
Example Calculated Metric:
How to Report on Client Configured KPIs
It’s likely that you will want to perform analysis with metrics not configured through Evolv. To do that, simply create a new goal and calculated metric using the same process, but now with the metric you want to perform analysis with. In this example you’ll see that instead of using a click event sent by Evolv, a client configured metric was used instead. That can be anything you want, as long as that data is available through Adobe Analytics.
How to Compare Candidate Data to Control
Reporting on Ordinal performance is sometimes misunderstood and done incorrectly. Because Evolv runs large scale experiments, it is sometimes thought that the way to analyze experiments is to force rank Ordinal values by absolute conversion rate. Instead, the correct way is to always compare the relative performance of an individual Ordinal to control for the time period that it was active.
After your optimization is up and running and real visitor data is flowing through, there are a few situations that can arise that will lead to discrepancies between the two systems. Although there are many reasons why discrepancies between systems may occur, here are a few common ones:
Contaminated Events - If contaminated users in a particular Ordinal are not filtered out then user counts will be inflated because these users did not actually receive the rendered experience. Evolv does not include contaminated users in platform reporting for this reason.
Clearing User IDs - Although cookie clearing is rare, it is worth mentioning that it can contribute to data discrepancies by creating a new Adobe Analytics ID and/or a new Evolv User ID.
Traffic Filtering - If Adobe Analytics is configured to filter out internal IP traffic, for example, there will be a discrepancy between Adobe and Evolv, as Evolv does not filter out traffic except bots.
Visit Analysis - Because Evolv targets and analyzes data at the user level, it is a common mistake to look at visit level conversions in Adobe Analytics (or any digital analytics platform). If you do, you’ll notice data discrepancies between systems. Evolv sends event data to Adobe Analytics when one of two things happen: when either the user matches the audience and the context predicate defined in the Evolv system, or when a customer configured event (eg. conversion, add to cart) fires, also defined through Evolv. Otherwise, events will not be sent to Adobe Analytics. So if User A enters an experiment on visit 1, leaves the site without converting then neither Adobe Analytics nor Evolv would record a conversion for that user. If User A returns for a second visit but does not re-enter the same experiment because they clicked on an ad and entered somewhere else on the website then converted, Adobe Analytics would not attribute that conversion to an Evolv user but Evolv would. For that reason, we recommend always creating User Level segments in Adobe Analytics when analyzing Evolv traffic.
Bounces - If users enter the site then bounce before the Evolv Adobe Analytics event fires, it is possible that Adobe Analytics counts that user in reporting but there is no record of the Evolv event for that user. Users that bounce before the page fully loads may cause discrepancies between systems.
Pre-Ordinal Conversions - Evolv will only count conversions for a user after they have entered an experiment. However, because we’re recommending that you use user-level segments in Adobe Analytics for the most accurate and comparable data, results will vary depending on your business. If for example, User A had two visits. Say on visit one, they visited your website and converted at 8 am without ever seeing an Evolv experiment whereas, on visit two, they visited your website, entered an Evolv experiment and did not convert. Evolv would not count that conversion but Adobe Analytics would, as long as your analysis time frame included the pre-Ordinal visit that converted.
Beacons - When data is sent back to Evolv, we use the Beacon API. This has the advantage of not losing the call if the user leaves the page too quickly. If Adobe Analytics does not have this configured this can result in Evolv receiving events that Adobe Analytics drops.