Jaret is StatBid's mad data scientist and is responsible for helping shape the company's bidding algorithms and charting a testing roadmap for StatBid and our clients. At StatBid Summit Jaret put retargeting to the test by revealing the results of two display retargeting tests.
Jaret Keniston: OK. Hey, everybody. My name is Jaret, I'm a data scientist at StatBid. Before that, I spent about two and a half years working for destinationlighting.com, an online lighting fixture retailer, doing just kind of data science stuff. My background is in statistics and after [inaudible 00:00:25] I kind of got roped into the world of e-commerce, and ever since then I've just been trying to do as many mathy, nerdy things as possible in the e-commerce world.
So the topic I kind of wanted to cover today is retargeting, also known as remarketing. I might use them interchangeably, but I mean the same thing here. Most of you obviously know what it is, but just to kind of get us on the same page real quick. It's targeted advertising any users who have visited your site already and have left, they haven't converted yet, specifically in doing display retargeting.
So this channel, along with every marketing channel, has one of the big problems or at least one of the big questions we need to answer is how do we value it? You know, how do we know how much value a channel is driving? Do we know is it worth the cost? Should I be investing more or less in it? That's obviously a difficult problem so far with retargeting.
There are some options, so you could try attribution. That's kind of the go to for solving a lot of marketing evaluation problems. You definitely, in most cases, don't want to take a market tool self reporting at face value. Tools like GA or other attribution services offer something more fair and devise up that kind of revenue. But, it's still mostly a subjective thing. Attribution's really just a guess on how the logic works and it's still problematic.
So, I wanna go through a quick thought experiment about why I'm skeptical, or why I think we should be skeptical to begin with. And to think, could we even design a better way to manipulate last-click attribution than what retargeting is able to do. Your client lets you cookie all their users, they identify the ones that are just about to convert anyways [inaudible 00:01:48] they bombard them with ads everywhere they go, and if you get them to click on a few ads you get full credit for that sale. So, I think that's super unfair but I just wanted to say that's why I think I need to be skeptical about just taking an attribution on the surface when it comes to retargeting.
However, if you talk to retargeting vendors, or any display vendors, they'll tell you just the opposite. They'll say, "We're actually getting short shrift from click-based attribution models." And they make a fairly compelling argument why. They will talk to you about the view-through interaction and it's this interaction where a user sees a retargeting ad or display ad and it works and makes them wanna come back to the site. But, they are smart enough to not just be clicking on random banner ads they see around the internet so they come to your site, the address bar, or a Google search, they use something else. And, when that happens, your display channel gets no credit in an attribution model. So, they say that's sure changing them.
But, how much is that really happening? Obviously, I can see it happening but can we actually prove it? I'm a data person, I wanna actually be able to see it happening. And even if we can verify that these are happening, how do we put a value on them, a number on them? I'm going to try and answer that.
So, test number one. Impression tracking, let's see if we can verify that this view-through thing is even happening in the first place. I worked with a company called [inaudible 00:03:03] who does clickstream kind of tracking and they also do impression tracking. They have a pixel they send out for display vendors and that pixel fires off a notice every time they serve an impression to one of your users. So that way you can kinda track those impressions, sort of time stamp in your users shopping path and you can know when they happen and if it, you know, sort of relates to their visits to your site.
[inaudible 00:03:22].com was kind enough to share with me three months of that data, which include three months of impression tracking for their Google display remarketing campaign. So, having that data though, how do I even identify what a view-through event looks like? I can't read their intent of the user. So, I tried to set it up through two sort of chronological conditions that I think would be typical of maybe what a view-through would look like.
One is that the user has been away from the site for an extended period of time so they're not shopping there currently. And then two is, within very close proximity to being served an impression, they return to your site and then, specifically, [inaudible 00:03:53] by not clicking on the ad itself. But they did, they came back through another channel.
What reactivation could mean could differ based on how long you've been away. So, I set up multiple windows to try and [inaudible 00:04:05] and I think this is pretty generous to what your definition of view-through might be. But, an example is like you've been away for more than an hour and then you come back within five minutes of seeing an impression. Or, on the other side, you've been away for like four days and you come back within four hours of seeing an impression.
So, even if I use this kind of generous thing, we found for these three months there were about 98,000/97,000 regular old ad clicks and to that, there were about 12,000 of these non-click impression driven visits. Even if I treated those as touches in the last click model that would account for about 19,000 dollars in revenue compared to what was already given to ad clicks that's an extra 18 percent in revenue. So, maybe that's not much we're missing with this whole sort of view-through activity.
Okay, so that's some insight to if view-throughs happening but still all that was just based on still using some sort of a last touch model, which I still think is super subjective. I want something more concrete, more scientific so I went back to the drawing board.
The AV-TEST. So, let's treat retargeting like a clinical experiment. We can do this through cookie splitting, basically. Because you know how retargeting works is they cookie the users when they're on your site and then they get to follow them around, show them ads and all that.
So, through GTM we can set up, basically, a tag that allows users to be randomly sorted into two pools when they come to your site for the first time. And what we did here is [inaudible 00:05:23] is we set it up so that Criteo only had access to this one pool, this one half of our users. The other half Criteo had no knowledge of, no access to, completely walled off from. So then we have a control group and we have a group that's been exposed to Criteo and we ran this for six plus months, just kind of looking at results of Criteo versus control group. So over these six months we spent about 13,000 dollars on [inaudible 00:05:47] retargeting. Users, obviously, should be the same. Sessions are more, that's kind of what you're going for with retargeting. Purchases were higher. Not falling for [inaudible 00:05:58] same slide again.
And you can see that how here, conversion rate went up a little bit. AOV actually went up, which was super surprising. In fact, the AOV rise was more significant [inaudible 00:06:11] than the conversion rate increase, which was very, very surprising. But altogether, an increase about 90,000 dollars for about 13,000 in spend.
On a per user level, about 19 cents per user. Higher in the Criteo group, about three cents more per user in spend, which comes up to an RY of about 677 percent, which is not bad at all. Deals [inaudible 00:06:32] in GA had Criteo over that time at about 123 percent [inaudible 00:06:36] which is pretty terrible. So, that's a pretty big challenge to what you would've thought just maybe looking at traditional attribution models.
So, quick takeaways. Did we learn anything? How should we value display retargeting? One is that view-throughs are probably a thing. I think that it makes sense and we did see that there were a lot of visits that were driven in close proximity to impressions that sort of seem to reactivate users. But, even if we were being super generous with that, there was maybe an extra 15 to 20 percent of traffic that we're missing. So, view-throughs aren't really the thing that is making display poorly valued by attribution models. And, with or without view-through, that AV test suggests that we could be undervalue retargeting by as much as four or five times with standard distribution models.
So, clearly the view-throughs not it. I think there may be something more gradual and long-term in how retargeting works on people and that it keeps the product or brand in your mind in the long run. But clearly, the AV test is really the more pure experiment form of this and it seemed to show that there wasn't that.