Behavioral Analysis for Sites of Any Size

DIY Behavioral Analysis for your E-commerce Site

This presentation was recorded on February 24th, 2016, at eTail West.  The full text transcript of the presentation follows the post.

All too often, when retailers hear about "Behavioral Analysis", they think of big-data-esque stuff.  Typical use cases are  for personalization, or advanced segmentation--and those are really cool things. However, I'm of the opinion that they're a distraction from some major wins that retailers can gain even without expensive software, an army of PhDs, or their own private data centers. Rather, I show you how I've learned critical things about my own site's shortcomings using tools like Google Analytics, if you can just think about the problem a little differently.

I discuss A/B split testing, heat maps, and other user behavior-driven systems I used to make iterative improvements over five years in my previous role. I hope you find it interesting!

Have questions?  Drop me a line via the Contact form--I love to talk shop.

Up and to the right!

Full Transcript Follows

This is what I've been looking forward to the whole time because, as he mentioned, I do like talking shop and this a unique type of situation. I spent more than five years at a company called PoolSupplyWorld and during that time ... I started as an engineer, and we were eventually acquired by Leslie’s Pool Supplies, which is the national brick-and-mortar chain, where I was vice president digital marketing for a couple of years before branching off.

What I'm going to do is, I'm going to take those five years of experience and show you how we learned what we learned. It's less about insights I gleaned in 2010. Hopefully a lot of this looks obvious in hindsight, but the bumps along the road are where I want to tell the story. If you glean anything from that, you can start to learn and in the same way we learned that allowed us to grow in the way we did.

To start with, behavioral analysis is sort of a buzzword. It means a couple of things in a couple of different contexts. The most common is using customer information to try and identify things about the value of the customer themselves. That's being segmentation, personalization basically, or you're doing some sort of modeling, or machine learning, or something like that. I'm not going to be talking about that, because it doesn't work on a small enough scale. If you are a large company like Amazon, you have infinite traffic, you have infinite data, you have infinite amounts of resources, you have an army of PhDs to do all of this, and as I think the panel asked, “who's a small company?” I was about the range you should talk about, 10-20 million, and we could barely hit statistical significance in about most of the things we did.

I'm going to show you our approach, which grew more technical over time, but at no point am I going to just be showing you statistics. When it comes down to it, if I want to try to beat Amazon at predictive modeling, behavior analysis kind of stuff that they're doing, they're going to win. I can't beat them, from a commodity retailer perspective, at their own game. If they have an army of PhDs, I'm going to take a completely different tack at it, and see if I can produce a customer experience that is sustainable, and gives a customer a reasonable and viable alternative to just buying everything on Amazon.

Instead of trying to learn something about the shoppers, I'm going to let the shoppers teach me about my own website and about my company. Instead of taking the entire group and making it statistically less and less relevant, I'm going to aggregate it all back up and say, "Okay, I'm going to apply Pareto’s Law, or the 80/20 principle, and say everybody's using the site, how do I make it awesome for the 80% group? I'm not going to try and focus too hard on the fringe groups, and especially when you're small, you have to do that, because that's how you grow. Then you have the luxury of worrying about fringe cases.

Effectively what we're doing is, we're taking data and turning it into money, and I love that line, it's on my LinkedIn profile. But it's incomplete. There's a middle set between data and money, and it's analysis and information. You turn data into information, and information is much easier to turn into money. So, this is a five year journey, mostly through mistakes that we made, that gets us from point A to point B.

I wanted to contextualize how long ago 2010 was, because it doesn't feel that long. I was looking for pop culture references and news headlines and things like that. So, I was looking at this, and thought… this doesn't do the thing at all. This feels like yesterday. Now, if this doesn't do it, this does. It may not look stylishly much different than Amazon does today, but if you look on the left, their primary focus was selling books. If Amazon was still just selling books, it would be a completely different ecosystem. This was the reality in 2010, when I started with PoolSupplyWorld.

There were a couple of things here that were starting to become industry standard practice. The navigation on the left was very, very common. They hadn't really gone all-in on the search yet, because their catalog hadn’t gotten so broad. You were already starting to see shoppers trained by Amazon to expect certain types of experiences.

The current team I'm working with, their site looked pretty modern for 2010. You see a couple of the things that were really, really popular and trendy at the time. They still have the side nav, which worked well. They have this slider, with things that the marketers and merchandisers decided the customer needed to see, and everybody had the same thing. Even Google shopping had that as a home page for a while. It looked like that, and when I started at PoolSupplyWorld, they were very proud of having just launched this. It's a website. I'll give them that. It was cutting edge for 2006. It was 2010, which is rough. I was, as an engineer, looking at this going, "Ugh. It's bad." If I read the text to you, which I promise I won't make any more slides at you, but Pool Supply wrote, "We carry all the well known brands in the Pool and Spa Supply Industry, as well as innovative new manufacturers. Our role is to," oh my gosh. Bruce Clay was responsible for that text. It was old ... It was terrible. The title of this thing was, "Pool Supply World. Your online pool and spa resource center and more, ellipsis" No.

I was looking at this. I'm new to this company. I'm brand new. I worked eCommerce a little bit before that, at what's now Build.com, and I was looking at this going, "Okay, what's the point of this huge banner section that everyone built?" Everybody is doing this slider. Is that a good thing? Does it do anything? The reality is that there is almost no chance that you, as a brand new customer who is coming to Pool Supply World, is looking for an Intelliflo pump. You're looking for a $1,400 dollar pump on a whim? Not likely. This is a waste of time. I was convinced of it. Because we had an eCommerce platform that was built from scratch, and because there were no real A/B testing solutions way back in the day, I built one myself. I baked it in. It allowed me to compare two different templates to each other. Simple. Dirt simple. We just dumped out the numbers I had to crunch them myself.

I took this, stripped out that entire bar, moved those four subcategories up, and went “ok, cool”. They're not good, but at least they're relevant. They're redundant to the navigation, but at least they're relevant. Let's go. However, being the arrogant engineer, I quickly hacked the skin. Awesome. I just built a new testing platform. Launch. Okay. Here are the templates. Good. Launch. Um ... Lunch! Went to lunch. Here's what I actually launched on a live website in the middle of our busy season and all of our goods. Oops.

I'm at lunch, halfway through a sandwich, about two miles from the office, and my phone is exploding. My boss, the CEO, is, "Roy, you've got to get back here. You broke the site. The home page is crap. It's a disaster. You've got to get back here right now!" I'm like, "Oh, okay." Vroom. Oh. Yeah, okay. Fix it. Am I fired? Okay. I'm not fired yet.

But I'm like, “look, we have the data!” Let's see. As it turns out, I was looking at the conversion rate, and everybody who saw the home page during that maybe 45 minutes to an hour, they were statistically significant sample sizes, because it was the middle of the busy season, everybody was hitting our site at that point, and I learned that none of the stuff on our home page mattered at all. The entire thing was useless. It wasn't just the stuff at the top, it was the stuff at the bottom, the middle, all bad. There was no difference between these two, and like I said, the math worked out: that was a waste of space. You'd think we'd stop doing that. We'll get there.

Now we have this A/B testing system online. All right. I'm not going to make that same mistake again. I'm going to test it a little bit before I launch it live. This was the product page. It's a little clunky. Simple but effective. It's like it was designed by a programmer, because I did. What we were testing was the in-stock status. Back in the day, we had our own warehouse, but we drop-shipped a lot of stuff, so the expectation of shipping times would be different based on whether it was coming from our warehouse, it would be coming lightning quick. Some very clever engineer, looks a lot like me, build the warehousing system. It was great. But, the drop shippers we a little Excel spreadsheet, PDF, via email and fax machine kind of world.

We wanted to emphasize our own inventory, which, it sounds very obvious. But, at the time, it was kind of a big deal that it's trying to distinguish these two things. The two messages were supposed to be, "In stock. Order today, to your door by," a date that involved the estimated shipping time for that particular product, etc. The alternative was, "Out of stock." We wanted to try out of stock, backorder, in stock but not bright green, things like that, because the secondary thing was, switch back, it wasn't really out of stock, it was just drop shipped. We needed to control that. Then we were just saying, "Ships in 2 to 12 business days," which is what the whole site used to say. Used to just say, "Oh, we do our own shipping," but nobody cares when you ship it, you care about when it arrives at your door, and we tested that already, so now we're playing with details. This is what actually launched and went to lunch. Subtle difference. You saw a big difference that didn't matter. This is the opposite. This is a tiny difference that no matter how reassuring that green text is, it says, "Don't convert." It was statistically significant within an hour, out of the season, because it was deployed across in every single product page. That was the other end of the spectrum.

Then this is like 2011-ish now. We're starting to refresh the brand. We have a new logo. We have a new designer on staff, finally. Everything is looking like Pinterest. We're going to look like Pinterest. As you can see, the navigation stuff is all very 2011. Tablets are going to take over the world. We should really be worried about tablets. We don't have the resources at this point to manage a mobile site by itself, so let's just make the desktop site not suck too badly on a tablet, then maybe it won't be so bad on the phone either. Phones were still a minority, at least for traffic.

We crammed all the navigation up until there's a nice big fat-finger-friendly buttons at the top. Put these, at least thematically and seasonally relevant things on the home page. This is when there's starting to be more tools available, so we're playing with Visual Website Optimizer, and there's other tools that do similar things, and clickmaps. We don't need these tools to get this insight. I'll show you examples in Google analytics. 80% of everything I'm going to tell you, you can do with free tools nowadays. Using VWO we learned that nobody was clicking anywhere. I didn't need to break the home page to know we went backwards. We went back to the same problem we had before. None of this mattered. They were hitting the navigation and the search, right?

What do we do? What do we do to make this more relevant to shoppers? Okay, well, we know our catalog. We know what shoppers need better than they do. Shoot. Let's just pick the things that we know they should buy, slap those into the side bar like Amazon (evo was doing that before we were), and I'm sure it will be wildly successful.

There's actually two wild assumptions here that are wrong. One, we know what's best for the customer, and two, I was convinced that pictures were clicked on. Nope. The content of the home page continues to be utterly useless. Our most trafficked page is doing nothing for us. There's a little bit of activity on the side bar there, because these categories do matter, and since we buried everything into that allegedly tablet friendly navigation at the top, we were surfacing something, right. Okay. There's a incremental progress. This is a step in the right direction. But, you can still do the same kind of thing. I promised a Google analytics reference. This is the same information, but several years later. This is last week at evo, and green means not that many clicks, and red means lots of clicks. A feature and major sections here that are things that I would love if they explored and bought things from, but realistically that was not where they clicked. They want the snowboarding stuff, and you can see the same sort of thing here. There are perfectly good merchandising reasons to have the other stuff, but the UX challenge is identical to what it was back in 2011.

Back to PoolSupplyWorld. If only there was some way to know what they're actually looking for when they landed. If maybe there was some sort of eye tracking technology I could use to tell me what's going on. Maybe, if only there was some way ... Maybe a service like a live chat, push it on them or something. If I know what they're looking for, then I can merchandise to them.

One day it was like, "Oh, wait. They've been telling me all along." They're like bounced user plus plus. A bounce goes away and tells you nothing. A person who hits your search basically is telling you the same thing. You failed to tell me what I needed to find out. I failed to learn the answer to my question. I failed to find the product I actually wanted. But then they go ahead and they tell you what it was they wanted. They're practically doing your QA for you.

You could do this with Google analytics. Didn't used to be able to back then, but you can now. It's just under behavior, under site search, Pages, then you select the Start Page. In our case, we’re talking about the home page, but it's even more relevant for category views and things like that. You can get the list of queries that are being submitted. We took that information, and this is August or September period, around 2012. The winter covers, safety covers, and leaf nets are perfect products for that time of year, because people are starting to think about closing their pool. As you can see, we then used that same insight to do some of the merchandising decisions, but the cover pumps in the middle is an interesting type, because that type of product wasn't on my radar.

At this point, I was spending mostly marketing. I was still doing programming, but I thought I knew the catalog pretty well, because I had all the data. I had all the data all the time, but it would get lost in the wash that are five or ten orders a week, which was nothing more in the season compared to other products on the site, but they converted really well. Not a lot of traffic, but they produced a small amount of healthy orders. The conversion rate was spectacular. I just wasn't drawing enough interest in it. That cover pump problem was surfaced by looking at the search queries that were being hit on the home page and then not converting.

I was looking through these e-words, I'm going, "Cover pumps? Sump pumps? What are these for?" When you put a safety cover out over a pool in the Northeast, snow and rain and stuff accumulate on top of it, and if they're a solid cover, it will start to sag, so you have to have a cover pump to keep the water off of the cover. They're also used for other winterizing things, but it was something that wasn't even on my radar, because I was looking at the top hundred skus, all the time. This was much better. We felt confident about it. We brought buttons back, because I learned my lesson. Finally, we're starting to see some action that looks like a competently designed web page. We have, navigation is actually getting hit in roughly the right order, buttons are getting clicked. There's still action on the search bar. I think it's universally going to be true on a home page, because it's the least relevant shopping page in the entire site, because it's the least specific. Then the nice thing is that the pools category is a little less, just destroyed with clicks.

This is from about a month ago. Long after I left. I left in September. You can see they've updated pool cleaners, pool cleaners, pool heaters, major equipment. This time of year, pools are thinking about opening up, they're on the other end of the season, and they're starting to replace equipment that failed or was damaged in the winter, things like that. I would bet you dollars to doughnuts, including if they're on the wall [eTail had a wall of doughnuts just before this--you’ll have to attend to see what that means], that in another month or so you'll see opening kits and chemicals, and things like that. The navigation is much better, and it's basically the same story, but it's been iterated upon again and again.

However, once you get to a certain point, you're looking at these keywords, and you're going to start seeing this happening. This is from evo’s home page. The queries that make this up are very diverse. The largest single keyword makes up less than 1% of searches. What do you do with this? There's two ways to look at this. First, this could mean that you've succeeded. There's no big glaring gaps in what your home page is trying to accomplish. The other thing you can do with it, is you can start aggregating synonyms. You do this in Excel. There's tricks you can do there. We were using database, but effectively, you can take the keywords out of Google analytics. In my case, I was taking things like pool cleaner, and pool cleaners, and pool vacuum, which are effectively synonymous depending on where in the country you are, and I treated those as a single term. If you start to aggregate those, it's not ... You don't have to build a semantic engine that can do crazy language interpretation AI stuff. You can just say, "Well, that one's really common. Add that to the list. That one's really common. Add that to the list," and in just an hour or two, you'll have probably have 90% of use cases. You'll have a much better list.

The other thing you can do is, if your home page looks like this, go to the category page. The category pages are going to have more specific searches, because the conversation is different. This is the accessories page on evo. Now, with context of the conversation in accessories. People aren't looking as often for every single product or every single category in the entire website. It becomes narrower and you can focus on this. When you do that synonym analysis on this kind of thing, it gets even better.

What's the next step? Product pages. This is the product page. It's a fairly heavily hit product page. Those first two keywords made up a sixth of all of the searches from there. Those two keywords likely represent things that I can probably add, either if it's product related, I could do a frequently bought together kind of plan. If they are information queries, then you can probably add content to that page, and make that page stickier. They're telling you what they're looking for.

There are other types of data. You can search behavior and click behavior, as in behaviors, but there's other ways to do it. Product reviews. Most of us, if we're modern day retailers, are collecting product reviews, either doing in house, like I was, or using a vendor, but you're collecting these things. You think, "Okay, cool. My customer is telling me about my products, and the qualities of those products." Almost. Not exactly what they're telling you, actually.

These two things are robotic pool cleaners. They do the same job. The one on the left is a Polaris. It's a huge brand within that tiny niche. It's big. This thing is a work of art. Looks like an Aston Martin for a reason. It is hyper intelligent. It can do crazy patterns on your pool, no matter what shape your pool is, to make sure it hits everywhere. It has a remote control. It'll actually drive up out of the water and you can pick it up, so it feels lighter than it is if you're dragging it out of the pool. It's magnificent.

On the right, is a Smartpool. This Smartpool is anything but. It weighs about four pounds. It's just this plastic toastery thing. When you get too much sand and silt into it, into the bag that holds that stuff, the bag just stops letting water through. Then the thing just seizes and dies. Their solution was to put grommets in the bag, to let the water through. All the sand and silt goes with it, and it basically just ends up finger painting in mud at the bottom of your pool.

But Polaris had a lower average rating. What? What am I looking at? It dawned on me. That's the difference. If I'm going to spend $1,400 dollars on a robot that cleans my pool, I want that thing to get out of the pool and make me latte afterwards. If I spend $350 dollars, I'm glad that it picked up that leaf. The expectation is different. The context is different. It's not just a measurement of the direct quality of it. Now all of the sudden I had insight about my own products. My customers have told me something I didn’t really think of. I knew it was more expensive, but to the point where it would affect how you're rating the product overall was mind-blowing. It's a really good product, and it's rated three stars, versus a three-and-a-half for the Smartpool. Ouch.

I'm saying, "Okay, how do we do our messaging?" Knowing that we know this thing about how these are perceived, we can push bargain messaging against the Smartpool. Not just because it's cheap, but because people expect things from that cheap product that it can deliver, and we're setting it up for success. Likewise, with the Polaris, we can play to its strengths in terms of it being a feature play.

Further, buy in behavior, and this also applies related to acting on behavior, wish lists are spectacular for this kind of thing. Anything where people are self-identifying as being interested in related to this specific product. But, I like buying. Short of calling me and shouting at me on the phone, it's the loudest thing we can hear, I get from customers, because as soon as they give me their money, there is no larger endorsement for whatever path they took to get there. Money is the loudest thing.

The buying behavior is useful in some ways, and this is the most common. Everybody's seen this. This is straight from Amazon. Frequently bought together. This phone case and this slightly more different phone case. What's fun about this that you don't really see until you sit back and think about it, is that, well, if this product is related to this product, and you click to the next product, that's related to other products, and it's related to other products. All of those purchase behaviors overlap, and you can actually build a map of your entire catalog, as if it's an ecosystem by recursively doing that same kind of analysis. This ... It's a little dark, I apologize for that, but it's a diagram of an entire month of sales across my entire catalog. There's a bunch of little dots here on the right, and they're all tied to that big orange dot. What they are are specialty chemicals that are not needed for every single pool, but every single pool requires chlorine. That's like the bread and eggs in a pool business. Then the big swath of larger dots down the middle are major equipment purchases. The large dot represent large volume, large AOV, therefore higher revenue.

Further out is another constellation, effectively, of spa products, that are totally unrelated to the specialty pool products. You see parts and things like that as well on this. If you render this several times, no matter what other filters I put in place, because of the recursion, it's kind of like the Kevin Bacon thing. You end up with the same constellation showing up every time, and they’re semantic groups. The importance of that is that this is like a magic robotic merchandising machine. If a merchandiser were to look at a bunch of products, and they could tell me that these are all products in this similar use case, these are products that apply to these use cases, and if I ask them to list off all the major groups, they do a good-ish job. However, there are clusters in here that reveal different affinities than I would have thought of. Sure, you see brand affinities in here, if I were to label the brands and stuff like that, and that's obvious, but what I didn't expect to see were there were affinities between products based on the pool size, and other variables I wasn't really thinking of.

Here's another way of rendering it. We have a bunch of big dots up above. This is actually narrowed to Florida. I use Florida as like a root node. I drew all of the products that were the top 500 products that were really popular in Florida, and then I drew all of the products that were associated with those products, as in bought by the same customer, I was using email address. I can't do frequently bought together, because my average line items per order is 1.1, but you have a pool last year, you have a pool this year. They come back. Our repeat customer rate is pretty good, so we do have all the products you bought over time, and can do the same kind of thing.

What was interesting here is this island down here at the bottom where they're all connected to the white dot, but they're not connected to anything else. What automatically formed this kind of visualization, because these are orphans. People buy that and that's the only thing they buy. Isn't that sad? Some of these are pretty big dots. You can't tell me that I can't figure out that K0400 is a suction-side pool cleaner. I know I can sell them another product that goes with that. I know the pool that goes to. I know everything about that. As a marketer, I can go, "Oh, man. I'm sorry Kreepy Krauly Kruiser. I'll find you a friend!" And I can go and get a better marketing angle for those products, so I'm under utilizing their traffic.

However, sometimes frequently bought together logic and go a bit astray. This is a harmless example of that. Ray Ban sunglasses and Furbies. I don't get it, but it happens. If you are a fashion retailer, and you sell ... If you recommend two products that don't go together, and I buy them anyway because I'm illiterate on the topic, I just look like a fool. If you buy this pool filter, and this valve that looks very much like the correct valve, they will literally explode.

A customer did this. We were very sorry. We tried to help them. However, their sales information was now in my product recommendation engine, that it built on top of all those lovely pretty graphs you saw, and now I was recommending those products together. The second time these products exploded, we finally started to go, "Oh, wait. Those are the same SKU's. Something's wrong here," and by the time the third one exploded and we replaced three pads of equipment in three different back yards, we finally said, "Wait. I think we need a incompatibility matrix," because it's not like car parts, where we say, "Okay, this goes to just that." It's basically, "This could go to anything, except that," and then manage to bake that back in. Customers were telling us things. Some of them were wrong. We needed to be able to tell the difference, or things would explode.

This one is a little less violent. Contact request source. I haven't figured out a better key phrase for this, but effectively, imagine something goes to your contact us page. If you think bouncing or searching on a page as a decree of your failure, contact us is among the worst. Many reasons to contact your customer service agents are perfectly valid, and our customer care team was amazing. Our sales team was amazing. If you did have to contact PoolSupplyWorld, you were taken care of. It was never about the quality of the experience, but it was a condemnation of the quality of the page. At first, we're just like, "What are the topics? Is there any data out of the call center?" Call center doesn't have time for that. We were, seasonal as crazy. If they're getting those questions, they're already working overtime out of their minds. They don't need to hear from some engineer that gets to go home as early as 7:30. It's not what they want to hear.

I'm like, "How do I reduce their pain, and increase the stickiness of the page and the site rather?" What I realized was that you can use all of the sources of traffic coming into the contact page to inform where you should focus your content efforts. Imagine these very subtle metaphoric arrows are sources of traffic from where else on your site, so it's almost everywhere someone's looked at the contact page. Wouldn't you want to know what the big blue arrow is? Because you could start there.

For PoolSupplyWorld, the culprit were very technical products, which was a very two sided story. If we simplify it so we can actually (and now, don’t actually try to read this, it’s wildly irrelevant, but it's a good demonstration)... If we simplify this too far, then we reduce the confidence in a shopper that this is the right product to do what they want. If we overwhelm them with it, we reduce the confidence to the shopper, and they don’t buy--both end up contacting us. The trick was trying to figure out a way to take a very complicated product and maximize the confidence and identify situations where they should contact a professional on our team, and walk them through it. Many people would be fine, but we scared them with this, and they wanted to contact us form.

The other end of the extreme is this. This is half of the catalog--we had a large parts catalog. There's not much to say about this gasket, except for, "Well, what size is it? Does it replace the gasket I already have?" You don't need to say much. There's not much there to buy, but we have really failed, and when we started looking at these products sorted theme sources in aggregate, we saw categories that were guilty. We started to put our content development efforts towards those categories, and that started to increase our conversion rate across those categories, and a number of distinct SKU's that were selling were increased. We fattened up our long tail within our catalog, because we knew where we were attracting traffic but not getting the job done.

This is something we can do with Google Analytics. It's under behavior, all pages, and then previous page path. Previous Page Path is the secondary dimension, and is the most like under-utilized, unsung hero of the Google Analytics, because anything that you're looking at in your account that is an indicator that something went wrong or they needed trouble, or they couldn't find what they were looking for, you can use this and look back, where did they come from? Everybody is focused on exit pages. Where did they go after this? Well, that's important too, but people forget we can tell where they came from to begin with, and this is an amazing thing.

This is evo’s. You can see most of this is not very interesting. In the first ten, you see there's pages that aren't a lot of traffic, so it's going to biased that way. When you dig a little deeper and further in there, we start to see pages that are kind of, it's apparent what's wrong here. Throughout the entire section of these links: “we have free stickers”--it’s a brilliant email list generation system, but we blew the wheels off of it, and haven’t gotten to fix it yet. I ran across this and went, "Oh." Even on this page I'm telling you I'm going to give you a free sticker. The arrow literally points at me telling you to pound sand. This is five years after the first slide I showed you. It's the same thing. This iterative of improvement, it's low tech, it's just consistently applying the same methodology again and again to make the site less and less terrible over time, and you can grow from very small to a lot less small pretty quickly.

If you remember anything from this talk, and I know it's the last talk of the day, so thank you for coming:

  • Click behavior guides merchandising. If you don't have a UX team, and I very rarely had one, just look at what people are doing and it will become apparent.
  • Search behavior data is like being psychic. You can read the minds of the customer. They are frustrated with you for not having given them what they wanted to begin with. You can get in there.
  • Previous page path is wildly under utilized. It's in there, and nobody really talks about it, but it's awesome.
  • Customers are play testing your site all the time. That's a sort of gaming metaphor, for video games. They are in there. They are trying out the experience that you designed for them to have, and what they're doing and the action to your attempts to shape that experience, will tell you whether or not you succeeded. Sometimes you're using conversion rate, but you don't always have to use just that metric and you may never hear me say that again, because dollars ... I can't buy a beer with engagement, but there are leading indicators that you've used in here where you make decisions about your content strategy, and where you're investing your resources, that will do more than just focusing myopically on conversion rate optimization and where that can get you.

Audience Question: What percentage of your tests succeeded?

Roy Steves: What percentage of my tests succeeded? 30%? We were of the opinion that if every test succeeded we weren't really testing. That took some ... That caused some heartburn with the ownership originally, but we would try wildly crazy things. Intentionally try to shake things up. After that homepage test, we basically discovered that most things don't matter. For a long time, the color of the button didn’t matter, the size of the button didn’t matter, the… all this stuff that you would think, "Oh, I have an A/B testing system. I know. I'll try that," it didn't matter. Very few things actually mattered. We started doing really drastic things. That gave some people a little bit of anxiety, but the more drastic the difference between your test and your control, the faster the test goes, because the divergence of behavior increases the statistical significance of the two groups. We were willing to do almost anything, as long as it wasn't profanity-ridden or something like that. As long as it was appropriate to the context, we'd do anything.

We tried a huge parallax pump-selling work of art, and it never worked. Then we went back and said, "Why don't we just double the size of the buy button? That worked!" We were just willing trying anything, because if we weren't failing often enough, we weren't learning anything, and as you saw, failing is the best way that I process work and my team worked as well at the time, so we weren't afraid of it. It was definitely way less than 50% success rate.

The other thing about that testing process, and this is a question that I liked that the panel got, was what happens if a test comes up insignificant? Our approach to that was, most of the tests were coming out of the engineering team or the merchandising team, so it tied us to the design. The designer says, "This is pretty, and this is a good experience," and we're like, "Yeah, but we've got to try this, because it makes them buy the thing." If we’re wrong, and it just was a wash, or the designer says, "We should update this horrifying thing that Roy designed, to this prettier thing," and it's still a wash, and you know it's tie goes to the designer. When in doubt, we iteratively get less hideous, and it made us much better over time.

Audience Question: Is your experience primarily B2C or B2B?

Roy Steves: 95% B2C. In B2B, it's a challenge of numbers, for sure. If you're playing a numbers game like that and you're doing lead-gen as a first step, as opposed to B2C, merchandising, then you end up doing the same math, but you have an additional conversion rate added in there. You have your visitor to a lead, your lead to a conversation, conversation to an end. Things like that. It ends up being the same math, it's just by multiple times through, and you're doing the same kind of thing. You're looking for, what are they trying to find that they're not finding?

A great example in the SaaS space for example is if you don't have a pricing description, or worse, and friends of mine are guilty of this, you have a pricing tab that takes you to a page that doesn't describe pricing at all. It's the same kind of hard dead end. You don't have the advantages necessarily of running the same search bar that lets people tell you what's wrong with what you’ve done, but you can intuit it based on bouncing. They came to the home page, they clicked the pricing, and they're gone. It's the same kind of approach to this, just a different context.

Audience Question: On the $1,500 dollar cleaner, did you see benefit from video?

Roy Steves: Yeah. That was something that, with that kind of price point, the manufacturer was also producing more content as well. We can do a great job at getting that content to change the position of product. We tried different media displayers, where we include those videos and things like that, and we embed the video above the product description, and those might speak to the quality of the video that's coming out of those sources, including our own internal videos, when we first started doing video, hurt conversion. It was P99, bad idea. I'm not saying video is bad. Video is excellent. You can make a lot of money with video, but if you're using videos produced by a pool cleaner manufacturer, even a very expensive one, it might not be as good as a car manufacturer or a shoe manufacturer or someone. It's a slow moving industry. When we started doing our own, we spent a solid year trying really terrible videos before we got better, and started getting some traction.

Audience Question: Other than Contact Us, what other pages yielded insights?

Roy Steves: The next thing after the Contact Us one was sort of the inverse problem, and that was, we were drawing a lot of organic traffic to our blog. Our blog was not doing a great job of turning those into shoppers. We were using the search activity on the blog as the corollary, because people would land on, how do I find the right size pool pump. That's an important question, we answered it, and then they'd just go away, or they'd search for what's the best pool pump. As soon as you saw, "How do I size it?" Followed by “which one is best?”, this is the difference between a research question and a buying question. Then we were starting to try and pump that stuff back in.

The other type of activity we were keeping an eye out for was live chat, so we had a live chat solution in place, and we had staff on it year-round, so I could look for the originating page for those live chat sessions as well, and it was much the same story.

I love talking shop. You have all probably picked up on that. If you have any more questions at all, feel free to grab me and I'd be happy to talk about any of this or eCommerce in general. I really appreciate your attention.

How do we deal with issues involving family safe content?

We sell lingerie and we come up against a lot of issues with ‘Family safe’ content, but there is no definitive answer as to what in particular is allowed and not allowed. Would be good to find an official response.

This might be a suprise, but Google uses humans to make a lot of these decisions, and the bad news is that they are subjective. This is a combination of human reviews by Google, but also shopper feedback. For your products, I would bet the only safe way to go is to push product photography where the items aren't on a model--but that looks like crap, right? You could also take a look at what your competitors are pushing, or seeing what is getting through, and what is getting kicked back at you. I doubt you'll get a firm answer, as I think Google is totally going with the "I know it when I see it" method of defining unsafe content.

Have questions?  Drop me a line via the Contact form--I love to talk shop.

Up and to the right!

Does product title optimization help Shopping Campaign performance?

Roy, have you seen much lift optimizing product titles for Google Shopping Campaigns?

Personally, I haven't seen that much lift from effort spent on product title optimization.

I know there are entire companies out there built around this type of optimization.  They claim incredible things, but I guess I don't get it. Google Shopping is a content democracy.  If you send Google a spectacular product title, or any other content for that matter (great photo!), then you're helping your competitors as much as yourself.  Google is abstracting the product data, and then simply dumping traffic off to whatever retailer the shopper clicks on.  This actually leads to all kinds of shopper confusion. When there's an error in a description, or the like, it's not clear whether that data came from you in their mind. It was on Google, and they clicked to you, right? While there are probably incremental gains to be had here, the safest baseline is the same as my SEO advice: be the most relevant result, and most things will shake out.

One area that absolutely makes sense to invest in is the overall quality of your product data and feed in general.  This includes some aspect of product title optimizaiton.  In my reviews and working with clients I see a lot of room for improvement in the underlying data submitted into the product feed.

First, start by working through your errors and warnings in Google Merchant Center.  Google does a decent job of prioritizing what you need to fix under the Diagnostics tab in Google Merchant Center.

In terms of the product title, are you already following Google's specifications of a 150 character limit? Remember that typically only 70 characters will display.

For products with variants (parent-child SKUs) are you using the common title for all of the variants and adding the variant attributes after the common title?

Are you following Google's editorial guidelines?

A properly constructed product title will typically include the following data:

  • Brand | Product Name | Attribute 1 | Attribute 2 | etc.
  • Levi's 501 Original Fit Blue 32x34

Additional information that I've seen that can help improve the relevancy depending on the product could be gender, model number or any other product descriptors that are commonly used to find a specific product.  For example:

  • Levi's 501 Men's Original Fit Stonewash Blue Jeans 31x34

While this type of product title optimization may not necessarily help your performance within Google Shopping Campaigns, it is improving the user experience and relevancy for someone looking for this specific product.

Have questions?  Drop me a line via the Contact form--I love to talk shop.

Up and to the right!

What % of spend that didn't convert is typical?

We'd obviously love every dollar of paid search we spend to convert, but that's not realistic and at the same time we need to explore adding new keywords to the mix. What % of the overall portfolio of spend that doesn't convert is typical/healthy? How is this different for text ads vs. PLAs?

PLAs have the advantage of being able to change the taxonomy of the campaign based on available data. As long as you're hitting your ROAS goal, and are adjusting your product group tree to make sound decisions, this should be fairly self-correcting.

Text ads, however, are a bit of a different beast on this front.

The most difficult scenario here is a very large catalog of products that individually have very little traffic, but add up to a lot of business. If you're trying to target them individually with text ads (say, in some automated way, for example), then you're going to have to use huge amounts of time to make good decisions, and if 80% of those bids end up not paying off, you're in trouble. Therefore, I recommend starting with the highest converting ads (low-funnel, high popularity, etc.). Get these producing a reliable ROAS.

From there, start moving toward the mid-tail, keeping an eye on your account's overall Ad spend.

How well your primary ads do versus your targets will determine how much you have to experiment with. For example, if your highest converting products are running at 10% under your COS limit once optimized, you have that 10% to work with. If your highest converting ads are running right at your COS target, however, then you won't be able to hit your COS target with anything that converts any lower.

Then, if you have a specific opportunity you want to pursue, without going over budget, you have to cool off your bids on your core ads a shred, giving up market share on those, to afford to explore for new potential core ads.

Have questions?  Drop me a line via the Contact form--I love to talk shop.

Up and to the right!