q2 test & learns: quick wins and discoveries from our recent experiments

In this podcast episode, we delve into the world of digital advertising experimentation and share the quick wins and valuable insights we gained over the last quarter. Join us as we discuss the successful strategies we implemented, including the use of emojis, testing new ad types, measurement techniques, and targeting tools. Discover how our experimentation fueled growth and transformed our campaigns across multiple brands. Tune in to gain practical takeaways and learn how innovation and iteration can lead to significant successes in your own advertising efforts. If you’re looking to optimize your strategies and achieve quick wins, this episode is a must-listen!

Podcast Transcript

A good marketer is going to plan for H two starting now, or they’re going to be a bad marketer and not get ahead of their shit and they’re going to wait until first half closes and then have to figure out a, an H two strategy. That really doesn’t go into effect until q4 and then all buying changes completely.

But where do we guys, the point, where do you want to start? Where do we want to start? I want to start with one of your experiments around MetaMatch and testing that as an audience enrichment and targeting tool. Yeah, so this has been this has been a fun one. MetaMatch metadata. They’re all part of the, I’ll call ’em the LinkedIn community where everybody’s talking to each other.

A lot of buzz on LinkedIn. So yeah, we had to try them. Historically, like we’ve used furio, we use Clearbit for some stuff. Meta Batch has been a new tool. MetaMatch is metadata’s B2B targeting tool. They use 10 different. Data providers [00:01:00] unique selling point for them is funny. We were talking about this just earlier today around ZoomInfo is great because they have a lot of data, but ZoomInfo thinks that we’re dragging search.

So like

When we look for data providers… the most important thing for us is that they have accurate, up-to-date information with people who fit the ideal customer profile of our customers. And I think, so MetaMatch has been a big win. So my initial goal with MetaMatch was I really wanted to enable LinkedIn-esque targeting on Facebook.

I mean that’s in order to effectively run B2B ads on Facebook, like you need a data provider. Sorry, Facebook, your data’s crap shit. And you need one of these tools in order to be effective. In our initial launch, I will say I was not as impressed with the Facebook piece as I would’ve hoped.

I [00:02:00] mean, yes it did achieve the goal of making Facebook a viable option. But it did not reach my expectations when it came from a CPA perspective. Something we’re going to continue to test for the couple clients that we have it running. It was profitable. Made sense. It was really good. Where I was much, much more impressed was with an AB test that we were running where we essentially took the

persona, ICP targeting of one of our clients. You can do this on LinkedIn already, right? You can build out job titles, roles, whatever. So we just took the exact targeting that we had in LinkedIn and we built out the exact targeting in MetaMatch. The thing that I was impressed with MetaMatch is they did have some job titles that were like more granular.

Think of this as like a job title targeting AB test. So there’s a lot of overlap in these two audiences with similar job titles. MetaMatch [00:03:00] had a much deeper granularity which was awesome. And I think because of that, so the results that we saw these are standard lead gen campaigns focused on demo trial, bottom of the funnel stuff.

We had a 51% reduction in CPA comparing the MetaMatch audience to like the standard job title targeting. MetaMatch is $20K a year. Seems like pretty worth it at the scale. Like this client is spending over a hundred grand a month. This is, seems like a no-brainer at the end of the day with the efficiencies that came out of it.

If you are spending you’re only spending 5K a month on LinkedIn. Yeah, probably not something you’re going to want to pursue. But if you are investing heavily. There’s some serious efficiencies that can be worked in there. So some things to look out for, some things that we discovered.

So this was in, this was a comparison over the first two months of data that we had. [00:04:00] I will say, this is just the nature of b2b. Audiences are small. There were some of these audiences where we saw that initial awesome 51% reduction, but. Month three and four. I think we just mapped out the, we saw those audiences.

We saw the results return back again. It was still good, it was still an improvement, but some of the smaller audiences just wore out pretty quickly. Now that can easily be addressed by, on the client side with better, more, better creative stuff like that. But there were a couple circumstances where I feel like, okay, we exhausted that audience pretty quick.

You know, maybe we’ll let it rest for a couple months and then bring something back with a better, more interesting. Offer. But, and that was the other thing, we were also very bottom of the funnel focused. Where if you were doing like content based offers or more of a demand gen focused strategy I think that.[00:05:00]

You could get in front of the right people, which is the goal in any form of advertising. So I don’t know. I like it. Did you guys test out like other audience layers beyond job titles? I know you just mentioned that job titles was obviously a much more granular targeting set compared to LinkedIn Native, but with the other parameters of like intent data or.

Tech stack that might have been in place at the contact list, things like that. Did you guys test any of those other nuances that are outside completely of LinkedIn’s native features? Yeah, so we have some of the technographic targeting going where you’re targeting people who are users of the competitors.

Mixed results, honestly, I think that one’s going to be an industry by industry based on what competitors that they have. If you are. I dunno, competing with a giant, like a, a company that’s a hundred million plus a r I think that you’re going to have a solid data set to choose from. If [00:06:00] it’s a smaller market, I do think that you’re going to struggle a bit more.

Yeah, they, for example, if they, if you’re looking for, trying to give like a better. If you’re targeting, if you’re targeting competi, like large companies, users of like I dunno, Salesforce, that’s probably a good one. Like people who are actively using Salesforce, who are in a specific industry, maybe you have a specific industry solution, you could layer those two on top of each other and probably get a fairly decent competitive set.

You know, if you’re doing, if you have a smaller, more niche audience I think it’s going to be a little bit of a struggle. Okay. Fair enough. I will say one thing, like the platform has some limitations too, based on what you pay, like you only get a certain amount of audiences. One of the things that we learned, metadata is probably going to hate me for saying it, but like a loophole, is that you can take their, essentially build out their job titles and then like layer in other things on top of that like [00:07:00] geo industry.

It’s just other native targeting that is built into LinkedIn so you can get the more granular audiences without building an additional audience and then going over your audience limit and having to pay for it. But. I think that, you should be experimenting with both of those. And at any point you shouldn’t be trying to work your way around tech costs.

So you should be focused on results. And, if you’re getting the efficiencies that we’ve seen so far, it makes sense to up the plan. It’s not, to me it’s not rocket science, but overall I have been very impressed with the tool. I’m interested to see. We’re only testing in, I’ll call it the SaaS space right now.

I’d be very curious to see how it expands to b2b. Beyond that. Like I said, with any data provider, it’s about how good the data is that fits your specific customers. Like when you’re shopping for your shopping MetaMatch, first see [00:08:00] Clearbit. Like first thing you should do is have them build out your car, your ideal customer profile, and make sure that they have the data that you’re looking for because.

Different tools are going to have different data. ZoomInfo has the largest data set, crappy quality MetaMatch does really well in the tech space. Let’s see. I dunno there’s ever, there’s a use for each one of these. Cognizant is more international. We did a whole episode on this. I don’t think you have to go into depth.

I think you’re good on that front. Cool. Anyways, like the tool so far. Test it out if you got the budget for it. There you go. All right. What do you want to hit next? I don’t know. You want to talk about some branded search tests? I know we’ve done an episode on this one before too, but I think you got the answer for us of should you bid on branded search and will you just capture all of that traffic organically [00:09:00] anyways?

It’s always a fun. The answer is in all marketing. It does depend. However, there is a way in which you can go about determining what the true answer to that question is. And it does come back to,

Are you strong in your SEO? Do you have a strong presence as it relates to the search results?

Are you fully out there as far as your digital brand is concerned? So that way when you do run an experiment that you can, try to determine whether branded search is worth the investment or not. You can effectively measure it because you can only manage what you can measure.

So the experiment that you know, I’ll talk about here was a FinTech company that actually was in two global markets, the US and the uk.

And therefore, while the intent of the experiment was to determine whether branded search was worth the investment or not it actually led to some interesting [00:10:00] geo level insights as well as it relates to brand presence, brand awareness, and some tidbits as it relates to that. So just to set the stage here, it was a FinTech brand.

Two markets. Their seo, because we were in charge of it, was solid, so we could effectively measure it. And we measured it through the context of branded search impressions and looking in Google search console. Now, All of us know that Google Search Console doesn’t provide fully transparent data. However, it provides good enough data, which we’ve talked about before, James.

Not looking for perfection, but using what the tools are good for and using what we have our disposal. It was good enough to lend some insight into here. So what we did is we created our tolerance levels and our benchmarks, understanding our baselines as far as what was the level of branded search queries coming through Google Search Console.

So therefore, it was unaffected and unaided by paid media in many [00:11:00] regards. And then what were our branded search campaigns? Driving and generating from an impressions, a clicks, and a click through rate perspective. We on our baselines. And therefore when the time came, when we shut down branded search, we could actually look at what incremental lift we had on organic search, right?

What was the transference of traffic from one channel and platform paid over to another organic? So I want to have this in front of me as we talk through this, because I think it’s important to really look at, what were we trying to overcome? What were we trying to understand? In a time where we are all trying to do more with less, we’re trying to make sure that our budgets are as efficient as possible, and we’re trying to squeeze everything we can out of every single dollar, both in the short term, but long term.

You know this experiment was set up to understand, what would happen when we shut down brand a search. To your point, James, do we capture that same amount of traffic? Does it still convert at the same rates? [00:12:00] And can we save thousands of dollars that we’re currently spending on brand search and allocate that to other areas that are going to basically create more demand, create more awareness, and create more brand searches over the course of time?

So that, that was the hypothesis because our hypothesis ultimately lied on the fact that, yeah, when we shut this down, we’re going to capture a majority of the traffic through organic. Now what we actually came out of this was we saw a lift. Basically we captured all of the traffic organically that we are previously capturing via paid in the US or I should say the uk.

My apologies. In the uk, we basically saw the lift so far that we met back up where we were now in the US we saw a lift. But it was not as drastic of a lift as we expected. We saw people searching, yes, but the amount of traffic and therefore a click-through rate did not increase at the same amount of as in the uk.[00:13:00]

And when we started to think about this, the FinTech company that we were doing this for was based outta the uk. Their European presence and European market was probably the strongest within the organization as far as driving revenue is concerned. The US was more or less a secondary market. Yes, they had great presence, they had great Fortune 500 clients in the US market.

But from an overall branding perspective, it was not as strong as the uk. And therefore our assumptions and hypothesis was that. While we were able to incrementally increase and save money in the US market, it actually wasn’t as strong as the UK because of the brand presence and brand awareness perspective of things.

So we’re still actually going through this process of, how do we create a new level of demand, a new level of brand awareness within the US market to effectively increase this amount of branded. Presence that will inherently drive more organic traffic and not need the paid budget to support it like [00:14:00] it does today.

But in the UK we basically came out of this saying, look, we just saved you $3,000 a month that was otherwise being spent on branded search. We’re still converting at a great rate. We’re still capturing the same amount of visibility, the same amount of traffic that we would otherwise needed branded search for.

So let’s take this $3,000 and devote it into other lead gen and demand gen possibilities. It’s that reallocation of budget that allows us to move the needle as far as the overall growth of the brand is perceived and. This is why I wanted to take it on home when it comes to this experiment, is the same experiment can be done for any organization.

It’s just setting up the necessary protocols, the baselines, and understanding what you’re looking to do. In this experiment, we let it run three months before we made a true ultimate decision as far as keeping branded paid search, paused for the UK market and bringing it back a little bit for the US market.

But we were doing weekly checks. We were seeing, what is the [00:15:00] amount of traffic that we’ve recaptured via organic this week, and then looking week over week, do we see it incrementally increasing and therefore when we look at that baseline versus testing period, did we capture the entire amount?

Or did we only capture a percentage? The other thing that I’ll just throw out there is a variable that, did not come into play with this client, but can certainly come into play depending on the business, the seasonality. And that’s why when you look at that baseline, you want to take into account not just, a month over month or a week over week viewpoint, but really look at a baseline across those seasonal factors, infl and influences.

In this case, we use three months, maybe six months is more appropriate. And it doesn’t mean you have to run the experiment for six months. Look, if you got the data that suggests that yet your hypothesis is correct, then make a move. You don’t need to wait on it. I think the big thing there that you might have left out a little bit too is like competition.

What made me think of that, yeah. Is you have two completely different markets and this obviously varies product to product, but [00:16:00] generally speaking, the US market is way more competitive in search than the European market. Sure. So I did miss that and I should have mentioned it because when we were setting up this experiment, we confirmed and validated that there was no competition on branded search.

And in those weekly readouts and reports, we did that validation and confirmation each time because in the experiment setup we had outlined with the client the fact that if somebody does start bidding on us, look, we have to call this experiment or really consider whether we are going to continue it or bring back a level of branded paid search just to protect ourselves in a defensive stance.

So that is a good point that you brought up. It’s interesting because like you read all the studies out there, right? And they’re like, oh, like when you have the top paid listing and the top organic listing, generally there’s a 15 to 20% overall lift. That’s like the standard study that’s always been given out there.

And we’ve run this test like a couple of times and mm-hmm. I don’t think that’s as true as you think it is. Like I, [00:17:00] so there was, there was a lift, so like when I look at actual paid search versus organic search, like organic search, clickthrough rate was low almost. It was hot. No, it was higher.

Really? Okay. Yeah,

Branded paid search was averaging, between both markets, I’ll say 20% click through rate. Organic in the US market was actually 50%, and in the UK it was 34%. Now when you, when we looked at the experiment outcomes and we looked at purely organic search, the US clickthrough rate increased by five percentage points. So 50 to 55% click through rate.

The UK, which is why I was saying before, like we saw the full capture there, it increased from 34% to 56%. So it actually broke some of those standard norms and misconceptions about the European market and their adversity when it comes to clicking on ads. Those click through rates actually suggested that UK users [00:18:00] were more apt to click on search ads. Or pay less attention to ad versus non-ads.

Now, again, it all comes back to the market, the vertical, the audience, things like that. But in each market, we definitely did see an increase in click through rate when we were looking at purely organic in that experimental phase.

Yeah. I mean after this experiment the, like you said, the next step is we save this money.

Is it better spent somewhere else Because Yes. With branded search, yeah, you’re going to have gotten some of that anyways. And then the question is are you going to get more if you invest that into building the brand or, yes. I don’t know. Attending It could be anything. Attending one more.

Booth show, like conference, something like that. Like it could, it doesn’t happen. We’re not talking about, okay, yeah. Let’s throw it all back. In advertising, like there, I understand there are other marketing channels, you could throw that into an attribution tool and have a better insight into what’s working and what’s not working and make better decisions.

There’s, yeah. This goes back to the episode when we talked about [00:19:00] maximizing media spend, when we were talking about, not looking at. Media spend versus agency fees versus tech tools as single light items. But looking at it as an aggregate, if we saved you $3,000, right? Where can that $3,000 be better utilized?

Is it more media spend? Is it the tech tools that we talked about, the attribution tool that you just talked about? Creative, right? Creative was a part of that episode too. Three K a month buys you five metadata licenses, something like that. I don’t know, three times. Yeah. Something like that. And then you can save your 51% cost per acquisition and go to your boss and be like, look at this, check this out. Yeah. It’s funny how all these things kind of work together. So let’s talk about the next one here. And I feel like this one is a relatively new release by LinkedIn Business email validation.

Ah, yes. So this one was this is interesting. This is new. Was it? I feel like this is like [00:20:00] Facebook needs to do this too. I’m just saying It has always been a no-brainer to shift from, at least on LinkedIn, sending traffic to a landing page versus lead ads. Maybe that’s like an experiment we can give the data on at some other point.

I don’t have that in front of you, but I will tell you it is a no-brainer from a conversion rate standpoint to do lead ads rather than a landing page. Now the question that always comes up is lead quality Great thing about lead ads and the bad thing about lead ads is that it kind of auto-populates the form for you and all you have to do is click submit.

And that’s why the conversion rate is a lot easier. It keeps somebody on the platform. Somebody who’s on LinkedIn wants to stay on LinkedIn. They don’t want to go to your landing page because that has taken them off the site. LinkedIn also doesn’t want them to go to your landing page, so they’re probably less likely to serve you an ad that goes to a landing page rather than [00:21:00] a lead form.

So big concern that everybody has is that the leads are crap because you’re getting a Gmail and your sales team doesn’t want to reach out to the Gmail address because it’s not a real person. It’s a Gmail address. It’s a Hotmail address that filled out a ebook form. Trust me, I get it. I don’t want to reach out to those people either.

What LinkedIn released about a month ago now is the ability to force somebody to use their business email through a domain verification. So you can’t use Gmail, you can’t use Hotmail, you have to use your at Dragon 360 email. So sales team is going to be super happy about this, but what do we know, right?

We know that We know that those business emails are more, we know that what’s going to happen is that you are going to get [00:22:00] less form submissions, right? You are now forcing a pair of forcing verification that was no longer in place. But what we don’t know is like, how does that actually affect? So we ran this across a couple of clients and here’s what we saw.

So for,

We did an A/B test of the same ad. Two separate forms. One required verification, one didn’t. Same piece of content. Same offer, same everything, right? So it is a true A/B test. So we saw a 61% decrease across all of these brands in conversion rate when adding the business email verification. So yeah, we saw way less leads come through.

61% decrease. I would consider that pretty bad. But then you have to look at the quality. So only 10% of the submissions on the form that did not require a business email [00:23:00] actually included their business email. So that means like 90% of these emails were business emails versus the other side, a hundred percent of them were business emails.

Why? Because we required it.

So again, this, it comes down to quality versus quantity. In this test in particular across the clients that we saw I do feel like the form verification was worth it. Even with the 61% decrease for, the down funnel stuff is going to take a little bit longer to.

Reconcile, did this stuff actually turn into sales? These clients are b2b. They have a three to six month sales cycle. Like I can’t give you that information yet, but I can tell you, yeah, there was a serious drop in conversion rate when you did it 61%, but you had basically a hundred percent M Q L rate versus you know, only 10% of the emails being business contacts.

If anything, I like being able to go through and. Immediately see what [00:24:00] the companies were and where did those companies fit our icp. Like that from an agency perspective, to be able to see that in real time is pretty awesome because you can. You can basically take the email domains, map ’em against the ICP and say, this is good.

Versus Joe Smith hotmail.com who may or may not work for a company that fits our icp, and finding that out. Actually, maybe never finding that out because your sales team did not even. Want to reach out to him at all. Cause he had a Hotmail email. I think I’ll take the, in this situation I’ll take the verification, but everybody should test, let me put it this way.

If you have a large sales team that just wants more leads, yeah, you probably should not do the business verification if you’re struggling for leads as is struggling for context as is. I wouldn’t do it if you have a sales team that’s literally throwing away. [00:25:00] Stuff that they feel like is irrelevant, like I would jump on it immediately.

I feel like this also helps circumvent some of the struggles that we have with, the feedback loops coming from the sales team or our clients back to us about, what is the progression of stages? What you know are the audiences and the accounts in pipe that we can therefore nurture with first party data, right?

It. I think it breaks down some of those barriers on our side, especially with the transparency, like you were saying, into, that domain verification. It’s not just the company name that’s being submitted within the lead form, it’s the email. It’s giving us extra data points to create additional targeting criteria and first party audience lists ourselves within those nurturing campaign strategies that are necessary to pushing somebody through not just the pipeline, but ultimately into a customer.

Yeah. I mean you can do that stuff in HubSpot, Salesforce but it’s just not as clean because you have That’s what I’m [00:26:00] saying. Yeah. Yeah. You have to wait for the conversation to happen where like real time, I can go through and I can look at all of these companies, like I said, map it out to the ideal customer profile, figure out what I need to adjust in real time.

Maybe, hey, maybe these companies are too small. Maybe these companies are too big. Maybe these, and just real time make optimizations, which is huge. That’s exactly my point, is it creates that speed of efficiency and it reduces the unstructured data points that might exist in the CRM when you’re trying to convert a Gmail, a personal email into the business email and needing, either an enrichment service or an S D R B D R to reach out and figure out who is this person actually at and updating their contact records.

What is it? We were talking about this with the oh, so we, that study that somebody released and they were talking about the amount of time that it takes a BDR to [00:27:00] reach out to somebody and it was of scary. I feel like another one to see is I. Okay, how does it reach out? How long does it take them to actually qualify or disqualify?

Because it’s okay, if it’s taking some of these companies like six days to reach out to somebody who submitted a form and then it’s taking them five days. Or longer to update the status of that lead in the crm? I dunno. You can see the problems. You know, I, not everybody, it’s funny, it was good at this, the startups are good at this because everything matters to them.

You get to a billion dollar company and it’s it’s hard to look at, throw in some seasonality factors in how those folks get commissioned and you got some sandbag and sales people that might not update stuff for a couple weeks. Yeah. We’ve seen that ourselves too. So anyways, that doesn’t help us that much on the paid media side and being able to look at business emails does right, right.

So

[00:28:00] One of the experiments that I wanted to mention was the concept of holdout tests. And the understanding of ad exposure on various elements of your marketing and advertising strategy. At a high level, a holdout test is simply taking a part of the market and holding it out from your marketing and advertising activities.

In its most simplest sense, think about taking your list of target accounts and basically splitting it, 50-50 or 75-25, where you’re going to expose 75% of your market to advertising and 25%, you’re going to do your best not to expose it at all. So inclusion, exclusion, audiences, right? The outcome of that and the hypothesis is that the more that you expose that control audience or expose audience to ads, the more likely they [00:29:00] are to become customers of yours. Now in a time again where we are trying to really maximize our budgets and really understand where we can put our best dollars out there.

This is one of those experiments that it’s not hard to do because you’re just taking your target audiences and splitting them accordingly. And looking at the results of are these accounts that are not being exposed moving through the pipeline? Are they converting? Are they converting into customers?

And I think it’s, it can be done at so many levels. But it does depend on your data and the structure of your audiences. And it is a lot easier when you have first party data as your targeting method. If you’re using things out there like demographics and firmographics that are of third party nature. Native targeting based off of industry and titles within LinkedIn as an example

there. It’s really difficult to do compared to first party.

With, I think that’s what one, sorry, go ahead. I was going to say, I think this one’s like huge for abm. Absolutely. [00:30:00] Because, let’s be honest, those ABM metrics that we see in 6 cents or whatever, like lifted accounts and all this, like the problem with ab, like what this answers is like how many of those people would’ve, could become customers anyways.

That’s like the goal, right? So you have exposed group, non-exposed group, you have a certain percentage of them just that you expect to close anyways because they’re actively con in contact with sales, right? Yep. And you want to know if ads. Are the difference, and I’m sorry, but like the lifted account metrics in 6 cents and demand base are a load of bullshit to help 6 cents and demand base sell more stuff.

Those are great tools. They have great features, but like the measurement of the effectiveness of their campaigns is a load of crap, right? Where this can help you understand okay, what’s actually happening? With those accounts. Right. It also goes a long way in moving past the attribution [00:31:00] hurdles and conversations that historically block somebody from showing success, because you have to look at it as an omnichannel big picture viewpoint of things with the understanding that, the customer journey is.

Eight plus touchpoints. It’s not channel specific where all the credit goes to a specific, search keyword, search campaigns, things like that. No, it’s just part of the customer journey. And to your point, like that’s why it is a great fit for a b m testing in particular because of the nature of the strategy and the nature of the experiment.

Like the biggest thing here is you can’t do it on search. Like you have to do it on social, you have to do it on programmatic connected tv, things like that. Like for example, one of the holdouts that we ran was heavily influenced towards C T V with programmatic and audio on top of that. And what did we see when we looked at the holdout?

We saw a 60% lift in conversion rate for those audiences that were treated, so exposed to our ads, a 60% lift in conversion rate. That’s pretty [00:32:00] significant by itself. Another one, going back to your A B M point, is we basically took an account list, split it up 75 25, and found that those accounts that were treated or exposed to ads actually saw a six.

Times more likelihood that they were going to turn into an opportunity for the client and a three act, three x increase in the likelihood that they were going to become customers just by being exposed to our ads. That’s a pretty significant lift, and when you’re going in around and justifying expenses and investments and saying where we need to put our money, like those are the types of experiments that go a long way in justifying the strategy that you have in play.

Yeah, you actually get some sort of insight on how these channels are working together too, and like what the incremental impact of a given channel is. Exactly, and it’s it’s an easy thing to do. Similar to the branded search one that I was talking about before. It is an easy thing to do.

You just need to come up with the strategy and the best [00:33:00] way to do it. So you can measure the outcomes, but go into it with hypothesis so you can set the stage in that scientific approach. So I’ll tell you like the best and biggest use case for this, and that’s retargeting audiences. You are targeting people who’ve already visited your website, already engaged with content, already did whatever, right?

So yeah. The reason you’re doing that is that you want to amplify, you want to stay relevant and you want to stay in front of them because they have showed intent. Now you don’t know what that level of intent is. And ad platforms, Facebook, LinkedIn, connected tv, or whatever. Are going to show view through attribution.

They’re going to show every single, they are making the correlation of this person saw an ad to this person converted. They are two separate paths. They’re not saying that one caused the other, they’re just saying that this percentage of people who saw the ad. Converted on the website view [00:34:00] through, they’re not actually making the psychological connection of this ad caused this person to do this, because it’s impossible to do that.

It’s like it’s not, data does not prove psychology. Now, where the holdout C test comes into place is they can tell you like, okay, the. Group that saw the ad converted at this percent and this ad that the audience that didn’t see the ad converted at this percent, and the difference in those conversion rates is the value that the, that channel provided.

And it’s, honestly, it’s probably more than that because again, attribution isn’t perfect, certainly when you’re trying to make a ch case for a specific channel, This is something that you should be trying to run. Well, well to build off of what you’re saying with retargeting and, our viewpoint of it being in the ad realm, the same exposure and hold out experiment can happen for email and nurturing processes.

Hold out a certain percentage of your market from your email. And see whether the email is actually [00:35:00] making a difference in moving somebody through the pipeline and converting them. Or if it’s not it doesn’t mean that email’s a failure. Maybe it’s your sequence, maybe it’s your content, but there’s so many different variables that you can play around with then in that holdout mentality in various channels of your digital strategy.

For sure. All right. Let’s hit a fun one here before we start to close out today. Emojis. You love emojis. So what are you seeing as far as performance and impact is concerned when you’re using them? Oh, so this one was a fun test. I, I mean, probably one that we were honestly like late to the game.

I, I, you know what? We weren’t late to the game on it at all. I think the, I finally took the time, and let’s tell you what, this was like a pretty messy data analysis of, I’m being honest, trying to aggregate data across all the different accounts that we work with. So we’ve been testing using emojis in our ad copy for.

A long time now. I took a [00:36:00] list a look at a couple of months worth of data and I looked at the engagement rate, so I really focusing on engagement, not conversion. So I’m going to make that very clear, focus on engagement to see if. We were getting more engagement from using emojis, and it was pretty significant.

So looking at all of the ads that had emojis versus all of the ads that didn’t have emojis, we saw a 20% lift, 28% lift in engagement rate for ads that contained emojis in comparison to those that did not. So I don’t know, b2b, we talk about everybody wants to be all uptight and professional, like suit and tie, that kind of stuff.

You know, I know part of our passion is to make B2B more fun and, not everybody has the budget to do a really funny video or nobody ha has the right person to be like the TikTok influencer, like entertaining focus, right?[00:37:00] This is like a really quick. Easy win to just make yourself like a little bit cooler than you are right now.

And it works like it works. You know, maybe the next step here is to dive into some of the conversion metrics. I don’t think I’m going to be able to pull that data across all brands in the same way that I can look at engagement because everybody’s goals are different. But I mean, It seems pretty obvious.

I w I think the other thing to, to add on top of this is,

You’re mentioning a KPI as far as engagement rate is concerned. And what many marketers struggle to keep in mind is the fact that those engagements can be triggers for additional audiences and targeting within various platforms. So if you’re increasing your engagement rate, you’re getting more reactions, you’re getting more like shares, et cetera, comments. You can actually create audiences[00:38:00]

based off of that criteria. And therefore you’re creating a larger audience of engaged, qualified individuals that you can there effectively retarget, nurture, prospect further.

Yeah, it’s funny, you’re so right. So everybody thinks of retargeting as people who like visited the website, right? That’s what they think of.

Every retargeting audience that we build includes people who previously engaged with our ads. Engagement is super important. One, you want to know whether or not people like your stuff, right? So we talk about measurement when lead forms go away, right? This is not a lead gen campaign.

It’s demand gen focused. We’re giving you the content. We’re not trying to capture your contact information. Okay? How does the measurement change? Okay, we’re not trying to capture email, but like we need to know whether or not people like our stuff. So engagement becomes like the core focus. Like we want to see a ton of engagement on our ads. Likes, comments, [00:39:00] shares, clicks through the website, whatever.

All of those things are going to tell whether or not your content is good.

And that’s, this get into the whole like demand gen thing. Another thing in this emoji piece is that in LinkedIn ads, I think you can use up to 600 characters. One of the things that I also like, strongly encourage is use those 600 characters and say what you want to say.

Even if you’re like pushing somebody to an ebook. Put the top five points that ebook is trying to make in the copy. Maybe use some emojis to make it more readable and visible. And you’re going to get higher engagement rates because you’re actually giving something of value to them while they’re on the platform.

And like I said before, LinkedIn wants to keep them on the platform. They want to stay on the platform. So you’re just letting the customer do what they want to do and engage the way that they want to engage. So, you know, give them the information they want. Make it readable at emojis. Make it more [00:40:00] fun.

It pays off in the end. Because like you said, when you then take that audience that, I don’t know, say you were promoting an e-book or a webinar or some top funnel piece of content video thing, right? Based on the engagement on that, then you’re going to slap ’em. You’re going to, you’re going to pitch slap ’em with the old demo, just gotta have a hand reach out to monitor and slap you. Here’s your demo. I, so just one thing that you brought up before we bring this to a close was taking those five points and throw ’em in post copy. You don’t have any like initial or findings as far as document ads are concerned in front of you.

Do you? No that’s hit or missed because that is dependent. It’s what has somebody made this quote? People don’t like, people don’t not like webinars. They like, they don’t like bad webinars. I think the same thing goes for like eBooks. Like people aren’t tired of eBooks, people are tired of sales guides disguise as eBooks.

Yeah. Okay. Fair enough. Yeah. Okay. So to bring [00:41:00] this episode to the close, I’m curious what’s on the horizon for us as far as our testing and innovation is concerned? Oh, you know what’s on the horizon. You were supposed to launch ’em this morning. They’re not available to us yet. I confirmed with LinkedIn support.

So LinkedIn. LinkedIn thought leader ads are not available to everyone yet, but it is absolutely getting tested by us and multiple clients of ours. That one will be interesting because it’s being phased at phased. Out as a rollout is concerned. So just to give some clarity of the situation, we’ve seen some rumors here come out about, oh yeah, today’s the day LinkedIn ads officially rolls out, thought leader ads, and I go to our account, I go to a couple of our clients’ accounts.

I don’t see shit, so I reach out to support. They’re like, yeah, I don’t know where you’re getting that information from. It’s a phased rollout. So I’m like, what the hell? All right. They’re like maybe you should just check back in two weeks. So clearly the information that we are receiving and seeing out there in the various noise and loud voices in the [00:42:00] room on LinkedIn was not accurate.

So it is definitely on the horizon for us as an experiment. Yeah, I feel left out if I, if we don’t have access to it yet. That’s why I verified with multiple clients. So the other one, if if I could get somebody with the Guts to do it. I want to get a cameo thing going. Yes, I do. Yes. I want to get the slithery snake guy out there and I want to get a couple office characters I saw.

I’m going to shout out metadata for this one for talking about it, but a little event promotion strategy using celebrities. That seems like it would be a way to get a little bit more buzz around your boring webinar for, get my old D listers on there. Ray Lewis, get ’em out there promoting a webinar on infrastructure software.

It’s going to be. Pretty [00:43:00] freaking sweet. So I think the other ones are just some continuations with our client base when it comes to holdouts, those branded search. I know there’s a couple branded search ones that are on the docket as we head into, Q3 and beyond. To really, understand whether there are better ways that we can be maximizing our budget beyond, what’s running today.

And holdouts are significant portion of that as well. Especially in that ABM realm. I got another one that I want to like maybe formalize some data around, and that’s like the meeting booking tools I’ve got. I’ve got some some clients who are really pushing towards that now. You know, I thi I think everybody like wants to do that.

In what we have going on right now, sales don’t want to change type thing. Anecdotally for ourselves, and we implemented that, like the level of junk coming through. As far as submissions are concerned, Dr. Decreased drastically. I mean there are still a few that filter through when [00:44:00] they book.

Call with me via Calendly that are, I can read through it and just basically cancel on them because it’s not, I know it’s not a true, discovery call, but the amount of spammers and, vlogs and link builders and everything else under the sun. Basically went away overnight and less things to manage and call through didn’t go away.

They just still used the lead form. It’s almost like the lead form is like your spam folder now and no. So we put Calendly into place, right? And actually, I. There is a significant decrease in the overall amount of people submitting the form. Like we saw a decrease in junk submissions overall, even though the form still exists.

Entertaining Content with Purpose

Interactive Demos are Better Demos

Beyond Creation: Maximizing the Impact of Your Content

People-First Playbook AMA Edition Part 2