[This is a transcript of Part 4 - Store Conversion on Steroids. Below is a link to the recording.]


Kausambi: One thing that I found very interesting when I was moving career side is, uh, you know, this whole, I forgot what you call it, but it's like whole library. You know, we are, I would select a good hub almost of, of strategies in place that you, uh, you know, you, our customers can tap into. Um, and I find that.

Hey refreshing, honestly, because it's, it's, you know, it's a lot of the things that do end up needing to do are things that might have already been done by others or with others. And so it's usually it's great inspiration, but I don't usually see a lot of consulting teams are our agencies actually put that out there.

So I felt that was very refreshing. Uh, what, what made you do that?

Justin: We're very fortunate that we get to see a whole bunch of data. I think we have, we we've conducted more than 15,000 experiments at this point in the history of the company working with, I mean, we've pretty much seen or done at all at this time. And when you look at it from an optimization or improvement perspective, there's only so many ways to do things. I mean, there's only so many ways to slice and dice a demographic.

And if you're doing optimization from a true optimization standpoint, there's only so many ideas or concepts in marketing, nobody's reinventing the wheel. It's just a matter of figuring out which have the highest leverage aspect or what could have the biggest impact and how can that apply to different brands based on basic demographics?

Cause we know that there are certain people like certain demographics, you know, females are going to shop different than males versus, you know, different ages within that, that range. And we use that kind of our library.  So our initial library that we have, I think is up to 82 concepts ideas, principles that we use and we pull from that can then adapt at least to get us the initial learnings that we're looking for and how we approach optimization and growth, because we're not.

We're not just spraying and hoping we see something stick. We're not rapid fire testing. We're not using tricks, gimmicks tactics. We're doing very incremental changes and adjustments to really just understand. The wants and needs of a visitor and their likes and dislikes and what holds weight, you know, do they really care about social proof?

Okay. How much do they care about social proof? Do they care about those secondary payment options? How much do they care about how much do they care about security? You know, all of those things that play a role in, you know, different pages. You know, flows on a, you know, an e-commerce store or what have you.

So yeah, we, we've just kind of built that Rolodex to get us more of a productized or a systematized approached optimizations, like, okay, we know nine times out of 10, this certain, this certain thing is going to, they're going to have an impact. It's just a matter of what impact it actually. Yeah. Yeah. Yeah.

Kausambi: And, you know, do you kind of like, have you open-sourced it out or is it something that, uh, you know, people have to kind of like, you know, get to you and then they get access to yeah.

Justin: It's we keep it pretty close, but at the same time it's no secret. I mean, but we don't have it like an open source repository of code base or anything.

We do have that internally, meaning like we just have short code. You know, there's, if you're going to run a shipping notification or like a shipping time progress bar type situation, there's only so many ways to code that. So we really tried to. Not rewrite everything every single time.

So we kind of have a repository because we're very developer heavy, uh, internally, so.

Kausambi: All right. So we are kind of like just rounding up week, one of  "Store Conversions on Steroids", 9 days, 9 experts episode.

And the fourth expert that we have here, that's Justin and you know, we've learned a bunch already. The last three days soup, a lot of learning for us to learn from Marty - We learn about the art of persuasion and how do we really keep the customer perspective and the customer bang in the middle of all your decisions.

And think about that user's perspective before you take any decision. It sounded super interesting. It's something that we all know, but we need reminder of often. And then we spoke to Lorenzo who continued the trend on customer perspective, spoke about how to mine for reviews to actually pull out nuggets of how customers think about your products and what to use that across your journey of your store from your navigation to search, et cetera.

Very, very interesting over there. And of course, one interesting thing that with Lauren's it came up was that, Hey, a lot of times you might not have enough reviews. So you don't have to go deep - quantitative all the time. It might also be qualitative analysis. And then Chad continued on with Oliver where he spoke about, Hey, like even to run A/B test in any kind of continental basis, you sometimes need to understand better. This is the right stage of your business direction to get into that.

Right. So ton of great learning and it's interesting how we are continuing to check today and today's topic with Justin, is about how do we balance out and what to look out for qualitative versus quantitative when you are planning out your conversion rate optimization in your store and in your site. Right. Um, and uh, we want to figure dive in now.

So welcome Justin. Thank you so much for being here and, uh, and a, and a quick intro, um, you know, about yourself for our audience and what do you do and why is it important for ecommerce companies.

Justin: I've been a marketer for absolutely ever. Um, it seems like I've been doing a 21 years. I've been doing optimization as a service for, we just celebrated our eighth birthday, um, as a company for conversion fanatics. So that's all we do. We, we literally do optimization as a service all day. Um, that's primarily work with a bunch of amazing brands you've likely heard of, and you probably haven't.

So we swim, we swim in data all day. Yeah.

Kausambi: And then talking about like the bunch of brands I was reading on your site, uh, you know, Mike, the founder of a toolbox, and he mentioned that. You know, one single AB tests that, uh, you know, it, that they implemented along with two, of course increased about 40% of their monthly conversion rate.

And I found that like - I want that magic bullet. You can get a one single test, you know, give it to me, uh, adjusted. But, but jokes aside, like I, you know, tell us more about that. I thought that was a very interesting.

Justin: Yeah. So, I mean, I wish there was one magic bullet every single time, but you know, it's, it's a combination of things that lead up to those bigger overall improvements.

I wish there was one magic bullet every single time, but it's a combination of things that lead up to those bigger overall improvements.

I've got one right now. Let's run a 94% increase on a mobile, um, on a mobile test. You know, it's just a combination of how we approach it, both qualitatively and quantitatively to really just learn those behaviors of those visitors. And that leads to those bigger overall compounded improvements. And, you know, once in a while you, you hit those home runs.

But it's a lot of base hits to lead up to those overall home runs. And it's really just the importance that I think people need to understand is it's a process. You know,

Optimization is a process. There's not some magic fits all type of split tests. That's going to make everything better.

Um, and I think that evolving process helps brands truly grow and scale.

Case Studies

Kausambi: Yeah! It, the process itself. Let's dive, dive in a little bit to do that. Like walk us through one example maybe for Butcher Box are, are the, if you can speak about the, exactly that you just said, very getting 94% uplift in mobile, um, the gig, what is the typical process that you follow?

And I guess what is the typical, hence the process, uh, that someone can follow when they're planning on this CRO strategy?

Justin: Yeah. So looking at CRO is to me is more about figuring out what holds. So as marketers, we often assume, we assume something's going to be better, or we see it on such and such site.

And we think it's going to be good for me, you know, as my brand. But instead we want to listen to what those visitors are truly telling us. And we use our experimentation to figure that out. So they did care about social proof. Do they care about all of these aspects in which. Which type of shopping path do they care about?

Do they care a bestseller path or do they choose to, you know, shop by collection or do they choose what, what really matters to those visitors? And we use our experimentation to figure that out. And our process is really that we try to cover all of the key site areas, meaning home collections, product cart, checkout, if you know, depending on the store, obviously some Shopify levels you can test on checkout, but we tried to get a good coverage of those key sites.

And then we measure not only primary metrics like conversion rate to sale or revenue per visitor, but we're looking at, did they visit the cart? Did they visit a product page from the homepage? Did they click on a certain element? Did they actually add the product to cart?

Um, you know, and we try to measure those secondary engagement metrics from a quantitative perspective. And then that gives us feedback. It's like, okay. We did X and they responded really well and favorable. So that means how many different ways can we expand? And we use that experimentation and those results to inform which direction the strategy needs to go.

Instead of, Hey, we spent a month analyzing data, doing user testing, and we found this and this and this, and then we still have to start somewhere. Um, so we take a different, a little bit different approach to it in that fact that we, we just want to get testing as quickly as possible, because that's gonna give us that direct feedback.

Versus just kind of guessing or assuming something is, is gonna make that big impact. Um, and then we support it with the, with the qualitative side, which is, you know, heat maps, click maps, um, we'll run exit polling on sites to you know, really just ask open-ended questions. Um, you know, what questions can we answer for you today?

Or what's holding you back from, you know, trying out our products or something like that, versus be like, Hey, was this information helpful? Yes or no, that doesn't tell us anything. Um, and that usually gives a trend very quickly of what that data is in those visitors. Hang ups are like, for example, we had one client that I remember.

Um, a while back that they had a product that smelled. So, you know, they have a scent and it was a male audience and they had to choose in the glaring question was, I don't know which scent to choose, but the set guide was on the page. They just weren't finding it. They weren't seeing it. So we just made it up front right in their face, front and center.

It smells like this. And it boosted mobile conversion rate, I think like 32, some odd percent. Just because we were able to answer that question. So we took not only a qualitative approach, but we took Ben a quantitative approach and tested 10 different ways that we can position that scent and ultimately came up with a winner.

Um, so there's, there's a kind of a give or take in that process, but it's really just consistent persistent experimentation. And using that as a, kind of a, to help evolve the marketing message. And I have, I used to be a direct response guy, so it was very much go, go, go, sell, sell, sell, but we've had to take that evolution, even working with the more, more e-commerce brands that experience, particularly now we're tracking and attribution are not greatest.

Um, and a big omni-channel environment, particularly the social world that we live in the experience really, really matters to visitors. So we look to not only improve conversion rate and all of the other key metrics, but we look to create that better experience for those visitors. If we do that in a way that they respond to favorably based on our data.

Yeah, then everything becomes more effective. You know, we've seen cost per acquisition, go down. We've seen customer support complaints go down. And then how do you factor in any of the word of mouth aspect that you can't necessarily quantify? You quantify what you can quantify, but then there's also some things you can't necessarily put on a spreadsheet.

Opt to true optimization has an impact on, um, and we've noticed that, um, there was a study done. I can't remember who did it, but customers are more likely probably to spend 18% more money on average, they beam the experience to be pleasurable. So, you know, they, they just know and setting that expectation.

That's, you know, that that's kind kinda how we approach optimization as a whole. Let's just let the visitor take your. Away from it just kind of push the buttons in the right order and really just incrementally change things to lead up to bigger overall improvement versus, oh, I need to redesign a product page today, or I need to redesign you a homepage.

I would rather kick the incremental approach that leads up to the bigger overall changes. I don't know if that answered your question. I just rambled.

How to Build Your Experimentation Framework

Kausambi: I'm sure it does! Yes.

Interesting. What I'm hearing is also that there is a discipline and a diligence through your whole experimentation process to, I. Well, it's one side of it, but also like diligence of really running it and knowing metrical in which muscle. Okay. And so how, how would you advise, uh, you know, a full tour, probably new to e-commerce with joined e-commerce teams and they really want to help their teams grow or even, uh, you know, earlier brands, I would say not your below a hundred care, your brands, but even 5 million in GMV or stiff, you found product market fit, but now you, you want to skip, right?

So you want to do like the things in a very disciplined and structured me. So how do you advise folks to actually build their experimentation and discipline muscle? And how do you, how, how do you balance out like Winterfell and font when to pull it qualitative and, and, you know, understand when to do that too?

Justin: So. Couple aspects of that. I see, I see this a lot in everybody jumps to particularly those sub say 5 million in revenue, I would say probably sub three. They tend to gravitate towards tricks, gimmicks tactics, and try to be very tactical. So-and-so. I mean, I can't count how many times, if you're familiar with like Ezra Firestone and his boom by Cindy Joseph, I can't how many times people I'm friends with both Ezra.

So, um, how many times people use boom, as an example, it's like, oh, he does it. And he does X, Y, and Z. I've literally had it two days ago. Somebody said, well, this is on this side. I'm like, yeah, we'll test it. But it's, it doesn't work. It only works for boom. So let's not, um, so it's, it's pulling in and not knowing that.

And instead that's what got you to say 3 million, but what's going to take the next stage is really listening to your visitors. So I think quant is supported by qualitative.

So you kind of have to constantly be leveraging some qualitative feedback, but how to make it and make it effective for you as a, as a business is you need to just prioritize it so many times.

You get so busy, putting out fires, he gets so busy getting pulled to, you know, inventory issues or whatever. You know, the ad campaign is not performing well today or whatever, instead of being very optimization or split test focused, meaning you don't make decisions until you test it. Instead of getting in the assumption trap, once you change that thinking and that mindset and deep in the DNA of your company becomes more of an optimization focused.

Then everything becomes easier. You know, we saw it during, you know, even the pandemic, we saw it during, you know, all of these things, those companies that embraced it and leaned into it and allowed that data to kind of use it, saw tremendous growth. It's those that fought it, that ended up going out of business or losing a big chunk of their revenue.

Because they were so reactive, but the ones that were already understood their visitors, we'd been testing. We've been optimizing all we did. Turn, turn it a little bit and in everything became more effective. So just really embracing the actual process of optimization in and of itself. Um, if you're early in your say, you're just getting started.

You haven't found your match, your market match yet split testing and quantitative improvements are going to be so painful. You're going to, you're not going to like that at all. It's going to take for absolutely ever to run a split test. You're not going to get any of the feedback that you want to get. So I recommend people typically take a qualitative approach only to that standpoint, meaning use the heat maps and click maps.

Use the polling, use the surveys, use the things that are going to help you figure out what questions you're not answering during. You might have some. Yeah. I mean, it might be selling five, 10 products a day. You know, what's not quite at a level of scale yet that qualitative leap was going to help, you know, get you to that 20, 30 sales a day, and then you can get the quantitative start in place and that's going to help you scale to the a hundred.

Kausambi: Hmm. And what would be the typical, like as a ballpark, right. I'm pretty sure it changes across to yours for the segment for your industry, but what would be a ballpark kind of traffic to conversion ratio that you're kind of, is it a ratio or, or is it like, uh, absolute traffic that you're looking at to actually say that, Hey, now I want to put up really high turning point engine, uh, into this tool.

Justin: I don't base it on visitors and sessions. I never base it on that because visitors don't pay bills. So I base it more on conversions. So whether that be a lead or whether that be a safe. And it could be micro-commitments and micro conversions, like add to cart rate and such, but I typically look for the number of conversions per day, um, sales conversions, and then I'll look for, um, usually the average order value basis, because obviously if you only have 20 sales a day, but your average order value is 600 bucks.

It's a different conversation. Then if you have 20 sales a day and your average order value is 30, um, it's you have to optimize that different, um, And the scale has to be different there. Um, so yeah, I look at usually about 30 sales a day. It's kind of where I start looking at more of a quantitative approach.

Um, because contrary to popular belief, I mean, 30 sales, once you get to 30, you've kind of figured out your message-market match. You've probably got some pieces in place. You're starting to build a brand identity. Um, and then 30 to a hundred is not hard. You've done the hard part. It's just a matter of amplifying as long as your infrastructure can hold.

Um, we did this with a cosmetics company in the UK, took a qualitative approach first. They were doing 20, 25 sales a day. They ballooned up to 150 and then they broke. Like, it just absolutely hockey stick and went to like 350 sales a day and they actually broke their infrastructure and they couldn't fill.

So they actually had to back it back down to a hundred. So good, good problems to have!

Kausambi: Any quick, uh, you know, example of some of the things that you remember from the team that you're talking about, the cosmetics team.

Justin: Um, so we emphasize they were natural. So we, we really leaned into their manufacturing process and why they were different. We leaned into the shipping expectation of it. So, and then supported it with a lot of testimonials over reviews was the big thing, but we just set the expectation of what those visitors were like.

What's the money back guarantee. What's the more D what's the specific outcome. They were very feature heavy when we went in and we switched it to more of a benefit, heavy, um, focus. And that was based on, you know, they didn't know it was gonna work. Um, you know, that was the biggest thing is just setting that outcome in that expectation of does this natural product actually work.

Um, it was combination of a bunch of things, but that's kind of the main things that we. Yeah, I think that that usually kind of steps, uh, again, stems back to that, keeping the customer and the perspective, uh, and their specific case. I assume people are more interested in the story and the benefits and what do I get out of it versus just the ingredient list.

Right? That's one of the biggest shortcomings that I see a lot of people they're so busy screaming how awesome their product is. Instead of being like what's in it for me as a consumer, you know, it's, it's marketing, persuasion and marketing, you know, psychology 1 0 1. It's like it's, you don't care. I don't care that it has the buttons.

I just want to know what the buttons are going to do for me. Totally same principles, whether you are a SAS business or e-commerce, I don't think it changes. It doesn't matter what you're selling. It's all the same.

Kausambi: So let's say I want to quickly jump into questions. I have so many, uh, but, uh, in respect of time, um, so the first question, are there any good tools that you suggest for tracking qualitative and quantitative metrics in CRO?

Justin: There's a bunch! Um, so our favorite split testing tool is convert.com. Um, that's our, that's our go-to. We support them all though. Um, my least favorite is Google optimize. Um, just because it is obviously the price is right on that. I just had that conversation this morning, um, on what it is, but for the younger company that needs it, just put simple stuff, it gets the job done.

Favorite for the longest time for a qualitative approach was Hotjar um, they made some changes and I'm actually looking at an alternative right now since they got rid of their manual heat maps. Yeah. And it's my, team's not very happy about it because we've leveraged that a lot. Um, so, um, I'm looking at a few other alternatives in there.

Um, but that's it. I don't, I don't reinvent the wheel. I don't complicate it with fancy trackings. I don't do anything. We use GA. Basic Shopify integrations. If it's Shopify, obviously if you're a big commerce or WooCommerce or Magento or something crazy like that, um, it's different. So we just, we, we try not to, and just too many things and too many additional variables into the, the metrics.

So just keep the tool simple. So just a basic split testing tool and a basic qualitative feedback tool that allows for polling. So exit polling surveys, those kinds of things, like Hotjar.

Black Friday Cyber Monday Sales

Kausambi: Yeah. Yeah. I love that. Keep it simple. It's their tools in the end, right? Um, store traffic. The second question, store traffic, and be fluctuating dramatically.

We now it, for instance, in BFCM it skyrockets conversion rates shoot up because of such fine behavior. Do you think looking at it in the short term mix.

Justin: So looking at it from a time of year type structure.

So like black Friday, cyber Monday. All rules are out the window. Um, you go into strict promotion mode then. So literally you can do whatever the heck you want. There's a lot of misconceptions in there thinking that, oh, if I discount, I'm going to be a detriment to my brand, or I'm going to do this, but you look at any major corporation, any, I mean, they're all discounting during black Friday, cyber Monday.

Um, so how you optimize is more towards a promotional calendar mode, meaning your, how you showcase your sale prices, how you showcase your percentage savings versus your coupon codes versus your sale banners. And you take that approach during a promotional period.

So we'll have, we'll have clients that run black Friday, cyber Monday, we'll have them run like mother's day sales and then periodic throughout the year. So then we can leverage, there was learning. Sort of those other promotional periods. Um, but in terms of fluctuations, I try to optimize very holistically, meaning not skitzo bucketed into.

Right now. And I look at more of a long-term scalability and growth factor versus, Hey, I changed the button color and it raised your lower, the conversion rate. I want to understand why for long-term growth. So I look at it for more month over month or a quarter over quarter trend, as well as what we're seeing from the actual quantitative and qualitative feedback on that.

The number of traffic and visitors. It's going to change. Obviously you're going to have completely irrational buying decisions and things made during black Friday cyber Monday versus what you might have now, you know, as we're approaching summer. Um, so it's completely different in, in terms of how we approach it.

So conversion rates are going to be, and that's another misconception is conversion rates in and of themselves are a relative number because you could turn on a spike, a traffic campaign and your conversion rates can dip. Does that mean there's a problem? Probably not. It just means you have a different set of eyeballs.

Algorithms are doing their thing. It's going to normalize. It's going to come back. It's just, how much does it go down and comparative to how much you increased it. And then, you know, just looking at that overall fluctuation, you know, we've scaled companies 300%, but their conversion rates stayed relatively the same.

But they've turned on more traffic, it would drop, and then we would normalize it. It would go up a little bit, then it would drop. Um, you know, it's just, it's just a matter of what that business's inputs and outputs are, uh, specifically, but yeah, don't pay attention to, don't get so bucketed into, I need to optimize certain way during black Friday, cyber Monday, if you're getting a big enough sample size and you're letting a test run long enough and you're watching the trends in that day.

You're going to be fine. It doesn't matter. We've got clients that run a promotion every three days and it's something different every three days. And it's just, we optimize around that infrastructure and we know that the conversion rate is going to drop or Kim Kardashian does it drop for one of them.

We have a client that had that happen, millions and millions and millions of eyeballs on it. You know, that that's going to artificially, you know, we just had one that was in American airlines. They had like 500,000 extra visitors hit their site. It converted like garbage. So it just artificially skewed the results.

And if you're not, if you're looking at that on a day over day basis or something like that, particularly in Shopify stats, you're going to drive yourself crazy. Um, so I look at the bigger picture then.

Kausambi: Yeah, look at the trends and use that to kind of move ahead the next time a Rishi. Thank you. Rishi has a question right here.

When Is The Right Time to Focus on CRO?

Question: When should we focus on CRO then?

Like I think he has, I assume he's saying, uh, right from the beginning, when you set up your, uh, business or later down the road. Thank you, Rishi. So always so focused on optimization.

Justin: Always. It's a matter of which approach you take. So early on, it's just answering it's, it's figuring out what the visitors, what questions you're not answering to the visitors.

If you think about it in those simplest of terms, everything becomes more effective and you don't need to think about it from the truest sense of optimization, meaning, you know, running split tests and backing it with supporting data and doing all of that. Um, and then basically. You, you, you just take it and just try to answer those questions.

Those are the biggest things, um, that we, that we try to approach it. Just, just look, listen to the visitors data. Um, beyond that, it's it, it doesn't need to be more complex than that. I just simplify it down to answering the question.

Kausambi: Yeah. And it doesn't matter, like CRO is not, I think a lot of times people, when they think of CRO, they're thinking split test, Debbie does this, that, and they're thinking of font.

Right. And we teach you, I love it. That you're abstracting it out, but it's the simple point of understanding why, and it doesn't matter.

Justin: Yeah. Conversion rate optimization is actually less about the conversion rate than it is about understanding those visitors. And then if you're using that. Optimization is not the same as split testing.

Split testing is just the vehicle at which we use to prove or disprove whether we're right or not. Yeah. Yep. Um, all right. Uh, two more questions. Uh, three more.

Kausambi: All right.  

Running Experiments and Statistical Significance

Question: And about the brands approach to attract and convert shoppers. Do you suggest we all run experiments to statistical significance?

Justin: Okay. I knew that question was coming. Um, so statistical significance, it is a very loaded metric too. You can run a test for a million visitors and not, um, not, not get statistical significance or confidence on a test. And you'll drive yourself nuts, trying to get there just sometimes they just will not, you might see 30, 40, 50% and then it just stops.

So that just means that the test is not of high impact or leverage enough, um, to make that bigger overall impact. So that means you probably just need to cut your losses on it or whatever, learn from what you can learn from it and move on. Um, So we look at tests from several different factors. Obviously, statistical confidence is a good one.

Um, whenever you can reach it, but we also look at it for trend in the data we look at for, um, sample size. We look at secondary metrics as it pertains to the primaries. And just looking at that from a bigger picture perspective. Cause sometimes you can call the test without necessarily reaching statistical confidence.

And it also depends on which tool you're using. Um, some are very conservative, some are more liberal in how they calculate that algorithm on their statistical confidence. So then we'll take it and we'll pull it out. Um, and you know, we'll, we'll extrapolate that data and we'll actually plug it into a secondary calculation to see how close we actually are from an overall, uh, perspective, but there's multiple different pieces to it than just running into confidence.

But I try to run it to 95 or at least 90, uh, whenever possible.

Kausambi: Alright. I hope that answers your question.

Data Stack for Experimentation

Question: Uh, what's the data stack that is suggest for a brand to start, um, looking at for experimentation.

Justin: Um, get your analytics in order first and foremost, get enhanced. E-commerce in play. So you're tracking shopping behavior and product behavior and those product performance and those kinds of things.

Um, shopping behavior is going to tell you majority of what you need to know. Um, but from there just a basic, if you're just starting. Um, just a basic heatmap solution. I mean, I wouldn't, over-complicate it then the next plugin isn't going to be the next plugin. Um, yeah, I mean there's yeah, you just don't over a complicated  type plugin. So, I mean, there's, there's so many Shopify apps out there. Um, I mean there, there really is just, and people get into that. They get buttoned, click happy and be like, oh, so-and-so said this was a great app. Let's plug it into here. But the chances are, it's probably hurting you more than it's helping you.

It just because it slows down sites. You know, it does one, it does 10 things and you only need to do one. So don't get over. Um, don't fall for the traps when it's particular, when you're first starting out of needing all of the apps and needing all of the, you know, a special theme or anything like that.

Literally I tell people, start with the debut theme and customize it based on your, your, your information, um, that you need for your visit.

Kausambi: Yeah. Yeah, yeah. Either it's like an app that that's done 10 things and you need only one or the opposite, which is a bigger problem is that you have two different apps running, very similar things.

And then you're like confused about the data. I've seen that so much yet. Totally. Totally hear you. The final question. I think we already covered that and we're also beyond the time I can. I know what should an early e-commerce. Uh, needs to focus on, is it qualitative quantitative approach? And I think we got the answer.

It's all about the answer to the question. Why, and even if it's a qualitative, we are good.

Justin: So, just, just start somewhere. I don't care. Just listened to your visitors. I don't care what approach you take. Just don't make, don't make marketing decisions based on one emotion or two that you heard it somewhere.

Meaning you saw it on such and such site and they said it converted really well. It's probably not going to work for you at the same time. Um, so just be mindful of that. Your visitors are the holy grail. So they're going to tell you exactly where you're screwing up. If you ask them the right way.

Kausambi: And with that.

Thank you so much for that, but that's not to be around in this week. Thank you so much yesterday. Thanks everyone who was here. We actually kept you a quiet beyond the time limit that we promise, but that, but that just shows everybody's very excited about, uh, to ask you the questions. Uh, thank you all.

And yeah, that's it!