Text transcript

Fireside chat with Manny Medina: AI Agent-Fueled Monetization

Held February 11–13
Disclaimer: This transcript was created using AI
  • 338
    00:58:59.680 –> 00:59:06.709
    Julia Nimchinski: Jason is a Gtm. Leader hailing from Amazon, Google and Oracle, and all the greatest.

    339
    00:59:07.730 –> 00:59:14.469
    Julia Nimchinski: many super excited to have you on the show. How have you been? What’s new? What’s up with the AI monetization.

    340
    00:59:17.710 –> 00:59:19.509
    Manny Medina: Jason, do you wanna you wanna go first.st

    341
    00:59:20.604 –> 00:59:22.939
    Jason Napieralski: No, no, you please.

  • 342
    00:59:24.260 –> 00:59:50.439
    Manny Medina: What’s new? I mean, what is not new every day there is something new like it’s it’s actually kind of hard to keep up. I just noticed there was an article on the Wall Street Journal. I’m actually kind of late. So this is going to be rehashing something that happened in February first, st as opposed to 10 days ago, that the vendors that are selling agents or co-pilots have mispriced them.

    343
    00:59:50.640 –> 00:59:55.690
    Manny Medina: and they have failed to align their offerings to

    344
    00:59:55.880 –> 01:00:09.049
    Manny Medina: to delivering customer value or aligning. You know what the charge to what the customer is getting for it. So they’re getting a lot of bytes. A lot of Poc pilots and very little actually, revenue attraction. So even though okay, so

    345
    01:00:09.410 –> 01:00:13.830
    Manny Medina: clearly, Microsoft is killing it because the numbers are going up and to the right, and and they’re

    346
    01:00:13.970 –> 01:00:16.589
    Manny Medina: their their co-pilot sales are up.

  • 347
    01:00:17.130 –> 01:00:41.900
    Manny Medina: But you know, I saw that back in my day when I was working at General Motors, where there was this thing called stuffing the channel. When you you put a lot of revenue into the Channel pre-sale, you know, based on like bundling and based on client access licenses that have everything in in, you know, like a big, this big soft, such a stuff in it. And then they just don’t get used, and what happens is now you have a big churn event at the end of the year.

    348
    01:00:41.900 –> 01:00:52.679
    Manny Medina: so I feel like we’re still sort of like feeling out the elephant of this, or what is agents? And what is that it does for you? And how do you pricing? How do you bring it to market. How do you add the value

    349
    01:00:52.680 –> 01:01:09.519
    Manny Medina: that unless you take some risks, and unless you actually go and talk to your customers and ask them, How would you like this thing to be priced? How is it delivering value for you? And then custom? Write that contract. It’s going to be a slog like trying to fit this into your old cute

  • 350
    01:01:09.700 –> 01:01:37.300
    Manny Medina: Cpq. And trying to make sure that it fits in one of your packages and like, make sure that your skews don’t break is nonsense like. At this point we should be experimenting with pricing, experimenting with messaging, and just going and getting closer to customers as opposed to like trying to scale. Anyway, I’m getting this all spun up in my rant. So I’m going to stop right here and let you go in with more sensical stuff.

    351
    01:01:38.010 –> 01:01:53.950
    Jason Napieralski: So, Banny, it’s really nice to meet you. By the way, we haven’t met before. And your your platform, your software is very impressive. I take a look at it. It was. I think I might want to use it myself. So you’ve done an amazing job. I think you’ve you’ve hit the head on on kind of automating.

    352
    01:01:54.070 –> 01:02:05.659
    Jason Napieralski: helping automate the busy work from salespeople, which I think is a really, you know, not a noble idea. And I think what we should all achieve, because Salesforce has destroyed our lives for way too long.

    353
    01:02:05.660 –> 01:02:06.190
    Manny Medina: That’s right.

  • 354
    01:02:06.190 –> 01:02:11.259
    Jason Napieralski: There’s nothing I hate more than salesforce and everything that it touches. It’s like the death star to me.

    355
    01:02:11.992 –> 01:02:37.739
    Jason Napieralski: So I think they get, you know, great. But 1st of all, kudos to you, and just from my in my background. So I I’m I’ve been around big tech for a long time. I’ve kind of done a lot of tours in in the different big tech companies. You know, I’ve worked for Oracle, which does exactly what you described from a pricing perspective as their number as their main go to market motion, which is basically dump all the software on the customers and then sue them for using it after the contract.

    356
    01:02:37.810 –> 01:02:46.010
    Jason Napieralski: And that’s how they make their model. But I think what you’re saying is really important, because I do believe that

    357
    01:02:46.290 –> 01:02:51.654
    Jason Napieralski: we don’t understand us in tech, especially in Enterprise Tech, which is kind of my expertise.

  • 358
    01:02:52.180 –> 01:02:56.329
    Jason Napieralski: we’re kind of screwed right now as far as the Enterprise tech business.

    359
    01:02:58.170 –> 01:03:06.369
    Jason Napieralski: We haven’t provided incremental value to the customer since 1995 since the Internet came out. There’s not the product.

    360
    01:03:06.370 –> 01:03:07.279
    Manny Medina: 100%.

    361
    01:03:07.280 –> 01:03:08.340
    Jason Napieralski: Down, not up.

    362
    01:03:08.340 –> 01:03:09.050
    Manny Medina: Oh, yeah.

    363
    01:03:09.510 –> 01:03:35.899
    Jason Napieralski: And and I believe that the pricing models are artificial. And basically there’s been a large club at the top of the top of the Enterprise market that has sold all their garbage to each other at high margins, high prices, they, all the customers and the owners of the tech companies all go to the same clubs, all live in the same places, all live in Atherton on the same street, and.

  • 364
    01:03:35.900 –> 01:03:37.290
    Manny Medina: They just.

    365
    01:03:37.510 –> 01:03:47.929
    Jason Napieralski: They just pet each other and give it, give each other giant contracts, and then deliver that to the users, who have no idea what the heck to do with this stuff.

    366
    01:03:48.620 –> 01:03:54.950
    Jason Napieralski: They they have these old outdated processes. It doesn’t matter. They just drop it right down on them, and they say good luck.

    367
    01:03:55.200 –> 01:04:22.639
    Jason Napieralski: and the pricing model is nonsensical because it doesn’t matter. It’s a Payola from one Atherton guy on the corner to the guy at the court on the other side. So it’s all. It’s been artificial, and it’s been horrible for a really long time. AI is showing its complete lack of scalability on that process. So that way of doing things. Because AI is allowing you to look outside of the Enterprise software suite and matter of fact, requiring it

    368
    01:04:22.670 –> 01:04:34.050
    Jason Napieralski: and them just tacking it on and then adding a subscription price or an extra extra usage price is not something that that people are to your point earlier is not gonna people are going to see value of that.

    369
    01:04:34.410 –> 01:04:51.400
    Jason Napieralski: And the entire thing breaks because of my favorite equation. This is from Michael Hammers, the kind of the founder of business process engineering. But my favorite quote is Nt plus OP equals eop, and that is new. Technology plus old process equals expensive old process.

  • 370
    01:04:51.740 –> 01:05:17.419
    Jason Napieralski: And you know what I see happening in the market is the numbers that you’re seeing that you’re talking about. Yeah, they’re killing it. The AI stuff is killing it. But it’s people with Fomo that are desperately trying to get as many tools into the system as possible, and they’re throwing spaghetti at the wall and hoping it sticks, and they’re doing that nt plus OP equals eop at scale. And AI just makes them screw it up faster

    371
    01:05:17.530 –> 01:05:44.940
    Jason Napieralski: and with more spectacular results. So I think you’re right just from the opening there, I think you’re on to something, but I think it’s a bigger. It’s a bigger case. And also, I think, and I made this comment on Linkedin. Not many. I think I had, like thousands of impressions, 2 comments or 2 reactions to the thousands of impressions article basically me saying that 70 to 80% of all enterprise software employees could go.

    372
    01:05:45.000 –> 01:06:04.670
    Jason Napieralski: And I think a lot of people are afraid to like that. But I think Elon has shown us the truth of this, and I do believe that the AI, you know, kind of revolution is not. They’re not looking at it like, Hey, I actually have an AI agent that can replace this guy. And I can fire this guy with this agent.

    373
    01:06:04.690 –> 01:06:17.649
    Jason Napieralski: But what they’re looking at is like, Hey, we had. We are so inefficient. We make sales, 80% margins. We can cut as many people as we want, and we can just blame it on AI. And that’s what’s really going on rather than people being replaced by AI.

    374
    01:06:18.300 –> 01:06:23.949
    Manny Medina: A 100%. I think that when you double click on what you just said, Jason, it’s pretty powerful.

    375
    01:06:24.400 –> 01:06:26.610
    Manny Medina: because just take an fte.

    376
    01:06:26.720 –> 01:06:40.500
    Manny Medina: you know. And just let’s make the math simple. So an fte is a hundred $1,000, just to make again the math simple. But for every $100,000 of Fte. You have another. I don’t know $100 per month of Crm.

    377
    01:06:40.720 –> 01:06:50.709
    Manny Medina: You know. $50 per month of erp, another $100 per month of payroll another, you know, they decide to buy notion, and whatever the hell tool that is their pet thing.

  • 378
    01:06:50.710 –> 01:07:12.609
    Manny Medina: and then for every one fte for every 6 ftes. Oh, you need a manager. So you got a 6 of a manager. The manager is 200,006, you know. 1 6 is 200,000 is like another, you know, 50,000 or so, and then you have like. And for every you know, for every 60 of these guys, there is an Hr, you know, business leader, and that, you know. And

    379
    01:07:12.610 –> 01:07:32.620
    Manny Medina: so what it turns out, is $100,000 of a human being. You end up with almost $75,000 of additional cost between software and other people, and like your stipend, and whatever the hell that actually adds up to a lot more. So when you say, Look, I’m going to take 30% of that role. And I’m just going to automate it. And I’m just going to make it. AI

    380
    01:07:32.830 –> 01:07:42.299
    Manny Medina: like the 30%. It’s not just 30% of our role. It’s that entire iceberg of like nonsense software and processing like blah blah blah that you bought that poof!

    381
    01:07:42.840 –> 01:07:46.020
    Manny Medina: Free cash flow, and who doesn’t like that?

    382
    01:07:46.550 –> 01:08:16.099
    Manny Medina: It’s free money, and it’s just all around you. You see what I mean. So what I’m seeing I just spent like months and months and months talking to AI agent companies going around and just finding buckets of people, because it’s so easy to say, like to find this group of individuals and say, Look, I’m not going to fire anybody, but I’m going to retain you, and not expand and free up all this cash. You know what I mean from not having to do this nonsense work, and from having, you know managers and other things that actually drive up costs. So you’re 100% right. And this train has left the station.

    383
    01:08:16.310 –> 01:08:31.769
    Jason Napieralski: Yeah, exactly. And and one of there’s a there’s a guy that I know. That does. He’s really early on. AI actually understands the large language models and how they’re built and how to program it. So he’s like a real AI guy. Unlike the rest of us who say we know AI, but basically, we can type prompts into chat.

    384
    01:08:32.550 –> 01:08:32.930
    Jason Napieralski: We.

    385
    01:08:32.930 –> 01:08:33.810
    Manny Medina: We’re we’re Monday morning.

    386
    01:08:33.819 –> 01:08:35.229
    Jason Napieralski: We don’t type.

    387
    01:08:37.000 –> 01:08:56.180
    Jason Napieralski: Anyway. This guy his name is Jason Wells. Really good Guy. Follow him on Linkedin. He’s got a lot of good things, and he focuses on a, but he focuses on the kind of the small problem to fix to automate the problem that is known right? That you’re already doing. Rpa, on that you’ve got a repeatable process. That is where you focus your AI attention.

    388
    01:08:56.180 –> 01:09:19.699
    Jason Napieralski: not at this higher level kind of pseudo agi, which is artificial general intelligence, which is what we’re trying to. We’re talking about with this. These these artificial agents, right? And you know, he said, that and this really stuck with me closely, says anything you could outsource to a lower cost region. You know India or whatever Philippines you can. You can automate with AI,

    389
    01:09:20.200 –> 01:09:26.120
    Jason Napieralski: and that alone is a tremendous opportunity to cut massive amounts of staff

  • 390
    01:09:26.240 –> 01:09:55.220
    Jason Napieralski: and to really be simple. If you have to explain something very simply with the steps, and you have your process around. You could use AI to automate that. And I think that that is the right use case. That’s the right way to go. What I think is the wrong way to go, and a very wrong way to go is to try to automate your frontline people who touch your customers with AI. That should be the very, very last thing you do. Once you’ve automated everything and your CEO is an Agi. Then you should start automating that

    391
    01:09:55.410 –> 01:10:19.520
    Jason Napieralski: because what’s what’s happening is as we try to do this, it really sucks. It’s stupid like these AI agents like 3 questions in. They’re they’re completely befuddled. And you could tell, it’s AI right away. And I believe. And I’ve said this before on these things is that as we interact more with AI. We are all building antibodies

    392
    01:10:19.600 –> 01:10:35.829
    Jason Napieralski: towards it. We are starting to recognize what AI article looks like versus another one, but it’s deep. You can’t really point it out, maybe necessarily. But as we continue to get infected with AI, we’re going to start to build antibodies to defend against it.

    393
    01:10:36.070 –> 01:10:37.030
    Jason Napieralski: So

    394
    01:10:37.080 –> 01:10:59.030
    Jason Napieralski: when you’re when you’re talking about an agent, what you’re going to end up with, if you if you because the other on the other side, on the procurement side, they’re also using AI to make selections. So what what you’ll end up with in a very short order is your your genius, AI agent, talking to the procurement. AI agent and the 2 bots will talk together, and never a customer is touched.

    395
    01:10:59.030 –> 01:11:08.719
    Jason Napieralski: and you know a purchase will be made. Maybe maybe not. But there’ll be a lot of AI credits being sold as these idiots talk, or these idiot bots talk to each other about how

    396
    01:11:08.720 –> 01:11:09.810
    Jason Napieralski: how to do things.

    397
    01:11:09.810 –> 01:11:14.039
    Manny Medina: So, Jason, where are you? In the whole Aisdr craze.

    398
    01:11:14.040 –> 01:11:14.679
    Jason Napieralski: Exactly what I’m saying.

    399
    01:11:14.680 –> 01:11:16.500
    Manny Medina: Buying, selling, or why are you like? Are you.

    400
    01:11:16.500 –> 01:11:18.500
    Jason Napieralski: I think that

    401
    01:11:18.730 –> 01:11:26.409
    Jason Napieralski: if you remove, for one thing, Amazon taught me one thing that I’ve kept from them ever since, and that is customer obsession.

    402
    01:11:26.810 –> 01:11:38.790
    Jason Napieralski: and that if you focus on the customer and you work backwards from the customer, success will come, and I think they’ve proven that in droves and companies that have thought in that direction. Success has come, and

    403
    01:11:39.010 –> 01:11:50.620
    Jason Napieralski: where I think everybody’s being lost here is that when they’re deciding on how to use it implement AI like an AI Sdr is, they’re not working backwards from the customer. They’re working backwards from the Cfo.

    404
    01:11:50.810 –> 01:12:02.850
    Jason Napieralski: and that is the absolute worst place to work backwards from in order to get your customers to be delighted with your product. And I also believe that there’s a lot of interaction that takes place with your customer.

    405
    01:12:03.260 –> 01:12:16.979
    Jason Napieralski: one on one individual interaction, that you that is nonverbal, that is, that is subtle, that is, you know, just building a relationship that an AI. Sdr. Is just never going to do. It’s just never going to happen.

    406
    01:12:17.190 –> 01:12:36.270
    Jason Napieralski: And, by the way, one of the things is like, we all think AI is here. It’s not even AI. It’s not artificial intelligence, it’s machine learning and deep learning, which is, by the way, a dead end. It will never get to Agi in our current way. It’s not scalable. We can’t do it.

    407
    01:12:36.460 –> 01:12:40.849
    Jason Napieralski: and it has the awareness of a house cat less than awareness of a House cat.

    408
    01:12:41.160 –> 01:12:57.480
    Jason Napieralski: So if maybe if you’re selling a scam product like if you’re selling something you’re selling that you know, tech support for the Microsoft virus that you implanted on somebody, or you’re selling. You’re trying to get grandma to give you her retirement money. AI agents are great.

    409
    01:12:57.860 –> 01:13:13.310
    Jason Napieralski: but if you’re trying to make an enterprise sale. You lose tremendous amount of knowledge and ability for that and relationship that you could build with a human. And I think it’s I think it’s a long way away from being something that would be useful.

  • 410
    01:13:13.690 –> 01:13:15.229
    Jason Napieralski: I think the opposite will happen.

    411
    01:13:16.170 –> 01:13:18.552
    Manny Medina: Yeah, I think that you’re seeing

    412
    01:13:19.720 –> 01:13:26.890
    Manny Medina: So in my, in my experience, Jason, I think I think you’re seeing a lot of a lot of blowback from misuse. 100%. You are

    413
    01:13:28.300 –> 01:13:45.270
    Manny Medina: And yet, but you know what’s interesting. Okay? So I’ve been at this game for for a long time. So maybe I haven’t worked in all the greatest like you have like, it seems like you Oracle and Google and Amazon. So you got all the merit patches. I only I only work at Outreach for the last 15 years.

    414
    01:13:46.440 –> 01:13:48.879
    Jason Napieralski: I had to bleed for each one of those merit patches.

    415
    01:13:50.210 –> 01:14:07.580
    Manny Medina: But I tell you, because you know, the Sdr role was a small role when I started right. And and then and you have seen a backwards and forwards of, you know. Does the Za own their pipeline, or do they not own their pipeline, you know who who’s responsible for for Pg.

    416
    01:14:07.960 –> 01:14:11.525
    Manny Medina: and and what I tell you is the following is that is that

    417
    01:14:12.910 –> 01:14:14.399
    Manny Medina: It’s a hard job.

    418
    01:14:15.040 –> 01:14:26.059
    Manny Medina: and it’s it’s hard to, you know the job is to call is to make, you know, 20 calls and get told no 15 times, you know, before you get a yes or or a maybe.

    419
    01:14:26.410 –> 01:14:27.709
    Manny Medina: and and

    420
    01:14:27.940 –> 01:14:34.789
    Manny Medina: but and I don’t think AI shortcuts that you see what I mean, because the reason somebody tells you no, it’s not because they don’t like you.

    421
    01:14:34.980 –> 01:14:45.039
    Manny Medina: it’s because it’s at the wrong time. It’s the wrong product. You’re the wrong person, you know. This is not my top of mind priority, you know. Bad product market, fit a number of things and.

    422
    01:14:45.040 –> 01:14:47.590
    Jason Napieralski: For now, right it’s not no forever. It’s no for.

    423
    01:14:47.590 –> 01:15:13.599
    Manny Medina: 100% and like what? What? So the part I don’t like is that it’s taking these reps that these repetitions that every sales rep needs to do away from the rep, so it’s depriving them from the benefit of being told. No, you don’t become a sales. Rep a great sales, rep. If you don’t go through this, you see what I mean. So the AI Sdr, you know, concept.

    424
    01:15:13.670 –> 01:15:19.269
    Manny Medina: it’s sort of robbing a little bit. People from their childhood. You see what I mean from like the true experience.

    425
    01:15:19.290 –> 01:15:19.970
    Jason Napieralski: Yes.

    426
    01:15:19.970 –> 01:15:44.450
    Manny Medina: Developing their calluses, you know, developing that mental toughness of how do you turn a Noah around when somebody tells you I don’t have time right now, and you tell them, of course you don’t, because you’re busy. That’s why I called you so that we can talk tomorrow about this thing that is really important to you, because I found out from talking to your peer that this particular pitch will resonate with you, so that you don’t get that by not knowing you see what I mean.

    427
    01:15:44.450 –> 01:15:49.560
    Manny Medina: But now the AI Sdrs, I think, could do have a place in.

    428
    01:15:49.690 –> 01:15:51.150
    Manny Medina: you know, create kind of relevant.

    429
    01:15:51.150 –> 01:15:55.509
    Jason Napieralski: Pause. Right there. Can I pause you one second? I just want to add a couple of things before I forget it, because then.

    430
    01:15:55.510 –> 01:15:56.249
    Manny Medina: Yeah, yeah, yeah, no.

  • 431
    01:15:56.250 –> 01:16:19.720
    Jason Napieralski: They’ll fall right out of my mind. So what you’re I think you’re absolutely on the No. And one of the main things that I do whenever I come in as a consultant or I’m running a new sales organization is I train the reps to get me no’s, and that is the metric that I track is how many you go get me 5 no’s today. I don’t tell them to get me 5 sales today. I tell them to go get me 5 no’s today.

    432
    01:16:19.780 –> 01:16:31.170
    Jason Napieralski: And what you find is. And it’s amazing because I’ve done this side by side with salespeople millions of times. At this point. It is really hard to get somebody to tell you flat out. No, you really have to push it.

    433
    01:16:31.170 –> 01:16:34.490
    Manny Medina: America is such a nice society, isn’t it? Like everybody.

    434
    01:16:34.490 –> 01:16:46.950
    Jason Napieralski: And everybody has this fallacy that they have something that they already have, that they don’t want to lose. So they’re not fearless and asking questions. They’re not fear. They have all this fear. And they have all this kind of like, I’m going to lose this account.

    435
    01:16:47.080 –> 01:17:04.419
    Jason Napieralski: and that’s a simple factor of not having enough accounts that you’re working on right 1st of all, but getting the no is absolutely difficult, and the other thing I always train everybody on, which is the simplest thing in the world, and everybody should use it is that is the 5 whys.

    436
    01:17:04.590 –> 01:17:28.629
    Jason Napieralski: And that’s just simply asking why? 5 times. And you don’t just ask why? Why? Why? Why, why, although you could. And it actually works just by being a child. But you could be more, you know, subtle about it, and ask, why did you say this? Why did you do that? And that is something that is so far away from AI that I can’t even imagine that it would be something that could be, you know, because you’re trying to get to the truth. And you can watch the people’s face.

    437
    01:17:29.083 –> 01:17:42.219
    Jason Napieralski: You know, kind of change from the the protection right of like. I have my answers to the 1st 3. Wise. I’ve said it to the last 7 reps I’ve talked to I will say it again, and I will be do it. And then when you get to the 4th one, they’re like.

    438
    01:17:42.490 –> 01:18:03.410
    Jason Napieralski: and they start to feel. And now you’re talking about something, and then that is, when a trained rep, who’s had the reps? Who had the things will go into. What is the emotional pain that you’re doing? Because people buy on pain, they don’t buy on pleasure. And if you don’t uncover the emotional pain. All you’re doing is you’re you’re a line on a spreadsheet. You’re not going it. You’re not trying to solve their problems.

    439
    01:18:03.410 –> 01:18:05.730
    Manny Medina: Yeah, a hundred, a hundred percent. And and.

    440
    01:18:05.730 –> 01:18:08.909
    Jason Napieralski: Our automation will just run over all of them.

    441
    01:18:08.910 –> 01:18:22.609
    Manny Medina: A 100%. And so my take on this. And and again, like, you know, I have a little bit of skin in the game because I’m a big. I’m a big shareholder in outreach. I invented a lot of you know what is happening out there

    442
    01:18:22.780 –> 01:18:45.989
    Manny Medina: is that it’s not a replacement. And we didn’t mean it to be a replacement like it was never meant to be a replacement. It’s like, you know, we invaded. We invented email sequences, and people immediately loaded up, like all their crap shit, into their sequencing, and then just let that thing rip and damage their main reputation, and they blame outreach for it, I’m like, Come on, guys like that didn’t work before.

    443
    01:18:46.090 –> 01:18:49.290
    Manny Medina: What the hell made you think that that’s gonna work? Now, you know.

    444
    01:18:49.290 –> 01:18:50.719
    Jason Napieralski: That’s OP, right?

    445
    01:18:50.720 –> 01:18:55.389
    Manny Medina: Mass messaging like didn’t work before that shit what I work now, either.

    446
    01:18:55.530 –> 01:19:06.519
    Manny Medina: You see what I mean so like, and and it took us a while to get the right, you know, framework in place for everybody to do it. But AI is going through exactly the same thing. Yeah.

    447
    01:19:06.880 –> 01:19:07.520
    Manny Medina: is.

    448
    01:19:07.520 –> 01:19:13.640
    Jason Napieralski: Analogy I use for that I like is like is like, say, you know, of course we’re in political

    449
    01:19:14.420 –> 01:19:15.989
    Jason Napieralski: right, like every every you go.

    450
    01:19:15.990 –> 01:19:17.760
    Manny Medina: Let her rip. Let her rip. Here.

    451
    01:19:17.760 –> 01:19:24.360
    Jason Napieralski: Making a political decision at this point, like, no matter what you do like, which sock do I put on first? st am IA right one or a left one. I don’t know.

    452
    01:19:24.360 –> 01:19:29.639
    Jason Napieralski: I’ll text her just everywhere, everywhere. Yes.

    453
    01:19:29.640 –> 01:19:45.539
    Jason Napieralski: So you know the way the analogy I look for the AI, for people like you’re just talking about, who take their broken processes and bad messages that haven’t been tested with real people, and they just blast it out at thousands of degrees. It’s like somebody having an political argument with you, and you don’t agree.

    454
    01:19:45.540 –> 01:20:10.760
    Jason Napieralski: And instead of like changing tact, all they do is start yelling the argument at you, which is what happens, and that is the equivalent of what’s happening with the AI messaging, which, already wrong, already bad, already poor, and instead of just like having it trickle out with people now they’re screaming it out. And if if you’ve ever been in a political argument, and I’m sure you have, because we live in now, and somebody starts yelling at you.

    455
    01:20:11.100 –> 01:20:22.609
    Jason Napieralski: That is not the path to getting you to convince to their side right. And then the same thing works with this. It’s like you cannot brute. Force your prospects into getting, you know, getting this like stuff going.

    456
    01:20:22.610 –> 01:20:28.216
    Manny Medina: 100%, a hundred percent. So let me ask a question that my man Dustin Brown asked. Here in the chat is

    457
    01:20:28.690 –> 01:20:32.190
    Manny Medina: what? How do you like? What happens when you’re you’re selling

    458
    01:20:32.420 –> 01:20:41.089
    Manny Medina: AI agents like this for a minute, like you are. You are in the business of selling back end, you know, back office AI agents. And you’re selling these things.

    459
    01:20:41.350 –> 01:20:47.359
    Manny Medina: What happens to sales? Comp, what happens to seed pricing what happens to the whole, to the whole construct of Saas.

    460
    01:20:47.650 –> 01:20:52.549
    Manny Medina: And and so this is my take, and I want to get your your reaction. Jason, is that

    461
    01:20:52.700 –> 01:21:04.979
    Manny Medina: I think in 5 years. The Sas model is just going to go away, and the reason it’s going to go away is that every Saas company will have an AI agent equivalent that is either done by themselves

    462
    01:21:05.120 –> 01:21:24.509
    Manny Medina: or done by somebody else, so either they stay in the game by disrupting the sales, coming up on an agent moving from off seats and on a per agent per success basis, or somebody else is going to come. Do it for them. And then what does the rep. You know. How does the rep compensation change in the same way that you sold seats? And now you’re going to sell agents.

    463
    01:21:24.920 –> 01:21:25.620
    Jason Napieralski: Right.

    464
    01:21:25.620 –> 01:21:26.440
    Manny Medina: No different.

    465
    01:21:26.900 –> 01:21:27.500
    Manny Medina: But we’re.

    466
    01:21:27.500 –> 01:21:44.769
    Jason Napieralski: So I I think I honestly believe that the Sas pricing model right now is completely and utterly broken, and needs to be thrown away completely that there is. There is 0 tie to customer success in 99% of the Sas software that’s sold out there today.

    467
    01:21:44.770 –> 01:21:45.530
    Manny Medina: 100%.

    468
    01:21:45.530 –> 01:22:05.490
    Jason Napieralski: And I truly believe that if you so, what does a company want to do? This is kind of the first.st So you take a step back. A company wants to make revenue by selling their products right? How do they sell their products? They sell their products with happy customers that solve their problems, the more that they provide value to their customer.

    469
    01:22:05.720 –> 01:22:10.909
    Jason Napieralski: the more that they can charge for their product. And the more people want their product right. It’s just simple, simple stuff, right?

    470
    01:22:11.500 –> 01:22:14.550
    Jason Napieralski: Unbelievably. That completely is forgotten by all.

    471
    01:22:16.120 –> 01:22:39.220
    Jason Napieralski: Not give a crap about any of that stuff like I said that sale is being made at the Yellowstone Club in Montana. It’s with with one CEO who knows the other, CEO. And it’s not about solving problems. It’s not about that, and so none of it. There’s no metrics set for it. There’s no there’s no skin in the game on either side, and if you have a broken old process. Salesforce says, Hell, yeah, that’s 3 years of consulting, too.

    472
    01:22:39.220 –> 01:22:40.300
    Manny Medina: Yeah, yeah, they’re not.

    473
    01:22:40.300 –> 01:22:43.939
    Manny Medina: Everybody wins. If they win, the only loser is you.

    474
    01:22:43.940 –> 01:22:50.459
    Jason Napieralski: Yes, exactly right. Who loses is well, who loses, who really loses is the person, the sucker that has to use salesforce every day.

    475
    01:22:50.460 –> 01:22:51.080
    Manny Medina: 100%.

    476
    01:22:51.080 –> 01:23:10.750
    Jason Napieralski: Have to go in there and enter the stuff or whatever garbage software is sold. So I truly believe that if we so always incentives are drive outcomes right and misaligned incentives drive misaligned outcomes. We look at our government and the Doge. We obviously are seeing a huge amount of misaligned incentives right.

    477
    01:23:10.750 –> 01:23:11.430
    Manny Medina: Hmm.

    478
    01:23:11.430 –> 01:23:21.799
    Jason Napieralski: If we could align our software businesses with the outcomes of the customer. One is, we have to understand the customer enough to understand what their outcomes are.

    479
    01:23:21.980 –> 01:23:24.880
    Jason Napieralski: That right alone is a huge sea change.

    480
    01:23:25.410 –> 01:23:26.660
    Jason Napieralski: We aren’t doing it.

    481
    01:23:26.660 –> 01:23:32.250
    Manny Medina: What like? What? What have we been doing for the past 20 years, brother like, come on like, isn’t that the job

    482
    01:23:32.250 –> 01:23:32.565
    Manny Medina: right?

    483
    01:23:32.880 –> 01:23:41.299
    Manny Medina: We got one thing going on, and that is to get in our customers business and understand what’s going on for them, and try to align our pricing and our pot and our software to them. Now.

    484
    01:23:41.510 –> 01:24:07.300
    Jason Napieralski: Yes, and we haven’t, and the opposite is true. And now and now I think it’s getting even worse, and I think there will be a giant pushback, because I am launching my new consulting company. By the way, it’s called the Alpha Initiative, and the idea of the consulting company is to bring people back to basics, to bring a meritocracy back to companies, to bring innovation back and to forget about all this noisy garbage that has cost companies, you know, tremendous amounts of stuff.

    485
    01:24:07.400 –> 01:24:15.629
    Jason Napieralski: So as I’m doing that, though the point of the saying is, is a subtle promotion, too. But the the point of it.

    486
    01:24:17.690 –> 01:24:40.630
    Jason Napieralski: I’m looking at all the software for my company to start off with, and everything has this hidden AI charge in it. Everything, everything you subscribe for 40 bucks a month. Okay, that’s reasonable. I can do that. I’m just starting up. And then you you start to use. Oh, use our new AI feature. All right. I type in one prompt. You’ve used one credit. You have 17 credits left.

    487
    01:24:40.640 –> 01:25:08.389
    Jason Napieralski: Buy the pro plus package for 3,000 more credits, and then you will get this, and I think people are already up to their eyeballs, and businesses are more that, and subscriptions to begin with, to have this extra charge. I mean, the Fomo is going to come soon, that they’re going to realize that. Hey? Wait a minute. This isn’t really helping me. I don’t need this extra charge, and the whole model of paying that extra 40 bucks plus this charge doesn’t make any sense. It’s not tied to any outcome.

    488
    01:25:08.470 –> 01:25:18.380
    Jason Napieralski: So they’ve got like 15 different pricing models that they’re trying to cram down on the enterprise in order to buy this stuff, and none of it has any connection to the outcomes of their business.

    489
    01:25:18.530 –> 01:25:25.310
    Jason Napieralski: So without changing that we’re just rearranging deck chairs on the Titanic, and we’re trying to fleece our customer as much as we can.

    490
    01:25:25.790 –> 01:25:32.109
    Manny Medina: So what do you make of the argument? That outcome-based pricing is too complicated for enterprise buyers, what do you make of that.

    491
    01:25:32.390 –> 01:25:38.619
    Jason Napieralski: I go. That that’s that’s told. Somebody that says that has no understanding of that customer. They have no.

    492
    01:25:38.620 –> 01:25:39.839
    Manny Medina: Understanding it.

    493
    01:25:40.450 –> 01:25:43.119
    Jason Napieralski: They? They are admitting that they don’t know what their customer is.

    494
    01:25:43.120 –> 01:25:44.239
    Manny Medina: A 100%. Yeah.

    495
    01:25:44.240 –> 01:26:13.090
    Jason Napieralski: It’s a 1 to one relationship that you need to build with your customer. Yes, you need to scale it. But you need to understand that business. Now the good news is is that once you understand one industry, most businesses kind of act the same in the same industry. So you don’t have to constantly be reinventing the wheel on defining outcomes you can have, you can get to an 80% per industry of what the outcomes may be, and then tweak it to price it out that way. But a lot of people. So the whole platform thing kind of ruined

    496
    01:26:13.150 –> 01:26:27.310
    Jason Napieralski: the whole idea of what a business is. So it’s like, I’m building a platform I’m not building. So then you can be absolved of talking about industries. You can be absolved and talking about business outcomes, you can resolve it because I’m building a platform. It’s a platform. So it could be anything which means it’s nothing.

    497
    01:26:27.390 –> 01:26:45.860
    Jason Napieralski: So what the question is, is that, how do you go back to industry? How do you understand your customer? How do you be deeply ingrained in what they need to do and what your software does to solve their problem. You have to understand their problem. And then it’s fairly easy. And it’s really highly connected to the customer’s metrics.

    498
    01:26:45.960 –> 01:26:47.470
    Jason Napieralski: Now their success metrics are like.

    499
    01:26:47.470 –> 01:26:57.970
    Manny Medina: Question from Eric Charles here in the chat. He says it’s too hard and like it’s not going to happen because it’s too hard, so do we? Do we not do it because it’s too hard? Or do we do it because it’s hard is the right thing to do.

    500
    01:26:58.180 –> 01:27:04.480
    Jason Napieralski: If you follow a leader, you follow the leader. That’s what I’m saying. If you want to be the leader, you have to do something different.

    501
    01:27:04.630 –> 01:27:17.549
    Jason Napieralski: And right now, you know, you’ve got a you’ve got a you’ve got a coordinated, you know, basically duopoly triopoly of software companies that are that are in charge of everything making crap. It’s crap.

    502
    01:27:17.860 –> 01:27:31.300
    Jason Napieralski: So if you can disrupt it, if I could say if I could say to you, Okay, don’t sign that contract with Microsoft 3, 65, for you know, for a hundred 1 million dollars instead. Pay me nothing, and just give me a percentage on every sale you make.

    503
    01:27:31.520 –> 01:27:37.960
    Jason Napieralski: but using our software, or something simple like that, it changes the game completely. It changes and like. But.

    504
    01:27:37.960 –> 01:27:43.100
    Manny Medina: Imagine if your cost goes up, their Cfo is happy like when that when has that happened before? Right.

    505
    01:27:43.100 –> 01:27:43.630
    Jason Napieralski: Right.

    506
    01:27:43.630 –> 01:27:48.019
    Manny Medina: Because you’re so aligned that you’re both winning. If you’re paying more.

    507
    01:27:48.200 –> 01:28:11.209
    Jason Napieralski: This is how I sold Cloud. I sold Cloud for startups all day long, saying that the reason that cloud is great for you is that if you get one more customer, it costs you one more unit of compute, and therefore it kind of goes together as long as your margin is the same, and you don’t have a leaky bucket. Your cost will rise with your.

    508
    01:28:11.514 –> 01:28:11.820
    Manny Medina: Consumption!

    509
    01:28:11.820 –> 01:28:26.739
    Jason Napieralski: With your, with your, with your money, and that’s that’s the angle to look at is like, how do you price it? So I always say there’s a should be a 10 x rule. If you’re selling something to somebody particularly, enterprise should be a 10 x rule. My value, I’m providing you is 10 x. What your problem is.

    510
    01:28:27.140 –> 01:28:46.139
    Jason Napieralski: and if you can do that and align the pricing there, it not only aligns the incentives of your entire company towards the customer’s success, but it aligns the incentives of the customers to their success, because now they’re looking at, and you would just be shocked how far away. Everyone has gone from customer obsession.

    511
    01:28:46.250 –> 01:28:59.109
    Jason Napieralski: The the private equity companies and the Cfos run these companies without any any understanding of what their customers are doing at the end result. They’re just looking for the big payout or the debt that buyout.

    512
    01:28:59.110 –> 01:29:03.907
    Manny Medina: A 100%. Alright, brother, it’s being so good to talk to you, Jason, so good to meet you

    513
    01:29:04.160 –> 01:29:05.119
    Jason Napieralski: Yeah. Pleasure.

    514
    01:29:05.120 –> 01:29:15.559
    Manny Medina: And that was a good, real conversation. I wish we had more of those. And Julia back to you. Appreciate you putting us together. That was that was awesome.

    515
    01:29:16.250 –> 01:29:23.050
    Julia Nimchinski: Thank you so much, Manny Jason, our community is raving. This was really good. Let’s do a promotional minute.

    516
    01:29:23.656 –> 01:29:28.049
    Julia Nimchinski: How can we support you, Jason, you mentioned the consultancy, and many.

    517
    01:29:28.420 –> 01:29:28.840
    Jason Napieralski: Yeah.

    518
    01:29:28.840 –> 01:29:30.870
    Julia Nimchinski: Assuming in your company, yeah.

    519
    01:29:30.870 –> 01:29:57.620
    Jason Napieralski: Yeah. So starting a company, I need help right now. I’m still starting it. But I need help, because I’m 1 of the biggest problems I’ve identified with businesses right now is the Hr. Gestapo that’s in charge of most businesses, and I know enough about it being, you know, suffered under their thumb. But I want some insiders that are willing to help me dismantle this giant

    520
    01:29:57.979 –> 01:30:01.210
    Jason Napieralski: format, so I need some some hr pros that

    521
    01:30:01.210 –> 01:30:14.840
    Jason Napieralski: that hate the way things are to help me fix it. So that’s what I’m looking for right now, and it’s just getting started. And if you have interest in improving your culture or innovation and and bringing back meritocracy. Please reach out to me, and we can discuss it.

    522
    01:30:16.090 –> 01:30:20.830
    Manny Medina: Jason, you have an awesome pitch. I don’t know who wouldn’t take you up on that like.

    523
    01:30:20.830 –> 01:30:23.979
    Jason Napieralski: Let’s hope maybe maybe you could be the 1st client.

    524
    01:30:23.980 –> 01:30:30.780
    Manny Medina: Well, I’m not an executive anymore, but if I was, I would definitely be be texting you right now to have a side conversation.

    525
    01:30:31.290 –> 01:30:40.110
    Manny Medina: Me. I’m working on a new project right now. I appreciate everyone just joining, and and for now what you can do

    526
    01:30:40.160 –> 01:31:07.880
    Manny Medina: is just check out outreach. We got a whole new lineup of products that hopefully deliver value for you. We’re very proud that we began with elevating the role of that person was in charge of pipeline. So come in. We have a lot more in store for you. We can replace a bunch of tools in one single solution that does the job, and it doesn’t charge you a ton of money. So we’re excited about what we got going at outreach in our AI offerings.

    527
    01:31:08.680 –> 01:31:18.180
    Manny Medina: But thank you all, and and thank you, Julia and Justin, for putting us together. And Jason, excited, for your entrepreneurship journey. It seems like you’re just getting started.

    528
    01:31:18.180 –> 01:31:19.029
    Jason Napieralski: Get started, yeah.

    529
    01:31:19.030 –> 01:31:24.249
    Manny Medina: On the way up, but you know that’s that you sign up for that. So you know good luck and enjoy it.

    530
    01:31:24.420 –> 01:31:27.269
    Jason Napieralski: Thank you. Thank you. Thanks everybody for joining.

    531
    01:31:27.270 –> 01:31:27.950
    Manny Medina: Thank you.

Table of contents
Watch. Learn. Practice 1:1
Experience personalized coaching with summit speakers on the HSE marketplace.

    Register now

    To attend our exclusive event, please fill out the details below.







    I want to subscribe to all future HSE AI events

    I agree to the HSE’s Privacy Policy
    and Terms of Use *