Text transcript

VC Roundtable — AI-Native Network Effects and Defensibility

AI Summit held on Sept 16–18
Disclaimer: This transcript was created using AI
  • 1692
    04:33:59.509 –> 04:34:10.709
    Julia Nimchinski: Welcome to the show, Kathleen Estrich, partner at ParaVC, and we’re for a real treat here at VC Roundtable, AI Native Network Effects and Defensibility.

    1693
    04:34:10.959 –> 04:34:19.409
    Julia Nimchinski: Snowflake, NFX, Redpoint Ventures, and Scale Venture Partners. Wow. Welcome to the show! How are you doing?

    1694
    04:34:20.070 –> 04:34:21.540
    Kathleen Estreich: Good, how are you doing?

    1695
    04:34:21.930 –> 04:34:23.609
    Julia Nimchinski: Super excited for this!

    1696
    04:34:24.180 –> 04:34:27.559
    Kathleen Estreich: Likewise. Should be fun. Hopefully controversial.

    1697
    04:34:27.759 –> 04:34:30.070
    Kathleen Estreich: That’s my ass to the panelists.

    1698
    04:34:32.529 –> 04:34:56.229
    Kathleen Estreich: Cool, thanks so much, everyone, for joining. I’m Kathleen, I’m a partner at Pair VC. We’re an early-stage fund focusing on pre-seed and seed investing. As a generalist fund, on the pre-seed side, we run an accelerator called PairX, which applications are open for, through October 8th for the next, cohort, and then on the seed side, we lead or co-lead rounds.

    1699
    04:34:56.249 –> 04:35:14.949
    Kathleen Estreich: I’m super excited to have these panelists, with us today. Today, we’re going to talk about defensibility in AI, and how you kind of think about moats, particularly from a VC perspective. Before we dive in, I think it’d be great for everyone to give a very quick intro, and maybe

    1700
    04:35:14.949 –> 04:35:22.649
    Kathleen Estreich: as part of your intro, give your definition of defensibility in this AI-native world.

    1701
    04:35:22.649 –> 04:35:24.269
    Kathleen Estreich: So, Jeremy, you want to go first?

    1702
    04:35:25.230 –> 04:35:37.979
    jeremyk: Sure, thanks, Kathleen. I’m Jeremy Kaufman, partner at Scale Venture Partners, focused on Series A and B in, you know, vertical AI and application-level AI companies. I…

    1703
    04:35:38.130 –> 04:35:41.080
    jeremyk: My word would probably be future workflows.

    1704
    04:35:43.779 –> 04:35:45.139
    Kathleen Estreich: Valerie, do you want to go?

    1705
    04:35:46.529 –> 04:35:57.979
    Kathleen Estreich: Yeah, I’ll go. Hi everyone, I’m Valerie, I’m with NFX, we’re a generous seed fund. NFX does technically stand for network effects, so as we talk about network effects, that is where we, work on.

    1706
    04:35:57.979 –> 04:36:03.669
    Valerie Osband Mahoney: I mean, I think, I mean, in defensibility in AI, I mean, I really think it is, like.

    1707
    04:36:03.869 –> 04:36:23.669
    Valerie Osband Mahoney: how… your ability to manage a sustained competitive advantage through either your data, your talent, your model architectures, that people can’t actually just come and replicate today. It’s maybe long-term, but I think it’s more that sustained competitive advantage, so you’re always at least, sort of, at least a half or one step ahead of others.

    1708
    04:36:26.770 –> 04:36:27.460
    Kathleen Estreich: Cool.

    1709
    04:36:27.800 –> 04:36:30.910
    Kathleen Estreich: Either… Madan or Jacob?

    1710
    04:36:31.790 –> 04:36:35.280
    Madhan Arumugam Ramakrishnan: I can go next. Hello everyone, my name is Madhan, I run our

    1711
    04:36:35.400 –> 04:36:43.030
    Madhan Arumugam Ramakrishnan: field engineering organizations for Snowflake. I work with customers small and large across, all the industries, and…

    1712
    04:36:43.380 –> 04:36:57.659
    Madhan Arumugam Ramakrishnan: To your question, I would say, Kathleen, you know, establishing some predictability on margin creation with AI, that would be what, would, you know, create more differenceability.

    1713
    04:36:59.890 –> 04:37:15.290
    Jacob Effron: Awesome, and I’m Jacob, I’m a managing director over at Redpoint, lead a lot of our AI app investing, and, you know, I think of defensibility as things you can build when you get to kind of scale and ubiquity within organizations, but really the first chapter being a race to get to that.

    1714
    04:37:17.410 –> 04:37:24.699
    Kathleen Estreich: Awesome. I’m excited to have you all here, and dive into this, I think, pretty timely topic.

    1715
    04:37:24.830 –> 04:37:29.119
    Kathleen Estreich: I would say, I guess my first question is maybe each,

    1716
    04:37:29.169 –> 04:37:53.019
    Kathleen Estreich: each of you can walk through maybe a recent company that you either invested in or saw that had what you would consider some sort of moat or, defensibility that got you excited enough to potentially write a check. So maybe walk through, kind of, how you thought about that from an investment lens, and or from Radin, I know at Snowflake, you guys see a bunch of companies, so…

    1717
    04:37:53.020 –> 04:38:02.949
    Kathleen Estreich: Would love to hear, kind of, how… how you think about that as well. So, a company that you recently underwrote, and the moat or defensibility that they had that got you excited enough to invest.

    1718
    04:38:06.880 –> 04:38:08.289
    Valerie Osband Mahoney: Anyone want to go first?

    1719
    04:38:10.759 –> 04:38:25.699
    Valerie Osband Mahoney: I’m happy to go. So we have a company, maybe one we’ve announced recently, it’s called Neon Health, and it’s doing, sort of, a vertical, sort of, AI automation platform in the pharmaceutical services industry. And I think for us, where we see on the moat here, I mean.

    1720
    04:38:26.289 –> 04:38:43.349
    Valerie Osband Mahoney: it is combining a lot of the next-gen technology of, sort of, voice AI and, sort of, how you’re thinking about, like, scaling these operations, but I think what… how they’re getting defensibility is really their go-to-market. They have a team of really, deep experts and connections, sort of, within that pharmaceutical industry, and I do think, in some ways.

    1721
    04:38:43.639 –> 04:38:53.219
    Valerie Osband Mahoney: I think in some of these vertical areas, maybe a hot take or not, there’s always, like, probably going to be 20 or 30 of them, always. And I do think that first mover advantage is probably even more important now, because

    1722
    04:38:53.339 –> 04:38:55.319
    Valerie Osband Mahoney: A lot of people are going to be able to have

    1723
    04:38:55.649 –> 04:39:15.389
    Valerie Osband Mahoney: 10x or 20x better product, and I think for them, they really found that sort of wedge just really at the beginning of when the real developments, kind of back in October last year, happened in voice AI, really going after a market that’s really large, and then for them, in this industry, really showing how they can really accelerate the revenue sales. And now that becomes quickly that go-to platform, so I think

    1724
    04:39:15.389 –> 04:39:20.479
    Valerie Osband Mahoney: Both… there’s more market to grab, but also, like, you gotta grab it straight away, otherwise…

    1725
    04:39:20.679 –> 04:39:23.109
    Valerie Osband Mahoney: You’re never going to be able to grab onto it again.

    1726
    04:39:23.320 –> 04:39:37.139
    Kathleen Estreich: On that note, do you think that first mover is sustainable? Like, how do you think of the fast follow? To your point, like, there’s 50 companies in every category at this point, so if you see something that maybe already has that first mover.

    1727
    04:39:37.560 –> 04:39:44.270
    Kathleen Estreich: You know, you’re evaluating that, do you… like, how defensible long-term is that, or do they need to use that to then compound other things?

    1728
    04:39:44.540 –> 04:39:54.790
    Valerie Osband Mahoney: I think, good question. I think, to me, at least how I’m seeing it, it really depends on the industry. I would say there’s certain industries where I feel, example, like law, where I do think Harvey actually

    1729
    04:39:54.790 –> 04:40:08.209
    Valerie Osband Mahoney: I think you actually have an almost better advantage being a second mover here, because they’ve done a lot of the legwork of kind of overcoming this, like, old industry who would, like, never buy AI, and then, like, deal with that, and the time went into that. Whereas now, because I’ve kind of done…

    1730
    04:40:08.210 –> 04:40:19.320
    Valerie Osband Mahoney: a lot of the legwork and gone there now, like, it opens up a lot of other applications. And maybe that means you’re not going on corporate counsel, or external counsel, maybe you’ve got in-house counsel, or juries, or litigators, or all of that, but I think it’s sort of…

    1731
    04:40:19.320 –> 04:40:35.979
    Valerie Osband Mahoney: being a follow there is really helpful, because… and I think I see this in the financial services industry as well, being a first mover can be really hard, because you get bogged down in the bureaucracy, and then following in those, yeah, more regulated industries. There’s… others have already opened the door for you, and you can… especially from a

    1732
    04:40:36.520 –> 04:40:43.480
    Valerie Osband Mahoney: having rapid growth aligned with VC incentives and where you want to go, you’re going to be able to hit that quicker, because you’re not going to spend 3 years

    1733
    04:40:43.880 –> 04:40:47.170
    Valerie Osband Mahoney: You’re less likely to spend 3 years in a bureaucracy triangle.

    1734
    04:40:47.510 –> 04:41:05.699
    Kathleen Estreich: Yeah, I would tend to agree on the legal tech front. I’ve been pretty excited in diving in there, and it’s like, Harvey paved the way, and now there’s so much greenfield, because, you know, folks are already adopting at least that, and they’re ready for some more purpose-built tools in combination.

    1735
    04:41:06.010 –> 04:41:07.959
    Kathleen Estreich: Jacob, do you have one in mind?

    1736
    04:41:08.630 –> 04:41:31.619
    Jacob Effron: Yeah, no, I recently led an investment in a company called Augment in the AI logistics space, and I think that’s an example where just a really unique kind of founding team, right? Someone that, the CEO, Harish, and one of his co-founders, Justin, had just lived in the logistics industry for the last, like, 15, 20 years, and so I think, as you think about lots of people going after this problem, just really deep understanding of, like, what the

    1737
    04:41:31.620 –> 04:41:41.570
    Jacob Effron: needs are within that space than married with, like, a really strong AI team. And so, I think a lot about, you know, in some of these crowded spaces, just finding a really unique team that you can back,

    1738
    04:41:41.570 –> 04:41:49.520
    Jacob Effron: And over time, you’re kind of betting on that team to continue to compound and ship quickly and discover these problems, but ultimately, you know, starting with that, I think, is really helpful.

    1739
    04:41:50.080 –> 04:42:04.279
    Kathleen Estreich: How do you think about, the deep domain expertise versus these sort of, like, AI-native founders in their early 20s who kind of just, you know, come into these industries? Like, do you or Redpoint have kind of a perspective around

    1740
    04:42:04.430 –> 04:42:06.400
    Kathleen Estreich: You know, either of those profiles.

    1741
    04:42:06.400 –> 04:42:24.559
    Jacob Effron: Yeah, I mean, I’ve invested in both, and I think I have immense respect for both profiles, I guess. I think the strong part about the people that are really deep in an industry is obviously, you know, both knowing out of the gate what the right things to build are, but then I think also, once you get to some level of scale.

    1742
    04:42:24.560 –> 04:42:37.430
    Jacob Effron: knowing what, like, the, you know, the Act 2, Act 3s, Act 4s, and having that kind of coherent vision from the beginning, and being able to, you know, meet with the CEOs of the largest enterprises in that space, and bring them on that journey.

    1743
    04:42:37.430 –> 04:42:52.470
    Jacob Effron: But obviously, so much of… I think the topic of this panel is defensibility. I’d say, like, I think there’s very little defensibility in these early innings of AI, and so, so much of what is defensible today is just velocity, right? And shipping really fast, and ultimately, these spaces can be learned, and customers

    1744
    04:42:52.470 –> 04:43:08.449
    Jacob Effron: I think uniquely in this AI era are willing to invite people in to, you know, talk to the CEOs of large banks and law firms and hospitals in ways that, in the past, maybe only the domain experts would have been able to do. And so, I guess maybe an unfair cop-out to your question, but I think both profiles are really interesting.

    1745
    04:43:08.840 –> 04:43:09.660
    Kathleen Estreich: Yeah.

    1746
    04:43:10.280 –> 04:43:13.819
    Kathleen Estreich: Cool. Maran, do you have any example of a company?

    1747
    04:43:13.970 –> 04:43:31.380
    Madhan Arumugam Ramakrishnan: Yeah, I mean, obviously, we work with a lot of customers and partners. The one that comes to mind recently, it’s actually something that our CEO highlighted recently, is a company called Simon Data. They focus on marketing, and they… their evolution to Simon.ai

    1748
    04:43:31.380 –> 04:43:42.409
    Madhan Arumugam Ramakrishnan: you know, which is basically a context personalization for marketers at scale. The reason I’m highlighting that is, look, as much as there’s so much disruption about

    1749
    04:43:42.460 –> 04:43:51.970
    Madhan Arumugam Ramakrishnan: you know, reimagining, you know, how you do marketing or finance or, you know, vertical AI or physical AI, being able to transform your expertise

    1750
    04:43:51.970 –> 04:44:08.729
    Madhan Arumugam Ramakrishnan: to something relevant is equally important, and that also gives you hope that, you know, expertise today, you know, can be transformed to expertists in the Gen AI era. And as, you know, someone else said, like, it’s really about

    1751
    04:44:08.870 –> 04:44:14.039
    Madhan Arumugam Ramakrishnan: You know, velocity, but both, the scale and speed of diffusion.

    1752
    04:44:14.120 –> 04:44:30.339
    Madhan Arumugam Ramakrishnan: you know, I think that’s what I think Gen AI has given us, is being able to immediately kind of make all this awesomeness from product, through Gen AI be available to a lot more customers than ever before, but also at the same depth of

    1753
    04:44:30.340 –> 04:44:45.120
    Madhan Arumugam Ramakrishnan: customization and personalization, you know, that is unique. Like, we didn’t have that, right? Like, deeply customized marketer persona experiences was hard to get at scale. Now, you don’t have to sacrifice one for the other. Customization and scale is possible.

    1754
    04:44:48.080 –> 04:44:49.110
    Kathleen Estreich: Jeremy.

    1755
    04:44:50.680 –> 04:45:09.869
    jeremyk: I would just say, one of the funny trends that I feel like we’ve been seeing in a lot of these vertical voice deals, I mean, you know, I did one in the healthcare space recently, is almost the cartoonization of the founder. Like, because the buyer is just as confused as the VCs are as to why they’re supposed to pick one of the 20,

    1756
    04:45:09.870 –> 04:45:10.960
    jeremyk: I mean, it’s just a…

    1757
    04:45:10.960 –> 04:45:20.030
    jeremyk: funny to literally see the founders, like, dressing up in the clothes, or like, you know, just taking on the identity of, you know, I did something in healthcare.

    1758
    04:45:20.030 –> 04:45:34.730
    jeremyk: where the founder was ex-Carbon Health, and it was… I tell the story of Carbon Health. I mean, Jacob, I know you guys are investors in a construction tech company, where if you, go on LinkedIn, you know, the founder’s literally wearing a construction hat. Like, you’re…

    1759
    04:45:34.730 –> 04:45:48.419
    jeremyk: I don’t think the VCs have the answer, but the funny thing is I don’t think the customers have the answer either, and therefore people are just trying strange hacks to get attention, and I… it’s just such a… I don’t even know what the right word is. It’s like co-splaying…

    1760
    04:45:48.420 –> 04:45:50.880
    jeremyk: You know, professional, almost.

    1761
    04:45:52.620 –> 04:45:56.399
    Kathleen Estreich: Does it work? I mean, how far do those things take you?

    1762
    04:45:56.910 –> 04:46:03.589
    jeremyk: I think it… I think it allows you to get to the first inning. Like, I feel like the odd thing here is that…

    1763
    04:46:04.020 –> 04:46:15.060
    jeremyk: You know, we actually had a… this is a bizarre conversation to have at a board meeting at a medical scribe company, but we were actually talking about, you know, where do you allocate engineering resources? And…

    1764
    04:46:15.060 –> 04:46:34.280
    jeremyk: you know, we had a legitimate conversation as, because the foundation model improvement is going to have such a big effect on the company, the marginal hours spent on eking out marginal gains before the next model drops is less important than changing, possibly, the underlying infrastructure. So therefore, it’s almost like the product has to work.

    1765
    04:46:34.310 –> 04:46:46.239
    jeremyk: But the way to get in the conversation is to have, you know, be known as the thing. And I think if you are completely… the product doesn’t work, eventually it catches up to you.

    1766
    04:46:46.240 –> 04:46:54.890
    jeremyk: But, you know, it’s just… it’s amazing to watch how so many of the winners have sold ahead of the product capabilities. And…

    1767
    04:46:55.100 –> 04:46:59.030
    jeremyk: Yeah, that goes against a lot of traditional speak.

    1768
    04:46:59.880 –> 04:47:14.259
    Kathleen Estreich: Yes and no. I feel like some big enterprises have vaporware conferences where they launch, you know, announce a lot of stuff on stage, and then nothing actually comes of it, because they need stuff to announce. That’s been going on, at least from my experience.

    1769
    04:47:14.260 –> 04:47:14.640
    jeremyk: Yeah.

    1770
    04:47:14.640 –> 04:47:18.210
    Kathleen Estreich: entire time I’ve been in tech. But I think that…

    1771
    04:47:18.290 –> 04:47:39.300
    Kathleen Estreich: potentially the time horizon by which you can then build it, because people are building a lot faster, is shortened, so you’re a little bit, less at risk, but although I don’t know if companies ever were that at risk, because Enterprise is just… I think on the enterprise, people just move quite slow anyways, so by the time they’re actually ready to implement it, the product is probably there.

    1772
    04:47:39.360 –> 04:47:43.429
    Kathleen Estreich: But, yeah, how much do you think of a founder as…

    1773
    04:47:43.760 –> 04:48:07.020
    Kathleen Estreich: I’ve been asked this, actually, quite frequently, and I’m curious to get all of your takes of, like, the founder as the sort of, and you kind of mentioned this, Jeremy, and Jacob, you in other ways, the founder as sort of a moat in terms of, like, their distribution, their voice in the market. There’s obviously some very interesting marketing tactics going on with some companies out there, that I’m sure we’ve all seen.

    1774
    04:48:07.020 –> 04:48:09.009
    Kathleen Estreich: But how much does the founders

    1775
    04:48:09.010 –> 04:48:16.850
    Kathleen Estreich: reach and distribution matter to you in terms of moat, and how defensible is that, from… from where you all sit?

    1776
    04:48:21.470 –> 04:48:28.830
    jeremyk: I mean, I think somebody wrote a really good article on it, the Levered Beta, the Levered Beta article, which was basically, like.

    1777
    04:48:29.560 –> 04:48:47.889
    jeremyk: because there’s a whole lot of folks just sitting on top of ChatGPT, the founder’s network and the founder’s buzz starts the conversation. I mean, I think… I mean, basically the point of this article was, it’s okay to get a little bit ahead of the model capabilities.

    1778
    04:48:48.460 –> 04:49:03.140
    jeremyk: But you also have to know, you know, how quickly the model’s gonna improve. So if you’re claiming, you know, it’s gonna do something that it can’t do for 3 years, and you’re sitting out there making claims, that’s gonna catch up to you.

    1779
    04:49:03.660 –> 04:49:20.050
    jeremyk: So yeah, I think we’re all seeing levels of marketing, and I think the… you have to be in the middle. If you don’t make the claims, people don’t know who you are, but if you make things that are so over-the-top and disprovable.

    1780
    04:49:20.210 –> 04:49:22.910
    jeremyk: Like, you know, the buyer’s gonna react.

    1781
    04:49:25.660 –> 04:49:38.720
    Jacob Effron: Yeah, I mean, I think great founding teams are a huge part of early defensibility, right? If I think about what a lot of these best AI apps have done, is I think of them as kind of their partners to the, you know, C-suite or functional leaders of these companies, and

    1782
    04:49:38.720 –> 04:49:52.089
    Jacob Effron: They’re partners both in, you know, realizing whatever the capabilities of these models are today, but also, you know, these companies are picking partners to basically go on this 5-10 year journey that is gonna be this massive improvement in models over that time.

    1783
    04:49:52.090 –> 04:49:55.260
    Jacob Effron: And so, I think part of this is… is…

    1784
    04:49:55.260 –> 04:50:11.730
    Jacob Effron: the company and the founders having this trusted brand such that it’s a big decision for a hospital exec, or a law firm exec, or a bank exec to say, hey, this is the partner we’re going to work with, and they’re going to help me navigate this all, and obviously you don’t want to fall behind your competitors in doing that, and so I think

    1785
    04:50:11.730 –> 04:50:22.460
    Jacob Effron: you know, the way that teams can build that kind of trust, you know, to the extent they can do that with brand equity they built around the company or the founding team, I think is hugely helpful to the businesses.

    1786
    04:50:24.270 –> 04:50:39.019
    Madhan Arumugam Ramakrishnan: Yeah, I agree. I think with so much disruption, happening almost every day, I was talking to a founder the other day, and they were like, yeah, we got disrupted by, you know, nana bananas, or, you know, you name it, right? Something that the big model companies are working on.

    1787
    04:50:39.020 –> 04:50:45.020
    Madhan Arumugam Ramakrishnan: But what matters, I feel, is, you know, the trust element, or their ability to adapt.

    1788
    04:50:45.020 –> 04:50:58.519
    Madhan Arumugam Ramakrishnan: that the founding team can display and demonstrate and say, you know, we are ready for disruption, and we’ll adapt and pivot. And so that comes from, you know, their ability to kind of show, you know, either in their experience or

    1789
    04:50:58.670 –> 04:51:08.319
    Madhan Arumugam Ramakrishnan: in, you know, lived experience in front of all the disruption, how they pivot and how they adjust and change their value prop. So, absolutely, I think it has some equity and value.

    1790
    04:51:10.670 –> 04:51:26.660
    Valerie Osband Mahoney: I would say here, I mean, I think it defines… it’s always been useful to be a, like, thought leader in some space, whichever way you’ve done that. I think some of the earlier Gen Z founders are sort of redefining on how to do that, and sort of, you see that even in media, you know, traditional media, sort of.

    1791
    04:51:26.660 –> 04:51:43.640
    Valerie Osband Mahoney: being upended by a lot of, sort of, new podcasts and new ways of talents across industries, and I think that just same thing is happening within technology as well, where people realize that, okay, maybe I don’t have 20 years of experience selling in this industry, so how can I overcome that without getting that is…

    1792
    04:51:44.120 –> 04:52:00.309
    Valerie Osband Mahoney: becoming a thought leader in that space. Actually, people are redefining, I think, more open on that, so I think it is an advantage. I think we talk, you know, what’s that balance between people working and doing that, and I think that’s hard, of, like, how do you go do that? But I think we’ve seen in our portfolio, it’s very clearly, like.

    1793
    04:52:00.310 –> 04:52:05.090
    Valerie Osband Mahoney: opened a lot of doors, for people, I think especially in some

    1794
    04:52:05.460 –> 04:52:20.470
    Valerie Osband Mahoney: harder verticals, we, you know, we do some deep tech and, like, space investments, and I think some of those industries with, like, long… much longer sales cycles and development, and just sort of research timelines as well, where staying top of mind in that conversation, because, hey, it’s going to take a little while to launch, you’re not

    1795
    04:52:20.880 –> 04:52:37.849
    Valerie Osband Mahoney: you’re not gonna get 100 million ARR in 6 months, that’s just, like, not what’s happening, because you’re, like, building your rocket, and I think that can help people, like, keep you top of mind and in that conversation. So, I think it’s good, but I think we always encourage… I don’t know what the right time should be, but it… it should be a small percentage, you know.

    1796
    04:52:38.320 –> 04:52:52.560
    Kathleen Estreich: I always tell founders that ask me, like, do I need to be out there? I’m like, if it’s authentic to you to be out there, and you have something to say, then be out there. If you… if it’s not authentic to you to do market… you know, do marketing or stuff that is…

    1797
    04:52:52.620 –> 04:53:16.619
    Kathleen Estreich: not your style, then it’s gonna come off as inauthentic, and so I think for every founder, they need to figure out what is their unique way of reaching their audience, whether that is being very out there, doing more targeted things, whatever that is, to sort of help you reach the people that you’re actually trying to reach, and there’s many ways to do that, but if it’s inauthentic to the founder, you can just spot it so easily that it actually…

    1798
    04:53:16.620 –> 04:53:20.170
    Kathleen Estreich: Does a disservice versus actually helping the company.

    1799
    04:53:20.170 –> 04:53:43.870
    Kathleen Estreich: I want to switch gears a little bit to kind of the types of, sort of, moats and, that you all are seeing and thinking about in terms of, like, I’m seeing a lot of companies that are building, kind of, these orchestration layers on top of some of these, like, systems of record, across a bunch of different verticals and industries. I’m curious, sort of, how you all think about, in this

  • 1800
    04:53:43.870 –> 04:54:01.169
    Kathleen Estreich: moment in time in this AI world of, like, orchestration layers versus trying to be more of that kind of full-stack, product, and how you think that sort of plays out, what’s defensible about an orchestration layer or not. I’m actually curious.

    1801
    04:54:01.170 –> 04:54:05.729
    Kathleen Estreich: Madam, for you to kick off, because Snowflake obviously sits kind of in the center of a lot of this.

    1802
    04:54:05.960 –> 04:54:11.390
    Madhan Arumugam Ramakrishnan: Yeah, no, it’s a really good question. At Snowflake, you know, we are super aware that

    1803
    04:54:11.460 –> 04:54:26.300
    Madhan Arumugam Ramakrishnan: We’re not, you know, the source of the data. You know, a lot of the systems of record, you know, you think about a CRM, or think about, you know, a finance, you know, SAP or a Workday, I mean, all these systems are outside of us, so we know

    1804
    04:54:26.300 –> 04:54:42.720
    Madhan Arumugam Ramakrishnan: that, you know, we don’t originate data, but we end up being kind of the destination of a lot of the goal-layer data, as we’d like to call it. So we see it both ways, right? A lot of customers tell us, you know, data is the fuel, and we understand that, and we advocate that for all things AI.

    1805
    04:54:42.720 –> 04:54:57.349
    Madhan Arumugam Ramakrishnan: And because of that, you know, they want to keep all their application workflows, agentic, you know, innovation as close as possible to data, because this is their asset, this is their product, even. So what that means is we have to build connectors

    1806
    04:54:57.350 –> 04:55:04.009
    Madhan Arumugam Ramakrishnan: for everything, every system of record. But we also understand that, like, there’ll be different, you know.

    1807
    04:55:04.010 –> 04:55:13.999
    Madhan Arumugam Ramakrishnan: entry points. You know, every company that has any sort of systems of engagement or systems of record want to do the same thing. They don’t want the data to be taken out.

    1808
    04:55:14.000 –> 04:55:34.550
    Madhan Arumugam Ramakrishnan: They want all the processing and the workflow to kind of happen in their user experience. So, and this is why things like the model context protocol or agent-to-agent protocol will be kind of the semantic, you know, interchange, if you will, of the future. You have to participate in that, like, so all the systems of record will compete to kind of have

    1809
    04:55:34.550 –> 04:55:50.570
    Madhan Arumugam Ramakrishnan: you know, their customers’ experience in their surface area, but at the same time, everybody knows that you’ll have to match data. Like, you know, if I’m a healthcare provider, you know, I want Workday data for getting information about people, so I can understand, you know, different dynamics. So.

    1810
    04:55:50.610 –> 04:55:58.940
    Madhan Arumugam Ramakrishnan: So we see that all the time, and so our point of view is, you know, participate in both, right? Like, it’s, you know, it’s too early

    1811
    04:55:58.940 –> 04:56:12.529
    Madhan Arumugam Ramakrishnan: to kind of have strong opinions on this will win, alright? Like, optimize for optionality. You know, that’s our take, and making sure that we show up headless if need to be, and we show up with a great user experience if people want to operate on our surface area.

    1812
    04:56:14.970 –> 04:56:27.099
    Kathleen Estreich: Makes sense. What’s an opinion… you mentioned it so early, like, what’s an opinion that maybe you or Snowflake had a year ago that’s changed on this, given how fast things are naturally changing right now?

    1813
    04:56:27.260 –> 04:56:32.929
    Madhan Arumugam Ramakrishnan: I think the rate at which things are moving into production

    1814
    04:56:33.050 –> 04:56:38.519
    Madhan Arumugam Ramakrishnan: that definitely has changed, right? Like, you know, last year was a lot of POCs, tire kicking.

    1815
    04:56:38.520 –> 04:56:44.079
    Kathleen Estreich: now we have material, regulated industry use cases, right? Like, you know, they’re, you know.

    1816
    04:56:44.080 –> 04:56:57.280
    Madhan Arumugam Ramakrishnan: These are… these are at scale. These have real meaningful impact on the bottom line, the top line, margin creation. So that’s definitely a change. You know, you’ll see, kind of, all other vendors talking about that as well.

    1817
    04:56:57.280 –> 04:57:14.100
    Madhan Arumugam Ramakrishnan: And that’s because there is a lot more durability on what to expect from the, you know, the foundational labs, you know, working on various models. And then there’s also real clarity on where there are games. There’s a bunch of use cases. For example, I run our

    1818
    04:57:14.100 –> 04:57:37.879
    Madhan Arumugam Ramakrishnan: support engineering team. The support space is ripe for, you know, efficiencies and efficiency gains. There’s so many players there. And I talked to a bunch of the startups in this space to see, you know, how I can create and create efficiency. So that’s a no-brainer, right? Like, this is kind of the next step in creating some autonomous AI-based workflows that will create a lot more, you know, improvements in customer experience.

    1819
    04:57:37.880 –> 04:57:49.409
    Madhan Arumugam Ramakrishnan: case deflection, you name it, right? Like, so these are problems that we’ve known. Again, the combination of scale, accuracy, and speed allows us to kind of get faster into these sort of production

    1820
    04:57:49.410 –> 04:58:03.379
    Madhan Arumugam Ramakrishnan: mode. So I would say that would be the biggest thing. And I do think, like, you know, especially in enterprise, you know, arguably, as you talked about, even though things are slow, they’re very sticky, right? You know, once they kind of move into this mode, like, they build

    1821
    04:58:03.380 –> 04:58:12.210
    Madhan Arumugam Ramakrishnan: An entire periphery of, you know, gestures in the enterprise that are fine-tuned to that particular process, or business process, or business workflow, so…

    1822
    04:58:12.210 –> 04:58:16.119
    Madhan Arumugam Ramakrishnan: So we’re seeing that, right? Like, very sticky workflows starting to happen on AI.

    1823
    04:58:16.580 –> 04:58:20.909
    Kathleen Estreich: Have you brought in a new tool on… for your team, from a… on the support side?

    1824
    04:58:21.050 –> 04:58:39.879
    Madhan Arumugam Ramakrishnan: Oh yeah, yeah, you know, we talked to all the vendors, we built our own on top of Snowflake Cortex, you know, because obviously a lot of our own… we run on Snowflake ourselves. We also use all the coding agents that’s out there, you know, the cursors of the world, you know, GitHub Copilot, and all of that. Tremendous productivity gains. I mean, of course, as someone said in the previous podcast.

    1825
    04:58:39.880 –> 04:58:58.210
    Madhan Arumugam Ramakrishnan: you have to be in the edge of learning, you know, to be able to have the license and the right to say, hey, this is material, and this is, you know, not as material. So yeah, so absolutely, we, all the way up to our CEO, are, you know, big, big users of pretty much all the AI tools out there.

    1826
    04:59:00.030 –> 04:59:00.760
    Kathleen Estreich: Cool.

    1827
    04:59:01.860 –> 04:59:08.590
    Kathleen Estreich: Anyone else have any thoughts on, sort of, that orchestration layer and how, how you think about defensibility there?

    1828
    04:59:10.920 –> 04:59:30.030
    jeremyk: It’s just interesting to know, I mean, compare and contrast, I feel like, in the last couple months, two hot categories, kind of, that fall under there are the ERP replacement, and then what’s going on with everything in modules of the EHR. And, I don’t know, a year… I mean, right now.

    1829
    04:59:30.180 –> 04:59:33.270
    jeremyk: Everyone makes fun of that the, you know, the…

    1830
    04:59:33.350 –> 04:59:56.029
    jeremyk: you know, how many rounds can happen in ERP replacement, but just even conceptually, if you had asked me a year ago, like, what’s the probability of people will be willing to rip and replace their ERP versus their CRM versus their EHR, it would not have been obvious to me that the clear answer was rip and replace your, you know, your ERP.

    1831
    04:59:56.030 –> 05:00:08.469
    jeremyk: I… I’m intrigued by that, and how that seems to differ. I mean, in healthcare, a lot of people have made the assumption, build scaffolding around the core system, bundle more, build yourself into the core system.

    1832
    05:00:08.470 –> 05:00:24.249
    jeremyk: Why in the ERP space, a lot of people made that same assumption, and then 2 or 3 businesses made the assumption, rip and replace. I don’t have a good perspective on why that seems to be happening differently in different core systems of record, but I think that’s really interesting.

    1833
    05:00:27.260 –> 05:00:52.249
    Jacob Effron: Yeah, I mean, I think those two markets are probably, you know, maybe actually represent the difference in those opportunities, right? Where the nice thing about the ERP market is there’s companies that are always growing into needing one for the first time, and so there’s this whole, like, you know, greenfield to go into. Whereas in the EHR space, most of the time, I mean, sometimes there’s new practices that are started, but by government mandate, most people have something in place already. So I think the ribbon replace in all these spaces, I think, is still hard, but in some

    1834
    05:00:52.250 –> 05:00:57.579
    Jacob Effron: of the other ones, you know, there’s new needs for system of records, which I think, does create an opportunity.

    1835
    05:00:57.840 –> 05:00:58.410
    jeremyk: Yeah.

    1836
    05:01:00.890 –> 05:01:09.859
    Madhan Arumugam Ramakrishnan: Maybe the one additional point I would add is, you know, there’s a real push, especially in this sort of, you know, data as a fuel for AIR,

    1837
    05:01:09.860 –> 05:01:19.419
    Madhan Arumugam Ramakrishnan: for minimizing data movement, right? Like, obviously, there is economic advantages for the vendors having systems of engagement, systems of record, to kind of keep the data where it is.

    1838
    05:01:19.420 –> 05:01:36.340
    Madhan Arumugam Ramakrishnan: But also, it’s costly, it opens up that attack surface, accuracy goes down. So, you know, like, even, like, if you look at, you know, our announcements this week with Workday and Microsoft, a lot of this is bidirectional read-write access with, you know, no data duplication or movement.

    1839
    05:01:36.340 –> 05:01:45.530
    Madhan Arumugam Ramakrishnan: That’s becoming a real trend. There are open formats, industry standards, even in each vertical that’s developing, because they realize that the moment you’re moving.

    1840
    05:01:45.530 –> 05:01:56.420
    Madhan Arumugam Ramakrishnan: you know, your access control is off the books, and, you know, you can’t control accuracy or access to data, so that’s, again, something that we are seeing. And my expectation is, even in this rip and replace

    1841
    05:01:56.420 –> 05:01:57.460
    Madhan Arumugam Ramakrishnan: context.

    1842
    05:01:57.460 –> 05:02:07.509
    Madhan Arumugam Ramakrishnan: you know, people are going to kind of worry a lot about governance of this data, because, like I said, data is the product these days, and that kind of shows you a differentiation with your competitors.

    1843
    05:02:11.710 –> 05:02:13.970
    Kathleen Estreich: Anything else to add, Valerie?

    1844
    05:02:15.450 –> 05:02:17.159
    Valerie Osband Mahoney: Honestly, not really, Ethel.

    1845
    05:02:18.470 –> 05:02:23.109
    Kathleen Estreich: All good. One other question I wa-

    1846
    05:02:23.130 –> 05:02:41.330
    Kathleen Estreich: I’d love to hear, particularly from… from the three investors here, when you think about, kind of, when… when founders are pitching you about, hey, here’s our, like, here’s our moat, here’s… here’s, like, how we’re defensible, here’s how we’re different from everyone else, what are some of the misconceptions that you’ve seen founders…

    1847
    05:02:41.370 –> 05:02:48.910
    Kathleen Estreich: Have around what makes, like, a, you know, a true moat, or areas where you’ve, you know, kind of…

    1848
    05:02:48.990 –> 05:02:57.050
    Kathleen Estreich: Come out of that conversation a little more skeptical, where founders perceived, you know, something that maybe, from where you sit, was a little bit off.

    1849
    05:02:58.480 –> 05:03:03.959
    Valerie Osband Mahoney: I personally, I do think it’s been changing over the last 6 months or so, but…

    1850
    05:03:04.150 –> 05:03:15.579
    Valerie Osband Mahoney: I just… I think we here at NFX have always believed this, but that there’s really… rarely a case you’re really going to win because of that technical moat that goes there. I think everyone’s…

    1851
    05:03:15.580 –> 05:03:27.480
    Valerie Osband Mahoney: you know, was doing the rounds earlier this year, everyone’s, like, laughing at, like, DocuSign. And, like, oh my god, how is this such a big company? Look, I can just, like, vibe code it in 20 seconds, or whatever. You know, it goes on that, it’s so easy. And the reality is, I think that’s always

    1852
    05:03:27.480 –> 05:03:46.099
    Valerie Osband Mahoney: often been the case for a lot of businesses. It’s not, okay, maybe you couldn’t do it in a week, but you could take a team and do it in 3 or 4 months, and the business didn’t win because the product was impossible to copy. It didn’t win just because no one else could go do that. Obviously, there’s some examples of that, and there’s, you know, different industries. And so I think people who are…

    1853
    05:03:46.630 –> 05:03:51.789
    Valerie Osband Mahoney: you know, we win because we’re X and X the best at this, and I’m like, okay, well, maybe you’re the best at that right now.

    1854
    05:03:51.790 –> 05:04:10.869
    Valerie Osband Mahoney: But how’s that really going to go and scale? And, like, yeah, what is your distribution? I think under-indexing on, like, how I’m really going to get go-to-market quickly. And if anything now, I think we prefer to work with teams who have a really good gauge and path and action plan and ambition, you know, in their go-to-market.

    1855
    05:04:11.190 –> 05:04:25.260
    Valerie Osband Mahoney: And I think, you know, before that, you could maybe… you’d have longer to grow into that, because you would need to build the technology, and it would have longer to build the technology before someone else could build it, and I think now, I’m like, as soon as this goes out in public, someone could copy it, so…

    1856
    05:04:25.260 –> 05:04:26.520
    Kathleen Estreich: Get a half, gonna win.

    1857
    05:04:28.150 –> 05:04:30.599
    Kathleen Estreich: Yeah, the fir- that quote…

    1858
    05:04:30.600 –> 05:04:53.600
    Kathleen Estreich: that we know we’ve all seen, like, first-time founders think about products, second-time founders think about distribution. I think the distribution matters so much more now than… I would argue it always mattered, but people didn’t acknowledge it, but I think it matters so much in the early days, too, much earlier, to your point, than maybe it used to.

    1859
    05:04:53.600 –> 05:04:56.729
    Valerie Osband Mahoney: And I think it… I do think it always mattered, but I think people’s…

    1860
    05:04:56.730 –> 05:05:15.479
    Valerie Osband Mahoney: bias or personal ego would go, well, we won because we are the smartest people in the room, and we have the best ever technology. You know, it’s like, it’s a nicer way to say that than to be like, I’m really good at sales. I don’t know the reasons those are so important. I’m an engineer by background, I think I see Madame shaking his head. It’s really easy for engineers to be like, I just won because I’m so good at coding.

    1861
    05:05:15.500 –> 05:05:25.320
    Valerie Osband Mahoney: And now that’s kind of ripped away, you’re like, oh wait, a lot of people are… Claude’s really good, you know, for a lot of people, so wait, you’ve actually got to have to own into… I’m actually quite good at sales as well, you know?

    1862
    05:05:28.090 –> 05:05:33.200
    Kathleen Estreich: Jacob, I saw you laughing. I feel like you have maybe a good anecdote here.

    1863
    05:05:33.610 –> 05:05:46.979
    Jacob Effron: No, no, I mean, obviously a lot of that, I think really resonates, and I feel like, you know, look, the reality is, in the early days of these businesses, I think getting to scale and distribution is so important, and usually.

    1864
    05:05:47.120 –> 05:05:58.400
    Jacob Effron: you know, what I look for is, okay, when this business is at scale, is there an interesting way to, you know, to build some products on top of that, or defensibility on top of that? And I think

    1865
    05:05:58.400 –> 05:06:19.619
    Jacob Effron: at the stage where we look, it’s never… you’re never gonna know exactly what that is, but it’s kind of at least some ideas of where you might go. And obviously, these businesses, it’s AI, everything changes every month, like, these unfold quite quickly, but the feeling that, God, if this business really can do what they say they’re gonna do over the next few years, that leaves themselves… that leaves them in a really interesting position to go build this out further.

    1866
    05:06:19.770 –> 05:06:27.469
    Jacob Effron: But I don’t think it’s ever, like, a perfect plan for 5 years that is then executed exactly to that plan, at least in my experience.

    1867
    05:06:30.580 –> 05:06:38.010
    jeremyk: I would just add, it’s so interesting how rare true data network effects are. Like, when we look at our companies.

    1868
    05:06:38.190 –> 05:06:42.950
    jeremyk: you know, truly, our only AI companies that have this is…

    1869
    05:06:43.100 –> 05:07:01.600
    jeremyk: Forder and SoCure, which are identity and fraud prevention, and that’s true machine learning. That’s not foundation model-based reasoning. And it’s just interesting that those were businesses we invested in in 2016 and 2017, and everyone talks about data network effects, and…

    1870
    05:07:01.750 –> 05:07:06.010
    jeremyk: I can’t honestly say, in the LLM era.

    1871
    05:07:06.710 –> 05:07:15.500
    jeremyk: who I feel like we have good examples of that. The two canonical examples I would cite are almost pre-LLM. So yeah.

    1872
    05:07:16.110 –> 05:07:19.579
    Kathleen Estreich: Does anyone have an example of one in the last two years?

    1873
    05:07:21.670 –> 05:07:33.890
    Madhan Arumugam Ramakrishnan: I mean, I would say, you know, just, you know, as the data hyperscaler, you know, that Snowflake is, you know, we… we have seen several data clouds, even, not just data network.

    1874
    05:07:34.020 –> 05:07:44.450
    Madhan Arumugam Ramakrishnan: You know, if you think financial services, you know, Snowflake is the distribution standard for a lot of the regulatory data, right? You know, companies close their book with us.

    1875
    05:07:44.560 –> 05:07:51.510
    Madhan Arumugam Ramakrishnan: And then monetize, right? Especially downstream distribution. But, you know, I don’t know if…

    1876
    05:07:51.760 –> 05:07:57.109
    Madhan Arumugam Ramakrishnan: you know, AI made it faster or easier, especially with the scale and speed of diffusion.

    1877
    05:07:57.250 –> 05:08:14.930
    Madhan Arumugam Ramakrishnan: But what strikes me is, the ability to kind of tap into that, right? Like, basically, you know, when you’re thinking about, you know, differentiating or creating differenceability, how well are you leveraging the ecosystem to create the network effects, right? So it’s not just your product, as you said.

    1878
    05:08:14.930 –> 05:08:22.739
    Madhan Arumugam Ramakrishnan: It is, of course, the distribution, which is the monetization potential, but are we, you know, creating friends who can sell the product where we are not?

    1879
    05:08:22.840 –> 05:08:41.339
    Madhan Arumugam Ramakrishnan: present. I think that’s key, so I… in an intentional ecosystem play, I think about a company like Glean, you know, how embedded they are with all the systems of record, you know, and being able to kind of go create, you know, network effects, you know, so, you know, when you use Glean.

    1880
    05:08:41.340 –> 05:08:51.730
    Madhan Arumugam Ramakrishnan: you’ll immediately feel the sense that, like, oh, this is connected to pretty much everything in my enterprise, you know, specific to me. And so I think the importance of creating ecosystems

    1881
    05:08:51.860 –> 05:08:57.959
    Madhan Arumugam Ramakrishnan: as a foundation for creating data networks, or any sort of network effects, I think is even more key.

    1882
    05:08:58.960 –> 05:09:03.550
    Kathleen Estreich: How do you think about ecosystem as a moat? Like, how defensible is ecosystem as a moat?

    1883
    05:09:07.070 –> 05:09:07.980
    Kathleen Estreich: Anyway… I mean…

    1884
    05:09:07.980 –> 05:09:17.670
    Madhan Arumugam Ramakrishnan: it depends on how you pick it, right? Like, you gotta, you know, both look at, you know, disruptors and incumbents. There are obviously…

    1885
    05:09:17.670 –> 05:09:30.359
    Madhan Arumugam Ramakrishnan: especially in enterprise, you know, for each of the verticals, there are kind of the big, massive systems of record. You look at market share, you’ll pretty much know, there are top 3 players, and then there are a huge amount of disruptors coming.

    1886
    05:09:30.360 –> 05:09:43.070
    Madhan Arumugam Ramakrishnan: you have to, you know, you have to kind of entertain both. And, you know, you have to be good in picking winners, and, you know, if you’re not picking winners, you know, be able to adapt faster and, you know, create network effects.

    1887
    05:09:44.610 –> 05:09:59.149
    Kathleen Estreich: Has anyone seen a startup, maybe in the last 2 to 3 years, who’s effective? Like, I guess Glean is maybe the last 5 years? Maybe my time horizons are off, but anyone maybe in the last 2 years, where they…

    1888
    05:09:59.400 –> 05:10:03.880
    Kathleen Estreich: A company has effectively leveraged the ecosystem for…

    1889
    05:10:03.990 –> 05:10:12.830
    Kathleen Estreich: You know, driving distribution and emote around them? Like Are there any good examples?

    1890
    05:10:13.410 –> 05:10:16.260
    jeremyk: I mean, there’s a few businesses, maybe…

    1891
    05:10:16.430 –> 05:10:20.049
    jeremyk: Case Text, which was acquired in Intercom.

    1892
    05:10:20.080 –> 05:10:42.479
    jeremyk: like, if you peel back the layers of the grapefruit, or sorry, the onion, like, what have these businesses done? I mean, in the case of Intercom, they had some existing customers on the old Help Desk product. So it’s interesting to see you build a new customer support agent, and suddenly you have some base of customers to sell them that product. I mean.

    1893
    05:10:42.480 –> 05:10:49.179
    jeremyk: Thylon, I think, is actually trying to do the same thing, kind of ground up. It’s combining the help desk

    1894
    05:10:49.180 –> 05:10:52.620
    jeremyk: with the agent. But I feel like Intercom was interesting because

    1895
    05:10:52.620 –> 05:10:57.979
    jeremyk: They had an old base, at least to leverage. Case text, you know.

    1896
    05:10:58.110 –> 05:11:10.919
    jeremyk: that was a business that was ultimately acquired, but they had thousands of pre-existing customers using the old world product. So maybe those are two… we had thousands of native captive customers to go sell.

    1897
    05:11:11.250 –> 05:11:32.319
    Kathleen Estreich: Yeah, they kind of disrupted themselves, versus someone coming in and being the AI native, which is a risky move, right? It’s like, are you cannibalizing your old product to move to the new one? Like, how do you make that decision? But I think there’s a subset of companies who haven’t made that decision and that pivot as quickly, and I think those companies are just…

    1898
    05:11:32.390 –> 05:11:43.730
    Kathleen Estreich: kind of, you know, in a really tough position if they don’t make that call in the next 6 months, because they’re just gonna get disrupted by this next maze of companies. And so, I think it’s,

    1899
    05:11:43.730 –> 05:11:59.710
    Kathleen Estreich: we’re kind of on the precipice of, I think, a set of companies started in, like, you know, the 2021-2022 era, who now these companies who have been started in 2024-ish are… can kind of leapfrog them, which is, you know, such a risk, so…

    1900
    05:11:59.710 –> 05:12:05.639
    Kathleen Estreich: Is anyone else seeing that? Am I the only one? I’m seeing some heads nodding. Anyone else have anything to add to that?

    1901
    05:12:10.480 –> 05:12:13.100
    Kathleen Estreich: Nothing. Okay.

    1902
    05:12:14.660 –> 05:12:24.399
    Jacob Effron: I was gonna say, I mean, I think the fun thing about AI is that it all changes so fast that I’m sure there’ll be a wave of companies started in a year or two that are doing things differently than the companies that were started a year or two ago, right?

    1903
    05:12:24.400 –> 05:12:35.240
    Jacob Effron: I mean, even look to voice, right? Probably two years ago, a far less dominant modality than it is today, and there’s probably all sorts of folks, starting voice-first takes on some of these spaces today.

  • 1904
    05:12:36.400 –> 05:12:53.400
    Kathleen Estreich: On that front, like, how… are there any new, sort of, shifts like that? You know, voice, obviously, is a new sort of paradigm. Any new, sort of, platform shifts that… that you’re seeing that you’re kind of excited about, that folks are… are…

    1905
    05:12:53.400 –> 05:13:00.960
    Kathleen Estreich: you know, building technology on, that AI has sort of opened up new greenfield opportunity. I mean, I think there’s… a lot of us are probably seeing the fact that

    1906
    05:13:00.960 –> 05:13:19.599
    Kathleen Estreich: A lot of companies are now able to go into the enterprise and capture not just the software spend, but the services spend as well, and that’s, like, it creates an interesting opportunity to kind of redefine how technology is bought, within an enterprise, but curious if there are other kind of macro shifts that you’re seeing where you think.

    1907
    05:13:19.730 –> 05:13:37.850
    Kathleen Estreich: in the next 6 to 12 months, you’re excited to see some companies kind of take advantage of that opportunity. Because you could argue, like, the voice stuff actually probably came because of the, you know, the move to mobile and everyone being, you know, everyone having a smartphone and basically a computer in your pocket, such that, you know, there’s a lot more that you could do. So that shift

    1908
    05:13:38.020 –> 05:13:49.999
    Kathleen Estreich: you know, plus AI able to take advantage of that, but curious if there are other sort of major platform shifts or behaviors that you’re seeing that you think are going to kind of blow up in the next 6 to 12 months.

    1909
    05:13:51.550 –> 05:14:00.030
    Valerie Osband Mahoney: I think for us, what I’m excited about, I think, is in these sort of quote-unquote legacy industries, but I would say they’re typically non-software industries. I actually think

    1910
    05:14:00.500 –> 05:14:16.010
    Valerie Osband Mahoney: why, you know, they… I would… I like an anecdote, they all pretty much came online and sort of got software, and then nothing really was actually that 10x better for them. They’re like, you know what? Am I really gonna go rewrite this or do all this stuff again? Like, I’m online, I don’t even want to deal with this. And I think that’s why, and they’re like.

    1911
    05:14:16.290 –> 05:14:36.549
    Valerie Osband Mahoney: mobile app era, cloud era, maybe they went on cloud, but, like, 2014 to 2022, like, nothing really changed that much in manufacturing or logistics or anything like that, in terms of, really, the technology platforms they use. And I think AI is really unlocking that sort of proof points of, hey, we can really be so much better, we can really, like, change what you’re doing, and really

    1912
    05:14:36.550 –> 05:14:42.349
    Valerie Osband Mahoney: it’s almost like they’ve sort of skipped a wave by, like, going out of it, and are now kind of going to go adopt that. So I’m very bullish on those.

    1913
    05:14:42.590 –> 05:14:53.379
    Valerie Osband Mahoney: Industries, like, finally be able to, like, actually have the, like, software spend and associated company value created in those industries that we haven’t really seen in the last 10 years that much.

    1914
    05:14:55.070 –> 05:15:10.830
    Jacob Effron: Yeah, I mean, to build on Valerie’s point, I think, you know, in addition to finally be able to provide the 10x value that some of these industries haven’t had, you can also just, you know, you don’t need people necessarily to adopt a new piece of technology, right? You can be like, great, you know, give us your emails or texts or whatever you’re using, and we can sit on top of that. And so I think

    1915
    05:15:10.830 –> 05:15:32.260
    Jacob Effron: the implementations are also a lot smoother. And one thing I’m super excited about, it might not be 6-12 months, but I do think there’s a lot of really interesting stuff happening in the robotics space right now, and it feels like that space is really starting to get a ton of interesting progress. I mean, we’ve all seen the demo videos of, like, a hand moving, like, a grapefruit onto a plate or something, but, like, I really do think

    1916
    05:15:32.260 –> 05:15:39.450
    Jacob Effron: You know, we’re… like, it feels like we’re starting to move from demo videos to real-world use cases, which is, really exciting.

    1917
    05:15:39.960 –> 05:15:42.870
    Kathleen Estreich: What are you most excited for a robot to be able to do?

    1918
    05:15:43.170 –> 05:15:54.070
    Jacob Effron: Oh, like, ugh, so many things. Definitely laundry, though. I mean, it’s funny, because everyone uses that as the demo, but it deeply resonates. Now that I’ve seen some of the demos actually work, it makes it that much more painful each time I have to do it.

    1919
    05:15:56.740 –> 05:16:07.069
    jeremyk: I thought the Perceptron model that came out today was very interesting. I mean, talk about a foundation model, you can now query the foundation model and ask

    1920
    05:16:07.070 –> 05:16:26.840
    jeremyk: you know, I mean, it’s recognizing sequences of actions in, you know, physical world space, which is not something you’ve been able to do before. I mean, you can ask it, you know, what time did the player throw the basketball? I think that was what the demo was showing, and yeah, that’s a big innovation on the computer vision and action side of things.

    1921
    05:16:28.300 –> 05:16:39.959
    Madhan Arumugam Ramakrishnan: Yeah, I would say a couple of things from my point of view. You know, physical AI, as many called out here, you know, that’s interesting. It has real interesting applications that, you know, we weren’t able to do it at scale, and…

    1922
    05:16:39.960 –> 05:16:51.080
    Madhan Arumugam Ramakrishnan: Accuracy as before. But, you know, a couple of things for me that’s interesting is just the access to, you know, huge corpuses of unstructured data.

    1923
    05:16:51.170 –> 05:16:56.690
    Madhan Arumugam Ramakrishnan: You know, that most enterprises or many companies, you know, we all create a whole bunch of

    1924
    05:16:56.700 –> 05:17:16.420
    Madhan Arumugam Ramakrishnan: data as we, you know, activate our online presence, that now suddenly can get, accessed by AI, you know, without a whole bunch of data management, and create, you know, super hyper-personalization, right, on what we want to do. So that’s really interesting. I mean, I almost feel like

    1925
    05:17:16.440 –> 05:17:29.350
    Madhan Arumugam Ramakrishnan: 90% of world’s data that was not accessible became online, thanks to AI. So that’s gonna unlock a whole bunch of, you know, distribution models and mechanisms. The second thing that I’m seeing is…

    1926
    05:17:29.790 –> 05:17:46.859
    Madhan Arumugam Ramakrishnan: you know, this business model disruption on proceed-based businesses now getting into, like, hey, it needs to be consumption-based, you know, like, real margin compression and pressure to kind of create value, to justify, like, you know, even in the era of, like, you know, you pay for

    1927
    05:17:46.860 –> 05:18:02.930
    Madhan Arumugam Ramakrishnan: your perplexity license or whatever for, you know, at a perceived model, so that is an interesting chain that’s happening, especially as companies are trying to really differentiate themselves from others. There’s a bunch of business model bravery, that I’m starting to see happen. Those would be the two that I would call out.

    1928
    05:18:03.320 –> 05:18:24.310
    Kathleen Estreich: I like that phrase, business model bravery. I might steal that. But I… that was gonna be my next set of questions, was around pricing, and it’s changing quite a bit at this point, the way that people sort of value different products, how people price things, from the per seat, to the usage to the outcome, services, plus,

    1929
    05:18:24.600 –> 05:18:40.109
    Kathleen Estreich: plus software spend. How do you ensure, like, when you’re talking to companies, how do you make sure it doesn’t just become a race to the bottom? How do they, you know, how do you think about pricing as a lever for the businesses and these, you know,

    1930
    05:18:40.130 –> 05:18:53.400
    Kathleen Estreich: these business… like, unique business models, and how you’re thinking about that when you’re evaluating companies as something defensible, or if it works, then other people might just copy it. And, you know, how do you think of pricing as that lever,

    1931
    05:18:53.490 –> 05:19:00.809
    Kathleen Estreich: In the way that you’re kind of evaluating companies, and ensuring that, you know, the value that they’re delivering, they’re actually capturing.

    1932
    05:19:05.010 –> 05:19:10.610
    Valerie Osband Mahoney: I mean, I think we’re seeing a shift, we actually recently done some stuff on pricing here, like, I think a shift

    1933
    05:19:10.670 –> 05:19:18.259
    Valerie Osband Mahoney: more to the kind of outcome-based pricing, that you’re seeing. I think, like, most people want that more, especially as you drive to, like, tokens and even, like.

    1934
    05:19:18.300 –> 05:19:24.380
    Valerie Osband Mahoney: you as an actual business making sure you make enough money on top of the margin of what it’s creating, so kind of driving

    1935
    05:19:24.380 –> 05:19:43.069
    Valerie Osband Mahoney: a kind of… yeah, I think we’re seeing that hybrid, like, kind of per seat, but also per outcome-based, and usually a lower per seat versus fixed, so maybe that reinvents, like, what does ARR mean here, when you’re not having everything all fixed and up front, and it’s just going to be XACV. But I think we’re seeing that’s probably the most… been the most… something that’s resonated with

    1936
    05:19:43.620 –> 05:19:51.770
    Valerie Osband Mahoney: the folks that are buying these softwares quite now, and especially that also goes into the shift of the services, where the people are taking, kind of, some of that services-based revenue that

    1937
    05:19:51.990 –> 05:19:56.969
    Valerie Osband Mahoney: You’re effectively shifting to some of that services-based revenue model as well, how they get that.

    1938
    05:20:01.090 –> 05:20:04.090
    Kathleen Estreich: Craziest pricing model you’ve seen, if any.

    1939
    05:20:05.600 –> 05:20:22.409
    Madhan Arumugam Ramakrishnan: I mean, I think, you know, again, to reflect back on, kind of, the business model bravery, like, the trend that you’re talking about, you have to lean on, you know, forward deployed engineers, right? Like, effectively saying… I mean, that’s, you know, Palantir started it pre-Gen AI, but the idea behind that is

    1940
    05:20:22.570 –> 05:20:34.720
    Madhan Arumugam Ramakrishnan: You know, something that required strategic patience, like, to kind of deploy effectively engineers, part of your product team for a couple of years to kind of completely, you know, build custom software with an enterprise.

    1941
    05:20:34.740 –> 05:20:52.090
    Madhan Arumugam Ramakrishnan: And then, you know, not think about repeatability, right? Like, it kind of, like, defies all the logic that you would think about scale, but now that is becoming kind of such a pattern, right? Now Snowflake is obviously, you know, doing this as well, but every model company is doing this because of

    1942
    05:20:52.190 –> 05:21:04.220
    Madhan Arumugam Ramakrishnan: the high cost of, failure, frankly, right? From a, you know, customer point of view, and the ROI expectations, right? Like, you know, a lot of people are saying, hey, what is the ROI? You know, I know this is a new normal.

    1943
    05:21:04.230 –> 05:21:15.729
    Madhan Arumugam Ramakrishnan: For me, I think the interesting thing there is, you know, it’s why this won’t be a race to the bottom is because there is innovation happening, you know, so fast that

    1944
    05:21:15.820 –> 05:21:27.939
    Madhan Arumugam Ramakrishnan: I think a lot of the enterprises have a hard time just keeping up, and they want real ROI, so when, you know, the people creating IP or margin are able to actually go deliver.

    1945
    05:21:27.940 –> 05:21:36.210
    Madhan Arumugam Ramakrishnan: you know, closer to customers, it gives them a bit better ROI, but also more trust, that, like, hey, this is not some money that I’m spending on

    1946
    05:21:36.210 –> 05:21:43.879
    Madhan Arumugam Ramakrishnan: Whatever the business model, without any defined outcomes, you know, there is basically more skin in the game, from people creating margin.

    1947
    05:21:46.470 –> 05:22:04.919
    Kathleen Estreich: Yeah, the number of pitches that I’ve seen where, how are you going to go to market? Well, we’re going to do the Ford deployed engineer model, and I’m like, what does that mean to you? Because I think people misuse it, and kind of use it in many different contexts, and, I have also spoken to some people from Palantir,

    1948
    05:22:04.920 –> 05:22:23.449
    Kathleen Estreich: some founders from Palantir who were pretty early, and I was like, why is everyone now becoming a forward-deployed engineer company? And they were like, well, I think a lot of them missed the point that, yes, they went and built this custom software, but they had a very specific strategy where then anything that they built was then input back into the core product. And so, I think that only scales

    1949
    05:22:23.450 –> 05:22:34.790
    Kathleen Estreich: If you then, you know, have that feedback loop back in, otherwise, you know, you’re kind of a glorified consulting firm, or I guess a more efficient consulting firm.

    1950
    05:22:34.990 –> 05:22:36.509
    Kathleen Estreich: But, yeah,

    1951
    05:22:36.810 –> 05:22:50.930
    Kathleen Estreich: my… my take, I guess, on what I’ve seen there. What are you, maybe in the last couple minutes that we have, together, it’d be interesting to hear, kind of, your,

    1952
    05:22:51.570 –> 05:22:57.430
    Kathleen Estreich: Predictions on, kind of, what the next 12 months looks like from

    1953
    05:22:57.430 –> 05:23:16.760
    Kathleen Estreich: you know, the startup ecosystem versus these foundation models and areas that you think they might, you know, they’ve obviously gone into a lot of these horizontal use cases. Any other horizontal use cases you think they might enter in the next 6 to 12 months, and then the opportunities that you see on some of these more vertical, opportunities, across

    1954
    05:23:16.760 –> 05:23:18.760
    Kathleen Estreich: The different ecosystem.

    1955
    05:23:22.780 –> 05:23:33.550
    jeremyk: I was… I’ve been waiting for somebody to do a good job, automating Excel work. It was cool to see the, the Anthropic demo. It’s clearly a bunch of…

    1956
    05:23:33.550 –> 05:23:46.069
    jeremyk: you know, newer startups are being funded to work on this problem. You know, I’ve been trying the kind of PLG-style ones that let you tinker around, and I honestly haven’t…

    1957
    05:23:46.560 –> 05:24:06.449
    jeremyk: I haven’t found the ones that I’ve tried super useful for my use cases, but it seems that when you think about highest buckets of knowledge work, you know, allowing any business person to use a spreadsheet seems like a high-value task, and I haven’t fully felt like I’ve

    1958
    05:24:06.500 –> 05:24:09.989
    jeremyk: found something that’s good for me yet, but that’s something I’m watching.

    1959
    05:24:10.310 –> 05:24:26.060
    Kathleen Estreich: Here’s the question on that. Does that live within Excel, because Excel is so entrenched, or is it, like, a new take on, you know, does it live somewhere else? Does it live in, like, a new form of Excel, or does it live within Excel and then just a layer on top? I think that’s my big open question there.

    1960
    05:24:26.810 –> 05:24:36.900
    jeremyk: I have the same question. I’ve seen 3 startups, each with 3 different opinions. One has the, you know, they’ve got the GUI on top of Excel, one’s just in Excel.

    1961
    05:24:37.090 –> 05:24:53.459
    jeremyk: it’s interesting, right? I mean, if you make the analogy to law, there’s a whole bunch of these contract editing tools, right, that they live on top of Word. They didn’t actually build the UI, so you say, in the lawyer case, people liked that it lived on top of Word.

    1962
    05:24:54.270 –> 05:25:00.569
    jeremyk: but in the lawyer case, there was no general tool that could do it. The Excel case is a little bit different.

    1963
    05:25:05.450 –> 05:25:10.910
    Madhan Arumugam Ramakrishnan: I mean, I would say, from my point of view, what I’m looking forward to seeing is

    1964
    05:25:11.220 –> 05:25:17.940
    Madhan Arumugam Ramakrishnan: you know, in healthcare, right? Like, there’s a lot of physician burnout, you know, you know, that problem’s not going away, we are…

    1965
    05:25:18.380 –> 05:25:26.310
    Madhan Arumugam Ramakrishnan: You know, we have less physicians, you know, year over year, you know, we have an aging population, at least in this country, you know, certainly globally.

    1966
    05:25:26.360 –> 05:25:43.350
    Madhan Arumugam Ramakrishnan: I mean, it seems like a place where, you know, if we can, you know, eliminate or at least improve efficiencies, right? A lot of the inefficiencies, you know, we can eliminate, but at least we can make incremental change, and I think this is an amazing problem to kind of target AI, and I know a lot of the healthcare

    1967
    05:25:43.350 –> 05:25:48.200
    Madhan Arumugam Ramakrishnan: vertical AI companies are chipping at that, you know, whether it is

    1968
    05:25:48.340 –> 05:25:58.730
    Madhan Arumugam Ramakrishnan: you know, transcripting, you know, autonomously, you know, what the doctor-patient thinks, within the boundary of, you know, regulatory governance. But I really feel like that’s…

    1969
    05:25:58.840 –> 05:26:01.759
    Madhan Arumugam Ramakrishnan: That’ll be beneficial, you know, at least in the next year or so.

    1970
    05:26:05.700 –> 05:26:08.780
    Jacob Effron: Yeah, I think, you know… Art, go for it, please.

    1971
    05:26:11.130 –> 05:26:28.279
    Jacob Effron: All right, I’ll go. I think, you know, finance and accounting seems like a category that’s pretty ripe with this next wave of models, and then, you know, also go-to-market. I’m not sure anyone has, like, quite cracked solutions there, but it seems like as the models keep getting better, certainly a really interesting area.

    1972
    05:26:28.660 –> 05:26:46.430
    Kathleen Estreich: I think I’ve seen, like, 100-plus AI marketing products, in the last 2 years. I think it is… it is interesting, on that front. There’s a lot more tooling out there, but then they’re all, I think, gonna converge, so the question is, who’s gonna be the winner?

    1973
    05:26:47.930 –> 05:27:04.260
    Valerie Osband Mahoney: Yeah, I think on my side, I’ll say, Luthi, I think what I hope to see is, you know, we invest across all industries, including consumer, and I think we really want to see people with kind of coming with new sort of paradigms of how we should interact and, you know, work with technology. I think a lot of things are… I often think… encourage people to say.

    1974
    05:27:04.290 –> 05:27:23.160
    Valerie Osband Mahoney: And that’s, I think, where the younger founders have an advantage, because they’ve effectively grown up with AI. So they’re looking at the world from this sort of AI-first principle of, like, okay, well, I have all this thing, how am I going to build this? Where sometimes, kind of, more entrenched or experienced founders will kind of look at it just like the marginal improvement from where we are already. So I think I really want to see things that hopefully are

    1975
    05:27:23.160 –> 05:27:27.490
    Valerie Osband Mahoney: Actually, just, like, a real step change in the difference of how we use technology versus just, yeah.

    1976
    05:27:27.800 –> 05:27:30.589
    Valerie Osband Mahoney: A little bit faster way that we… than we do it already.

    1977
    05:27:30.870 –> 05:27:48.129
    Kathleen Estreich: Yeah, I agree with that, Valerie. The incremental versus the step change is, I think, excited for the next, hopefully, 6-12 months to see more of that. I think that concludes our panel. Thank you all, very insightful. I’ll turn it over to Julia to close it out.

    1978
    05:27:48.640 –> 05:28:01.189
    Julia Nimchinski: Phenomenal, panel, thank you so much. We actually have one question from the community, which is super interesting and curious to hear your thoughts. What blind spots would you say today’s AI-native startups can see at?

    1979
    05:28:03.560 –> 05:28:05.219
    Julia Nimchinski: Kathleen, let’s start with you.

    1980
    05:28:05.220 –> 05:28:16.650
    Kathleen Estreich: Oh, what can’t they see? I mean, I don’t know, startups can’t see a lot of things because they’re so early and moving so fast. I think…

    1981
    05:28:16.820 –> 05:28:20.780
    Kathleen Estreich: That’s a good question. I think generally they can’t see…

    1982
    05:28:21.250 –> 05:28:29.429
    Kathleen Estreich: I think everything is just moving very quickly, and so sometimes it’s like, you feel like you get there, and then things change, and so…

    1983
    05:28:29.510 –> 05:28:46.680
    Kathleen Estreich: I think they can’t see what the future teams look like. I think the way that companies are being built right now are pretty different than they were even 6 to 12 months ago, and so I think we’re still in the early days of, like, who is the team that you need? And I think some of these functions that used to be

    1984
    05:28:46.680 –> 05:28:52.020
    Kathleen Estreich: kind of, more standalone roles are actually getting more,

    1985
    05:28:52.220 –> 05:29:17.170
    Kathleen Estreich: integrated, so on the go-to-market side, it’s like you had sales, marketing, and customer success as pretty distinct teams, and I think going forward, you might have more generalist people who can actually use the tooling and coordinate across all those things, because it never actually made sense to me that you had all these different teams with different tools and systems when it was one record of people, across that. So, I think maybe the way that they build their company

    1986
    05:29:17.170 –> 05:29:22.139
    Kathleen Estreich: internally is actually gonna probably change, and I’m not sure that we know what that’s actually gonna look like.

    1987
    05:29:23.650 –> 05:29:31.330
    Madhan Arumugam Ramakrishnan: Yeah, I would agree with that. In fact, I think, Kathleen, I think the Clay folks called it the GTM engineer of the future, right? Which is blurring off

    1988
    05:29:31.350 –> 05:29:43.149
    Madhan Arumugam Ramakrishnan: the various functions, and it’s also a new profile of people coming out of school these days, right? Like, very savvy, you can call them AI, you know, bilingual in some ways.

    1989
    05:29:43.150 –> 05:29:55.819
    Madhan Arumugam Ramakrishnan: And, you know, they don’t process information in the same way that perhaps, you know, many of us in the panel do, so I think, you know, just thinking about the workforce of the future, I think, is…

    1990
    05:29:55.820 –> 05:30:01.150
    Madhan Arumugam Ramakrishnan: Not just from a go-to-market point of view, but across the company that you want to build is going to be very interesting.

    1991
    05:30:05.780 –> 05:30:09.510
    Julia Nimchinski: Valerie, Jacob, Jeremy, what are your thoughts?

    1992
    05:30:14.210 –> 05:30:17.289
    Jacob Effron: Yeah, the only thing I’d add is I think the,

    1993
    05:30:17.560 –> 05:30:25.730
    Jacob Effron: I think the reality is that everybody has blind spots in this current wave. Things are just moving so fast, and so, you know, I don’t think…

    1994
    05:30:25.730 –> 05:30:45.910
    Jacob Effron: anyone quite knows what the model capabilities will be in 6, 12 months, what the org of the future is, and so, you know, there’s… given that there’s not an answer out there, I think the ability to just be really adaptable and move quickly and change things as new information comes up, but the fact that there are blind spots is certainly not, not a problem. I think we’re all operating with, with them to some extent.

    1995
    05:30:47.650 –> 05:30:57.810
    Valerie Osband Mahoney: Yeah, I think that’s what I encourage people, it’s like, you’re always gonna have… there will always be blind spots. I mean, yeah, make… be on top of stuff, understand, you know, where your market is. I think now.

    1996
    05:30:57.990 –> 05:31:05.219
    Valerie Osband Mahoney: you know, sometimes you go, like, oh, go head to town, I don’t really care what other people are building. I think, actually, in this market of building people building so quickly, it is actually good to

    1997
    05:31:05.350 –> 05:31:23.610
    Valerie Osband Mahoney: really understand where other people are at in that build cycle. I’ve talked to some companies, they don’t know half the companies in this space, and so I think that’s helpful when you’re thinking about go-to-market, and just, like, again, you can use AI to help you be up to speed with what’s happening, but yeah, know that there’s blindfold, that’s not a negative thing, just, like, be a scholar, you know, of your industry.

    1998
    05:31:27.220 –> 05:31:35.139
    jeremyk: just be able to adopt to the foundation models. I think we’re all watching what they can do, the speed at which they can get better, and just so interesting.

    1999
    05:31:36.570 –> 05:31:51.570
    Julia Nimchinski: Thank you so much again, and that wraps up our AGENTIC Distribution Summit. Thank you for the incredible speakers, panelists, demo leaders, and all of you watching, and we’ll be back on October 23rd with AI practice sessions.

    2000
    05:31:51.570 –> 05:32:01.130
    Julia Nimchinski: You can actually book one-on-one sessions, just consultation and, Mentorship programs on our marketplace.

    2001
    05:32:01.160 –> 05:32:11.919
    Julia Nimchinski: Unfortunately not with all of the panelists that you are currently seeing on the show, but very hopeful that you’ll create a profile, and yeah, see you soon. Thank you so much!

Table of contents
Watch. Learn. Practice 1:1
Experience personalized coaching with summit speakers on the HSE marketplace.

    Register now

    To attend our exclusive event, please fill out the details below.







    I want to subscribe to all future HSE AI events

    I agree to the HSE’s Privacy Policy and Terms of Use *