What does it really take to build, ship, and scale AI fast? Today on Next Gen Builders, Francois is joined by Daniel Sternberg, former Head of Data at Notion and Gusto, who shares lessons learned from the development of Notion AI and the evolving role of data leadership in AI-first organizations.
What does it really take to build, ship, and scale AI fast? Today on Next Gen Builders, Francois is joined by Daniel Sternberg, former Head of Data at Notion and Gusto, who shares lessons learned from the development of Notion AI and the evolving role of data leadership in AI-first organizations.
Daniel offers an inside look at Notion AI’s rapid development, revealing how a CEO-driven push transformed into a revenue-generating product just a month after ChatGPT launched. He highlights the importance of balancing excitement about these new AI tools with a clear understanding of their limitations. Daniel also encourages organizations to adopt AI broadly, not just within specialized teams.
Daniel & Francois also explore the evolving role of data leadership in SaaS, emphasizing why data teams need to move beyond being simple service providers. Whether you’re leading AI projects or building data-driven products, Daniel’s insights provide valuable guidance for leaders navigating the fast-changing AI landscape.
—
Guest Bio
Daniel Sternberg is a seasoned data science and AI executive. He currently advises startups as a fractional data and AI executive, helping organizations build high-performing data teams and leverage AI for product and operational excellence. Prior, he served as Head of Data at Notion where he guided the company through a period of rapid user growth and pioneered the launch of advanced AI-powered features.
His leadership was instrumental in scaling Notion’s data infrastructure and integrating generative AI capabilities into the platform, supporting both product innovation and business expansion. His career is marked by a commitment to data-driven decision-making, technical leadership, and the thoughtful integration of AI into everyday tools.
—
Guest Quote
"There's this idea that AI features can be prioritized and shipped like any other product feature, but there's an added dimension: does it actually work? You don't know if it works until you build it. So you should never promise a ship date until you're confident in the quality. The tech has to prove itself first, and only then should you align marketing or product engineering behind it." – Daniel Sternberg
—
Time Stamps
00:00 Episode Start
02:00 The role of a SaaS data leader
07:40 How the data function will evolve
16:40 Building Notion AI
20:40 Launching with a flood of interest
22:50 Don't quit while you're ahead
24:45 The power of a CEO Founder
27:30 Prioritizing speed and flexibility
31:10 Lessons learned from the Notion AI launch
27:20 Democratizing AI responsibility
40:55 Daniel's "Oh Sh*t Moment"
—
Links
0:00:00.3 Daniel Sternberg: You should push the technology to the frontier of what it's capable of, but you need to be realistic about what that frontier is and you can tell a high level story to the company about how transformative you think this technology will be, but be clear eyed about what are the capabilities right now.
[music]
0:00:21.8 Francois Ajenstat: This is Next Gen Builders, the show for the growth and product leaders of tomorrow. There's no doubt that AI products are tech's hottest commodity. But what does it really take to build, deliver and scale AI fast? Today we're going to go behind the scenes of Notion AI, which launched only one month after ChatGPT. We'll explore how the team went from an idea to beta in record time. We'll talk about the challenges they overcame along the way and how to turn data into a revenue engine. Joining us today to talk through all of it is Daniel Sternberg, fractional data and AI executive and advisor and former head of data at Notion and Gusto. Welcome Daniel.
0:01:15.2 Daniel Sternberg: Thanks for having me. It's fun to be here.
0:01:17.3 Francois Ajenstat: So great to have you here. I mean that is quite the title that you have right now. That's impressive.
0:01:23.0 Daniel Sternberg: Well, we'll see. It's been fun. It's been a couple months just trying to help out some companies and work on a few different types of problems in data and AI actually and some cases the merger of the two. So yeah, it's been a fun experience so far, but excited to also talk more about past experiences too.
0:01:40.9 Francois Ajenstat: I love it. Well, you've been a data leader for many, many years. What does that really mean? What's your main focus thinking about a data leader in an organization?
0:01:50.5 Daniel Sternberg: Throughout my career, especially at Notion and at Gusto, I would say there were kind of two flavors or two hats that I generally wore in my work. And one of those was, and often this was I would say two thirds or more of the role. I think at most, most these were SaaS businesses. And most software companies is just helping the company use data to make decisions across the board and that's business decisions on go to market side, help structure and understand the kind of financials of the business and also support actually a lot of my time has been spent on supporting kind of product analytics functions and how we design product experiences, how we measure, how they're working for our users and also on the growth side there as well within the product, especially at a company like Notion, which is very product led, growth heavy. So that's one big kind of bucket. And where I would say in those both of those roles, I probably spent roughly two thirds of my time. And then the other area is how can we actually use data to more directly either power parts of the product or optimize the product in more automated ways.
0:03:07.7 Daniel Sternberg: And so just to give a little bit more of an example at Gusto, a lot of that looked like either in the time I was there. Now they're doing a bunch of AI stuff, because aren't we all? But at the time I was there, it was more like, can we automate some of the manual backend processes for risk and fraud. So we'd have to onboard new companies onto the platform. And one of the first things I worked on there from a machine learning standpoint was try to automate that process more so that most new companies signing up on Gusto could just get onboarded automatically and we would approve them because we could check everything out and they were legitimate and just run that through a model. And then a smaller percentage would go to human review. Before that, every single company was going to human review. So just speeding things up. Also things like helping with optimizing growth, so scoring every single new lead coming in for likelihood of conversion and kind of LTV and things like that that we could use and our marketing teams could use to make things more efficient as well.
0:04:06.2 Daniel Sternberg: So that was what it looked like at Gusto. At Notion it was a little bit different in that still a lot of the analytical work. There actually wasn't much machine learning happening internally in the product when I joined. But I happened to join at a moment a few months in, where our AI efforts took off quite a bit and that became a bigger focus of my role as well. But it's generally been this kind of mix of those two and I think that split there looks different, obviously, depending on the nature of the business and product that you're working with. So some products really need data to work and some products don't really need data to work at their fundamental level. And so your role is going to look... Ahead of data role is going to look a lot more like helping with internal decision making.
0:04:43.9 Francois Ajenstat: Well, it's interesting you say that because in many cases when you talk to data folks, they think... There's either two kinds of people. There's those that think of data as a service center, they're internally helping the business and they own the data governance, the data infrastructure. You sound like you're on the other side where you're using data to power growth. You're to use data to power building better products, better outcomes. Is that a deliberate thing because that's what you're excited about? Or is that where the companies you've gravitated to, that's their focus on data?
0:05:17.8 Daniel Sternberg: Yeah, I mean, to be honest with you, I think in some cases, less at Notion because I think we were, AI was just going to transform the productivity space and kind of knew that within a few months of starting to play around with LLMs, especially in this product that's so text heavy. So less there where I think like that was very clear. But at Gusto, you might argue I was dragging the company kicking and screaming to some of this. Well, not kicking and screaming. That's an overstatement. But I had to make some, I had to push. I had to find areas where there was an exciting opportunity and really push on those areas. To be honest, I think like nonetheless, two thirds of my role was probably more on the internal decision making side at both companies where I spent time. There were periods where maybe that mix looked different. But if I just look across the board, and I think just, I think I said this at a talk like or like a fireside chat like last year. I think as I have done this type of role for a number of years, I've come to terms with the fact that at most companies, it really is a service role. The internal decision making part.
0:06:22.6 Daniel Sternberg: And I separate the idea, I distinguish between do we run our org like a service org? Do we like, it's centralized with tickets and those sorts of things coming in. And generally speaking, I don't. So maybe the data infrastructure side is more of a centralized, purely centralized function. But the data science or analytics function and even to some extent, the kind of more analytics ENG and data ENG function that's doing the data transformation and building data sets and all of that. Those tend to run more like in some embedded or semi embedded way. So you have people partnering directly with product teams, for example, if they're working on the product side or working closely with specific marketing functions or things like that. But my role as a leader at the end of the day looks a lot like a service function in the sense that my success is dependent on the opinions of all of the stakeholders I have across the organization in a more pure way than I think of product leader who obviously that is the case as well. There's still people who are, weighing in on how you're doing, but how you're doing is mostly about did we execute on this roadmap successfully? Did the strategy make sense? Those sorts of things versus, hey, do I have what I need to do my job, which is really what comes the big part of the evaluation of a data leader.
0:07:40.2 Francois Ajenstat: Right. Do you think that this is changing in the world of AI where there is no AI without data? And so does the role of a data leader change as a result of that?
0:07:51.3 Daniel Sternberg: I think it depends... I think how it changes depends on the organization. I think it's almost certainly going to change. I also think that there's actually kind of two... I've been thinking about this a lot recently. There's kind of like two different historical trends that are, I think, influencing where data and organizations is going. And one of them is AI, which you just mentioned. And I'll even talk more about in a second. And the other, I think, is actually that a lot of the historical growth of the kind of data function over the last, I don't know, 10, 15 years, which is kind of when I was entering the industry in 2011, 2012, during that kind of like initial growth period, coincided with zero interest rate times. And so I actually think we also got a little ahead of our skis in how we run things during this period of time where it was really easy or like, the money was easy. And all these organizations were growing headcount like crazy in tech, at least. And so we kind of saw these armies of like analysts and data scientists kind of scale up quite a bit during that period of time. And to be honest, I actually think the data science, as an example, was actually hit above average.
0:09:03.3 Daniel Sternberg: It was hit harder during the kind of layoff periods of 2023, early 2024, in part because of this, because it's seen as a... It's a support function for the organization. And so that's happening at the same time that these AI capabilities are growing. And so I think on the analytics... I distinguish between like, you're going to need data for your AI features, which I'll talk about in a sec. And how does AI transform how the kind of internal data function works. And my intuition right now is actually not that like, oh, all the kind of, there will never be a need for analysts or any of these roles because the AI will be an Oracle that you will just ask questions and it will magically answer all of them. I think the function on that side of the house does become more, though, about like the part around we need to structure the data and make sense of it is still very important because actually like AI tools can't like just jump in and work with structured data that has not been governed, documented and made sense of in the same way that I can't... Yeah, go ahead.
0:10:10.4 Francois Ajenstat: Is that adding context and semantics basically to the data?
0:10:13.8 Daniel Sternberg: Exactly. Exactly. So it's adding context, semantics to the data, structuring it, helping build pipelines. I'm sure you can use AI to speed up all of those processes. But a lot of that is, okay, I have this data from some set of applications and different data sources. And how do I turn this into something that's useful from a business standpoint. And that requires interacting with the business and people. And getting that all, like someone has to take the messiness of a startup where it's just nothing is documented and it's all tribal knowledge and turn that into something structured. And that's actually very hard for AI to do on its own. So it probably can help, augment the data engineer or whoever. I think where things do shift though is I think the idea of we're going to build large embedded analytics functions in an organization is actually going to change. Not to say that there won't be lots of specialized roles and places where you do need specialized talent over time, but just the rate at which you scale that, I think there's an opportunity to really do that a couple clicks less and a couple clicks more slowly than we did in 2020, 2021.
0:11:27.1 Daniel Sternberg: And candidly, I was doing it too at that point in time. But I think there's an opportunity there to, through getting the semantics in place and all of that, set things up in a way where self-service in data is something that can become more of a real thing. So that's that side of the house and how I think that changes.
0:11:45.1 Francois Ajenstat: Interesting. And so when you think about the people or the kind of work that people do, do you see the data scientists doing more like higher end, deeper strategic work as opposed to less than, less of the reporting or basics, which that should be self-service, or does the work just change from building models to now putting together workflows or analyzing the results? Is it the work that's changing or the people that are changing, the skills required?
0:12:16.8 Daniel Sternberg: Yeah. And again, these are my, I should note, my caveat to all of this, this is my current kind of working hypothesis and narrative, which we'll make contact with the real world and we'll see what happens. But I think it's a little bit of both. I do think there will continue to be a need for like specialized strategic work from data scientists in a bunch of areas, especially like I think the hardest thing of all is, or one of the hardest, harder areas is, you know, there's a lot of things that are pretty straightforward. So I've worked at a couple of SaaS businesses and at a certain point, you kind of know how the... Okay, if I need to build the financial reporting and metrics, there's going to be nuances around a given company and how it works. But at a high level, like I kind of know the, what are the core metrics I'm going to need to have. Like I can write that out for you in an hour or two for the most part. And then you'll tweak them a bit based on the specifics of the company. And exactly how subscriptions run or something like that will differ from company to company. But products look a lot different and products are very idiosyncratic from company to company.
0:13:14.9 Daniel Sternberg: Some, there's classes of them, but like my experience at Gusto did not help me understand Notion as a product at all. And it just was very different. And so you do on some level need people to develop expertise on like that part and to be able to do really strategic analytical work. At the same time, I think, it cuts both ways. At the same time, like historically what that would mean is like I'm going to hire a data scientist or a product analyst for like every product team or every two product teams or whatever ratio gearing you're going to come up with. And then what you're going to find is that they're spending a lot of their time on just like lots of random ad hoc cuts for PMs. And that is like a very expensive use of that type of talent. And so I think while like what I'm imagining happening over time is you're going to have people who develop deep expertise in a product and business. And there are these areas where like, when Notion were trying to, we were trying to understand how to think about, you know, what is the right activation metric for an early user using the product that is predictive of downstream free-to-paid conversion, which may not happen for many months or something like that.
0:14:22.5 Daniel Sternberg: You're going to want like an expert to like look into that and look across all of these engagement metrics and think about what are the suites of engagement metrics we need to have, et cetera, and standardize and govern that over time and build the Notion version of that, which is going to be different than any other company. So you need someone to think through that. But there's like these other, 60%, 80% of the time of each of those people on the team that I honestly think we can get to a place with like the right investment on the data engineering side, AI functionality to help us self-service where you can at least get like, okay, once that data is structured for me, I can start to go and like if they haven't thought of that cut yet, that's fine. There's a semantic layer or something that will help me figure that out. And I don't need to worry about that as much. And I do think like AI can help augment that part for sure.
0:15:11.2 Francois Ajenstat: Absolutely. And one of the things I've seen is actually great product managers, even great designers are the ones that understand the data because the data can actually help them understand their users and where the actual problems are in the products they're building. And interestingly, if you actually give that task to somebody else, then that understanding doesn't actually happen. Right, you need that depth.
0:15:37.5 Daniel Sternberg: Exactly. Right. And that means you need... The way you get there in the kind of current world or the world that maybe I had at Notion and I think became en vogue in 2016 on up until things kind of went downhill a little bit was, was like you need to sit with... They need to become really tight with that PM and like build a lot of trust over months to years and really, really understand products. And like those were the best, the best people on my teams were those people who were like had this combination of like amazing communication skills, like a product oriented mindset, like almost like, oh, this person could be a PM, but they're like a data expert and someone who could just like really build those really strong relationships and communicate their findings super, super clearly. And that created a really nice feedback loop. But again, that's like, that's a really special person. We should keep hiring those people. And we will, but we'll take a bunch of other things off their plate is my hope.
0:16:35.8 Francois Ajenstat: Those are some great attributes. Let me switch over and talk about your last experience at Notion. You guys built Notion AI, which I think is absolutely amazing. And actually the adoption and the impact of that has been actually just breathtaking to see. Can you bring us into like the early days of that? What was the genesis of it? Was that started from the data side? Did it come from somewhere else? Where did the data come in to make it happen?
0:17:06.2 Daniel Sternberg: Yeah... For sure.
0:17:08.6 Francois Ajenstat: Bring us in.
0:17:09.9 Daniel Sternberg: Yeah, for sure. So I take absolutely no credit for the very earliest days of Notion AI. I helped with some things later on and I'll talk more about that in a sec. But Notion AI really originated with Simon, who's one of the co-founders of Notion, who has been a hands-on technical IC at the company for the past bunch of years. And I honestly don't remember how many years because it predates me. And so he had been getting really interested in some of the advances we were starting to see in AI. Even this is a little prior to ChatGPT and kind of the like big wave of everybody investing in AI. But earlier, I would say summer of 2022 when I joined the company, he was playing around with various image models and other things like that. And then we, through the founders at Notion, got access to an early version of GPT-4, kind of a preview version before it was released. And there was actually a company, the lore, you can actually find other videos of the founders talking about this, and it's true, was that there was a company trip to Cancun that we were all on for a few days, which was really exciting.
0:18:13.3 Daniel Sternberg: People came from all over the world who worked at Notion. And during that, Ivan and Simon had already become really excited about what AI could do for a product like Notion and what LLMs could do for a product like Notion, again, given how text-heavy and that's the core content of the product. And so they actually stayed on a couple days extra and they went back into founder mode, startup mode, and built a demo in a development version of Notion that they then actually shared a video with the whole company when we got back and was really motivating. It was like, hey, we want to work on this. And so the next thing that happened, the first thing I really got involved in besides having some conversations with them that week, was we were going to borrow a few people, we're going to get a Tiger team together to work on this. So we're just going to, we need someone from this team, someone from this team, we need someone from the team that built Docs because it was going to be the kind of generative writing experience first.
0:19:11.9 Daniel Sternberg: There was also some interest in doing something that was going to turn into Notion Q&A, which is RAG and Notion. And so I wanted to borrow one person from our data platform team for that effort, someone from our search team, which had just moved out of my org, but I was still close to that team as well. And they were going to work together on trying to get this Q&A RAG experience going while some folks on our Docs team and just kind of product engineers plus Simon were working on the initial product.
0:19:38.0 Francois Ajenstat: And this happened because it seemed like a great idea and you guys wanted to surge with a small team?
0:19:44.6 Daniel Sternberg: Correct. Small team, let them be disruptive. We don't know where this is going quite yet. I think Simon had some ideas about exactly what he wanted built initially out of the gate. So let's just try to do this as fast as we can because it's also becoming clear at this moment that we're not the only ones getting access to an early version of GPT-4 and people are starting to work on this. And I think Notion prides itself as a company where we are going to be fast to market. And so this wasn't really a request, it was a demand, but that's totally fine. I was on board. So here, just borrow this person, borrow a couple people. So it's a really small team. And then we actually started hiring a couple of ML engineers as well. So that was happening. I mean, honestly, the founders were running this process at that point in time. This is kind of a founder-led initiative. And then, we hired a couple more people and then we launched a waitlist. Very shortly after, I think within a few weeks after ChatGPT came out because we were already working on it before and we launched a waitlist. So some people got it out of the gate, but we got millions of people on that waitlist before we went to GA, like generally available version of the product in February of that year, which when we also launched the revenue generating part of it at that point in time, like the actual add on.
0:20:57.2 Francois Ajenstat: Did you guys share like a demo of what it was to get the excitement going or was there a clear like value use case or AI was hot, anything that had the word AI on it, everybody wants to jump on it?
0:21:09.7 Daniel Sternberg: No, I think there was actually some really good marketing around it. And I think it was also clear that like from some of the examples we were showing and kind of trying to create, getting... First off, stepping back, getting the like basic generative writing is not that hard. Let's be real. Like the basic capability was just a wrapper around GPT, right? The harder part is how do you introduce that from a UI standpoint into a product like Notion out of the gate? And so a big part of it was, what are the kind of like canned prompts we can give you that are useful? How do we invoke that I want to use AI? And like what way should we, that was actually some controversial decisions made there as well.
0:21:53.8 Francois Ajenstat: Sounds spicy.
0:21:54.6 Daniel Sternberg: Yeah. And how much do we push it on people is a big topic, obviously, that I think a lot of companies have dealt with. It's one of the great things that Notion is from a kind of like content brand team, like we do a really good job. There are other areas of marketing that could be better at that company, but like that is an area that's been really good. And so I think the creating excitement around it was not just, hey, it's AI. It was, it is like, here's a video of how you can use it. And like, let's focus on some things that we know people in tech are doing because this is the early adopter period. It's like, okay, look, we're going to generate this PRD from scratch, or here are some good like personal user use cases as well. And so what happened was... I can't go to numbers, but when we went to GA in February, we had let a lot of people off the waitlist already. We add us add on purchase and we really generated a lot of revenue right away. And those first few months were pretty, pretty crazy. It was pretty impressive to see.
0:22:54.0 Daniel Sternberg: And meanwhile, we were already working on for a few months by that point, what is the second thing we're going to... Set of AI features we're going to launch. What is the third set of AI features we're going to launch. And it kept us ahead, I think, of a lot of other companies where the history of like AI, Notion has in some level been, we are following that same trends everyone else is following, but we just had a tendency to... Because we're scrappy about how we approach problems and I can share more about what I mean by that in a sec, we just tended to like be ahead in market of most of our competitors. So I'm sure they were working on the same thing we were working on because the trends are pretty common in the space. Everybody's excited about agents this year. So they're working on agents, a year and a half ago, it was everybody's putting RAG into their products. But we were just in a position where some of the, I think, tactical ways the company worked internally, we could get these things out in front of actual people. And like we actually have a product feature that works that is in market, three to six months before most other companies.
0:24:00.1 Francois Ajenstat: I mean, it's the speed at which this was delivered and in general how that company delivers capabilities is impressive. Is that a cultural value? Is there something in the water that makes people feel more comfortable shipping fast? Or like, what does Notion do differently that other people could learn from?
0:24:21.1 Daniel Sternberg: Sure. So I'll state two things. And first off, I think it's most... It is most apparent on AI at Notion. That was my experience, at least. There's other areas where you have to move slower for various reasons. Obviously, for like your enterprise customers, it's just, things look a little bit different. AI is a bit of an exception for enterprise customers too, because everybody's got some AI budget they're working with or something like that. But it's not the same everywhere. But I would say it's really two things going on, one of which was like specific to AI to some degree, and one of which is more general. The general point is, like, this is the job of the CEO founder and founders in general. And I think really good companies, a big part, and this became very clear to me in my role over time, this may be more useful advice for people in like my level of an organization. Like, don't get mad when the CEO is like, why can't we ship it faster? Why can't we do it faster? That is literally their job. Like, yes, it can be stressful. Yes, it happens in situations where it's like it annoys you or you're like, oh, no, we really are putting everything we can against this. It is their job to push on that.
0:25:27.3 Daniel Sternberg: Hopefully, they don't push you so hard that you break, but like it is their job. And it is our job to do our best to accomplish that and think and use that as a jumping off point, not to say, hey, here's the 15 reasons we can't go faster, but hey, like, here's a way we could if you're okay with cutting scope in this way or if you're like, these are the tradeoffs. Here are some ways we could approach it and also sometimes understand the question behind the question from the CEO founder. So I had a really good example of when we were building Q&A, and we were actually trying to, actually, we built it already. So this was, we have RAG inside of Notion. We got this massive wait list, and we need to like onboard millions of workspaces. We need to onboard them, like embed all of their content in a vector database and get that all set up. Why is this going to take us two, three months to do? How can we do it faster? And eventually, you have to understand the question behind the question.
0:26:19.8 Daniel Sternberg: And I was like, this is gonna be really hard. Here's the technical constraints, et cetera. But the real question wasn't how can we embed them in the vector database faster. It was how can we give them access to this awesome product feature. And eventually, we actually decided, hey, we can do this for all of our paid workspaces and we can get this over a slightly faster time period for all of our paid workspaces. But now they're asking us, how can we get every single workspace embedded so they can try this feature? And how are we gonna do that for the millions upon millions of free workspaces? And just kind of have a shift of thinking where it's like, wait, they don't care about embeddings. And then we ended up taking a totally different approach and saying, actually, we tested this out and what we learned is that most of these workspaces are very small and it's good enough to just use our normal Elasticsearch backend that we use for search and power this RAG experience on top of that. And actually, it doesn't perform much differently.
0:27:12.0 Daniel Sternberg: So we could just turn it on tomorrow because all we're doing is doing the normal generative completion stuff on top of Elasticsearch. And so that was a moment where it's like, oh, right, their job is to, they're trying to get to this product outcome. Let's just figure out how we can get there and let's play around with what our different options are. So that's one way. The other thing I think, honestly, is that Notion is just very pragmatic about where do we invest a lot technically and where do we not. And so I think one of the things I noticed was when we were working on this also is related to this Q&A RAG experience. I think I use that one a lot because it crossed data platform and our data teams and our AI team more because it relied on a lot more data infrastructure than just like doing completions on top of prompts. And so when we were working on that and we would like talk to other companies and people we knew in the industry, it felt like a lot of companies were like, hey, what is our vector database strategy? Like are there interesting open source things we could work with? And they're really like thinking a lot about that side of the house. And we think much more about how do we get a product experience that works well and feels good and we will find a vendor that's good enough for right now, and we'll start working with them.
0:28:28.9 Daniel Sternberg: And if that doesn't work over time, we'll figure something else out. And we're going to focus a lot more of our attention on what is the product experience and what is the quality of the functionality, the AI functionality. And we're going to like do whatever is fast to get things out into market. There's also a privilege that comes there of working at a pretty high margin organization where we can afford to do things in a way that's either cash flow or otherwise inefficient for a period of time and then come back and optimize later. But like we think an AI, I think being first to market or being really early to market is an advantage for two reasons. One, it's just like you build buzz. Two, you get more like shots on goal in terms of understanding what users, what works for users and what doesn't. And so we really focus a lot on dog food internally really rapidly. So everyone's using Notion all day long. And so we can test these things internally, get some pilot customers out there who are in a program with you who you're giving early access to consistently and they have like a strong relationship. They're often startups. There're other companies even that we use internally, we would use internally. So some of our data partners actually were early users of some of these features, get feedback from them and then just get it out into market fast and iterate. And so that's sort of how we approach it.
0:29:47.5 Francois Ajenstat: That's great. And it does sound like there is a culture element of having a learning mindset as opposed to make it perfect. You're willing to not cut corners, but you're willing to get something out early in order to learn, iterate, iterate, iterate. And then once you figured out the ICP, then you can optimize.
0:30:10.6 Daniel Sternberg: Exactly. And the nuance there is that you might want to build an amazing generalized assistant that helps do all of your workflows inside of the product. And it may be the case that that's really... Either the current state of the AI technology or models doesn't enable it or you haven't figured out how to do it yet. But that's okay. Find three or four use cases that you can get going really well and figure those out even internally with your pilot customers and then get that to work really well and then put it out in the market and see what happens.
0:30:41.7 Francois Ajenstat: Yeah, that's great advice, especially now as the technology is moving so fast, you can't get to perfection, but you can really get those use cases nailed and just iterate and improve those.
0:30:50.9 Daniel Sternberg: Exactly.
0:30:52.3 Francois Ajenstat: So if we wind the conversation back to the Notion AI launch, you guys had this massive wait list, this incredible enthusiasm and hype around what you guys built. If you could go back in time at the beginning of the project, knowing then what you know now, are there some things you would have done differently?
0:31:13.0 Daniel Sternberg: Yeah, I mean, there's some small things, and then maybe I can talk about some bigger ones. I think on the smaller side, I think this is also more tactical. I think one of the learnings I had with the Q&A experience in particular was that we did have this moment where we launched all of the people using AI, like who are paying us for AI already got access to it, but we really couldn't offer it to others yet to try it. And it's important to give people an opportunity to try these product features, especially because for a lot of people, these user experiences are very new and novel to them, and they don't know how to think about them, so they may not be willing to buy them up front unless they're an early AI adopter. Because our AI team was this, it was its own standalone team, we built it explicitly because I mentioned earlier, we kind of wanted it to be disruptive. Even as we were, a year in, we had a real team with a manager and a bunch of ICs, and eventually multiple managers, we built it to be full stack. Like it had, we will move some data platform people to that team because we know there's data platform problems to work on.
0:32:13.7 Daniel Sternberg: We'll move some, we'll have product engineers on the team, we'll have AI and ML folks on the team. We were a little slow to like get the core data platform team involved in figuring out how we scale this out. And what we quickly realized once we were in this moment of, oh my gosh, we have this thing out in the market, and we honestly don't have much confidence yet in how we're going to scale it, is realizing that, well, actually the best use case for these folks with this specialized knowledge on the AI team is to go and build the next thing. And they don't need to worry about all of the problems about how will I perfectly scale this over time. They have that skill set, but that's not what they are supposed to do now. That's not their remit at this point in time. And so I wish one small thing is I wish we'd brought that data platform team in sooner because once we, they got really invested in the space over the course of about a month, they made a lot of progress really fast, but we could have front-loaded that much more. So that's small.
0:33:10.2 Daniel Sternberg: I think the bigger thing is that we, a few different times, I think overestimated what was possible now to build with the state of the tech. I think that actually did look a little bit like RAG was not quite ready in the first moments we were trying it to actually work correctly. And so I think that is alerting as well, which is... You should push the technology to the frontier of what it's capable of, but you need to be realistic about what that frontier is. And you can tell a high-level story to the company about how transformative you think this technology will be, but be clear-eyed about what are the capabilities right now. And I think we could have, at various points, done a better job of that because I do think when you over-promise, and even internally, it can create a perception of snake oil if you're not careful. And it was very clear to me that this technology will be transformative at a minimum in the way that it's going to change how software looks. Whether or not I believe, I'm not going to say an opinion, but whether or not I believe we're heading toward this AGI future in the next three to five years, I know that there's actually going to be a lot of change because of this tech, and it's perfect for this type of product.
0:34:32.2 Daniel Sternberg: So what can we do with it now? And when the new thing comes out, what new capabilities does that unlock that we didn't used to have? So I think we could have done a better job there. And then the last thing I would just say, which is really important, a fundamental difference between AI features, which it shares exactly with the ML stuff, machine learning stuff I used to work on before AI was hot, is that when you're a product manager and an ENG manager and you're trying to figure out, I'm trying to cost all of the different things I could build and prioritize them, and maybe I have a spreadsheet with this list on it and I've got how valuable it is.
0:35:06.3 Francois Ajenstat: You create your two by two.
0:35:07.2 Daniel Sternberg: Yeah, exactly. I'm making my two by two. There's an added dimension for all of these product features, which is, does it work? Like, can it work? Does the technology allow me to do this right now? And you should never set a date, a ship date in stone for one of these features, unless the only exception being when it's like, you just know if you've done essentially the same thing and it's just like a slightly different use case and it's mostly about product and a little bit of prompting work. But anytime you're like doing one of these new frontier capabilities, like don't tell marketing a ship date until you know the quality is there. And so like you should be putting more of your eggs in the kind of AI ML engineer basket of getting the prompts and like figuring out if this thing is actually capable of being built and working well and like get some really rough internal version of that to a really good spot, both before you promise anyone anything around like a ship date and a marketing moment and all of those sorts of things. And before you invest a ton of product engineering time on it as well. I saw that mistake get made a couple of times and it is the fundamental case that fundamentally true that you don't know if it works until you try to build it with AI. And so until you know you can actually make it work at a certain level of quality, you shouldn't promise any of those things.
0:36:34.6 Francois Ajenstat: It's so true. I mean, I'm living it right now as I'm building and building the next AI features. It definitely seems that there is, because of all the hype around AI, it seems it can do everything. But at the same time, you also have to be grounded in reality, what's possible, and also what is valuable. And I do find that sometimes there's the realists and the zealots, or maybe the skeptics and the zealots. In this area that's moving so fast, I do think the zealots should actually be the ones trying this out and hitting their wall, because they don't see those walls. They're just creating and figuring it out without the limitations of today.
0:37:20.5 Daniel Sternberg: For sure. And I think the careful thing, I think you have to be really careful about an organization, is you can create a culture where you've got this AI team full of zealots, and you've got a lot of people on basic product teams who are getting really skeptical, and the incentives get really messed up. And so that's something you have to be really careful about.
0:37:38.2 Francois Ajenstat: Absolutely. You can't make it the AI team's problem to do AI. It's everybody's responsibility.
0:37:44.9 Daniel Sternberg: And it's everybody's responsibility over time. That was also a really tough challenge, right? Is we knew over time we needed to start embedding more of that within teams. There would always be an AI team, but we had different opinions about how that team would evolve. Would they become mostly a platform? Are you going to have certain expert talent on it, but a lot of the product engineering is going to sit outside? We thought it was important early on to keep it together all in one, because it was like, be disruptive for now. But eventually, that's going to make your product look really weird if you're not careful. Because now there's like... That's one of those shipping or org chart types of situations. Oh, here's some AI slop thrown on top of stuff. I'm not saying we did that, but that's always a risk. And so that part is, that part's hard as well. But I think one of the things I thought of maybe going back to my job is I tried to maintain somewhere in the middle of that continuum view on things.
0:38:39.7 Daniel Sternberg: I was like, there are enough AI zealots at this company. This is a tricky thing to do. But some of the founders on to others on the AI team and elsewhere, there were definitely some zealots. And there were definitely a lot of skeptics as well in the organization. I think if you actually ask people, you can get a pretty... Probably not as me, but I get reads from people on how people really felt. And I thought it was really important that I didn't position myself as a pure zealot, that I'm really excited about this technology. I think there's a ton we can do with it. Zealots, you may be right about where this is all going over time, but I'm going to try to focus on what are those frontier capabilities we can build right now. And I trust the zealots are going to keep me up to date on where is this going and, oh, there's these three new things I heard about that I think are going to push the boundaries here. I'm like, that's awesome. Go check it out. But try to maintain a good relationship across the organization because you know over time the way this works is going to change. Like I said, things are going... You're going to need, maybe the IT team rolls up to me, but it's okay.
0:39:51.8 Daniel Sternberg: For this stall to work out, that probably shouldn't be the case long-term. Probably some of this does need to happen on other teams. And we do need the whole company to work on AI. So that role is going to shift over time. And you need to maintain a good balance there so that even the people who are a little bit skeptical can feel safe to be able to both voice skepticism and feel like when they see some cool capability, they can get excited about that without that being like, oh, I've given in to the AI side. No, it's like this technology is cool and I can do some interesting things. Let's figure out what those are.
0:40:23.4 Francois Ajenstat: That's great. And definitely, you need that learning mindset to help you break through and push things forward. All right, last question. Because I think we can go on for hours on these AI topics. Thinking about your career, you built a lot of teams, you built a lot of products, you built a lot of data and initiatives. If you think back, is there a notable like, oh, shit moment where you look back and you're like, oh, shit, what happened?
0:40:55.0 Daniel Sternberg: Yeah, I think I go back to a moment, it's kind of like personal professional growth moment that I had in like 2018. It was 2018. And the situation, it was essentially a moment where I... This was well before I'd kind of scaled a lot of teams or built and scaled teams meaningfully at all. I'd been a manager for a few years, first at a data science team at the first startup I worked at. And then I joined Gusto, became an engineer... Became a kind of IC data scientist and also managing the data engineering team, which was pretty small at that point. And then spun data science out and just focused on building that. So I was just going to focus on building data science. And so it had been about a year into that. And I candidly, like that point in time, and this is to be fair, this was only like seven years ago now. I did not know how to hire talent like well at all. And I also, I think probably had a lot of imposter syndrome still at that moment in time. And so I was trying to... I actually had like a lot of buy-in from my IC work that people knew I was pretty capable as a data scientist. I was pretty good at like working with the business and product folks.
0:42:13.1 Daniel Sternberg: And so I had actually had gotten like some mandate to actually start to really build the team. So I was like, oh, I have like two people on my data science team. We're going to grow that to like three, like to five in the next few months. And then next year we'll see where it goes from there. But I had like, I just had my first child a few months earlier and I was about to go back on the second half of my paternity leave. And two things happened at the same time. I had this one really senior hire I was super excited about who was going to be my first like meaningfully senior hire on the team. And this person like seemed like they were going to accept and then backed out at the last second. And then within a week or two, the one other hire I had successfully made on this team left, told me he was leaving. And I was about to go back on parental leave.
0:42:57.2 Daniel Sternberg: And I was looking at this and I basically was like, look, like I went to my boss and I was like, who is the chief product officer, co-founder at Gusto. And I was like, look, like honestly, I was just real about it. This is a hard moment. I like, I'm about to go back on parental leave. I look at where I'm at right now. And I feel like, yeah, I've done some great work here, which in retrospect, I could give you a very different take on it looking back that it was all honestly, like I'd learned a lot during that period of time through being a player coach IC. And I'm really glad I was actually a mix of manager and still doing IC work for four or five years versus just scaling a team like three years into my career. The time I was like, look, like I feel like I'm actually further behind where I was in my last job. I'm two years in here and this is really hard. And I don't know how I'm going to feel because I'm about to head out and like kind of leave things in a lurch. And so his advice to me was just like, look, like I hear you. I get it. Like take your time off. And he had a young kid too. So he appreciated this. Take your time off, like see how you feel when you come back.
0:44:00.6 Daniel Sternberg: But like no like expectation set or anything like that. Just see how you feel when you come back. And he was, I think he was willing to just roll the dice in that situation. And what happened when I came back was I kind of did two things. One, and this is with some luck involved too. One thing I did was... I was like, how am I going to deal with these People, all these needs that they're asking for. And it's literally me and one other person who I'm very grateful for, who I had lunch with last week, but me and one other person, I was like, okay, I'm going to write a memo. And I'm basically going to say like, here are the things we're going to focus on. Here's where we're trying to get to. Like I need to spend most of my time on hiring. Like here's my prioritization. Very little is above the line. If you disagree with anything in this, like come talk to me about it. We can shift things around, but like, this is the total number of things we can do. It was a little service orgy, but there's no way around it in that moment. And then the second thing I did was just go insane on recruiting and interviewing.
0:44:53.6 Daniel Sternberg: So I didn't have enough people to like do most phases of the interview process. Because I just had one other data scientist and I had maybe a PM helping me do cross-functional interviews. So I interviewed someone, like I did three interviews per person in that process. So I think I was like the number one interviewer at the company and I luckily hire... And I like, I was really also just like tactically did some interesting things.
0:45:17.3 Daniel Sternberg: Like I found one amazing hire and got really lucky on that person. And then a couple other things kind of fell into my lap. Like there was a great referral who I took a chance on. There was someone I was almost hired a year earlier who I went back to and found and the kind of like scrappily built a team. And then from there and focused a lot on people who are really good at the communication side, which I mentioned earlier. And that just built a team that kind of went, got off the ground and running. And so there's a lot of luck in there too. But that's a real moment that always stands out to me from a personal standpoint, because it was just this moment of like, it was the first moment I think since I started working in tech where I was like, oh, things aren't always going to be up into the right for me in my career. And you're going to have to figure out like a different way. And these pivot points are actually really helpful in forcing you to reframe how you're approaching problems versus just like rolling... So much of your time is kind of just like there's momentum and you go with the momentum and then these points where you can't go with the momentum. And that was one of them.
0:46:20.4 Francois Ajenstat: That is good perspective and good to reflect on and also probably change how you hired in the future as a result of it too.
0:46:31.3 Daniel Sternberg: For sure.
0:46:31.6 Francois Ajenstat: And build stronger teams.
0:46:33.2 Daniel Sternberg: Definitely. Yeah, it was definitely one of those moments. I think one of the things I learned from it was like, your gut actually matters a lot in hiring as a leader. And it's like, it's interesting to kind of help other managers kind of learn how to like, yeah, you have a weird feeling about this situation. Let's figure out why and like dig into that. Or you feel good about this person, even though like they didn't meet X, Y, or Z expectations, like in parts of the interview, like are those the top priority ones for you? If you feel really good about these other areas, so like kind of like leaning into that in the right ways, not to say like go with your gut and bias, but just like if you have a gut reaction, like let's investigate that and figure out why.
0:47:11.0 Francois Ajenstat: Absolutely. Well, Daniel, so great to chat with you. Thank you for joining the podcast. And thank you all for listening to Next Gen Builders. Look out for our next episode wherever you get your podcasts. And please don't forget to subscribe.
[music]