0:00
/

Ignite UX: How to Avoid Building a Product No One Will Use with Bill Albert | Ep270

Episode 270 of the Ignite Podcast

Most startups don’t fail because they can’t build.

They fail because they build the wrong thing.

That’s the core problem Bill Albert has spent decades trying to solve.

He’s worked across academia and industry, led global customer experience at Mach49, and now runs Greenlight Idea Lab, where he helps companies validate ideas before they burn time and capital. His focus is simple: reduce the risk of building products nobody wants.

This matters more now than ever.

You can ship a product in days. AI tools cut development time to near zero. But that speed creates a new problem. You can go very fast in the wrong direction.

Here’s how Bill thinks about avoiding that.


Most founders validate the wrong thing

A common pattern shows up in early-stage startups.

Founders say they’ve “talked to customers.” They feel confident. They start building.

Then the product launches—and nothing happens.

The issue is not effort. It’s what they validated.

There are two separate questions:

  1. Is this a real problem worth solving?

  2. Will people actually use or pay for this solution?

Most teams jump straight to the second question without answering the first.

They assume the problem exists. They focus on features, design, and speed.

Bill sees this constantly. Products that are easy to use. Clean UI. Thoughtful flows.

But nobody cares.


The difference between interest and demand

One of the biggest traps in customer discovery is mistaking positive feedback for real demand.

People will tell you your idea is great. They’ll say they would use it. They might even ask to be notified when it launches.

None of that matters.

What matters is behavior.

Bill looks for increasing levels of commitment:

  • “Send me an email when it’s ready”

  • “I’ll join a demo”

  • “I’ll bring my team”

  • “I’ll sign an agreement”

  • “Here’s my credit card”

Each step requires more risk from the user. More time. More reputation. More money.

That’s where signal lives.

If people won’t move up that ladder, you don’t have strong demand yet.


Why “talking to users” is often useless

Many founders rely on small sets of conversations. Friends, early contacts, warm intros.

That creates bias in three ways:

  • They’re talking to the wrong audience

  • They’re asking leading questions

  • They’re looking for confirmation, not truth

Even worse, they ignore negative feedback.

Bill describes a common scenario. A founder hears 58 minutes of hesitation and doubt. Then two minutes of mild enthusiasm.

They leave convinced the product is a hit.

This is human nature. But it’s dangerous.

Real validation requires pressure.

You need to create situations where users can easily reject your idea. If it survives that, then you might have something.


The biggest mistake: falling in love with the solution

Founders are wired to build.

That’s the problem.

They start with an idea. They imagine the product. They picture the outcome.

Then they try to prove it’s right.

Bill pushes the opposite approach.

Start with the problem. Stay there longer than you want to.

Test whether the problem is painful, frequent, and worth solving.

He uses exercises like forcing users to “spend” a fixed budget across different problems. The ones that attract the most “spend” are the ones that matter most.

This forces prioritization. It removes vague answers.

If your problem isn’t winning that competition, it’s not strong enough.


Speed is not your advantage anymore

AI has changed how fast teams can build.

What used to take a month now takes a day. Sometimes an hour.

This creates the illusion that you should just launch and learn.

Bill disagrees.

Fast iteration is useful. But skipping validation upfront leads to wasted cycles.

You end up building multiple products, hoping one works. That costs time, money, and credibility.

His focus is different:

Time to first revenue.

Not time to launch.

Not time to prototype.

Revenue is proof that someone values what you built.

Everything else is a guess.


What real product-market fit looks like

Many founders rely on soft signals.

Signups. Engagement. Positive feedback.

Bill looks for something stronger.

One simple benchmark is the Sean Ellis test:

Ask users: How would you feel if this product disappeared?

If more than 40% say “very disappointed,” you’re on the right track.

But even that is not enough.

You still need behavioral proof:

  • Are people paying?

  • Are they committing time and resources?

  • Are they bringing others in?

The closer you get to real-world commitment, the more confident you can be.


Where AI helps—and where it hurts

AI is powerful for product discovery.

It can help you:

  • Understand a new domain quickly

  • Generate interview questions

  • Synthesize qualitative data

  • Identify patterns faster

But there’s a line.

Bill avoids synthetic users and AI-generated personas.

Why?

They tend to be overly positive. They don’t push back. They don’t reflect real behavior.

Until AI can replicate true human decision-making under risk, it can’t replace real users.

Use AI to move faster. Not to replace validation.


A better way to evaluate startup ideas

If you’re building or investing, Bill suggests a simple framework:

  1. What does the product do?

  2. Who is it for?

  3. What problem does it solve?

  4. What evidence proves this problem matters?

  5. What proof shows your solution works?

If any of these answers are vague, that’s a red flag.

Strong teams can explain all five clearly. And back them with data.


The takeaway

You don’t need months of research.

You don’t need large budgets.

But you do need discipline.

Spend a few weeks validating the problem. Test demand with real signals. Look for commitment, not compliments.

It’s cheap insurance.

Because once you start building, every mistake compounds.

And in a world where building is easy, knowing what to build is everything.

👂🎧 Watch, listen, and follow on your favorite platform: https://tr.ee/S2ayrbx_fL

🙏 Join the conversation on your favorite social network: https://linktr.ee/theignitepodcast

Chapters:

00:01 Introduction to Bill Albert

00:30 Bill’s Background and Academic Roots

02:50 Transition from Academia to Industry

03:30 The Problem of Building Products Nobody Wants

05:08 Joining Mach49 and Focus on Product Validation

07:05 Early UX Research in Japan

09:26 Measuring UX and Industry Gaps

11:07 UX Research Misconceptions on Sample Size

12:30 Shift from Usability to Design and Brand

12:50 Common Mistakes in Early-Stage Product Development

14:31 Validating Problems vs Solutions

17:11 Why Talking to Customers Isn’t Enough

18:40 Stress Testing Product Ideas

18:48 Framework for Knowing When a Product Is Ready

19:57 Common Product Failure Patterns

21:23 Evaluating Startups as an Investor

23:06 Market Trends and AI Impact

25:50 Favorite Tools and AI in Research

29:35 AI’s Role in Product Discovery

30:47 Merging Roles: UX, PM, Engineering

31:57 AI: Easier or More Dangerous for Discovery

32:09 Speed vs Insight in Product Development

33:13 Faster Iteration Cycles in Startups

34:18 Future of Product Development and AI



Transcript

Brian Bell (00:01:01): Hey everyone, welcome back to the Ignite Podcast. Today we’re thrilled to have Bill Albert on the mic. He is the founder of Greenlight Idea Lab, a product validation and UX research expert who has spent decades helping companies de-risk innovation, previously leading global customer experience at Mach 49 and authoring one of the foundational books on measuring user experience and product validation. Thanks for coming on the pod, Bill.

Bill Albert (00:01:24): Yeah, it’s my pleasure. Thanks for having me.

Brian Bell (00:01:26): Yeah, so I’d love to start with your origin story. What’s your background?

Bill Albert (00:01:29): So my background’s probably a little bit unusual, and when I tell it, just to give it a caveat, it makes perfect sense to me, but may not make sense to other people. So here we go. So in college, I studied geography, and I went all the way through my PhD. And for my research, I was really focused on spatial cognition, how people navigate in real life. in virtual environments. When I was finishing up, I started into a postdoc looking at the design of navigation systems in cars. And at that time, they were only in Japan. They weren’t in the US yet. And it really got me thinking about kind of design and cognition and how people process information. And at that point, I sort of There’s sort of a fork in the road for people at that stage of academia or industry. I went into industry, but I always wanted to have a connection into academia. I started working for actually a UX team in 1999. I learned a ton about design and usability and really kind of core foundational skills. I jumped over to Fidelity Investments for about seven years running a research team within Fidelity working on a lot of enterprise applications and for both kind of B2B, B2C and really learned a ton. At that point I met what became my mentor Tom Tullis and we ended up doing a lot of stuff together writing books and research and all that and it’s very transformative in terms of my life trajectory after fidelity an opportunity opened up to head up the user experience center at Bentley University it’s a business school but they had a UX center there that kind of operates like a consultancy or you could think of as a teaching hospital for people in UX so we had a staff we had graduate students working on real client engagements and that was wonderful I really enjoyed that I did that for a long time you know again kind of having the kind of being in an academic setting but doing really practical grounded research was very important to me and then what happened after that this is probably in 2021 I was seeing so many products that were we could make them easy to use but no one would want to use them And so that really got me thinking about Yeah, if you can make the most beautiful designed product, it’s perfect in every way, but yeah, nobody cares Exactly, and so A big problem, so I started trying to answer this question like, should we build it? Should we build it? Because I saw a lot of wasted money building product that was never going to be successful.

Brian Bell (00:04:17): Most painful thing as a product manager, as a new product manager, where I’m like, yeah, we’re going to build this. I remember I built this beautiful map-based report back when I was in mobile ads over a decade ago. One internal product of the quarter or year, and everybody was like, high-fiving me. It’s such a great product, and nobody used it.

Bill Albert (00:04:37): Yeah, exactly.

Brian Bell (00:04:38): Nobody cared.

Bill Albert (00:04:39): So what happened was as I was starting to like, okay, I’m ready for a new challenge, an opportunity came along to head up the global CX team at Mach 49, does a corporate venture building. And that really gave me kind of an extra energy, a kick in my step in terms of my career focused on early product validation. And my background is in UX was really about measurement. I’m a data person. As you mentioned, the book I wrote on measuring UX really fit nicely into this new focus around product validation and was there for a few years left and started up Greenlight Idea Lab where I’ve been running this agency for about the last year and a half. Wow. Really focused on product validation for venture studios, corporate ventures, product innovation teams.

Brian Bell (00:05:28): Makes sense. Yeah, I mean, they definitely have a need for someone like you. Circle back to Japan really quick. What was it about Japan as you studied UX coming out of Japan that inspired you? Because it strikes me, especially in the late 90s, early 2000s, they were just like ahead on so many dimensions as a producer of UX. And not so much anymore, but definitely back then they were.

Bill Albert (00:05:50): Yeah I mean it was really interesting so so I went to Japan through a fellowship that had been organized between the Japanese government and Toyota the car company to bring new PhDs over to Japan to work on like technology whatever that means and in Boston where I’m based I was doing my postdoc at Nissan R&D so Nissan hosted me but I was being paid through the Japanese government and they sort of didn’t care what I worked on and what the cool thing was is at Nissan they hadn’t really done any research like we’re building these navigation systems but we have no idea like how to design them in terms of the the experience the UI how how do people process that information what’s too much information what’s not enough we take all that for granted now but back then people just didn’t know and at at Nissan at the plant where I was working they had a driving simulator that was up on a hydraulic lift it had six degrees of freedom and it was literally like this million dollar machine that was essentially only used to take executives is like basically the world’s most expensive video game here jump in the driving simulator do it for a couple minutes and wow that’s cool and then you get out and I got there and I said like this is an amazing opportunity for us to really do more research that no one else has and so we started kind of working with that as a way to really understand kind of cognition during driving we used eye tracking all this cool technology so it was there they had the technology to do the research but they really weren’t research minded like like I was so it was a bit of a clash for at least for the first few months and then you know then they sort of got used to me and what I wanted to do so it worked out I lived in Japan 20 years ago actually when I washed out of Wall Street and what struck me about living there and visiting sense is just how intentional everything is designed in their society yeah they really really think everything through right just just from end to end for the food packaging you know is very innovative how they wrap food and how you unpack it just just every little thing there is just really thought through yeah and even the details like the speed at which you would wrap or unwrap something like you said intentional it’s very like there’s a process of doing it and in the U.S. we’re just like you know rip it apart all this so you’re totally right there they think about design very deeply in a very different way I feel like some of it is is more surface level and they’re not going deep into UX actually for my measuring UX book they translate into Japanese I went there on like a little book tour and I was trying to say let’s build more rigor let’s start measuring experience and it was like totally new concept at that time and kind of 2010 that’s interesting

Brian Bell (00:08:57): Yeah. So you spent decades measuring user experience. What’s a widely accepted UX practice that you think is fundamentally flawed?

Bill Albert (00:09:06): Oh boy, there’s a lot to choose from. I’m going to pick one thing that’s a little bit more nuanced. For people that do UX research, oftentimes we’re trying to find pain points or usability issues as somebody’s using a product. And we can do that with a very small sample size. We can do it with 5, 8, 10 people because what we’re looking for is problem detection we’re just trying to identify problems we’re not trying to measure opinion or preferences where we would need hundreds of people to get something reliable and what I see is in people that either do research or are consumers of UX research they sort of conflate the two they say how can you only test with 10 people and be sure that this is a problem well our Goal or outcome is very different. And sometimes the researchers can do the opposite. Well, they’ll ask people, do you like the red button or the blue button with 10 people? And that’s nowhere close to being reliable. It’s a preference and we need a much larger sample size. So it’s kind of a overall kind of a misuse or misunderstanding of sample sizes related to the research question.

Brian Bell (00:10:21): Yeah, I think that’s really insightful. So looking back at your career, what did you believe about users or product design, UX design that you’ve since completely reversed?

Bill Albert (00:10:32): Oh boy. I think the one thing that Early on, and it’s partly a product of when I sort of started in this field in, let’s say, 1999, early 2000s. We were really focused on just usability, like can people complete the tasks? And we were kind of less focused on the design of it. The design didn’t really matter that much, the branding. And as we’ve sort of figured out more about how to design good experiences or at least make them more usable I’ve really come around to see the power that visual design and brand particularly how that affects kind of their overall experience and so that’s something where I’ve really kind of shifted my attitude probably in the last 10 years about how important that is on top of kind of the more you know basic UX stuff that we’re trying to do.

Brian Bell (00:11:33): Yeah. So back to, you know, you’re helping a lot of venture studios do product discovery or product validation. What do you think a lot of early stage companies and venture studios, people building experiences get wrong?

Bill Albert (00:11:48): I think this is in some way, this is kind of an easy one for me, but it’s so fundamental. There’s a well-known quote. I think it’s fall in love with the problem, not the solution. And I see founders senior leadership falling in love with the solution hey I’ve got a great idea I think there’s going to be a real market for this and go ahead and let’s let’s start designing and building it and launch and learn and it’s great to be excited but oftentimes they have done nothing about validating the pain the problem that they’re solving and I see that over and over and over again and then what we end up with is a product in search of a problem it’s it’s really common and probably to me one of the biggest reasons why either startups fail or they run just run out of runway they overspend capital early on they have to go back to the drawing board it has a whole host of problems but it’s always more fun to start building than going to these fundamental questions it’s very enticing to build it especially when you’ve got a great idea but you gotta like pause that and validate the problem yeah

Brian Bell (00:13:04): it’s kind of like the old adage you know measure twice cut once you really really should make sure what you’re building people actually want so how do you how do you know when people actually you know have the problem or really want the solution

Bill Albert (00:13:19): Yeah, so it’s a really interesting point you just made, Brian. Those are kind of two sort of different things. So one is, how do you know that this is a problem worth solving? Because it could be a problem, but it could be what we’d say a paper cut. It’s really not that big a deal. The only way that we can really truly validate that it’s a problem worth solving is by collecting a lot of data from the target users. We do things like a spend exercise. You have a hundred dollars, euros, whatever to solve problems, right? And we have four in our kind of hypothesized panes. You add two or three things. How would you spend that money, right? virtual money to solve that and we can see if there’s one pain that is just like getting the lion’s share of the money and there are other research techniques to really hone in on this is the big pain right once we do that then the question you said like how do we know people will want it then we’re looking at the value proposition and actually the product solution and for that we’re starting to go from what people say to what they do we’re looking at a demonstration behavioral demonstration of the demand for it so for example going from send me an email when the product is available to let me sign up for a demo a demo with my boss to uh let me sign an MOU here’s my credit card like all the different ladders of kind of commitment that we would be looking for as we develop that prototype or that product that there’s an increase in that level of commitment whether it’s the time of the tension the reputation or the money that people are willing to put forth so that’s how we validate as the product is being developed. But it’s kind of two different ways of validation in that order. The problem, the idea, the value prop, and then the product solution.

Brian Bell (00:15:22): So founders often say they’ve talked to customers, right? They’ve validated it. Why is that almost always insufficient?

Bill Albert (00:15:27): Not most of the time. Sometimes it’s friends and family. They’re always going to tell you they love it.

Brian Bell (00:15:31): Mom loves my product.

Bill Albert (00:15:33): Absolutely. She always will, no matter what the product is, guaranteed.

Brian Bell (00:15:37): My mom thinks I’m the best basketball player on the court.

Bill Albert (00:15:41): I hear you. But they’re oftentimes not asking the right questions the right way. They’re not talking to the right customer profile. Sometimes they only talk to buyers and not the users or vice versa. They’re just, it’s research done very quickly and oftentimes really a confirmation of You know, what they already believe. The way I, or at Greenlight, how we approach the validation is we really want to put it through a stress test. We want to pressure test this as best we can. We want to give people as many Different ways of telling us, no, they don’t like it. They’re not going to use it. And even after that, if they’re saying, yes, I have to have it, right? Then we know we’ve got something. So we try to attack that validation in different ways, different ways of stress testing it.

Brian Bell (00:16:34): Yeah. So what is that framework for knowing when a product is actually ready for market versus just still kind of in that illusion territory?

Bill Albert (00:16:42): Yeah, kind of going back to what I said a few minutes ago, the way we know it is when we’re seeing those behavioral signals. One of the kind of more common measurements for product validation is the Sean Ellis test. And in that, for those that aren’t aware, it’s one simple question is, how would you feel if this product weren’t available? If you had it and I took it away from you? And if more than 40% of the people say, I would be very upset because That’s sort of at least a signal that you have something. Now, I think that’s a great way to start, but it’s not sufficient. We need to look at what those behavioral signals are that matter to that product, to your company, and making sure that we’re hitting those targets. That’s really when we know, all right, we know we have something that there’s going to be true customer demand for. And until that, it’s not an illusion, but it’s a lot of assumptions going on.

Brian Bell (00:17:41): So you’ve seen hundreds of products over the years. What’s the most common failure pattern look like? And what does success look like?

Bill Albert (00:17:49): There’s products that have already been developed and they’re not necessarily new. They’re not trying to do anything new, but they’re being improved somehow. For those types of products, the biggest mistake or failures are really not understanding kind of ultimately what customers care about, how it’s going to make their lives better. for new product, especially in an innovation space. it’s really figuring out what is that what is that problem that people care about what is what are those blockers how can I remove the blockers right when you’re able to do that then you’re gonna be successful so there’s just that is really to me that the difference between success and failure ultimately I mean people are get so enamored with with new technology and AI but ultimately it’s about solving a problem and you I know I’m kind of harping on that, but it is so fundamental to product validation.

Brian Bell (00:18:47): So put your investor, your VC hat on for a second and you’re evaluating early stage startups. How would you evaluate that they’ve validated that this is a problem?

Bill Albert (00:18:57): Well, the first thing, I mean, whenever I’m learning about a new product, like, you know, tell me about the product. What does it do? Right. I’m trying to understand the value proposition. Then I want to know who is it for? What is your ideal customer profile? and then I want to know what problem are you solving and then I want to understand something about the context of use and if they’re not able to give a really crisp clear explanation of the problem then I’m starting to wonder a little bit and I want to know what data they have to support what is the evidence to support that that is a problem worth Solving and we get kind of into the weeds around the data and the methodology that’s sort of my thing anyway so I like going down there but that’s really what I’m looking for is I want you to convince me in many different ways that there’s really strong evidence that this is the problem that you’re that you’re going after that really matters to people and then And then the idea. So once you know that problem, then there’s the idea or the value proposition. There’s a lot of different ideas to solve the same problem. And then we want to have evidence that you have the right idea to solve that.

Brian Bell (00:20:11): So what are you seeing in the market? You know, it’s 2026 now. Like, what are you seeing out there that’s shifting?

Bill Albert (00:20:18): Well, I mean, obviously it’s AI. Everything is AI. I mean, try to find a product that doesn’t have AI somehow incorporated into it. Sometimes they used to switch in the old days. We’d say IA because that’s information architecture, but another start. But what we’re seeing now with So many products, you know, through vibe coding and everyone saying, you know, hey, we can build this product really quickly. Let’s just launch it out there. You know, let’s launch and learn. We’ll be able to make tweaks. And on the surface... Well, it used to take you a month, can it take you an hour with cloud code and just like get it out there and see if people use it? Yeah, yeah. I can see where that would be very enticing. I’m not a big fan of that because people think that the goal is to launch a product the goal is really for me time to first revenue right through a product how long is it going to take you to get to first revenue and I contend I strongly believe that when we do that validation up front we’re saving time we can get to first revenue faster than if you’re just throwing spaghetti at the wall let’s build this and launch it see let’s look for those behavioral signals you have to have it out there enough you’re starting to hurt your reputation if you’re like throwing a bunch of different things out there you keep going back to the drawing board or you could launch multiple products see what sticks But even then, to me, it seems like a kind of inefficient way of doing things So that’s one sort of practice I’m seeing that I’m very kind of weary of And the other thing is using IA in a way that doesn’t always make sense And when we look at product that is kind of an AI-based solution We need to look at it, not just, you know, does the problem Is it solving a problem that matters? And, you know, but we need to look at it in other ways with other lenses like around trust and transparency. You know, is that solution better than what you could get on your own or through some other mechanism? So kind of a slightly different way of stress testing those AI based solutions.

Brian Bell (00:22:24): Yeah, that’s really cool. So what are some of your favorite tools that you’re using on a day to day basis?

Bill Albert (00:22:29): Yeah, so in terms of AI, the way I see it is it’s awfully powerful that lets us run faster. It does things like ramps us up on a domain. We can develop interviews, interview guides, helps us identify the customer profiles with recruiting. It helps us with qualitative data synthesis. A lot of things that help us just run much faster, which is great. But what we don’t use it for is is we’re not using creating synthetic personas or running synthetic interviews. There are just a lot of people I know believe that they’re crap I believe that too I just maybe at some point they’ll be so good but we’re not there yet we do have a portfolio company that does this they use AI and synthetic audiences to sort of you know test if you’re like a brand manager at Procter & Gamble or like you make makeup or whatever for Sephora like what are people in your ICP going to think about this synthetically I think it could be helpful to sort of inform a little bit there’s another product that we have in our portfolio I don’t know if you’ve come across them Conveyo they just raised a pretty big ground they automate qualitative research through video and voice interviews so basically helps you kind of do the AI research at scale where previously you would hire a whole team you’d have to like go to out and like kind of do it manually and that just does it with AI Yeah that’s awesome and I’m all for like until my agent is selling your agent something and the agent is the buyer until that happens we have to have a human a human is making a decision so there has to be it has to be sort of grounded in some kind of human experience human behavior human you know preferences so Use AI to do all that, to capture qualitative data at scale, to synthesize, pull out key patterns. It’s wonderful for that. But when I’ve seen demonstrations of synthetic interviews, everything in AI is always so positive and it’s hard to develop personas that are like essentially real people. Everyone’s sort of... Kind of psychophantic, right? The way AIs are trained is with reinforcement learning and so it’s sort of like you’re basically giving it a little treat when it does a good job. and so it’s always trying to please you know I literally have that in my custom instructions for all my AIs like please give it to me straight do not sugarcoat things be honest objective you know things like that Yeah, I will say one of the tools that we’ve built, we’re almost ready to release, is kind of a rapid assessment tool using AI that kind of walks people through kind of a series of about 10 questions just to understand, is this product kind of viable, kind of just as a sniff test? Is it viable? And kind of where are the gaps? What are you missing? What do you need to do next in terms of getting to a much higher level of kind of reliability in the assessment of it? So it kind of gives people a kind of a jumpstart into thinking about, all right, this seems like a good idea, but these are the things that we need to focus on.

Brian Bell (00:25:36): There is also, I’ve heard from Boots on the Ground, that there’s sort of this merging happening between the product manager, the UX person and the engineering manager. Have you seen this on the ground?

Bill Albert (00:25:48): Oh yeah I have I think for people who are trained in UX they’re usually designers or researchers sometimes strategy people and you know for decades we had a way of doing things we’re running these projects right on different products and and that all made sense and and then AI comes along and it can do so much now we’re a UX researchers or designers are able to kind of Almost be like a maestro, right? And trying to organize everything together. So it’s that role between researcher, designer, product manager is definitely merging into somebody who can kind of understand all three and how they interact to deliver product quickly. Right. And people that aren’t able to make that pivot, if they’re too focused on this is what I do, I’m only a researcher, it’s going to be tough. Right. Creative destruction from AI. What do you think AI makes? Does AI make product discovery easier or more dangerous? Both It makes it easier because it gives us a jump on where we should talk to people or some different types of questions we might want to ask or understanding the context in which they’re working stuff like that Yeah It’s more dangerous because if we start going down the road of synthetic interviews, we can easily, easily jump to the wrong conclusions. We’re not going to get the depth, the level of insight. It’s powerful. You know, use it in the right way is kind of the bottom line.

Brian Bell (00:27:30): so if AI can generate products faster does that make user insights more valuable or less valuable

Bill Albert (00:27:36): I would say if it can generate products faster it makes time to insights faster let’s just say so you know the more I’m putting out there the more data I’m getting right so we can get more insights but really the measure should be we getting to those milestones like how quickly can we get through the different stage gates to actually launch a product and start to grow and scale like I said time to first revenue that’s what we care about insights are only as good as you’re able to kind of leverage them in order to get to a successful product launch so AI probably more insights does it give you better or new insights sure it might but it’s got to be done in conjunction with some kind of human data right

Brian Bell (00:28:28): yeah I think we’re seeing the the collapse of time scales to validation happen because I see it in my portfolio right I see founders building faster than ever And I see them getting traction faster than ever as well. Because I think because they can be really responsive to customers and synthesize a lot of data very quickly, they can, you know, what used to take, you know, we used to have this joke in product management that the quarterly roadmap was really a yearly roadmap. So what we thought we’d get done in a quarter took a year. And now I think, you know, you think you can get done in a quarter, you get done in a month or a couple weeks in one sprint maybe. And so I think you’re just building faster than ever because you can generate all this code faster than ever. It’s just driving a lot of iteration. What do you see out there in the future? You know, if you look, you know, five or 10 years out, what are you excited about?

Bill Albert (00:29:18): You know, famous for being a really bad predictor. I can give you examples of when I convinced my roommate not to invest in Starbucks the night before they went public with navigation systems. I said, this will never make sense in the US. We all know where we’re going and we know how that played out. So I am very leery of predictions I make. I am a data person. So where things are going, honestly, I have no clue. I know things will be different and I know things will change that’s about the only thing I can say

Brian Bell (00:29:49): I think in the future you’ll probably be able to say hey I have this hypothesis to AI to your super intelligent assistant and be like hey I really want to build something in XYZ area what kind of problems are people having in that area that they can’t solve themselves that we can build and I think that will be the one person unicorn of the future it’s just like this person just orchestrating lots of different AIs to do lots of different things.

Bill Albert (00:30:16): Yeah.

Brian Bell (00:30:17): That’s the vision.

Bill Albert (00:30:18): I mean, maybe it’s we have AI agents designing and creating product for other agents and they already know what they want and what’s going to work.

Brian Bell (00:30:26): Yeah. Any takeaways for the founders listening out there? Any final words of advice? There’s lots of founders that listen to this.

Bill Albert (00:30:34): Yeah, I would say don’t fall in love with your solution. Focus on solving a problem that matters, that people will say, I have to have this product. Don’t take it away from me. You’ve got to look for that. The motion in the voice, behavioral signals, everything that’s going to tell you that it’s going to work. Validate upfront, just spend those extra few weeks. It’s cheap, cheap insurance to really de-risk your innovation. Ultimately, that’s what we’re talking about. It’s not going off for a year and collecting tons of data. It can be done quickly, cheaply, just to really dramatically increase your likelihood of product success and growth and scale.

Brian Bell (00:31:20): What do you see some, like speaking of falling in love with the problem, What do you see as kind of a best practice for running a backlog these days? You know, back in the day when I used to run backlogs, it was, of course, a spreadsheet. And then you had maybe a weighted matrix of pain and effort to sort of prioritize it and things like that. What do you see as some best practices these days?

Bill Albert (00:31:41): Well, it really depends on where you are in the product development process.

Brian Bell (00:31:47): Let’s say you’re looking at value propositions, right?

Bill Albert (00:31:50): So you know that this is a problem. Now you’ve A lot of times people will hone in on one or two value props and, you know, in terms of like the backlog is really testing a lot of different types of value propositions that are wildly different from one another. Having all that data in a central spot and kind of being able to organize it being able to do kind of apples to apples to see what ideas are the strongest to understand why and then moving over into a product solution and seeing really what resonates what are those metrics that matter most things like that

Brian Bell (00:32:25): Let’s wrap up some rapid fire questions. What’s one product you think succeeded for the wrong reasons?

Bill Albert (00:32:30): I’m not sure this is going to be rapid fire. A product that succeeded originally designed to take the soot off walls during the 1930s. And 20 years after that, they discovered that kids were playing with it. Kindergarten teacher gave it to some kids and they took out the chemicals, put in color, and that’s Play-Doh.

Brian Bell (00:32:51): amazing what’s a product you were sure would work but failed what did you miss I

Bill Albert (00:32:57): cannot honestly there’s no product trained to be super skeptical and so there’s

Brian Bell (00:33:02): nothing that wow that’s good I have one that map-based report you know there’s probably others if I wrap my brain around it but what’s the most misleading metric founders rely on when validating ideas

Bill Albert (00:33:09): I think people focus too much on product launch We launched the product success instead of metrics that more tied to their success, right? Whether it’s that first revenue or hitting certain financial goals, things like that. I mean, that’s usually more on more established product innovation teams. The other is focused too much on the speed and running so fast that they’re ultimately wasting time.

Brian Bell (00:33:40): They can be running really fast in the wrong direction. Getting to nowhere fast, I think is the phrase. What’s one question every founder should ask users, but almost never does? And we covered a couple of variations of that in our discussion, but yeah.

Bill Albert (00:33:52): Yeah, yeah, sure. I will say founders often don’t deeply appreciate the context of use. They’re used to just talking to people on computers. They’re not going out into the field. They’re not understanding how somebody might use your product kind of in the wild. And that can give you a lot of important insights.

Brian Bell (00:34:08): How do you know a product or startup has product market fit?

Bill Albert (00:34:13): The level of knowing or your confidence or certainty should be moving up as you start to go from pain to idea to solution. And once you’re getting into more high-fidelity prototypes, then you should be quite confident, right? Because you have those behavioral signals. That’s what you need at that point. That’s the only way you’re really going to be You know, very, very confident.

Brian Bell (00:34:41): And I’ll ask that kind of the same question in a different way. What’s a belief you have? What’s a belief about product market fit generally that you strongly disagree with?

Bill Albert (00:34:49): yeah this is I think I would probably go back to that speed I am all for working fast but I just I think that people are just they’re just moving too quickly and they’re not they’re not really weighing the right things in the right way they’re not looking at the data that’s going to be most predictive of success

Brian Bell (00:35:10): love that and what is the data that’s most predictive of success in your your experience

Bill Albert (00:35:13): Well, it’s going to be dependent on that product. So if it’s like an enterprise application, it could be a letter of intent, an MOU. It could be signing some other type of agreement. Once it does come out, they’re going to purchase it. Or B2C could be payments, people paying in advance for something. So those are usually the strongest signals as opposed to anything that people are telling you. They’re going to tell you they love it no matter what, but... When you ask for their credit card, it’s a different story.

Brian Bell (00:35:45): You’ve probably dealt with this many times in your multi-decade career, but how do you kind of guard against the executive hippo, highest paid person’s opinion in the room deciding what to build next or the seagull who flies in and shits over everything and flies out? How have you counteracted that and how should maybe the product managers and UX researchers listening to this counteract that in their corporate lives?

Bill Albert (00:36:10): Sure. I think there’s really two ways to counteract that. One way is to kind of be this objective Switzerland person and say like, it’s not me. I just want to share with you the data that we have. Right. We have, you know, 50 people saying that they wouldn’t use it or whatever it is.

Brian Bell (00:36:29): Yeah. We surveyed 100 people and 80 people said, hell yeah, I need this. Or 80 people said, but this would be really painful if this was taken away or whatever the data is.

Bill Albert (00:36:40): Yep, I am a conduit of the data. Okay, the second way to do it is kind of a longer term play that I really believe in is about relationships, building relationships with senior leadership. Get to know them, have lunch with them, let them understand what you do, and especially the value that you deliver. And once they’ve seen you demonstrate that value, you build more and more trust. they’re less they’re more likely to bring you into those conversations they’re less likely to be that hippo or seagull so build relationships a lot of what we do whether it’s UX CX product it is about relationships ultimately

Brian Bell (00:37:25): so you wrote a book measuring the user experience what would you rewrite today if you if you had to release a new edition of that

Bill Albert (00:37:32): Well, funny you ask, because I just finished a few months ago, the fourth edition. It’s not out yet. I think maybe early fall. For that edition, we included a lot of stuff around AI, obviously. I even hate to say this, but if sometime down the road there was a fifth edition one thing that we do with product it’s in the book it’s really focused on digital products but a lot of the concepts a lot of the questions that we’re asking apply to services too there’s a whole field of service design it’s really emerging and I would like to try to say what are some different ways that we can measure a service experience think about your journey through the airport from When you step into the airport till you get to your destination, all those different touch points, all those different channels, we’re not really looking at that as a field in a really systematic kind of rigorous way.

Brian Bell (00:38:28): Some people are, but I’m just finishing the first edition of a book forthcoming. I’m not ready to kind of announce it yet, but any advice as I kind of shop it around to publishers or maybe just upload it to Amazon or kind of what are some takeaways from doing this process?

Bill Albert (00:38:44): I mean the way we did it early on going back to Tom Tullis who was a big mentor for me I walked into his office one day and I said hey I have this idea there’s no book on measuring UX whole quant UX hadn’t come I mean that was just not a thing back right in 2006 and it was really to me kind of a selfish exercise just to kind of like make sense of all this we went with that he loved the idea and he happened to be meeting with a publisher Elsevier who’s our publisher and the acquisitions editor and pitched the idea and she’s like oh wow I really like this and so we wrote sample chapter outline all that stuff so for us it was kind of a fairly easy seamless but I know a lot of people don’t have that experience I would say if you really want to have it broadly distributed is find a potential publisher write one or two sample chapters ask them what the book’s finished well there you go book ready yeah then shop it around and see yeah I have the proposal I’m starting to shop it around right now we’re meeting with publishers this week and I just wondered if you’ve had any advice look at the terms carefully you know oftentimes there’s a step so if you for us I think it was the first 5,000 books we got some percent and then when we hit 7,500 and 10,000 kind of you get higher percentage the more you sell make sure that that’s fair yeah well maybe I’ll reach out when I when I get the proposal if I’m lucky enough to get one kind of the hard part is over you’ve written the book so yeah that’s the hard part right I mean I’m kind of going into the copy editing phase now which I’m not excited about but it’s fun to kind of create the book and now I have to go like kind of refine it a lot of publishers have copy editors yeah if I get it published they’ll they’ll just take it and copy edit it right give me lots of comments and yeah yeah that’s why I’m waiting to see if I can get somebody to pick it up then I don’t have to pay a copy editor which I will do if I need to yeah what’s the biggest lie founders tell themselves during customer discovery that people love our product love that they can’t live without it you know it’s it’s again it’s it’s they’re falling in love with the product and how could other people not love it like we do a lot of founders are blind to that a lot of founders are blind you know as a VC I mean thousands of companies every year right that’s a big big problem I see is like I just don’t see anybody needing this you can’t yeah the metrics the metrics clearly say nobody needs it I don’t think anybody needs it I don’t think it’s a big problem and clearly your metrics show that yeah I think it’s that founder gaslighting themselves into thinking that hey this is the best thing ever and everyone needs this like let me show you a picture of my kid and my baby isn’t he or she the cutest thing you’ve ever seen literally like an ugly dog sitting in a yeah like no baby could be cute so you know and it’s hard to get that even when people tell you you don’t always hear that right so They’ll be telling you for 58 minutes basically why they’re, in many different ways, why they’re not interested in it, why it’s not for them or why now is not the right time, etc. And then there’s two minutes where they’re like, yeah, this is really cool. I really like it. And that’s what they latch on to. those two minutes instead of the 58 and that’s human nature I mean that’s that it’s so we have to be optimist right we have to be optimist yeah we we do and it’s good that we are I suppose but yeah we that’s why having kind of objective third party let’s run this through the ringer whatever you want to call it to really make sure that you know back to that first question I said should we build it let’s make sure the answer is yes definitively yes there is going to be a demand for it that’s what I’m trying to do save companies save companies money by not building the wrong product and help them achieve their

Brian Bell (00:42:42): and then building it correctly. What’s an industry where UX is still massively underdeveloped?

Bill Albert (00:42:47): It’s, I don’t think about it industry. I think about it more in markets. So I think that there’s some markets where I just don’t see it as being valued. And now what are those markets? Probably Asia, Africa, South America, outside of US and Europe. But I just, yeah, they just- It makes sense. They’re a little bit behind generationally Yeah, exactly. Sometimes they just haven’t come around to really seeing, you know, what you’re selling is an experience. Usually it’s an experience. Some people call this the experience economy, right? Starbucks is not selling coffee. They’re selling an experience that happens to be around coffee, but that’s what they’re selling, right? It’s not just a cup of coffee. So the faster that people kind of can recognize that, the more that’s going to kind of shift their mindset.

Brian Bell (00:43:46): Yeah. Tell me about your favorite product or experience.

Bill Albert (00:43:49): My favorite product. That’s...

Brian Bell (00:43:52): It’s an interview question, but I thought it was appropriate here.

Bill Albert (00:43:56): I mean there’s different apps I use but I’m very transactional I don’t you know product I don’t get I’m not not a person that gets excited about technology I sort of I’m excited about the experience around it so all the time you were wowed where you just use something and you were just like wow this is amazing there’s an app called all trails and if anyone likes hiking being in the outdoors it’s like got literally every trail and every park and every place certainly in the U.S. and abroad as well I think it tells you everything about it shows you that the profile and shows you all these reviews and when to go and you know and it’s really nice and then when you’re out there walking you have your phone you can see where you are in kind of what more elevation you have it’s it’s really um handy it’s really opened us up to exploring more around where we live and it’s a technology that checks a lot of my boxes around an experience that really matters a lot to me

Brian Bell (00:44:55): and last question what do you want your legacy to be

Bill Albert (00:44:59): You know, when I’m thinking about my career, there’s really been two big themes for about the first 20-something years. It was around measurement and data. And then about four or five years ago, it’s kind of pivoted towards product validation. And that’s really what I’m focused on now. And they certainly, one kind of feeds into the other, but I I want, in terms of legacy, one is I’ve done a lot of mentoring. I’ve probably mentored over 100 graduate students, and I feel really grateful for that opportunity and a lot of people I’ve helped in their careers. But the second thing to of legacy would be helping move our field forward you know helping helping with the whole quant ux movement through our books and our research and now i want to help bring product validation of make it kind of a more rigorous scientific reliable way of really de-risking so that’s really ultimately you know it’s about the people and it’s about helping organizations really well really enjoyed talking to you bill

Brian Bell (00:46:06): where can folks find you online

Bill Albert (00:46:07): greenlightidealab.com please reach out love to hear from you thanks so much really enjoyed it thank you

Discussion about this video

User's avatar

Ready for more?