Beta
Logo of the podcast Experiencing Data w/ Brian T. O’Neill  (UX for AI Products, Analytics SAAS and Data Product Management)

Experiencing Data w/ Brian T. O’Neill (UX for AI Products, Analytics SAAS and Data Product Management) (Brian T. O’Neill from Designing for Analytics)

Explore every episode of Experiencing Data w/ Brian T. O’Neill (UX for AI Products, Analytics SAAS and Data Product Management)

Dive into the complete episode list for Experiencing Data w/ Brian T. O’Neill (UX for AI Products, Analytics SAAS and Data Product Management). Each episode is cataloged with detailed descriptions, making it easy to find and explore specific topics. Keep track of all episodes from your favorite podcast and never miss a moment of insightful content.

Rows per page:

1–50 of 100

Pub. DateTitleDuration
22 Aug 2023124 - The PiCAA Framework: My Method to Generate ML/AI Use Cases from a UX Perspective00:21:51

In this episode, I give an overview of my PiCAA Framework, which is a framework I shared at my keynote talk for Netguru’s annual conference, Burning Minds. This framework helps with brainstorming machine learning use cases or reverse engineering them, starting with the tactic. Throughout the episode, I give context to the preliminary types of work and preparation you and your team would want to do before implementing PiCAA, as well as the process and potential pitfalls you may run into, and the end results that make it a beneficial tool to experiment with. 

 

Highlights/ Skip to:

  • Where/ how you might implement the PiCAA Framework (1:22)
  • Focusing on the human part of your ideas (5:04)
  • Keynote excerpt outlining the PiCAA Framework (7:28)
  • Closing a PiCAA workshop by exploring what could go wrong (18:03)
Links
21 Feb 2023111 - Designing and Monetizing Data Products Like a Startup with Yuval Gonczarowski00:33:15

Today I’m chatting with Yuval Gonczarowski, Founder & CEO of the startup, Akooda. Yuval is a self-described “socially capable nerd” who has learned how to understand and meet the needs of his customers outside of a purely data-driven lens. Yuval describes how Akooda is able to solve a universal data challenge for leaders who don’t have complete visibility into how their teams are working, and also explains why it’s important that Akooda provide those data insights without bias. Yuval and I also explore why it’s so challenging to find great product leaders and his rule for getting useful feedback from customers and stakeholders. 

 

Highlights/ Skip to:

  • Yuval describes what Akooda does (00:35)
  • The types of technical skills Yuval had to move away from to adopt better leadership capabilities within a startup (02:15)
  • Yuval explains how Akooda solves what he sees as a universal data problem for anyone in management positions (04:15)
  • How Akooda goes about designing for multiple user types (personas) (06:29)
  • Yuval describes how using Akooda internally (dogfooding!) helps inform their design strategy for various use cases (09:09)
  • The different strategies Akooda employs to ensure they receive honest and valuable feedback from their customers (11:08)
  • Yuval explains the three sales cycles that Akooda goes through to ensure their product is properly adapted to both their buyers and the end users of their tool (15:37)
  • How Yuval learned the importance of providing data-driven insights without a bias of whether the results are good or bad (18:22)
  • Yuval describes his core leadership values and why he feels a product can never be simple enough (24:22)
  • The biggest learnings Yuval had when building Akooda and what he’d do different if he had to start from scratch (28:18)
  • Why Yuval feels being the first Head of Product that reports to a CEO is both a very difficult position to be in and a very hard hire to get right (29:16)
Quotes from Today’s Episode
  • “Re: moving from a technical to product role: My first inclination would be straight up talk about the how, but that’s not necessarily my job anymore. We want to talk about the why and how does the customer perceive things, how do they look at things, how would they experience this new feature? And in a sense, [that’s] my biggest change in the way I see the world.” — Yuval Gonczarowski (03:01)
  • “We are a very data-driven organization. Part of it is our DNA, my own background. When you first start a company and you’re into your first handful of customers, a lot of decisions have to be made based on gut feelings, sort of hypotheses, scenarios… I’ve lived through this pain.” — Yuval Gonczarowski (09:43)

 

  • “I don’t believe I will get honest feedback from a customer if I don’t hurt their pocket. If you want honest feedback [from customers], you got to charge.” — Yuval Gonczarowski (11:38)
  • “Engineering is the most expensive resource we have. Whenever we allocate engineering resources, they have to be something the customer is going to use.” – Yuval Gonczarowski (13:04)

 

  • When selling a data product: “If you don’t build the right collateral and the right approach and mindset to the fact that it’s not enough when the contract is signed, it’s actually these three sales cycles of making sure that customer adoption is done properly, then you haven’t finished selling. Contract is step one, installation is step two, usage is step three. Until step three is done, haven’t really sold the product.” — Yuval Gonczarowski (16:59)

 

  • “By definition, all products are too complex. And it’s always tempting to add another button, another feature, another toggle. Let’s see what we can remove to make it easier.” – Yuval Gonczarowski (26:35)
Links
19 Sep 2023126 - Designing a Product for Making Better Data Products with Anthony Deighton00:47:38

Today I’m joined by Anthony Deighton, General Manager of Data Products at Tamr. Throughout our conversation, Anthony unpacks his definition of a data product and we discuss whether or not he feels that Tamr itself is actually a data product. Anthony shares his views on why it’s so critical to focus on solving for customer needs and not simply the newest and shiniest technology. We also discuss the challenges that come with building a product that’s designed to facilitate the creation of better internal data products, as well as where we are in this new wave of data product management, and the evolution of the role.

 

Highlights/ Skip to:

  • I introduce Anthony, General Manager of Data Products at Tamr, and the topics we’ll be discussing today (00:37)
  • Anthony shares his observations on how BI analytics are an inch deep and a mile wide due to the data that’s being input (02:31)
  • Tamr’s focus on data products and how that reflects in Anthony’s recent job change from Chief Product Officer to General Manager of Data Products (04:35)
  • Anthony’s definition of a data product (07:42)
  • Anthony and I explore whether he feels that decision support is necessary for a data product (13:48)
  • Whether or not Anthony feels that Tamr qualifies as a data product (17:08)
  • Anthony speaks to the importance of focusing on outcomes and benefits as opposed to endlessly knitting together features and products (19:42)
  • The challenges Anthony sees with metrics like Propensity to Churn (21:56)
  • How Anthony thinks about design in a product like Tamr (30:43)
  • Anthony shares how data science at Tamr is a tool in his toolkit and not viewed as a “fourth” leg of the product triad/stool (36:01)
  • Anthony’s views on where we are in the evolution of the DPM role (41:25)
  • What Anthony would do differently if he could start over at Tamr knowing what he knows now (43:43)
Links
06 Aug 2024149 - What the Data Says About Why So Many Data Science and AI Initiatives Are Still Failing to Produce Value with Evan Shellshear00:50:18

Guess what? Data science and AI initiatives are still failing here in 2024—despite widespread awareness. Is that news? Candidly, you’ll hear me share with Evan Shellshear—author of the new book Why Data Science Projects Fail: The Harsh Realities of Implementing AI and Analytics—about how much I actually didn’t want to talk about this story originally on my podcast—because it’s not news! However, what is news is what the data says behind Evan’s findings—and guess what? It’s not the technology.

 

In our chat, Evan shares why he wanted to leverage a human approach to understand the root cause of multiple organizations’ failures and how this approach highlighted the disconnect between data scientists and decision-makers. He explains the human factors at play, such as poor problem surfacing and organizational culture challenges—and how these human-centered design skills are rarely taught or offered to data scientists. The conversation delves into why these failures are more prevalent in data science compared to other fields, attributing it to the complexity and scale of data-related problems. We also discuss how analytically mature companies can mitigate these issues through strategic approaches and stakeholder buy-in. Join us as we delve into these critical insights for improving data science project outcomes.

 

 

Highlights/ Skip to:
  • (4:45) Why are data science projects still failing?
  • (9:17) Why is the disconnect between data scientists and decision-makers so pronounced relative to, say, engineering? 
  • (13:08) Why are data scientists not getting enough training for real-world problems?
  • (16:18) What the data says about failure rates for  mature data teams vs. immature data teams
  • (19:39) How to change people’s opinions so they value data more
  • (25:16) What happens at the stage where the beneficiaries of data don’t actually see the benefits?
  • (31:09) What are the skills needed to prevent a repeating pattern of creating data products that customers ignore??
  • (37:10) Where do more mature organizations find non-technical help to complement their data science and AI teams? 
  • (41:44) Are executives and directors aware of the skills needed to level up their data science and AI  teams?

 

Quotes from Today’s Episode
  • “People know this stuff. It’s not news anymore. And so, the reason why we needed this was really to dig in. And exactly like you did, like, keeping that list of articles is brilliant, and knowing what’s causing the failures and what’s leading to these issues still arising is really important. But at some point, we need to approach this in a scientific fashion, and we need to unpack this, and we need to really delve into the details beyond just the headlines and the articles themselves. And start collating and analyzing this to properly figure out what’s going wrong, and what do we need to do about it to fix it once and for all so you can stop your endless collection, and the AI Incident Database that now has over 3500 entries. It can hang its hat and say, ‘I’ve done my job. It’s time to move on. We’re not failing as we used to.’” - Evan Shellshear (3:01)
  • "What we did is we took a number of different studies, and we split companies into what we saw as being analytically mature—and this is a common, well-known thing; there are many maturity frameworks exist across data, across AI, across all different areas—and what we call analytically immature, so those companies that probably aren’t there yet. And what we wanted to draw a distinction is okay, we say 80% of projects fail, or whatever the exact number is, but for who? And for what stage and for what capability? And so, what we then went and did is we were able to take our data and look at which failures are common for analytically immature organizations, and which failures are common for analytically mature organizations, and then we’re able to understand, okay, in the market, how many organizations do we think are analytically mature versus analytically immature, and then we were able to take that 80% failure rate and establish it. For analytically mature companies, the failure rate is probably more like 40%. For analytically immature companies, it’s over 90%, right? And so, you’re exactly right: organizations can do something about it, and they can build capabilities in to mitigate this. So definitely, it can be reduced. Definitely, it can be brought down. You might say, 40% is still too high, but it proves that by bringing in these procedures, you’re completely correct, that it can be reduced.” - Evan Shellshear (14:28)
  • "What happens with the data science person, however, is typically they’re seen as a cost center—typically, not always; nowadays, that dialog is changing—and what they need to do is find partners across the other parts of the business. So, they’re going to go into the supply chain team, they’ll go into the merchandising team, they’ll go into the banking team, they’ll go into the other teams, and they’re going to find their supporters and winners there, and they’re going to probably build out from there. So, the first step would likely be, if you’re a big enough organization that you’re not having that strategy the executive level is to find your friends—and there will be some of the organization who support this data strategy—and get some wins for them.” - Evan Shellshear (24:38)
  • “It’s not like there’s this box you put one in the other in. Because, like success and failure, there’s a continuum. And companies as they move along that continuum, just like you said, this year, we failed on the lack of executive buy-in, so let’s fix that problem. Next year, we fail on not having the right resources, so we fix that problem. And you move along that continuum, and you build it up. And at some point as you’re going on, that failure rate is dropping, and you’re getting towards that end of the scale where you’ve got those really capable companies that live, eat, and breathe data science and analytics, and so have to have these to be able to survive, otherwise a simple company evolution would have wiped them out, and they wouldn’t exist if they didn’t have that capability, if that’s their core thing.” - Evan Shellshear (18:56)
  • “Nothing else could be correct, right? This subjective intuition and all this stuff, it’s never going to be as good as the data. And so, what happens is, is you, often as a data scientist—and I’ve been subjected to this myself—come in with this arrogance, this kind of data-driven arrogance, right? And it’s not a good thing. It puts up barriers, it creates issues, it separates you from the people.” - Evan Shellshear (27:38)
  • "Knowing that you’re going to have to go on that journey from day one, you can’t jump from level zero to level five. That’s what all these data maturity models are about, right? You can’t jump from level zero data maturity to level five overnight. You really need to take those steps and build it up.” - Evan Shellshear (45:21)
  • "What we’re talking about, it’s not new. It’s just old wine in a new skin, and we’re just presenting it for the data science age." - Evan Shellshear (48:15)

 

Links
10 Jan 2023108 - Google Cloud’s Bruno Aziza on What Makes a Good Customer-Obsessed Data Product Manager00:50:43

Today I’m chatting with Bruno Aziza, Head of Data & Analytics at Google Cloud. Bruno leads a team of outbound product managers in charge of BigQuery, Dataproc, Dataflow and Looker and we dive deep on what Bruno looks for in terms of skills for these leaders. Bruno describes the three patterns of operational alignment he’s observed in data product management, as well as why he feels ownership and customer obsession are two of the most important qualities a good product manager can have. Bruno and I also dive into how to effectively abstract the core problem you’re solving, as well as how to determine whether a problem might be solved in a better way. 

 

Highlights / Skip to:

  • Bruno introduces himself and explains how he created his “CarCast” podcast (00:45)
  • Bruno describes his role at Google, the product managers he leads, and the specific Google Cloud products in his portfolio (02:36)
  • What Bruno feels are the most important attributes to look for in a good data product manager (03:59)
  • Bruno details how a good product manager focuses on not only the core problem, but how the problem is currently solved and whether or not that’s acceptable (07:20)
  • What effective abstracting the problem looks like in Bruno’s view and why he positions product management as a way to help users move forward in their career (12:38)
  • Why Bruno sees extracting value from data as the number one pain point for data teams and their respective companies (17:55)
  • Bruno gives his definition of a data product (21:42)
  • The three patterns Bruno has observed of operational alignment when it comes to data product management (27:57)
  • Bruno explains the best practices he’s seen for cross-team goal setting and problem-framing (35:30)

 

Quotes from Today’s Episode

 

  • “What’s happening in the industry is really interesting. For people that are running data teams today and listening to us, the makeup of their teams is starting to look more like what we do [in] product management.” — Bruno Aziza (04:29)
  • “The problem is the problem, so focus on the problem, decompose the problem, look at the frictions that are acceptable, look at the frictions that are not acceptable, and look at how by assembling a solution, you can make it most seamless for the individual to go out and get the job done.” – Bruno Aziza (11:28)

 

  • “As a product manager, yes, we’re in the business of software, but in fact, I think you’re in the career management business. Your job is to make sure that whatever your customer’s job is that you’re making it so much easier that they, in fact, get so much more done, and by doing so they will get promoted, get the next job.” – Bruno Aziza (15:41)

 

  • “I think that is the task of any technology company, of any product manager that’s helping these technology companies: don’t be building a product that’s looking for a problem. Just start with the problem back and solution from that. Just make sure you understand the problem very well.” (19:52)

 

  • “If you’re a data product manager today, you look at your data estate and you ask yourself, ‘What am I building to save money? When am I building to make money?’ If you can do both, that’s absolutely awesome. And so, the data product is an asset that has been built repeatedly by a team and generates value out of data.” – Bruno Aziza (23:12)

 

  • “[Machine learning is] hard because multiple teams have to work together, right? You got your business analyst over here, you’ve got your data scientists over there, they’re not even the same team. And so, sometimes you’re struggling with just the human aspect of it.” (30:30)

 

  • “As a data leader, an IT leader, you got to think about those soft ways to accomplish the stuff that’s binary, that’s the hard [stuff], right? I always joke, the hard stuff is the soft stuff for people like us because we think about data, we think about logic, we think, ‘Okay if it makes sense, it will be implemented.’ For most of us, getting stuff done is through people. And people are emotional, how can you express the feeling of achieving that goal in emotional value?” – Bruno Aziza (37:36)

 

Links
11 Jan 2022082 - What the 2021 $1M Squirrel AI Award Winner Wants You To Know About Designing Interpretable Machine Learning Solutions w/ Cynthia Rudin00:37:55
Episode Description

As the conversation around AI continues, Professor Cynthia Rudin, Computer Scientist and Director at the Prediction Analysis Lab at Duke University, is here to discuss interpretable machine learning and her incredible work in this complex and evolving field. To begin, she is the most recent (2021) recipient of the $1M Squirrel AI Award for her work on making machine learning more interpretable to users and ultimately more beneficial to humanity.

In this episode, we explore the distinction between explainable and interpretable machine learning and how black boxes aren’t necessarily “better” than more interpretable models. Cynthia offers up real-world examples to illustrate her perspective on the role of humans and AI, shares takeaways from her previous work which ranges from predicting criminial recidivism to predicting manhole cover explosions in NYC (yes!). I loved this chat with her because, for one, Cynthia has strong, heavily informed opinions from her concentrated work in this area, and secondly, because Cynthia is thinking about both the end users of ML applications as well as the humans who are “out of the loop,” but nonetheless impacted by the decisions made by the users of these AI systems.

In this episode, we cover:

  • Background on the Squirrel AI Award – and Cynthia unpacks the differences between Explainable and Interpretable ML. (00:46)
  • Using real-world examples, Cynthia demonstrates why black boxes should be replaced. (04:49)
  • Cynthia’s work on the New York City power grid project, exploding manhole covers, and why it was the messiest dataset she had ever seen. (08:20)
  • A look at the future of machine learning and the value of human interaction as it moves into the next frontier. (15:52)
  • Cynthia’s thoughts on collecting end-user feedback and keeping humans in the loop. (21:46)
  • The current problems Cynthia and her team are exploring—the Roshomon Set, optimal sparse decision trees, sparse linear models, causal inference, and more. (32:33)
Quotes from Today’s Episode
  • “I’ve been trying to help humanity my whole life with AI, right? But it’s not something I tried to earn because there was no award like this in the field while I was trying to do all of this work. But I was just totally amazed, and honored, and humbled that they chose me.”- Cynthia Rudin on receiving the AAAI Squirrel AI Award. (@cynthiarudin) (1:03)
  • “Instead of trying to replace the black boxes with inherently interpretable models, they were just trying to explain the black box. And when you do this, there's a whole slew of problems with it. First of all, the explanations are not very accurate—they often mislead you. Then you also have problems where the explanation methods are giving more authority to the black box, rather than telling you to replace them.”- Cynthia Rudin (@cynthiarudin) (03:25)
  • “Accuracy at all costs assumes that you have a static dataset and you’re just trying to get as high accuracy as you can on that dataset. [...] But that is not the way we do data science. In data science, if you look at a standard knowledge discovery process, [...] after you run your machine learning technique, you’re supposed to interpret the results and use that information to go back and edit your data and your evaluation metric. And you update your whole process and your whole pipeline based on what you learned. So when people say things like, ‘Accuracy at all costs,’ I’m like, ‘Okay. Well, if you want accuracy for your whole pipeline, maybe you would actually be better off designing a model you can understand.’”- Cynthia Rudin (@cynthiarudin) (11:31)
  • “When people talk about the accuracy-interpretability trade-off, it just makes no sense to me because it’s like, no, it’s actually reversed, right? If you can actually understand what this model is doing, you can troubleshoot it better, and you can get overall better accuracy.“- Cynthia Rudin (@cynthiarudin) (13:59)
  • “Humans and machines obviously do very different things, right? Humans are really good at having a systems-level way of thinking about problems. They can look at a patient and see things that are not in the database and make decisions based on that information, but no human can calculate probabilities really accurately in their heads from large databases. That’s why we use machine learning. So, the goal is to try to use machine learning for what it does best and use the human for what it does best. But if you have a black box, then you’ve effectively cut that off because the human has to basically just trust the black box. They can’t question the reasoning process of it because they don’t know it.”- Cynthia Rudin (@cynthiarudin) (17:42)
  • “Interpretability is not always equated with sparsity. You really have to think about what interpretability means for each domain and design the model to that domain, for that particular user.”- Cynthia Rudin (@cynthiarudin) (19:33)
  • “I think there's sometimes this perception that there's the truth from the data, and then there's everything else that people want to believe about whatever it says.”- Brian T. O’Neill (@rhythmspice) (23:51)
  • “Surveys have their place, but there's a lot of issues with how we design surveys to get information back. And what you said is a great example, which is 7 out of 7 people said, ‘this is a serious event.’ But then you find out that they all said serious for a different reason—and there's a qualitative aspect to that. […] The survey is not going to tell us if we should be capturing some of that information if we don't know to ask a question about that.”- Brian T. O’Neill (@rhythmspice) (28:56)
Links
06 Feb 2024136 - Navigating the Politics of UX Research and Data Product Design with Caroline Zimmerman00:44:16

This week I’m chatting with Caroline Zimmerman, Director of Data Products and Strategy at Profusion. Caroline shares her journey through the school of hard knocks that led to her discovery that incorporating more extensive UX research into the data product design process improves outcomes. We explore the complicated nature of discovering and building a better design process, how to engage end users so they actually make time for research, and why understanding how to navigate interdepartmental politics is necessary in the world of data and product design. Caroline reveals the pivotal moment that changed her approach to data product design, as well as her learnings from evolving data products with the users as their needs and business strategies change. Lastly, Caroline and I explore what the future of data product leadership looks like and Caroline shares why there's never been a better time to work in data.

Highlights/ Skip to:

  • Intros and Caroline describes how she learned crucial lessons on building data products the hard way (00:36)
  • The fundamental moment that helped Caroline to realize that she needed to find a different way to uncover user needs (03:51)
  • How working with great UX researchers influenced Caroline’s approach to building data products (08:31)
  • Why Caroline feels that exploring the ‘why’ is foundational to designing a data product that gets adopted (10:25)
  • Caroline’s experience building a data model for a client and what she learned from that experience when the client’s business model changed (14:34)
  • How Caroline addresses the challenge of end users not making time for user research (18:00)
  • A high-level overview of the UX research process when Caroline’s team starts working with a new client (22:28)
  • The biggest challenges that Caroline faces as a Director of Data Products, and why data products require the ability to navigate company politics and interests (29:58)
  • Caroline describes the nuances of working with different stakeholder personas (35:15)
  • Why data teams need to embrace a more human-led approach to designing data products and focus less on metrics and the technical aspects (38:10)
  • Caroline’s closing thoughts on what she’d like to share with other data leaders and how you can connect with her (40:48)
Quotes from Today’s Episode
  • “When I was first starting out, I thought that you could essentially take notes on what someone was asking for, go off and build it to their exact specs, and be successful. And it turns out that you can build something to exact specs and suffer from poor adoption and just not be solving problems because I did it as a wish fulfillment, laundry-list exercise rather than really thinking through user needs.” — Caroline Zimmerman (01:11)
  • “People want a thing. They’re paying for a thing, right? And so, just really having that reflex to try to gently come back to that why and spending sufficient time exploring it before going into solution build, even when people are under a lot of deadline pressure and are paying you to deliver a thing [is the most important element of designing a data product].” – Caroline Zimmerman (11:53)
  • “A data product evolves because user needs change, business models change, and business priorities change, and we need to evolve with it. It’s not like you got it right once, and then you’re good for life. At all.” – Caroline Zimmerman (17:48)
  • “I continue to have lots to learn about stakeholder management and understanding the interplay between what the organization needs to be successful, but also, organizations are made up of people with personal interests, and you need to understand both.” – Caroline Zimmerman (30:18)
  • “Data products are built in a political context. And just being aware of that context is important.” – Caroline Zimmerman (32:33)
  • “I think that data, maybe more than any other function, is transversal. I think data brings up politics because, especially with larger organizations, there are those departmental and team silos. And the whole thing about data is it cuts through those because it touches all the different teams. It touches all the different processes. And so in order to build great data products, you have to be navigating that political context to understand how to get things done transversely in organizations where most stuff gets done vertically.” – Caroline Zimmerman (34:37)
  • “Data leadership positions are data product expertise roles. And I think that often it’s been more technical people that have advanced into those roles. If you follow the LinkedIn-verse in data, it’s very much on every data leader’s mind at the moment:  how do you articulate benefits to your CEO and your board and try to do that before it’s too late? So, I’d say that’s really the main thing and that there’s just never been a better time to be a data product person.” – Caroline Zimmerman (37:16)
Links
10 Dec 2024158 - From Resistance to Reliance: Designing Data Products for Non-Believers with Anna Jacobson of Operator Collective00:43:41

After getting started in construction management, Anna Jacobson traded in the hard hat for the world of data products and operations at a VC company. Anna, who has a structural engineering undergrad and a masters in data science, is also a Founding Member of the Data Product Leadership Community (DPLC). However, her work with data products is more “accidental” and is just part of her responsibility at Operator Collective. Nonetheless, Anna had a lot to share about building data products, dashboards, and insights for users—including resistant ones! 

 

 

That resistance is precisely what I wanted to talk to her about in this episode: how does Anna get somebody to adopt a data product to which they may be apathetic, if not completely resistant?

 

 

At the end of the episode, Anna gives us a sneak peek at what she’s planning to talk about in our final 2024 live DPLC group discussion coming up on 12/18/2024.

    We covered:
  • (1:17) Anna's background and how she got involved with data products
  • (3:32) The ways Anna applied her experiences working in construction management to her current work with data products at a VC firm
  • (5:32) Explaining one of the main data products she works on at Operator Collective
  • (9:55) How Anna defines success for her data products
  • (15:21) The process of designing data products for "non-believers"
  • (21:08) How to think about "super users" and their feedback on a data product
  • (27:11) How a company's cultural problems can be a blocker for product adoption
  • (38:21) A preview of what you can expect from Anna's talk and live group discussion in the DPLC
  • (40:24) Closing thoughts from Anna
  • (42:54) Where you can find more from Anna
    Quotes from Today’s Episode
  • “People working with data products are always thinking about how to [gain user adoption of their product]... I can’t think of a single one where [all users] were immediately on board. There’s a lot to unpack in what it takes to get non-believers on board, and it’s something that none of us ever get any training on. You just learn through experience, and it’s not something that most people took a class on in college. All of the social science around what we do gets really passed over for all the technical stuff. It takes thinking through and understanding where different [users] are coming from, and [understanding] that my perspective alone is not enough to make it happen.” - Anna Jacobson (16:00)
  • ​​“If you only bring together the super users and don’t try to get feedback from the average user, you are missing the perspective of the person who isn’t passionate about the product. A non-believer is someone who is just over capacity. They may be very hard-working, they may be very smart, but they just don’t have the bandwidth for new things. That’s something that has to be overcome when you’re putting a new product into place.” - Anna Jacobson (22:35)
  • “If a company can’t find budget to support [a data product], that’s a cultural decision. It’s not a financial decision. They find the money for the things that they care about. Solving the technology challenge is pretty easy, but you have to have a company that’s motivated to do that. If you want to implement something new, be it a data product or any change in an organization, identifying the cultural barriers and figuring out how to bring [people in an organization] on board is the crux of it. The money and the technology can be found.” - Anna Jacobson (27:58)
  • “I think people are actually very bad at explaining what they want, and asking people what they want is not helpful. If you ask people what they want to do, then I think you have a shot at being able to build a product that does [what they want]. The executive sponsors typically have a very different perspective on what the product [should be] than the users do. If all of your information is getting filtered through the executive sponsor, you’re probably not getting the full picture” - Anna Jacobson (31:45)
  • “You want to define what the opportunity is, the problem, the solution, and you want to talk about costs and benefits. You want to align [the data product] with corporate strategy, and those things are fairly easy to map out. But as you get down to the user, what they want to know is, ‘How is this going to make my life easier? How is this going to make [my job] faster? How is it going to result in better outcomes?’ They may have an interest in how it aligns with corporate strategy, but that’s not what’s going to motivate them. It’s really just easier, faster, better.” - Anna Jacobson (35:00)

 

 

Links Referenced

LinkedIn: https://www.linkedin.com/in/anna-ching-jacobson/

DPLC (Data Product Leadership Community): https://designingforanalytics.com/community

14 Dec 2021080 – How to Measure the Impact of Data Products…and Anything Else with Forecasting and Measurement Expert Doug Hubbard00:46:00

Finding it hard to know the value of your data products on the business or your end users? Do you struggle to understand the impact your data science, analytics, or product team is having on the people they serve?  

Many times, the challenge comes down to figuring out WHAT to measure, and HOW. Clients, users, and customers often don’t even know what the right success or progress metrics are, let alone how to quantify them. Learning how to measure what might seem impossible is a highly valuable skill for leaders who want to track their progress with data—but it’s not all black and white. It’s not always about “more data,” and measurement is also not about “the finite, right answer.” Analytical minds, ready to embrace subjectivity and uncertainty in this episode! 

In this insightful chat, Doug and I explore examples from his book, How to Measure Anything, and we discuss its applicability to the world of data and data products. From defining trust to identifying cognitive biases in qualitative research, Doug shares how he views the world in ways that we can actually measure. We also discuss the relationship between data and uncertainty, forecasting, and why people who are trying to measure something usually believe they have a lot less data than they really do. 

Episode Description
  • A discussion about measurement, defining “trust”, and why it is important to collect data in a systematic way. (01:35)
  • Doug explores “concept, object and methods of measurement” - and why most people have more data than they realize when investigating questions. (09:29)
  • Why asking the right questions is more important than “needing to be the expert” - and a look at cognitive biases. (16:46)
  • The Dunning-Kruger effect and how it applies to the way people measure outcomes - and Bob discusses progress metrics vs success metrics and the illusion of cognition. (25:13)
  • How one of the challenges with machine learning also creates valuable skepticism - and the three criteria for experience to convert into learning. (35:35)
Quotes from Today’s Episode
  • “Often things like trustworthiness or collaboration, or innovation, or any—all the squishy stuff, they sound hard to measure because they’re actually an umbrella term that bundles a bunch of different things together, and you have to unpack it to figure out what it is you’re talking about. It’s the beginning of all scientific inquiry is to figure out what your terms mean; what question are you even asking?”- Doug Hubbard (@hdr_frm) (02:33)

  • “Another interesting phenomenon about measurement in general and uncertainty, is that it’s in the cases where you have a lot of uncertainty when you don’t need many data points to greatly reduce it. [People] might assume that if [they] have a lot of uncertainty about something, that [they are] going to need a lot of data to offset that uncertainty. Mathematically speaking, just the opposite is true. The more uncertainty you have, the bigger uncertainty reduction you get from the first observation. In other words, if, you know almost nothing, almost anything will tell you something. That’s the way to think of it.”- Doug Hubbard (@hdr_frm) (07:05) 

  • “I think one of the big takeaways there that I want my audience to hear is that if we start thinking about when we’re building these solutions, particularly analytics and decision support applications, instead of thinking about it as we’re trying to give the perfect answer here, or the model needs to be as accurate as possible, changing the framing to be, ‘if we went from something like a wild-ass guess, to maybe my experience and my intuition, to some level of data, what we’re doing here is we’re chipping away at the uncertainty, right?’ We’re not trying to go from zero to 100. Zero to 20 may be a substantial improvement if we can just get rid of some of that uncertainty, because no solution will ever predict the future perfectly, so let’s just try to reduce some of that uncertainty.”- Brian T. O’Neill (@rhythmspice) (08:40)

  • “So, this is really important: [...] you have more data than you think, and you need less than you think. People just throw up their hands far too quickly when it comes to measurement problems. They just say, ‘Well, we don’t have enough data for that.’ Well, did you look? Tell me how much time you spent actually thinking about the problem or did you just give up too soon? [...] Assume there is a way to measure it, and the constraint is that you just haven’t thought of it yet. ”- Doug Hubbard (@hdr_frm) (15:37)
  • “I think people routinely believe they have a lot less data than they really do. They tend to believe that each situation is more unique than it really is [to the point] that you can’t extrapolate anything from prior observations. If that were really true, your experience means nothing.”- Doug Hubbard (@hdr_frm) (29:42)

  • “When you have a lot of uncertainty, that’s exactly when you don’t need a lot of data to reduce it significantly. That’s the general rule of thumb here. [...] If what we’re trying to improve upon is just the subjective judgment of the stakeholders, all the research today—and by the way, here’s another area where there’s tons of data—there’s literally hundreds of studies where naive statistical models are compared to human experts […] and the consistent finding is that even naive statistical models outperform human experts in a surprising variety of fields.”- Doug Hubbard (@hdr_frm) (32:50)
Links Referenced

 

28 Dec 2021081 - The Cultural and $ Benefits of Human-Centered AI in the Enterprise: Digging Into BCG/MIT Sloan’s AI Research w/ François Candelon00:36:45
Episode Description

The relationship between humans and artificial intelligence has been an intricate topic of conversation across many industries. François Candelon, Global Director at Boston Consulting Group Henderson Institute, has been a significant contributor to that conversation, most notably through an annual research initiative that BCG and MIT Sloan Management Review have been conducting about AI in the enterprise. In this episode, we’re digging particularly into the findings of the 2020 and 2021 studies that were just published at the time of this recording. 

Through these yearly findings, the study has shown that organizations with the most competitive advantage are the ones that are focused on effectively designing AI-driven applications around the humans in the loop. As these organizations continue to generate value with AI, the gap between them and companies that do not embrace AI has only increased. To close this gap, companies will have to learn to design trustworthy AI applications that actually get used, produce value, and are designed around mutual learning between the technology and users. François claims that a “human plus AI” approach —what former Experiencing Data guest Ben Schneiderman calls HCAI (see Ep. 062)—can create organizational learning, trust, and improved productivity. 

In this episode, we cover:

  • How the Henderson Institute is conducting its multi-year study with MIT Sloan Management Review. (00:43)
  • The core findings of the 2020 study, what the 10/20/70 rule is, and how Francois uses it to determine a company’s level of successful deployment of AI, and specific examples of what leading companies are doing in terms of user experience around AI. (03:08)
  • The core findings of the 2021 study, and how mutual learning between human and machine (i.e. the experience of learning from and contributing to ML applications) increases the success rate of AI deployments. (07:53)
  • The AI driving license for CxOs: A discussion about the gap between C-suite and data scientists and why it’s critical for teams to be agile and integrate both capabilities. (14:44)
  • Why companies should embed AI as the core of their operating process. (22:07)
  • François’ perspective on leveraging AI and why it is meant to solve problems and impact cultural change. (29:28)
Quotes from Today’s Episode
  • “What makes the real difference is when you have what we call organizational learning, which means that at the same time you learn from AI as an individual, as a human, AI will learn from you. And this is relatively easy to understand because as we’re in a world, which is always more uncertain, the rate of learning, the ability for an organization to learn, is one of the most important competitive advantages.”- François Candelon (04:58)
  • “When there is an additional effectiveness linked to AI, people will feel more comfortable, will feel augmented, not replaced, and then they will trust AI. As they trust, they are ready to have additional use cases implemented and therefore you are entering into a virtuous cycle.”- François Candelon (08:06)
  • “If you try to optimize human plus AI and build on their respective capabilities—humans are much better at dealing with ambiguity and AI deals with large amounts of data, If you’re able to combine both, then you’re in a situation to be ready to create a source of competitive advantage.”- François Candelon (09:36)
  • “I think that’s largely the point of my show and what I’m trying to focus on is to talk to the people who do want to go beyond the technical work. Building technically, right, effectively wrong solutions is something nobody needs, and at some point, not only is it not good for your career, but you might find it more rewarding to work on things that actually matter, that get used, that go into the world, that produce value. It’s more personally gratifying, not just for the business, but yourself.”- Brian T. O’Neill (@rhythmspice) (20:55)
  • “Making sure that AI becomes the core of your operating process and your operating model [is] very important. I think that very often companies ask themselves, ‘how could AI help me optimize my process?’ I believe that they should now move—or at least the most advanced—are now moving to, ‘how should I make sure that I redesign my process to get the full potential of AI, to bring AI at the core of my operating model?’”- François Candelon (24:40)
  • “AI is a way to solve problems, not an objective in itself. So, this is why when I used to say we are an AI-enabled or an AI-powered company, it shows a capability. It shows a way of thinking and the ability to deal with the foundational capabilities of AI. It’s not something else. And this is why—for the data scientists that will be open to better understanding business—they will learn a lot, and it will be very enlightening to be able to solve these issues and to solve these problems.”- François Candelon (30:51)
  • “The human in the loops matter, folks. For now at least, we’re still here. It’s not all machines running machines. So, you have to figure out the human-machine interaction. It’s not going away, and so when you’re ready, it’s time to face that we need to design for the human in the loop, and we need to think about the last mile, and we need to think about change, adoption, and all the human factors that go into the solution, as well as the technologies.”- Brian T. O’Neill (@rhythmspice) (35:35)
Links 
31 May 2022092 - How to measure data product value from a UX and business lens (and how not to do it)00:34:46

Today I’m talking about how to measure data product value from a user experience and business lens, and where leaders sometimes get it wrong. Today’s first question was asked at my recent talk at the Data Summit conference where an attendee asked how UX design fits into agile data product development. Additionally, I recently had a subscriber to my Insights mailing list ask about how to measure adoption, utilization, and satisfaction of data products. So, we’ll jump into that juicy topic as well. Answering these inquiries also got me on a related tangent about the UX challenges associated with abstracting your platform to support multiple, but often theoretical, user needs—and the importance of collaboration to ensure your whole team is operating from the same set of assumptions or definitions about success. I conclude the episode with the concept of “game framing” as a way to conceptualize these ideas at a high level. 

 

Key topics and cues in this episode include: 

  • An overview of the questions I received (:45)
  • Measuring change once you’ve established a benchmark (7:45) 
  • The challenges of working in abstractions (abstracting your platform to facilitate theoretical future user needs) (10:48)
  • The value of having shared definitions and understanding the needs of different stakeholders/users/customers (14:36)
  • The importance of starting from the “last mile” (19:59)
  • The difference between success metrics and progress metrics (24:31)
  • How measuring feelings can be critical to measuring success (29:27)
  • “Game framing” as a way to understand tracking progress and success (31:22)
Quotes from Today’s Episode
  • “Once you’ve got your benchmark in place for a data product, it’s going to be much easier to measure what the change is because you’ll know where you’re starting from.” - Brian (7:45)
  • “When you’re deploying technology that’s supposed to improve people’s lives so that you can get some promise of business value downstream, this is not a generic exercise. You have to go out and do the work to understand the status quo and what the pain is right now from the user's perspective.” - Brian (8:46)
  • “That user perspective—perception even—is all that matters if you want to get to business value. The user experience is the perceived quality, usability, and utility of the data product.” - Brian (13:07)
  • “A data product leader’s job should be to own the problem and not just the delivery of data product features, applications or technology outputs. ” - Brian (26:13)
  • “What are we keeping score of? Different stakeholders are playing different games so it’s really important for the data product team not to impose their scoring system (definition of success) onto the customers, or the users, or the stakeholders.” - Brian (32:05)
  • “We always want to abstract once we have a really good understanding of what people do, as it’s easier to create more user-centered abstractions that will actually answer real data questions later on. ” - Brian (33:34)
Links
17 Oct 2023128 - Data Products for Dummies and The Importance of Data Product Management with Vishal Singh of Starburst00:53:01

Today I’m joined by Vishal Singh, Head of Data Products at Starburst and co-author of the newly published e-book, Data Products for Dummies. Throughout our conversation, Vishal explains how the variations in definitions for a data product actually led to the creation of the e-book, and we discuss the differences between our two definitions. Vishal gives a detailed description of how he believes Data Product Managers should be conducting their discovery and gathering feedback from end users, and how his team evaluates whether their data products are truly successful and user-friendly.

 

Highlights/ Skip to:

  • I introduce Vishal, the Head of Data Products at Starburst and contributor of the e-book Data Products for Dummies (00:37)
  • Vishal describes how his customers at Starburst all had a common problem, but differing definitions of a data product, which led to the creation of his e-book (01:15)
  • Vishal shares his one-sentence definition of a data product (02:50)
  • How Vishal’s definition of a data product differs from mine, and we both expand on the possibilities between the two (05:33)
  • The tactics Vishal uses to useful feedback to ensure the data products he develops are valuable for end users (07:48)
  • Why Vishal finds it difficult to get one on one feedback from users during the iteration phase of data product development (11:07)
  • The danger of sunk cost bias in the iteration phase of data product development (13:10)
  • Vishal describes how he views the role of a DPM when it comes to doing effective initial discovery (15:27)
  • How Vishal structures his teams and their interactions with each other and their end users (21:34)
  • Vishal’s thoughts on how design affects both data scientists and end users (24:16)
  • How DPMs at Starburst evaluate if the data product design is user-friendly (28:45)
  • Vishal’s views on where Designers are valuable in the data product development process (35:00)
  • Vishal and I discuss the importance of ensuring your products truly solve your user’s problems (44:44)
  • Where you can learn more about Vishal’s upcoming events and the e-book, Data Products for Dummies (49:48)
Links
05 Apr 2022088 - Doing UX Research for Data Products and The Magic of Qualitative User Feedback with Mike Oren, Head of Design Research at Klaviyo00:42:26

Mike Oren, Head of Design Research at Klaviyo, joins today’s episode to discuss how we do UX research for data products—and why qualitative research matters. Mike and I recently met in Lou Rosenfeld’s Quant vs. Qual group, which is for people interested in both qualitative and quantitative methods for conducting user research. Mike goes into the details on how Klaviyo and his teams are identifying what customers need through research, how they use data to get to that point, what data scientists and non-UX professionals need to know about conducting UX research, and some tips for getting started quickly. He also explains how Klaviyo’s data scientists—not just the UX team—are directly involved in talking to users to develop an understanding of their problem space.

Klaviyo is a communications platform that allows customers to personalize email and text messages powered by data. In this episode, Mike talks about how to ask research questions to get at what customers actually need. Mikes also offers some excellent “getting started” techniques for conducting interviews (qualitative research), the kinds of things to be aware of and avoid when interviewing users, and some examples of the types of findings you might learn. He also gives us some examples of how these research insights become features or solutions in the product, and how they interpret whether their design choices are actually useful and usable once a customer interacts with them. I really enjoyed Mike’s take on designing data-driven solutions, his ideas on data literacy (for both designers, and users), and hearing about the types of dinner conversations he has with his wife who is an economist ;-) . Check out our conversation for Mike’s take on the relevance of research for data products and user experience. 

 

In this episode, we cover:

  • Using “small data” such as qualitative user feedback  to improve UX and data products—and the #1 way qualitative data beats quantitative data  (01:45)
  • Mike explains what Klaviyo is, and gives an example of how they use qualitative information to inform the design of this communications product  (03:38)
  • Mike discusses Klaviyo data scientists doing research and their methods for conducting research with their customers (09:45)
  • Mike’s tips on what to avoid when you’re conducting research so you get objective, useful feedback on your data product  (12:45)
  • Why dashboards are Mike’s pet peeve (17:45)
  • Mike’s thoughts about data illiteracy, how much design needs to accommodate it, and how design can help with it (22:36)
  • How Mike conveys the research to other teams that help mitigate risk  (32:00)
  • Life with an economist! (36:00)
  • What the UX and design community needs to know about data (38:30)

 

Quotes from Today’s Episode
  • “I actually tell my team never to do any qualitative research around preferences…Preferences are usually something that you’re not going to get a reliable enough sample from if you’re just getting it qualitatively, just because preferences do tend to vary a lot from individual to individual; there’s lots of other factors. ”- Mike (@mikeoren) (03:05)
  • “[Discussing a product design choice influenced by research findings]: Three options gave [the customers a] feeling of more control. In terms of what actual options they wanted, two options was really the most practical, but the thing was that we weren’t really answering the main question that they had, which was what was going to happen with their data if they restarted the test with a new algorithm that was being used. That was something that we wouldn’t have been able to identify if we were only looking at the quantitative data if we were only serving them; we had to get them to voice through their concerns about it.” - Mike (@mikeoren) (07:00)
  • “When people create dashboards, they stick everything on there. If a stakeholder within the organization asked for a piece of data, that goes on the dashboard. If one time a piece of information was needed with other pieces of information that are already on the dashboard, that now gets added to the dashboard. And so you end up with dashboards that just have all these different things on them…you no longer have a clear line of signal.” - Mike (@mikeoren) (17:50)
  • “Part of the experience we need to talk about when we talk about experiencing data is that the experience can happen in more additional vehicles besides a dashboard: A text message, an email notification, there’s other ways to experience the effects of good, intelligent data product work. Pushing the right information at the right time instead of all the information all the time.” - Brian (@rhythmspice) (20:00)
  • “[Data illiteracy is] everyone’s problem. Depending upon what type of data we’re talking about, and what that product is doing, if an organization is truly trying to make data-driven decisions, but then they haven’t trained their leaders to understand the data in the right way, then they’re not actually making data-driven decisions; they’re really making instinctual decisions, or they’re pretending that they’re using the data.” - Mike (@mikeoren)(23:50)
  • “Sometimes statistical significance doesn’t matter to your end-users. More often than not organizations aren’t looking for 95% significance. Usually, 80% is actually good enough for most business decisions. Depending upon the cost of getting a high level of confidence, they might not even really value that additional 15% significance.” - Mike (@mikeoren) (31:06)
  • “In order to effectively make software easier for people to use, to make it useful to people, [designers have] to learn a minimum amount about that medium in order to start crafting those different pieces of the experience that we’re preparing to provide value to people. We’re running into the same thing with data applications where it’s not enough to just know that numbers exist and those are a thing, or to know some graphic primitives of line charts, bar charts, et cetera. As a designer, we have to understand that medium well enough that we can have a conversation with our partners on the data science team.” - Mike (@mikeoren) (39:30)
11 Jul 2023121 - How Sainsbury’s Head of Data Products for Analytics and ML Designs for User Adoption with Peter Everill00:39:40

Today I’m chatting with Peter Everill, who is the Head of Data Products for Analytics and ML Designs at the UK grocery brand, Sainsbury’s. Peter is also a founding member of the Data Product Leadership Community. Peter shares insights on why his team spends so much time conducting discovery work with users, and how that leads to higher adoption and in turn, business value. Peter also gives us his in-depth definition of a data product, including the three components of a data product and the four types of data products he’s encountered. He also shares the 8-step product management methodology that his team uses to develop data products that truly deliver value to end users. Pete also shares the #1 resource he would invest in right now to make things better for his team and their work.

Highlights/ Skip to:

 

  • I introduce Peter, who I met through the Data Product Leadership Community (00:37)
  • What the data team structure at Sainsbury’s looks like and how Peter wound up working there (01:54)
  • Peter shares the 8-step product management methodology that has been developed by his team and where in that process he spends most of his time (04:54)
  • How involved the users are in Peter’s process when it comes to developing data products (06:13)
  • How Peter was able to ensure that enough time is taken on discovery throughout the design process (10:03)
  • Who on Peter’s team is doing the core user research for product development (14:52)
  • Peter shares the three things that he feels make data product teams successful (17:09)
  • How Peter defines a data product, including the three components of a data product and the four types of data products (18:34)
  • Peter and I discuss the importance of spending time in discovery (24:25)
  • Peter explains why he measures reach and impact as metrics of success when looking at implementation (26:18)
  • How Peter solves for the gap when handing off a product to the end users to implement and adopt (29:20)
  • How Peter hires for data product management roles and what he looks for in a candidate (33:31)
  • Peter talks about what roles or skills he’d be looking for if he was to add a new person to his team (37:26)
Quotes from Today’s Episode
  • “I’m a big believer that the majority of analytics in its simplest form is improving business processes and decisions. A big part of our discovery work is that we align to business areas, business divisions, or business processes, and we spend time in that discovery space actually mapping the business process. What is the goal of this process? Ultimately, how does it support the P&L?” — Peter Everill (12:29)
  • “There’s three things that are successful for any organization that will make this work and make it stick. The first is defining what you mean by a data product. The second is the role of a data product manager in the organization and really being clear what it is that they do and what they don’t do. … And the third thing is their methodology, from discovery through to delivery. The more work you put upfront defining those and getting everyone trained and clear on that, I think the quicker you’ll get to an organization that’s really clear about what it’s delivering, how it delivers, and who does what.” – Peter Everill (17:31)

 

  • “The important way that data and analytics can help an organization firstly is, understanding how that organization is performing. And essentially, performance is how well processes and decisions within the organization are being executed, and the impact that has on the P&L.” – Peter Everill (20:24)

 

  • “The great majority of organizations don’t allocate that percentage [20-25%] of time to discovery; they are jumping straight into solution. And also, this is where organizations typically then actually just migrate what already exists from, maybe, legacy service into a shiny new cloud platform, which might be good from a defensive data strategy point of view, but doesn’t offer new net value—apart from speed, security and et cetera of the cloud. Ultimately, this is why analytics organizations aren’t generally delivering value to organizations.” – Peter Everill (25:37)

 

  • “The only time that value is delivered, is from a user taking action. So, the two metrics that we really focus on with all four data products [are] reach [and impact].” – Peter Everill (27:44)

 

  • “In terms of benefits realization, that is owned by the business unit. Because ultimately, you’re asking them to take the action. And if they do, it’s their part of the P&L that’s improving because they own the business, they own the performance. So, you really need to get them engaged on the release, and for them to have the superusers, the champions of the product, and be driving voice of the release just as much as the product team.” – Peter Everill (30:30)

 

  • On hiring DPMs: “Are [candidates] showing the aptitude, do they understand what the role is, rather than the experience? I think data and analytics and machine learning product management is a relatively new role. You can’t go on LinkedIn necessarily, and be exhausted with a number of candidates that have got years and years of data and analytics product management.” – Peter Everill (36:40)
Links
28 Jun 2022094 - The Multi-Million Dollar Impact of Data Product Management and UX with Vijay Yadav of Merck00:46:02

Today I sit down with Vijay Yadav, head of the data science team at Merck Manufacturing Division. Vijay begins by relating his own path to adopting a data product and UX-driven approach to applied data science, andour chat quickly turns to the ever-present challenge of user adoption. Vijay discusses his process of designing data products with customers, as well as the impact that building user trust has on delivering business value. We go on to talk about what metrics can be used to quantify adoption and downstream value, and then Vijay discusses the financial impact he has seen at Merck using this user-oriented perspective. While we didn’t see eye to eye on everything, Vijay was able to show how focusing on the last mile UX has had a multi-million dollar impact on Merck. The conversation concludes with Vijay’s words of advice for other data science directors looking to get started with a design and user-centered approach to building data products that achieve adoption and have measurable impact.

 

In our chat, we covered Vijay’s design process, metrics, business value, and more: 

 

  • Vijay shares how he came to approach data science with a data product management approach and how UX fits in (1:52)
  • We discuss overcoming the challenge of user adoption by understanding user thinking and behavior (6:00)
  • We talk about the potential problems and solutions when users self-diagnose their technology needs (10:23)
  • Vijay delves into what his process of designing with a customer looks like (17:36)
  • We discuss the impact “solving on the human level” has on delivering real world benefits and building user trust (21:57)
  • Vijay talks about measuring user adoption and quantifying downstream value—and Brian discusses his concerns about tool usage metrics as means of doing this (25:35)
  • Brian and Vijay discuss the multi-million dollar financial and business impact Vijay has seen at Merck using a more UX  driven approach to data product development (31:45)
  • Vijay shares insight on what steps a head of data science  might wish to take to get started implementing a data product and UX approach to creating ML and analytics applications that actually get used  (36:46)
Quotes from Today’s Episode
  • “They will adopt your solution if you are giving them everything they need so they don’t have to go look for a workaround.” - Vijay (4:22)

 

  • “It’s really important that you not only capture the requirements, you capture the thinking of the user, how the user will behave if they see a certain way, how they will navigate, things of that nature.” - Vijay (7:48)

 

  • “When you’re developing a data product, you want to be making sure that you’re taking the holistic view of the problem that can be solved, and the different group of people that we need to address. And, you engage them, right?” - Vijay (8:52)

 

  • “When you’re designing in low fidelity, it allows you to design with users because you don’t spend all this time building the wrong thing upfront, at which point it’s really expensive in time and money to go and change it.” - Brian (17:11)

 

  • "People are the ones who make things happen, right? You have all the technology, everything else looks good, you have the data, but the people are the ones who are going to make things happen.” - Vijay (38:47)

 

  • “You want to make sure that you [have] a strong team and motivated team to deliver. And the human spirit is something, you cannot believe how stretchable it is. If the people are motivated, [and even if] you have less resources and less technology, they will still achieve [your goals].” - Vijay (42:41)

 

  • “You’re trying to minimize any type of imposition on [the user], and make it obvious why your data product  is better—without disruption. That’s really the key to the adoption piece: showing how it is going to be better for them in a way they can feel and perceive. Because if they don’t feel it, then it’s just another hoop to jump through, right?” - Brian (43:56)
Resources and Links:

 LinkedIn: https://www.linkedin.com/in/vijyadav/

25 Jun 2024146 - (Rebroadcast) Beyond Data Science - Why Human-Centered AI Needs Design with Ben Shneiderman00:42:07

Ben Shneiderman is a leading figure in the field of human-computer interaction (HCI). Having founded one of the oldest HCI research centers in the country at the University of Maryland in 1983, Shneiderman has been intently studying the design of computer technology and its use by humans. Currently, Ben is a Distinguished University Professor in the Department of Computer Science at the University of Maryland and is working on a new book on human-centered artificial intelligence.

 

 

I’m so excited to welcome this expert from the field of UX and design to today’s episode of Experiencing Data! Ben and I talked a lot about the complex intersection of human-centered design and AI systems.

 

 

In our chat, we covered:

  • Ben's career studying human-computer interaction and computer science. (0:30)
  • 'Building a culture of safety': Creating and designing ‘safe, reliable and trustworthy’ AI systems. (3:55)
  • 'Like zoning boards': Why Ben thinks we need independent oversight of privately created AI. (12:56)
  • 'There’s no such thing as an autonomous device': Designing human control into AI systems. (18:16)
  • A/B testing, usability testing and controlled experiments: The power of research in designing good user experiences. (21:08)
  • Designing ‘comprehensible, predictable, and controllable’ user interfaces for explainable AI systems and why [explainable] XAI matters. (30:34)
  • Ben's upcoming book on human-centered AI. (35:55)

 

 

Resources and Links:  

 

Quotes from Today’s Episode The world of AI has certainly grown and blossomed — it’s the hot topic everywhere you go. It’s the hot topic among businesses around the world — governments are launching agencies to monitor AI and are also making regulatory moves and rules. … People want explainable AI; they want responsible AI; they want safe, reliable, and trustworthy AI. They want a lot of things, but they’re not always sure how to get them. The world of human-computer interaction has a long history of giving people what they want, and what they need. That blending seems like a natural way for AI to grow and to accommodate the needs of real people who have real problems. And not only the methods for studying the users, but the rules, the principles, the guidelines for making it happen. So, that’s where the action is. Of course, what we really want from AI is to make our world a better place, and that’s a tall order, but we start by talking about the things that matter — the human values: human rights, access to justice, and the dignity of every person. We want to support individual goals, a person’s sense of self-efficacy — they can do what they need to in the world, their creativity, their responsibility, and their social connections; they want to reach out to people. So, those are the sort of high aspirational goals that become the hard work of figuring out how to build it. And that’s where we want to go. - Ben (2:05)

 

The software engineering teams creating AI systems have got real work to do. They need the right kind of workflows, engineering patterns, and Agile development methods that will work for AI. The AI world is different because it’s not just programming, but it also involves the use of data that’s used for training. The key distinction is that the data that drives the AI has to be the appropriate data, it has to be unbiased, it has to be fair, it has to be appropriate to the task at hand. And many people and many companies are coming to grips with how to manage that. This has become controversial, let’s say, in issues like granting parole, or mortgages, or hiring people. There was a controversy that Amazon ran into when its hiring algorithm favored men rather than women. There’s been bias in facial recognition algorithms, which were less accurate with people of color. That’s led to some real problems in the real world. And that’s where we have to make sure we do a much better job and the tools of human-computer interaction are very effective in building these better systems in testing and evaluating. - Ben (6:10)

 

 

Every company will tell you, “We do a really good job in checking out our AI systems.” That’s great. We want every company to do a really good job. But we also want independent oversight of somebody who’s outside the company — someone who knows the field, who’s looked at systems at other companies, and who can bring ideas and bring understanding of the dangers as well. These systems operate in an adversarial environment — there are malicious actors out there who are causing trouble. You need to understand what the dangers and threats are to the use of your system. You need to understand where the biases come from, what dangers are there, and where the software has failed in other places. You may know what happens in your company, but you can benefit by learning what happens outside your company, and that’s where independent oversight from accounting companies, from governmental regulators, and from other independent groups is so valuable. - Ben (15:04)

 

 

There’s no such thing as an autonomous device. Someone owns it; somebody’s responsible for it; someone starts it; someone stops it; someone fixes it; someone notices when it’s performing poorly. … Responsibility is a pretty key factor here. So, if there’s something going on, if a manager is deciding to use some AI system, what they need is a control panel, let them know: what’s happening? What’s it doing? What’s going wrong and what’s going right? That kind of supervisory autonomy is what I talk about, not full machine autonomy that’s hidden away and you never see it because that’s just head-in-the-sand thinking. What you want to do is expose the operation of a system, and where possible, give the stakeholders who are responsible for performance the right kind of control panel and the right kind of data. … Feedback is the breakfast of champions. And companies know that. They want to be able to measure the success stories, and they want to know their failures, so they can reduce them. The continuous improvement mantra is alive and well. We do want to keep tracking what’s going on and make sure it gets better. Every quarter. - Ben (19:41)

 

 

Google has had some issues regarding hiring in the AI research area, and so has Facebook with elections and the way that algorithms tend to become echo chambers. These companies — and this is not through heavy research — probably have the heaviest investment of user experience professionals within data science organizations. They have UX, ML-UX people, UX for AI people, they’re at the cutting edge. I see a lot more generalist designers in most other companies. Most of them are rather unfamiliar with any of this or what the ramifications are on the design work that they’re doing. But even these largest companies that have, probably, the biggest penetration into the most number of people out there are getting some of this really important stuff wrong. - Brian (26:36)

 

Explainability is a competitive advantage for an AI system. People will gravitate towards systems that they understand, that they feel in control of, that are predictable. So, the big discussion about explainable AI focuses on what’s usually called post-hoc explanations, and the Shapley, and LIME, and other methods are usually tied to the post-hoc approach.That is, you use an AI model, you get a result and you say, “What happened?” Why was I denied a parole, or a mortgage, or a job? At that point, you want to get an explanation. Now, that idea is appealing, but I’m afraid I haven’t seen too many success stories of that working. … I’ve been diving through this for years now, and I’ve been looking for examples of good user interfaces of post-hoc explanations. It took me a long time till I found one. The culture of AI model-building would be much bolstered by an infusion of thinking about what the user interface will be for these explanations. And even the DARPA’s XAI—Explainable AI—project, which has 11 projects within it—has not really grappled with this in a good way about designing what it’s going to look like. Show it to me. … There is another way. And the strategy is basically prevention. Let’s prevent the user from getting confused and so they don’t have to request an explanation. We walk them along, let the user walk through the step—this is like Amazon checkout process, seven-step process—and you know what’s happened in each step, you can go back, you can explore, you can change things in each part of it. It’s also what TurboTax does so well, in really complicated situations, and walks you through it. … You want to have a comprehensible, predictable, and controllable user interface that makes sense as you walk through each step. - Ben (31:13)

31 Oct 2023129 - Why We Stopped, Deleted 18 Months of ML Work, and Shifted to a Data Product Mindset at Coolblue00:35:21

Today I’m joined by Marnix van de Stolpe, Product Owner at Coolblue in the area of data science. Throughout our conversation, Marnix shares the story of how he joined a data science team that was developing a solution that was too focused on the delivery of a data-science metric that was not on track to solve a clear customer problem. We discuss how Marnix came to the difficult decision to throw out 18 months of data science work, what it was like to switch to a human-centered, product approach, and the challenges that came with it. Marnix shares the impact this decision had on his team and the stakeholders involved, as well as the impact on his personal career and the advice he would give to others who find themselves in the same position. Marnix is also a Founding Member of the Data Product Leadership Community and will be going much more into the details and his experience live on Zoom on November 16 @ 2pm ET for members.

 

Highlights/ Skip to:

  • I introduce Marnix, Product Owner at Coolblue and one of the original members of the Data Product Leadership Community (00:35)
  • Marnix describes what Coolblue does and his role there (01:20)
  • Why and how Marnix decided to throw away 18 months of machine learning work (02:51)
  • How Marnix determined that the KPI (metric) being created wasn’t enough to deliver a valuable product (07:56)
  • Marnix describes the conversation with his data science team on mapping the solution back to the desired outcome (11:57)
  • What the culture is like at Coolblue now when developing data products (17:17)
  • Marnix’s advice for data product managers who are coming into an environment where existing work is not tied to a desired outcome (18:43)
  • Marnix and I discuss why data literacy is not the solution to making more impactful data products (21:00)
  • The impact that Marnix’s human-centered approach to data product development has had on the stakeholders at Coolblue (24:54)
  • Marnix shares the ultimate outcome of the product his team was developing to measure product returns (31:05)
  • How you can get in touch with Marnix (33:45)
Links
12 Jul 2022095 - Increasing Adoption of Data Products Through Design Training: My Interview from TDWI Munich00:16:50

Today I am bringing you a recording of a live interview I did at the TDWI Munich conference for data leaders, and this episode is a bit unique as I’m in the “guest” seat being interviewed by the VP of TDWI Europe, Christoph Kreutz. 

Christoph wanted me to explain the new workshop I was giving later that day, which focuses on helping leaders increase user adoption of data products through design. In our chat, I explained the three main areas I pulled out of my full 4-week seminar to create this new ½-day workshop as well as the hands-on practice that participants would be engaging in. The three focal points for the workshop were: measuring usability via usability studies, identifying the unarticulated needs of stakeholders and users, and sketching in low fidelity to avoid over committing to solutions that users won’t value. 

Christoph also asks about the format of the workshop, and I explain how I believe data leaders will best learn design by doing it. As such, the new workshop was designed to use small group activities, role-playing scenarios, peer review…and minimal lecture! After discussing the differences between the abbreviated workshop and my full 4-week seminar, we talk about my consulting and training business “Designing for Analytics,” and conclude with a fun conversation about music and my other career as a professional musician. 

In a hurry? Skip to: 

  • I summarize the new workshop version of “Designing Human-Centered Data Products” I was premiering at TDWI (4:18)
  • We talk about the format of my workshop (7:32)
  • Christoph and I discuss future opportunities for people to participate in this workshop (9:37)
  • I explain the format of the main 8-week seminar versus the new half-day workshop  (10:14)
  • We talk about one on one coaching (12:22)
  • I discuss my background, including my formal music training and my other career as a professional musician (14:03)
Quotes from Today’s Episode
  • “We spend a lot of time building outputs and infrastructure and pipelines and data engineering and generating stuff, but not always generating outcomes. Users only care about how does this make my life better, my job better, my job easier? How do I look better? How do I get a promotion? How do I make the company more money? Whatever those goals are. And there’s a gap there sometimes, between the things that we ship and delivering these outcomes.” (4:36)
  • “In order to run a usability study on a data product, you have to come up with some type of learning goals and some kind of scenarios that you’re going to give to a user and ask them to go show me how you would do x using the data thing that we built for you.” (5:54)
  • “The reality is most data users and stakeholders aren’t designers and they’re not thinking about the user’s workflow and how a solution fits into their job. They don’t have that context. So, how do we get the really important requirements out of a user or stakeholder’s head? I teach techniques from qualitative UX interviewing, sales, and even hostage negotiation to get unarticulated needs out of people’s head.” (6:41)
  • “How do we work in low fidelity to get data leaders on the same page with a stakeholder or a user? How do we design with users instead of for them? Because most of the time, when we communicate visually, it starts to click (or you’ll know it’s not clicking!)” (7:05)
  • “There’s no right or wrong [in the workshop]. [The workshop] is really about the practice of using these design methods and not the final output that comes out of the end of it.” (8:14)
  • “You learn design by doing design so I really like to get data people going by trying it instead of talking about trying it. More design doing and less design thinking!” (8:40)
  • “The tricky thing [for most of my training clients], [and perhaps this is true with any type of adult education] is, ‘Yeah, I get the concept of what Brian’s talking about, but, how do I apply these design techniques to my situation? I work in this really weird domain, or on this particularly hard data space.’ Working on an exercise or real project, together, in small groups, is how I like start to make the conceptual idea of design into a tangible tool for data leaders..” (12:26)
Links
19 Oct 2021076 - How Bedrock’s “Data by Design” Mantra Helps Them Build Human-Centered Solutions with Jesús Templado00:43:38

Why do we need or care about design in the work of data science? Jesús Templado, Managing Director at Bedrock, is here to tell us about how Bedrock executes their mantra, “data by design.” 

 

Bedrock has found ways to bring to their clients a design-driven, human-centered approach by utilizing a “hybrid model” to synthesize technical possibilities with human needs. In this episode, we explore Bedrock’s vision for how to achieve this synthesis as part of the firm’s DNA, and how Bedrock adopted their vision to make data more approachable with the client being central to their design efforts. Jesús also discusses a time when he championed making “data by design” a successful strategy with a large chain of hotels, and he offers insight on how making clients feel validated and heard plays a part.

 

In our chat, we also covered: 

  • “Data by design” and how Bedrock implements this design-driven approach. (00:43)
  • Bedrock’s vision for how they support their clients and why design has always been part of their DNA. (08:53)
  • Jesús shares a time when he successfully implemented a design process for a large chain of hotels, and some of the challenges that came with that approach. (14:47)
  • The importance of making clients feel heard by dedicating time to research and UX and how the team navigates conversations about risk with customers. (24:12)
  • More on the client experience and how Bedrock covers a large spectrum of areas to ensure that they deliver a product that makes sense for the customer. (33:01)
  • Jesús’ opinion on why companies should consider change management when building products and systems - and a look at the Data Stand-Up podcast (35:42)
Quotes from Today’s Episode

“Many people in corporations don’t have the technical background to understand the possibilities when it comes to analyzing or using data. So, bringing a design-based framework, such as design thinking, is really important for all of the work that we do for our clients.” - Jesús Templado (2:33)

 

“We’ve mentioned “data by design” before as our mantra; we very much prefer building long-lasting relationships based on [understanding] our clients' business and their strategic goals. We then design and ideate an implementation roadmap with them and then based on that, we tackle different periods for building different models. But we build the models because we understand what’s going to bring us an outcome for the business—not because the business brings us in to deliver only a model for the sake of predicting what the weather is going to be in two weeks.”- Jesús Templado (14:07)

 

“I think as consultants and people in service, it’s always nice to make friends. And, I like when I can call a client a friend, but I feel like I’m really here to help them deliver a better future state [...] And the road may be bumpy, especially if design is a new thing. And it is often new; in the context of data science and analytics projects.”- Brian T. O’Neill (@rhythmspice) (26:49)

 

“When we do data science [...] that’s a means to an end. We do believe it’s important that the client understands the reasoning behind everything that we do and build, but at the end of the day, it’s about understanding that business problem, understanding the challenge that the company is facing, knowing what the expected outcome is, and knowing how you will deliver or predict that outcome to be used for something meaningful and relevant for the business.”- Jesús Templado (33:06)

 

“The appetite for innovation is high, but a lot of the companies that want to do it are more concerned about risk. Risk and innovation are at opposite ends of the spectrum. And so, if you want to be innovative, by definition—you’re signing up for failure on the way to success. [...] It’s about embracing an iterative process, it’s about getting feedback along the way, it’s about knowing that we don’t know everything, and we’re signing up for that ambiguity along the way to something better.”- Brian T. O’Neill (@rhythmspice) (38:20)

 

Links Referenced
16 May 2023117 - Phil Harvey, Co-Author of “Data: A Guide to Humans,” on the Non-Technical Skills Needed to Produce Valuable AI Solutions00:39:39

Today I’m chatting with Phil Harvey, co-author of Data: A Guide to Humans and a technology professional with 23 years of experience working with AI and startups. In his book, Phil describes his philosophy of how empathy leads to more successful outcomes in data product development and the journey he took to arrive at this perspective. But what does empathy mean, and how do you measure its success? Brian and Phil dig into those questions, and Phil explains why he feels cognitive empathy is a learnable skill that one can develop and apply. Phil describes some leading indicators that empathy is needed on a data team, as well as leading indicators that a more empathetic approach to product development is working. While I use the term “design” or “UX” to describe a lot of what Phil is talking about, Phil actually has some strong opinions about UX and shares those on this episode. Phil also reveals why he decided to write Data: A Guide to Humans and some of the experiences that helped shape the book’s philosophy. 

Highlights/ Skip to:

  • Phil introduces himself and explains how he landed on the name for his book (00:54) 
  • How Phil met his co-author, Noelia Jimenez Martinez, and the reason they started writing Data: A Guide to Humans (02:31)
  • Phil unpacks his understanding of how he defines empathy, why it leads to success on AI projects, and what success means to him (03:54)
  • Phil walks through a couple scenarios where empathy for users and stakeholders was lacking and the impacts it had (07:53)
  • The work Phil has done internally to get comfortable doing the non-technical work required to make ML/AI/data products successful  (13:45)
  • Phil describes some indicators that data teams can look for to know their design strategy is working (17:10)
  • How Phil sees the methodology in his book relating to the world of UX (user experience) design (21:49)
  • Phil walks through what an abstract concept like “empathy” means to him in his work and how it can be learned and applied as a practical skill (29:00)
Quotes from Today’s Episode
  • “If you take success in itself, this is about achieving your intended outcomes. And if you do that with empathy, your outcomes will be aligned to the needs of the people the outcomes are for. Your outcomes will be accepted by stakeholders because they’ll understand them.” — Phil Harvey (05:05)
  • “Where there’s people not discussing and not considering the needs and feelings of others, you start to get this breakdown, data quality issues, all that.” – Phil Harvey (11:10)

 

  • “I wanted to write code; I didn’t want to deal with people. And you feel when you can do technical things, whether it’s machine-learning or these things, you end up with the ‘I’ve got a hammer and now everything looks like a nail problem.’ But you also have the [attitude] that my programming will solve everything.” – Phil Harvey (14:48)

 

  • “This is what startup-land really taught me—you can’t do everything. It’s very easy to think that you can and then burn yourself out. You need a team of people.” – Phil Harvey (15:09)

 

  • “Let’s listen to the users. Let’s bring that perspective in as opposed to thinking about aligning the two perspectives. Because any product is a change. You don’t ride a horse then jump in a car and expect the car to work like the horse.” – Phil Harvey (22:41)

 

  • “Let’s say you’re a leader in this space. … Listen out carefully for who’s complaining about who’s not listening to them. That’s a first early signal that there’s work to be done from an empathy perspective.” – Phil Harvey (25:00)

 

  • “The perspective of the book that Noelia and I have written is that empathy—and cognitive empathy particularly—is also a learnable skill. There are concrete and real things you can practice and do to improve in those skills.” – Phil Harvey (29:09)
Links
17 May 2022091 - How Brazil’s Biggest Fiber Company, Oi, Leverages Design To Create Useful Data Products with Sr. Exec. Design Manager, João Critis00:31:24

Today I talked with João Critis from Oi. Oi is a Brazilian telecommunications company that is a pioneer in convergent broadband services, pay TV, and local and long-distance voice transmission. They operate the largest fiber optics network in Brazil which reaches remote areas to promote digital inclusion of the population. João manages a design team at Oi that is responsible for the front end of data products including dashboards, reports, and all things data visualization.  We begin by discussing João’s role leading a team of data designers. João then explains what data products actually are, and who makes up his team’s users and customers. João goes on to discuss user adoption challenges at Oi and the methods they use to uncover what users need in the last mile. He then explains the specific challenges his team has faced, particularly with middle management, and how his team builds credibility with senior leadership. In conclusion, João reflects on the value of empathy in the design process. 

 

In this episode, João shares:  

  • A data product  (4:48)
  • The research process used by his data teams to build journey maps for clients (7:31)
  • User adoption challenges for Oi (15:27)
  • His answer to the question “how do you decide which mouths to feed?” (16:56)
  • The unique challenges of middle management in delivering useful data products (20:33)
  • The importance of empathy in innovation (25:23)
  • What data scientists need to learn about design and vice versa (27:55)

 

Quotes from Today’s Episode

  • “We put the final user in the center of our process. We [conduct] workshops involving co-creation and prototyping, and we test how people work with data.” - João (8:22)
  • "My first responsibility here is value generation. So, if you have to take two or three steps back, another brainstorm, rethink, and rebuild something that works…. [well], this is very common for us.” - João (19:28)
  • “If you don’t make an impact on the individuals, you’re not going to make an impact on the business. Because as you said, if they don’t use any of the outputs we make, then they really aren’t solutions and no value is created. - Brian (25:07)
  • “It’s really important to do what we call primary research where you’re directly interfacing as much as possible with the horse’s mouth, no third parties, no second parties. You’ve really got to develop that empathy.” - Brian (25:23)
  • “When we are designing some system or screen or other digital artifact, [we have to understand] this is not only digital, but a product. We have to understand people, how people interact with systems, with computers, and how people interact with visual presentations.” - João (28:16)
Links
23 Aug 2022098 - Why Emilie Schario Wants You to Run Your Data Team Like a Product Team00:39:41

Today I’m chatting with Emilie Shario, a Data Strategist in Residence at Amplify Partners. Emilie thinks data teams should operate like product teams. But what led her to that conclusion, and how has she put the idea into practice? Emilie answers those questions and more, delving into what kind of pushback and hiccups someone can expect when switching from being data-driven to product-driven and sharing advice for data scientists and analytics leaders.

 

Highlights / Skip to:

 

  • Answering the question “whose job is it” (5:18)
  • Understanding and solving problems instead of just building features people ask for (9:05)
  • Emilie explains what Amplify Partners is and talks about her work experience and how it fuels her perspectives on data teams (11:04)
  • Emilie and I talk about the definition of data product (13:00)
  • Emilie talks about her approach to building and training a data team (14:40)
  • We talk about UX designers and how they fit into Emilie’s data teams (18:40)
  • Emilie talks about the book and blog “Storytelling with Data” (21:00)
  • We discuss the push back you can expect when trying to switch a team from being data driven to being product driven (23:18)
  • What hiccups can people expect when switching to a product driven model (30:36)
  • Emilie’s advice for data scientists and and analyst leaders (35:50)
  • Emilie explains what Locally Optimistic is (37:34)

 

Quotes from Today’s Episode
  • “Our thesis is…we need to understand the problems we’re solving before we start building solutions, instead of just building the things people are asking for.” — Emilie (2:23)

 

  • “I’ve seen this approach of flipping the ask on its head—understanding the problem you’re trying to solve—work and be more successful at helping drive impact instead of just letting your data team fall into this widget builder service trap.” — Emilie (4:43)

 

  • “If your answer to any problem to me is, ‘That’s not my job,’ then I don’t want you working for me because that’s not what we’re here for. Your job is whatever the problem in front of you that needs to be solved.” — Emilie (7:14)

 

  • “I don’t care if you have all of the data in the world and the most talented machine learning engineers and you’ve got the ability to do the coolest new algorithm fancy thing. If it doesn’t drive business impact, it doesn’t matter.” — Emilie (7:52)

 

  • “Data is not just a thing that anyone can do. It’s not just about throwing numbers in a spreadsheet anymore. It’s about driving business impact. But part of how we drive business impact with data is making it accessible. And accessible isn’t just giving people the numbers, it’s also communicating with it effectively, and UX is a huge piece of how we do that.” — Emilie (19:57)

 

  • “There are no null choices in design. Someone is deciding what some other human—a customer, a client, an internal stakeholder—is going to use, whether it’s a React app, or a Power BI dashboard, or a spreadsheet dump, or whatever it is, right? There will be an experience that is created, whether it is intentionally created or not.” — Brian (20:28)

 

  • “People will think design is just putting in colors that match together, like, or spinning the color wheel and seeing what lands. You know, there’s so much more to it. And it is an expertise; it is a domain that you have to develop.” — Emilie (34:58)
  Links Referenced:
26 Dec 2023133 - New Experiencing Data Interviews Coming in January 202400:02:33

Today I am sharing some highlights for 2023 from the podcast, and also letting you all know I’ll be taking a break from the podcast for the rest of December, but I’ll be back with a new episode on January 9th, 2024. I’ve also got two links to share with you—details inside!

 

Transcript

Greetings everyone - I’m taking a little break from Experiencing Data over December of 2023, but I’ll be back in January with more interviews and insights on leveraging UX design and product management to create indispensable data products, machine learning apps, and decision support tools. 

Experiencing Data turned this year five years old back in November, with over 130 episodes to date! I still can’t believe it’s been going that long and how far we’ve come. 

Some highlights for me in 2023 included launching the Data Product Leadership Community, finding out that the show is now in the top 2% of all podcasts worldwide according to ListenNotes, and most of all, hearing from you that the podcast, and my writing, and the guests that  I have brought on are having an impact on your work, your careers, and hopefully the lives of your customers, users, and stakeholders as well! 

So, for now, I’ve got just two links for you:

If you’re wondering how to either:

  • support the show yourself with a really fast review on Apple Podcasts,
  • to record a quick audio question for me to answer on the show,
  •  or if you want to join my free Insights mailing lists where I share my bi-weekly ideas and thoughts and 1-page episode summaries of all the show drops that I put out here on Experiencing Data.

…just head over to designingforanalytics.com/podcast and you’ll get links to all those things there.

And secondly, if you need help increasing customer adoption, delight, the business value, or the usability of your analytics and machine learning applications in 2024, I invite you to set up a free discovery call with me 1 on 1. 

You bring the questions, I’ll bring my ears, and by the end of the call, I’ll give you my best advice on how to move forward with your situation – whether it’s working with me or not. To schedule one of those free discovery calls, visit designingforanalytics.com/go

And finally, there will be some news coming out next year with the show, as well as my business, so I hope you’ll hop on the mailing list and stay tuned, that’s probably the best place to do that. And if you celebrate holidays in December and January, I hope they’re safe, enjoyable, and rejuvenating. Until 2024, stay tuned right here - and in the words of the great Arnold Schwarzenegger, I’ll be back.

13 Dec 2022106 - Ideaflow: Applying the Practice of Design and Innovation to Internal Data Products w/ Jeremy Utley00:44:14

Today I’m chatting with former-analyst-turned-design-educator Jeremy Utley of the Stanford d.school and co-author of Ideaflow. Jeremy reveals the psychology behind great innovation, and the importance of creating psychological safety for a team to generate what they may view as bad ideas. Jeremy speaks to the critical collision of unrelated frames of reference when problem-solving, as well as why creativity is actually more of a numbers game than awaiting that singular stroke of genius. Listen as Jeremy gives real-world examples of how to practice and measure (!) your innovation efforts and apply them to data products.

 

Highlights/ Skip to:

 

  • Jeremy explains the methodology of thinking he’s adopted after moving from highly analytical roles to the role he’s in now (01:38)
  • The approach Jeremy takes to the existential challenge of balancing innovation with efficiency (03:54)
  • Brian shares a story of a creative breakthrough he had recently and Jeremy uses that to highlight how innovation often comes in a way contrary to normalcy and professionalism (09:37)
  • Why Jeremy feels innovation and creativity demand multiple attempts at finding solutions (16:13)
  • How to take a innovation-forward approach like the ones Jeremy has described when working on internal tool development (19:33)
  • Jeremy’s advice for accelerating working through bad ideas to get to the good ideas (25:18)
  • The approach Jeremy takes to generate a large volume of ideas, rather than focusing only on “good” ideas, including a real-life example (31:54)
  • Jeremy’s beliefs on the importance of creating psychological safety to promote innovation and creative problem-solving (35:11)
Quotes from Today’s Episode
  • “I’m in spreadsheets every day to this day, but I recognize that there’s a time and place when that’s the tool that’s needed, and then specifically, there’s a time and a place where that’s not going to help me and the answer is not going to be found in the spreadsheet.” – Jeremy Utley (03:13)
  • “There’s the question of, ‘Are we doing it right?’ And then there’s a different question, which is, ‘Are we doing the right “it”?’ And I think a lot of us tend to fixate on, ‘Are we doing it right?’ And we have an ability to perfectly optimize that what should not be done.” – Jeremy Utley (05:05)
  • “I think a vendetta that I have is against this wrong placement of—this exaltation of efficiency is the end-all, be-all. Innovation is not efficient. And the question is not how can I be efficient. It’s what is effective. And effectiveness, oftentimes when it comes to innovation and breaking through, doesn’t feel efficient.” – Jeremy Utley (09:17)
  • “The way the brain works, we actually understand it. The way breakthroughs work we actually understand them. The difficulty is it challenges our definitions of efficiency and professionalism and all of these things.” – Jeremy Utley (15:13)

 

  • “What’s the a priori probability that any solution is the right solution? Or any idea is a good idea? It’s exceptionally low. You have to be exceptionally arrogant to think that most of your ideas are good. They’re not. That’s fine, we don’t mind because then what’s efficient is actually to generate a lot.” – Jeremy Utley (26:20)

  • “If you don’t learn that nothing happens when the ball hits the floor, you can never learn how to juggle. And to me, it’s a really good metaphor. The teams that don’t learn nothing happens when they have a bad idea. Literally, the world does not end. They don’t get fired. They don’t get ridiculed. Now, if they do get fired or ridiculed, that’s a leadership problem.” – Jeremy Utley (35:59)

 

  • [The following] is an essential question for a team leader to ask. Do people on my team have the freedom, at least with me, to share what they truly fear could be an incredibly stupid idea?” – Jeremy Utley (41:52)

 

Links
29 Oct 2024155 - Understanding Human Engagement Risk When Designing AI and GenAI User Experiences00:55:33

The relationship between AI and ethics is both developing and delicate. On one hand, the GenAI advancements to date are impressive. On the other, extreme care needs to be taken as this tech continues to quickly become more commonplace in our lives. In today’s episode, Ovetta Sampson and I examine the crossroads ahead for designing AI and GenAI user experiences.

 

 

While professionals and the general public are eager to embrace new products, recent breakthroughs, etc.; we still need to have some guard rails in place. If we don’t, data can easily get mishandled, and people could get hurt. Ovetta possesses firsthand experience working on these issues as they sprout up. We look at who should be on a team designing an AI UX, exploring the risks associated with GenAI, ethics, and need to be thinking about going forward.

 

 

Highlights/ Skip to:
  • (1:48) Ovetta's background and what she brings to Google’s Core ML group
  • (6:03) How Ovetta and her team work with data scientists and engineers deep in the stack
  • (9:09)  How AI is changing the front-end of applications
  • (12:46) The type of people you should seek out to design your AI and LLM UXs
  • (16:15) Explaining why we’re only at the very start of major GenAI breakthroughs
  • (22:34) How GenAI tools will alter the roles and responsibilities of designers, developers, and product teams
  • (31:11) The potential harms of carelessly deploying GenAI technology
  • (42:09) Defining acceptable levels of risk when using GenAI in real-world applications
  • (53:16) Closing thoughts from Ovetta and where you can find her

 

 

Quotes from Today’s Episode
  • “If artificial intelligence is just another technology, why would we build entire policies and frameworks around it? The reason why we do that is because we realize there are some real thorny ethical issues [surrounding AI]. Who owns that data? Where does it come from? Data is created by people, and all people create data. That’s why companies have strong legal, compliance, and regulatory policies around [AI], how it’s built, and how it engages with people. Think about having a toddler and then training the toddler on everything in the Library of Congress and on the internet. Do you release that toddler into the world without guardrails? Probably not.” - Ovetta Sampson (10:03)
  • “[When building a team] you should look for a diverse thinker who focuses on the limitations of this technology- not its capability. You need someone who understands that the end destination of that technology is an engagement with a human being.  You need somebody who understands how they engage with machines and digital products. You need that person to be passionate about testing various ways that relationships can evolve. When we go from execution on code to machine learning, we make a shift from [human] agency to a shared-agency relationship. The user and machine both have decision-making power. That’s the paradigm shift that [designers] need to understand. You want somebody who can keep that duality in their head as they’re testing product design.” - Ovetta Sampson (13:45)
  • “We’re in for a huge taxonomy change. There are words that mean very specific definitions today. Software engineer. Designer. Technically skilled. Digital. Art. Craft. AI is changing all that. It’s changing what it means to be a software engineer. Machine learning used to be the purview of data scientists only, but with GenAI, all of that is baked in to Gemini. So, now you start at a checkpoint, and you’re like, all right, let’s go make an API, right? So, the skills, the understanding, the knowledge, the taxonomy even, how we talk about these things, how do we talk about the machine who speaks to us talks to us, who could create a podcast out of just voice memos?” - Ovetta Sampson (24:16)
  • “We have to be very intentional [when building AI tools], and that’s the kind of folks you want on teams. [Designers] have to go and play scary scenarios. We have to do that. No designer wants to be “Negative Nancy,” but this technology has huge potential to harm. It has harmed. If we don’t have the skill sets to recognize, document, and minimize harm, that needs to be part of our skill set.  If we’re not looking out for the humans, then who actually is?” - Ovetta Sampson (32:10)
  • “[Research shows] things happen to our brain when we’re exposed to artificial intelligence… there are real human engagement risks that are an opportunity for design.  When you’re designing a self-driving car, you can’t just let the person go to sleep unless the car is fully [automated] and every other car on the road is self-driving. If there are humans behind the wheel, you need to have a feedback loop system—something that’s going to happen [in case] the algorithm is wrong. If you don’t have that designed, there’s going to be a large human engagement risk that a car is going to run over somebody who’s [for example] pushing a bike up a hill[...] Why? The car could not calculate the right speed and pace of a person pushing their bike. It had the speed and pace of a person walking, the speed and pace of a person on a bike, but not the two together. Algorithms will be wrong, right?” - Ovetta Sampson (39:42)
  • “Model goodness used to be the purview of companies and the data scientists. Think about the first search engines. Their model goodness was [about] 77%. That’s good, right? And then people started seeing photos of apes when [they] typed in ‘black people.’ Companies have to get used to going to their customers in a wide spectrum and asking them when they’re [models or apps are] right and wrong.  They can’t take on that burden themselves anymore. Having ethically sourced data input and variables is hard work. If you’re going to use this technology, you need to put into place the governance that needs to be there.” - Ovetta Sampson (44:08)
14 Nov 2023130 - Nick Zervoudis on Data Product Management, UX Design Training and Overcoming Imposter Syndrome00:48:56

Today I’m joined by Nick Zervoudis, Data Product Manager at CKDelta. As we dive into his career and background, Nick shares insights into his approach when it comes to developing both internal and external data products. Nick explains why he feels that a software engineering approach is the best way to develop a product that could have multiple applications, as well as the unique way his team is structured to best handle the needs of both internal and external customers. He also talks about the UX design course he took, how that affected his data product work and research with users, and his thoughts on dashboard design. We discuss common themes he’s observed when data product teams get it wrong, and how he manages feelings of imposter syndrome in his career as a DPM. 

Highlights/ Skip to:

  • I introduce Nick, who is a Data Product Manager at CKDelta (00:35)
  • Nick’s mindset around data products and how his early career in consulting shaped his approach (01:30)
  • How Nick defines a data product and why he focuses more on the process rather than the end product (03:59)
  • The types of data products that Nick has helped design and his work on both internal and external projects at CKDelta (07:57)
  • The similarities and differences of working with internal versus external stakeholders (12:37)
  • Nick dives into the details of the data products he has built and how they feed into complex use cases (14:21)
  • The role that Nick plays in the Delta Power SaaS application and how the CKDelta team is structured around that product (17:14)
  • Where Nick sees data products going wrong and how he’s found value in filling those gaps (23:30)
  • Nick’s view on how a digital-first mindset affects the scalability of data products (26:15)
  • Why Nick is often heavily involved in the design element of data product development and the course he took that helped shape his design work (28:55)
  • The imposter syndrome that Nick has experienced when implementing this new strategy to data product design (36:51)
  • Why Nick feels that figuring things out yourself is an inherent part of the DPM role (44:53)
  • Nick shares the origins and information on the London Data Product Management meetup (46:08)
Quotes from Today’s Episode
  • “What I’m always trying to do is see, how can we best balance the customer’s need to get exactly the data point or insight that they’re after to the business need. ... There’s that constant tug of war between customization and standardization that I have the joy of adjudicating. I think it’s quite fun.” — Nick Zervoudis (16:40)
  • “I’ve had times where I was hired, told, 'You’re going to be the product manager for this data product that we have,' as if it’s already, to some extent built and maybe the challenge is scaling it or bringing it to more customers or improving it, and then within a couple of weeks of starting to peek under the hood, realizing that this thing that is being branded a product is actually a bunch of projects hiding under a trench coat.” — Nick Zervoudis (24:04)
  • “If I just speak to five users because they’re the users, they’ll give me the insight I need. […] Even when you have a massive product with a huge user base, people face the same issues.” — Nick Zervoudis (33:49)
  • “For me, it’s more about making sure that you’re bringing that more software engineering way of building things, but also, before you do that, knowing that your users' needs are going to [be varied]. So, it’s a combination of both, are we building the right thing—in other words, a product that’s flexible enough to meet the different needs of different users—but also, are we building it in the right way?” – Nick Zervoudis (27:51)
  • “It’s not to say I’m the only person thinking about [UX design], but very often, I’m the one driving it.” – Nick Zervoudis (30:55)
  • “You’re never going to be as good at the thing your colleague does because their job almost certainly is to be a specialist: they’re an architect, they’re a designer, they’re a developer, they’re a salesperson, whereas your job [as a DPM] is to just understand it enough that you can then pass information across other people.” – Nick Zervoudis (41:12)
  • “Every time I feel like an imposter, good. I need to embrace that, because I need to be working with people that understand something better than me. If I’m not, then maybe something’s gone wrong there. That’s how I’ve actually embraced impostor syndrome.” – Nick Zervoudis (41:35)
Links
22 Mar 2022087 - How Data Product Management and UX Integrate with Data Scientists at Albertsons Companies to Improve the Grocery Shopping Experience00:37:36

For Danielle Crop, the Chief Data Officer of Albertsons, to draw distinctions between “digital” and “data” only limits the ability of an organization to create useful products. One of the reasons I asked Danielle on the show is due to her background as a CDO and former SVP of digital at AMEX, where she also managed  product and design groups. My theory is that data leaders who have been exposed to the worlds of software product and UX design are prone to approach their data product work differently, and so that’s what we dug into this episode.   It didn’t take long for Danielle to share how she pushes her data science team to collaborate with business product managers for a “cross-functional, collaborative” end result. This also means getting the team to understand what their models are personalizing, and how customers experience the data products they use. In short, for her, it is about getting the data team to focus on “outcomes” vs “outputs.” Scaling some of the data science and ML modeling work at Albertsons is a big challenge, and we talked about one of the big use cases she is trying to enable for customers, as well as one “real-life” non-digital experience that her team’s data science efforts are behind. The big takeaway for me here was hearing how a CDO like Danielle is really putting customer experience and the company’s brand at the center of their data product work, as opposed solely focusing on ML model development, dashboard/BI creation, and seeing data as a raw ingredient that lives in a vacuum isolated from people.  

 

In this episode, we cover:

  • Danielle’s take on the “D” in CDO: is the distinction between “digital” and “data” even relevant, especially for a food and drug retailer? (01:25)
  • The role of data product management and design in her org and how UX (i.e. shopper experience) is influenced by and considered in her team’s data science work (06:05)
  • How Danielle’s team thinks about “customers” particularly in the context of internal stakeholders vs. grocery shoppers  (10:20)
  • Danielle’s current and future plans for bringing her data team into stores to better understand shoppers and customers (11:11)
  • How Danielle’s data team works with the digital shopper experience team (12:02) 
  • “Outputs” versus “Outcomes”  for product managers, data science teams, and data products (16:30)
  • Building customer loyalty, in-store personalization, and long term brand interaction with data science at Albertsons (20:40)
  • How Danielle and her team at Albertsons measure the success of their data products (24:04)
  • Finding the problems, building the solutions, and connecting the data to the non-technical side of the company (29:11)

 

Quotes from Today’s Episode
  • “Data always comes from somewhere, right? It always has a source. And in our modern world, most of that source is some sort of digital software. So, to distinguish your data from its source is not very smart as a data scientist. You need to understand your data very well, where it came from, how it was developed, and software is a massive source of data. [As a CDO], I think it’s not important to distinguish between [data and digital]. It is important to distinguish between roles and responsibilities, you need different skills for these different areas, but to create an artificial silo between them doesn’t make a whole lot of sense to me.”- Danielle  (03:00)
  • “Product managers need to understand what the customer wants, what the business needs, how to pass that along to data scientists and data scientists, and to understand how that’s affecting business outcomes. That’s how I see this all working. And it depends on what type of models they’re customizing and building, right? Are they building personalization models that are going to be a digital asset? Are they building automation models that will go directly to some sort of operational activity in the store? What are they trying to solve?” - Danielle (06:30)
  • “In a company that sells products—groceries—to individuals, personalization is a huge opportunity. How do we make that experience, both in-digital and in-store, more relevant to the customer, more sticky and build loyalty with those customers? That’s the core problem, but underneath that is you got to build a lot of models that help personalize that experience. When you start talking about building a lot of different models, you need scale.”  - Danielle (9:24)
  •  “[Customer interaction in the store] is a true big data problem, right, because you need to use the WiFi devices, et cetera. that you have in store that are pinging the devices at all times, and it’s a massive amount of data. Trying to weed through that and find the important signals that help us to actually drive that type of personalized experience is challenging. No one’s gotten there yet. I hope that we’ll be the first.” -  Danielle (19:50)
  • “I can imagine a checkout clerk who doesn’t want to talk to the customer, despite a data-driven suggestion appearing on the clerk’s monitor as to how to personalize a given customer interaction. The recommendation suggested to the clerk may be ‘accurate from a data science point of view, but if the clerk doesn’t actually act on it, then the data product didn’t provide any value. When I train people in my seminar, I try to get them thinking about that last mile. It may not be data science work, and maybe you have a big enough org where that clerk/customer experience is someone else’s responsibility, but being aware that this is a fault point and having a cross-team perspective is key.” - Brian @rhythmspice (24:50)
  • “We’re going through a moment in time in which trust in data is shaky. What I’d like people to understand and know on a broader philosophical level, is that in order to be able to understand data and use it to make decisions, you have to know its source. You have to understand its source. You have to understand the incentives around that source of data….you have to look at the data from the perspective of what it means and what the incentives were for creating it, and then analyze it, and then give an output. And fortunately, most statisticians, most data scientists, most people in most fields that I know, are incredibly motivated to be ethical and accurate in the information that they’re putting out.” - Danielle (34:15)
13 Jul 2021069 - The Role of Creativity and Product Thinking in Data Monetization with ‘Infonomics’ Author Doug Laney00:34:09

Doug Laney is the preeminent expert in the field of infonomics — and it’s not just because he literally wrote the book on it. 

 

As the Data & Analytics Strategy Innovation Fellow at consulting firm West Monroe, Doug helps businesses use infonomics to measure the economic value of their data and monetize it. He also is a visiting professor at the University of Illinois at Urbana-Champaign where he teaches classes on analytics and infonomics. 

 

On this episode of Experiencing Data, Doug and I talk about his book Infonomics, the many different ways that businesses can monetize data, the role of creativity and product management in producing innovative data products, and the ever-evolving role of the Chief Data Officer.

 

In our chat, we covered: 

 

  • Why Doug's book Infonomics argues that measuring data for its value potential is key to effectively managing and monetizing it. (2:21)
  • A 'regenerative asset': Innovative methods for deploying and monetizing data — and the differences between direct, indirect, and inverted data monetization. (5:10)
  • The responsibilities of a Chief Data Officer (CDO) — and how taking a product management approach to data can generate additional value. (13:28)
  • Why Doug believes that a 'lack of vision and leadership' is partly behind organizational hesitancy of data monetization efforts. (17:10)
  • ‘A pretty unique skill’: The importance of bringing in people with experience creating and marketing data products when monetizing data. (19:10)
  • Insurance and torrenting: Creative ways companies have leveraged their data to generate additional value. (24:27)
  • Ethical data monetization: Why Doug believes consumers must receive a benefit when organizations leverage their data for profit. (27:14)
  • The data monetization workshops Doug runs for businesses looking to generate new value streams from its data. (29:42)
Quotes from Today’s Episode

“Many organizations [endure] a vicious cycle of not measuring [their data], and therefore not managing, and therefore not monetizing their data as well as they can. The idea behind my book Infonomics is, flip that. I’ll just start with measuring your data, understanding what you have, its quality characteristics, and its value potential. But vision is important as well, and so that’s where we start with monetization, and thinking more broadly about the ways to generate measurable economic benefits from data.” - Doug (4:13)

 

“A lot of people will compare data to oil and say that ‘Data is the new oil.’ But you can only use a drop of oil one way at a time. When you consume a drop of oil, it creates heat and energy and pollution, and when you use a drop of oil, it doesn’t generate more oil. Data is very different. It has unique economic qualities that economists would call a non-rivalrous, non-depleting, and regenerative asset.” - Doug (7:52)

 

“The Chief Data Officer (CDO) role has come on strong in organizations that really want to manage their data as an actual asset, ensure that it is accounted for as generating value and is being managed and controlled effectively. Most CDOs play both offense and defense in controlling and governing data on one side and in enabling it on the other side to drive more business value.”- Doug (14:17)

 

“The more successful teams that I read about and I see tend to be of a mixed skill set, they’re cross-functional; there’s a space for creativity and learning, there’s a concept of experimentation that’s happening there.” - Brian (19:10)

 

“Companies that become more data-driven have a market-to-book value that’s nearly two times higher than the market average. And companies that make the bulk of their revenue by selling data products or derivative data have a market-to-book value that’s nearly three times the market average. So, there's a really compelling reason to do this. It’s just that not a lot of executives are really comfortable with it. Data continues to be something that’s really amorphous and they don’t really have their heads around.” - Doug (21:38)

 

“There’s got to be a benefit to the consumer in the way that you use their data. And that benefit has to be clear, and defined, and ideally measured for them, that we’re able to reduce the price of this product that you use because we’re able to share your data, even if it’s anonymously; this reduces the price of your product.” - Doug (28:24)

 

Links referenced
28 May 2024144 - The Data Product Debate: Essential Tech or Excessive Effort? with Shashank Garg, CEO of Infocepts (Promoted Episode)00:52:38

Welcome to another curated, Promoted Episode of Experiencing Data! 

In episode 144, Shashank Garg, Co-Founder and CEO of Infocepts, joins me to explore whether all this discussion of data products out on the web actually has substance and is worth the perceived extra effort. Do we always need to take a product approach for ML and analytics initiatives? Shashank dives into how Infocepts approaches the creation of data solutions that are designed to be actionable within specific business workflows—and as I often do, I started out by asking Shashank how he and Infocepts define the term “data product.” We discuss a few real-world applications Infocepts has built, and the measurable impact of these data products—as well as some of the challenges they’ve faced that your team might as well. Skill sets also came up; who does design? Who takes ownership of the product/value side? And of course, we touch a bit on GenAI.

 

 

Highlights/ Skip to

  • Shashank gives his definition of data products  (01:24)
  • We tackle the challenges of user adoption in data products (04:29)
  • We discuss the crucial role of integrating actionable insights into data products for enhanced decision-making (05:47)
  • Shashank shares insights on the evolution of data products from concept to practical integration (10:35)
  • We explore the challenges and strategies in designing user-centric data products (12:30)
  • I ask Shashank about typical environments and challenges when starting new data product consultations (15:57)
  • Shashank explains how Infocepts incorporates AI into their data solutions (18:55)
  • We discuss the importance of understanding user personas and engaging with actual users (25:06)
  • Shashank describes the roles involved in data product development’s ideation and brainstorming stages (32:20)
  • The issue of proxy users not truly representing end-users in data product design is examined (35:47)
  • We consider how organizations are adopting a product-oriented approach to their data strategies (39:48)
  • Shashank and I delve into the implications of GenAI and other AI technologies on product orientation and user adoption (43:47)
  • Closing thoughts (51:00)

 

 

Quotes from Today’s Episode

  • “Data products, at least to us at Infocepts, refers to a way of thinking about and organizing your data in a way so that it drives consumption, and most importantly, actions.” - Shashank Garg (1:44)
  • “The way I see it is [that] the role of a DPM (data product manager)—whether they have the title or not—is benefits creation. You need to be responsible for benefits, not for outputs. The outputs have to create benefits or it doesn’t count. Game over” - Brian O’Neill (10:07)
  • We talk about bridging the gap between the worlds of business and analytics... There's a huge gap between the perception of users and the tech leaders who are producing it." - Shashank Garg (17:37)
  • “IT leaders often limit their roles to provisioning their secure data, and then they rely on businesses to be able to generate insights and take actions. Sometimes this handoff works, and sometimes it doesn’t because of quality governance.” - Shashank Garg  (23:02)
  • “Data is the kind of field where people can react very, very quickly to what’s wrong.”  - Shashank Garg (29:44)
  • “It’s much easier to get to a good prototype if we know what the inputs to a prototype are, which include data about the people who are going to use the solution, their usage scenarios, use cases, attitudes, beliefs…all these kinds of things.” - Brian O’Neill (31:49)
  • “For data, you need a separate person, and then for designing, you need a separate person, and for analysis, you need a separate person—the more you can combine, I don’t think you can create super-humans who can do all three, four disciplines, but at least two disciplines and can appreciate the third one that makes it easier.” - Shashank Garg (39:20)
  • “When we think of AI, we’re all talking about multiple different delivery methods here. I think AI is starting to become GenAI to a lot of non-data people. It’s like their—everything is GenAI.” -  Brian O'Neill (43:48)

 

 

Links

18 May 2021065 - Balancing Human Intuition and Machine Intelligence with Salesforce Director of Product Management Pavan Tumu00:30:50

I once saw a discussion on LinkedIn about a fraud detection model that had been built but never used. The model worked — it was expensive — but it just simply didn’t get used because the humans in the loop were not incentivized to use it.

 

It was on this very thread that I first met Salesforce Director of Product Management Pavan Tuvu, who chimed in on the thread about a similar experience he went through. When I heard about his experience, I asked him if he would share it with you and he agreed. So, today on the Experiencing Data podcast, I’m excited to have Pavan on to talk about some  lessons he learned while designing ad-spend software that utilized advanced analytics — and the role of the humans in the loop. We discussed: 

 

  • Pavan's role as Director of Product Management at Salesforce and how he works to make data easier to use for teams. (0:40)
  • Pavan's work protecting large-dollar advertising accounts from bad actors by designing a ML system that predicts and caps ad spending. (6:10)
  • 'Human override of the machine': How Pavan addressed concerns that its advertising security system would incorrectly police legitimate large-dollar ad spends. (12:22)
  • How the advertising security model Pavan worked on learned from human feedback. (24:49)
  • How leading with "why" when designing data products will lead to a better understanding of what customers need to solve. (29:05)
02 Nov 2021077 - Productizing Analytics for Performing Arts Organizations with AMS Analytics CPO Jordan Gross Richmond00:42:35

Even in the performing arts world, data and analytics is serving a purpose. Jordan Gross Richmond is the Chief Product Officer at AMS Analytics, where they provide benchmarking and performance reporting to performing arts organizations. As many of you know, I’m also a musician who tours and performs in the performing arts market and so I was curious to hear how data plays a role “off the stage” within these organizations. In particular, I wanted to know how Jordan designed the interfaces for AMS Analytics’s product, and what’s unique (or not!) about using data to manage arts organizations.

Jordan also talks about the beginnings of AMS and their relationship with leaders in the performing arts industry and the “birth of benchmarking” in this space. From an almost manual process in the beginning, AMS now has a SaaS platform that allows performing arts centers to see the data that helps drive their organizations. Given that many performing arts centers are non-profit organizations, I also asked Jordan about how these organizations balance their artistic mission against the colder, harder facts of data such as ticket sales, revenue, and “the competition.”  

In this episode, we also cover:

  • How the AMS platform helps leaders manage their performing arts centers and the evolution of the AMS business model. (01:10)
  • Benchmarking as a measure of success in the performing arts industry and the “two buckets of context” AMS focuses on. (06:00)
  • Strategies for measuring intangible success and how performing arts data is about more than just the number of seats filled at concerts and shows. (15:48)
  • The relationships between AMS and its customers, their organizational structure, and how AMS has shaped it into a useful SaaS product. (26:27)
  • The role of users in designing the solution and soliciting feedback and what Jordan means when he says he “focuses on the problems, and not the solutions” in his role as Chief Product Officer. (35:38)
 Quotes from Today’s Episode
  • “I think [AMS] is a one-of-a-kind thing, and what it does now is it provides what I consider to be a steering wheel for these leaders. It’s not the kind of thing that’s going to help anybody figure out what to do tomorrow; it’s more about what’s going on in a year from now and in five years from now. And I think the need for this particular vision comes from the evolution in the business model in general of the performing arts and the cultural arts in America.”- Jordan Gross Richmond (@the1jordangross) (03:07)
  • “No one metric can solve everything. It’s a one-to-one relationship in terms of data model to analytical point. So, we have to be really careful that we don't think that just because there's a lot of charts on the screen, we must be able to answer all of our [customers'] questions.”- Jordan Gross Richmond (@the1jordangross) (18:18)
  • “We are absolutely a product-led organization, which essentially means that the solutions are built into the product, and the relationship with the clients and the relationship with future clients is actually all engineered into the product itself. And so I never want to create anything in a black box. Nobody benefits from a feature that nobody cares about.”- Jordan Gross Richmond (@the1jordangross) (29:16)
  • “This is an evolution that's driven not by the technology itself, but [...] by the key stakeholders amongst this community. And we found that to be really successful. In terms of product line growth, when you listen to your users and they feel heard, the sky's the limit. Because at that point, they have buy-in, so you have a real relationship. ”- Jordan Gross Richmond (@the1jordangross) (31:11)
  • “Successful product leaders don't focus on the solutions. We focus on the problems. And that's where I like to stay, because sometimes we kind of get into lots of proposals. My role in these meetings is often to help identify the problem and make sure we're all solving the same problem because we can get off pretty easily on a solution that sounds sexy [or] interesting, but if we're not careful, we might be solving a problem that doesn't even exist.”- Jordan Gross Richmond (@the1jordangross) (35:09)
  • “It’s about starting with the customer’s problems and working backwards from that. I think that you have to start with the problem space that they're in, and then you do the best job you can with the data that's available. [...] So, I love the fact that you're having these working groups. Sometimes we call these design partners in the design world, and I think that kind of regular interaction and exposure, especially early and as frequently as possible, is a great habit.”- Brian T. O’Neill (@rhythmspice) (40:26)
Links Referenced

https://www.ams-analytics.com/

08 Mar 2022086 - CED: My UX Framework for Designing Analytics Tools That Drive Decision Making00:27:57

Today, I’m flying solo in order to introduce you to CED: my three-part UX framework for designing your ML / predictive / prescriptive analytics UI around trust, engagement, and indispensability. Why this, why now? I have had several people tell me that this has been incredibly helpful to them in designing useful, usable analytics tools and decision support applications. 

 

I have written about the CED framework before at the following link:

 

https://designingforanalytics.com/ced

 

There you will find an example of the framework put into a real-world context. In this episode, I wanted to add some extra color to what is discussed in the article. If you’re an individual contributor, the best part is that you don’t have to be a professional designer to begin applying this to your own data products. And for leaders of teams, you can use the ideas in CED as a “checklist” when trying to audit your team’s solutions in the design phase—before it’s too late or expensive to make meaningful changes to the solutions. 

CED is definitely easier to implement if you understand the basics of human-centered design, including research, problem finding and definition, journey mapping, consulting, and facilitation etc. If you need a step-by-step method to develop these foundational skills, my training program, Designing Human-Centered Data Products, might help. It comes in two formats: a Self-Guided Video Course and a bi-annual Instructor-Led Seminar.

Quotes from Today’s Episode
  • “‘How do we visualize the data?’ is the wrong starting question for designing a useful decision support application. That makes all kinds of assumptions that we have the right information, that we know what the users' goals and downstream decisions are, and we know how our solution will make a positive change in the customer or users’ life.”- Brian (@rhythmspice) (02:07)
  • “The CED is a UX framework for designing analytics tools that drive decision-making. Three letters, three parts: Conclusions; C, Evidence: E, and Data: D. The tough pill for some technical leaders to swallow is that the application, tool or product they are making may need to present what I call a ‘conclusion’—or if you prefer, an ‘opinion.’ Why? Because many users do not want an ‘exploratory’ tool—even when they say they do. They often need an insight to start with, before exploration time  becomes valuable.” - Brian (@rhythmspice) (04:00)
  • “CED requires you to do customer and user research to understand what the meaningful changes, insights, and things that people want or need actually are. Well designed ‘Conclusions’—when experienced in an analytics tool using the CED framework—often manifest themselves as insights such as unexpected changes, confirmation of expected changes, meaningful change versus meaningful benchmarks, scoring how KPIs track to predefined and meaningful ranges, actionable recommendations, and next best actions. Sometimes these Conclusions are best experienced as charts and visualizations, but not always—and this is why visualizing the data rarely is the right place to begin designing the UX.” - Brian (@rhythmspice) (08:54)
  • “If I see another analytics tool that promises ‘actionable insights’ but is primarily experienced as a collection of gigantic data tables with 10, 20, or 30+ columns of data to parse, your design is almost certainly going to frustrate, if not alienate, your users. Not because all table UIs are bad, but because you’ve put a gigantic tool-time tax on the user, forcing them to derive what the meaningful conclusions should be.”   - Brian (@rhythmspice) (20:20)
18 Oct 2022102 - CDO Spotlight: The Non-Technical Roles Data Science and Analytics Teams Need to Drive Adoption of Data Products w/ Iván Herrero Bartolomé00:35:05

Today I’m chatting with Iván Herrero Bartolomé, Chief Data Officer at Grupo Intercorp. Iván describes how he was prompted to write his new article in CDO Magazine, “CDOs, Let’s Get Out of Our Comfort Zone” as he recognized the importance of driving cultural change within organizations in order to optimize the use of data. Listen in to find out how Iván is leveraging the role of the analytics translator to drive this cultural shift, as well as the challenges and benefits he sees data leaders encounter as they move from tactical to strategic objectives. Iván also reveals the number one piece of advice he’d give CDOs who are struggling with adoption. 

Highlights / Skip to:

  • Iván explains what prompted him to write his new article, “CDOs, Let’s Get Out of Our Comfort Zone” (01:08)
  • What Iván feels is necessary for data leaders to close the gap between data and the rest of the business and why (03:44)
  • Iván dives into who he feels really owns delivery of value when taking on new data science and analytics projects (09:50)
  • How Iván’s team went from managing technical projects that often didn’t make it to production to working on strategic projects that almost always make it to production (13:06)
  • The framework Iván has developed to upskill technical and business roles to be effective data / analytics translators (16:32)
  • The challenge Iván sees data leaders face as they move from setting and measuring tactical goals to moving towards strategic goals and initiatives (24:12)
  • Iván explains how the C-Suite’s attitude impacts the cross-functional role of data & analytics leadership (28:55)
  • The number one piece of advice Iván would give new CDO’s struggling with low adoption of their data products and solutions (31:45)
Quotes from Today’s Episode
  • “We’re going to do all our best to ensure that [...] everything that is expected from us is done in the best possible way. But that’s not going to be enough. We need a sponsorship and we need someone accountable for the project and someone who will be pushing and enabling the use of the solution once we are gone. Because we cannot stay forever in every company.” – Iván Herrero Bartolomé (10:52)
  • “We are trying to upskill people from the business to become data translators, but that’s going to take time. Especially what we try to do is to take product owners and give them a high-level immersion on the state-of-the-art and the possibilities that data analytics bring to the table. But as we can’t rely on our companies having this kind of talent and these data translators, they are one of the profiles that we bring in for every project that we work on.” – Iván Herrero Bartolomé (13:51)
  • “There’s a lot to do, not just between data and analytics and the other areas of the company, but aligning the incentives of all the organization towards the same goals in a way that there’s no friction between the goals of the different areas, the people, [...]  and the final goals of the organization. – Iván Herrero Bartolomé (23:13)
  • “Deciding which goals are you going to be co-responsible for, I think that is a sophisticated process that it’s not mastered by many companies nowadays. That probably is one of the main blockers keeping data analytics areas working far from their business counterparts” – Iván Herrero Bartolomé (26:05)
  • “When the C-suite looks at data and analytics, if they think these are just technical skills, then the data analytics team are just going to behave as technical people. And many, many data analytics teams are set up as part of the IT organization. So, I think it all begins somehow with how the C-suite of our companies look at us.” – Iván Herrero Bartolomé (28:55)
  • “For me, [digital] means much more than the technical development of solutions; it should also be part of the transformation of the company, both in how companies develop relationships with their customers, but also inside how every process in the companies becomes more nimble and can react faster to the changes in the market.” – Iván Herrero Bartolomé (30:49)
  • “When you feel that everyone else not doing what you think they should be doing, think twice about whether it is they who are not doing what they should be doing or if it’s something that you are not doing properly.” – Iván Herrero Bartolomé (31:45)
Links
19 Mar 2024139 - Monetizing SAAS Analytics and The Challenges of Designing a Successful Embedded BI Product (Promoted Episode)00:51:02

This week on Experiencing Data, something new as promised at the beginning of the year. Today, I’m exploring the world of embedded analytics with Zalak Trivedi from Sigma Computing—and this is also the first approved Promoted Episode on the podcast. In today’s episode, Zalak shares his journey as the product lead for Sigma’s embedded analytics and reporting solution which seeks to accelerate and simplify the deployment of decision support dashboards to their SAAS companies’ customers. Right there, we have the first challenge that Zalak was willing to dig into with me: designing a platform UX when we have multiple stakeholder and user types. In Sigma’s case, this means Sigma’s buyers, the developers that work at these SAAS companies to integrate Sigma into their products, and then the actual customers of these SAAS companies who will be the final end users of the resulting dashboards.  also discuss the challenges of creating products that serve both beginners and experts and how AI is being used in the BI industry.  

 

Highlights/ Skip to:

  • I introduce Zalak Trivedi from Sigma Computing onto the show (03:15)
  • Zalak shares his journey leading the vision for embedded analytics at Sigma and explains what Sigma looks like when implemented into a customer’s SAAS product . (03:54)
  • Zalak and I discuss the challenge of integrating Sigma's analytics into various companies' software, since they need to account for a variety of stakeholders. (09:53)
  • We explore Sigma's team approach to user experience with product management, design, and technical writing (15:14)
  • Zalak reveals how Sigma leverages telemetry to understand and improve user interactions with their products (19:54)
  • Zalak outlines why Sigma is a faster and more supportive alternative to building your own analytics (27:21)
  • We cover data monetization, specifically looking at how SAAS companies can monetize analytics and insights (32:05)
  • Zalak highlights how Sigma is integratingAI into their BI solution (36:15)
  • Zalak share his customers' current pain points and interests (40:25) 
  • We wrap up with final thoughts and ways to connect with Zalak and learn more about Sigma (49:41) 
Quotes from Today’s Episode
  • "Something I’m really excited about personally that we are working on is [moving] beyond analytics to help customers build entire data applications within Sigma. This is something we are really excited about as a company, and marching towards [achieving] this year." - Zalak Trivedi (04:04)
  • “The whole point of an embedded analytics application is that it should look and feel exactly like the application it’s embedded in, and the workflow should be seamless.” - Zalak Trivedi (09:29) 
  • “We [at Sigma] had to switch the way that we were thinking about personas. It was not just about the analysts or the data teams; it was more about how do we give the right tools to the [SAAS] product managers and developers to embed Sigma into their product.” - Zalak Trivedi (11:30) 
  • “You can’t not have a design, and you can’t not have a user experience. There’s always an experience with every tool, solution, product that we use, whether it emerged organically as a byproduct, or it was intentionally created through knowledge data... it was intentional” - Brian O’Neill (14:52) 
  • “If we find that [in] certain user experiences,people are tripping up, and they’re not able to complete an entire workflow, we flag that, and then we work with the product managers, or [with] our customers essentially, and figure out how we can actually simplify these experiences.” - Zalak Trivedi (20:54)
  • “We were able to convince many small to medium businesses and startups to sign up with Sigma. The success they experienced after embedding Sigma was tremendous. Many of our customers managed to monetize their existing data within weeks, or at most, a couple of months, with lean development teams of two to three developers and a few business-side personnel, generating seven-figure income streams from that.” - Zalak Trivedi (32:05)
  • “At Sigma, our stance is, let’s not just add AI for the sake of adding AI. Let’s really identify [where] in the entire user journey does the intelligence really lie, and where are the different friction points, and let’s enhance those experiences.” - Zalak Trivedi (37:38) 
  • “Every time [we at Sigma Computing] think about a new feature or functionality, we have to ensure it works for both the first-degree persona and the second-degree persona, and consider how it will be viewed by these different personas, because that is not the primary persona for which the foundation of the product was built." - Zalak Trivedi (48:08)
Links

Sigma Computing: https://sigmacomputing.com

Email: zalak@sigmacomputing.com 

LinkedIn: https://www.linkedin.com/in/trivedizalak/

Sigma Computing Embedded: https://sigmacomputing.com/embedded

About Promoted Episodes on Experiencing Data: https://designingforanalytics.com/promoted

27 Dec 2022107 - Tom Davenport on Data Product Management and the Impact of a Product Orientation on Enterprise Data Science and ML Initiatives00:42:52

Today I’m chatting with returning guest Tom Davenport, who is a Distinguished Professor at Babson College, a Visiting Professor at Oxford, a Research Fellow at MIT, and a Senior Advisor to Deloitte’s AI practice. He is also the author of three new books (!) on AI and in this episode, we’re discussing the role of product orientation in enterprise data science teams, the skills required, what he’s seeing in the wild in terms of teams adopting this approach, and the value it can create. Back in episode 26, Tom was a guest on my show and he gave the data science/analytics industry an approximate “2 out of 10” rating in terms of its ability to generate value with data. So, naturally, I asked him for an update on that rating, and he kindly obliged. How are you all doing? Listen in to find out!

Highlights / Skip to:

  • Tom provides an updated rating (between 1-10) as to how well he thinks data science and analytics teams are doing these days at creating economic value (00:44)
  • Why Tom believes that “motivation is not enough for data science work” (03:06)
  • Tom provides his definition of what data products are and some opinions on other industry definitions (04:22)
  • How Tom views the rise of taking a product approach to data roles and why data products must be tied to value (07:55)
  • Tom explains why he feels top down executive support is needed to drive a product orientation (11:51)
  • Brian and Tom discuss how they feel companies should prioritize true data products versus more informal AI efforts (16:26)
  • The trends Tom sees in the companies and teams that are implementing a data product orientation (19:18)
  • Brian and Tom discuss the models they typically see for data teams and their key components (23:18)
  • Tom explains the value and necessity of data product management (34:49)
  • Tom describes his three new books (39:00)
Quotes from Today’s Episode
  • “Data science in general, I think has been focused heavily on motivation to fit lines and curves to data points, and that particular motivation certainly isn’t enough in that even if you create a good model that fits the data, it doesn’t mean at all that is going to produce any economic value.” – Tom Davenport  (03:05)
  • “If data scientists don’t worry about deployment, then they’re not going to be in their jobs for terribly long because they’re not providing any value to their organizations.” – Tom Davenport (13:25)
  • “Product also means you got to market this thing if it’s going to be successful. You just can’t assume because it’s a brilliant algorithm with capturing a lot of area under the curve that it’s somehow going to be great for your company.” – Tom Davenport (19:04)

 

  • “[PM is] a hard thing, even for people in non-technical roles, because product management has always been a sort of ‘minister without portfolio’ sort of job, and you know, influence without formal authority, where you are responsible for a lot of things happening, but the people don’t report to you, generally.” – Tom Davenport (22:03)

 

  • “This collaboration between a human being making a decision and an AI system that might in some cases come up with a different decision but can’t explain itself, that’s a really tough thing to do [well].” – Tom Davenport (28:04)
  • “This idea that we’re going to use externally-sourced systems for ML is not likely to succeed in many cases because, you know, those vendors didn’t work closely with everybody in your organization” – Tom Davenport (30:21)

 

  • “I think it’s unlikely that [organizational gaps] are going to be successfully addressed by merging everybody together in one organization. I think that’s what product managers do is they try to address those gaps in the organization and develop a process that makes coordination at least possible, if not true, all the time.” – Tom Davenport (36:49)
Links
18 Apr 2023115 - Applying a Product and UX-Driven Approach to Building Stuart’s Data Platform with Osian Jones00:45:19

Today I’m chatting with Osian Jones, Head of Product for the Data Platform at Stuart. Osian describes how impact and ROI can be difficult metrics to measure in a data platform, and how the team at Stuart has sought to answer this challenge. He also reveals how user experience is intrinsically linked to adoption and the technical problems that data platforms seek to solve. Throughout our conversation, Osian shares a holistic overview of what it was like to design a data platform from scratch, the lessons he’s learned along the way, and the advice he’d give to other data product managers taking on similar projects. 

Highlights/ Skip to:

  • Osian describes his role at Stuart (01:36)
  • Brian and Osian explore the importance of creating an intentional user experience strategy (04:29)
  • Osian explains how having a clear mission enables him to create parameters to measure product success (11:44)
  • How Stuart developed the KPIs for their data platform (17:09)
  • Osian gives his take on the pros and cons of how data departments are handled in regards to company oversight (21:23)
  • Brian and Osian discuss how vital it is to listen to your end users rather than relying on analytics alone to measure adoption (26:50)
  • Osian reveals how he and his team went about designing their platform (31:33)
  • What Osian learned from building out the platform and what he would change if he had to tackle a data product like this all over again (36:34)
Quotes from Today’s Episode
  • “Analytics has been treated very much as a technical problem, and very much so on the data platform side, which is more on the infrastructure and the tooling to enable analytics to take place. And so, viewing that purely as a technical problem left us at odds in a way, compared to [teams that had] a product leader, where the user was the focus [and] the user experience was very much driving a lot of what was roadmap.” — Osian Jones (03:15)
  • “Whenever we get this question of what’s the impact? What’s the value? How does it impact our company top line? How does it impact our company OKRs? This is when we start to panic sometimes, as data platform leaders because that’s an answer that’s really challenging for us, simply because we are mostly enablers for analytics teams who are themselves enablers. It’s almost like there’s two different degrees away from the direct impact that your team can have.” — Osian Jones (12:45)
  • “We have to start with a very clear mission. And our mission is to empower everyone to make the best data-driven decisions as fast as possible. And so, hidden within there, that’s a function of reducing time to insight, it’s also about maximizing trust and obviously minimizing costs.” — Osian Jones (13:48)
  • “We can track [metrics like reliability, incidents, time to resolution, etc.], but also there is a perception aspect to that as well. We can’t underestimate the importance of listening to our users and qualitative data.” — Osian Jones (30:16)
  • “These were questions that I felt that I naturally had to ask myself as a product manager. … Understanding who our users are, what they are trying to do with data and what is the current state of our data platform—so those were the three main things that I really wanted to get to the heart of, and connecting those three things together.” – Osian Jones (35:29)
  • “The advice that I would give to anyone who is taking on the role of a leader of a data platform or a similar role is, you can easily get overwhelmed by just so many different use cases. And so, I would really encourage [leaders] to avoid that.” – Osian Jones (37:57)
  • “Really look at your data platform from an end-user perspective and almost think of it as if you were to put the data platform on a supermarket shelf, what would that look like? And so, for each of the different components, how would you market that in a single one-liner in terms of what can this do for me?” – Osian Jones (39:22)
Links
02 May 2023116 - 10 Reasons Your Customers Don’t Make Time for Your Data Product Initiatives + A Big Update on the Data Product Leadership Community (DPLC)00:45:56

Do you ever find it hard to get the requirements, problems, or needs out of your customers, stakeholders, or users when creating a data product? This week I’m coming to you solo to share reasons your stakeholders, users, or customers may not be making time for your discovery efforts. I’ve outlined 10 reasons, and delve into those in the first part of this episode. 

 

In part two, I am going to share a big update about the Data Product Leadership Community (DPLC) I’m hoping to launch in June 2023. I have created a Google Doc outlining how v1 of the community will work as well as 6 specific benefits that I hope you’ll be able to achieve in the first year of participating. However, I need your feedback to know if this is shaping up into the community you want to join. As such, at the end of this episode, I’ll ask you to head over to the Google Doc and leave a comment. To get the document link, just add your email address to the DPLC announcement list at http://designingforanalytics.com/community and you’ll get a confirmation email back with the link. 

Links
29 Jun 2021068 - Why User Adoption of Enterprise Data Products Continues to Lag with International Institute for Analytics Executive VP Drew Smith00:52:08

Drew Smith knows how much value data analytics can add to a business when done right.

 

Having worked at the IKEA Group for 17 years, Drew helped the company become more data-driven, implementing successful strategies for data analytics and governance across multiple areas of the company. 

 

Now, Drew serves as the Executive Vice President of the Analytics Leadership Consortium at the International Institute for Analytics, where he helps Fortune 1000 companies successfully leverage analytics and data science. 

 

On this episode of Experiencing Data, Drew and I talk a lot about the factors contributing to low adoption rates of data products, how product and design-thinking approaches can help, and the value of proper one-on-one research with customers.

 

In our chat, we covered:

  • 'It’s bad and getting worse': Drew's take on the factors behind low adoption of data products. (1:08)
  • Decentralizing data analytics: How understanding a user's business problems by including them in the design process can lead to increased adoption of data products. (6:22)
  • The importance for business leaders to have a conceptual understanding of the algorithms used in decision-making data products. (14:43)
  • Why data analysts need to focus more on providing business value with the models they create. (18:14)
  • Looking for restless curiosity in new hires for data teams — and the importance of nurturing new learning through training. (22:19)
  • The value of spending one-on-one time with end-users to research their decision-making process before creating a data product. (27:00)
  • User-informed data products: The benefits of design and product-thinking approaches when creating  data analytics solutions. (33:04)
  • How Drew's view of data analytics has changed over 15 years in the field . (45:34)

 

Quotes from Today’s Episode

“I think as we [decentralize analytics back to functional areas] — as firms keep all of the good parts of centralizing, and pitch out the stuff that doesn’t work — I think we’ll start to see some changes [when it comes to the adoption of data products.]” - Drew (10:07)

 

“I think data people need to accept that the outcome is not the model — the outcome is a business performance which is measurable, material, and worth the change.” - Drew (21:52)

 

“We talk about the concept of outcomes over outputs a lot on this podcast, and it’s really about understanding what is the downstream [effect] that emerges from the thing I made. Nobody really wants the thing you made; they just want the result of the thing you made. We have to explore what that is earlier in the process — and asking, “Why?” is very important.” - Brian (22:21)

 

“I have often said that my favorite people in the room, wherever I am, aren’t the smartest, it’s the most curious.” - Drew (23:55)

 

“For engineers and people that make things, it’s a lot more fun to make stuff that gets used. Just at the simplest level, the fact that someone cared and it didn’t just get shelved, and especially when you spent half your year on this thing, and your performance review is tied to it, it’s just more enjoyable to work on it when someone’s happy with the outcome.” - Brian (33:04)

 

“Product thinking starts with the assumption that ‘this is a good product,’ it’s usable and it’s making our business better, but it’s not finished. It’s a continuous loop. It’s feeding back in data through its exhaust. The user is using it — maybe even in ways I didn’t imagine — and those ways are better than I imagined, or worse than I imagined, or different than I imagined, but they inform the product.” - Drew (36:35)

 

Links Referenced

Analytics Leadership Consortium: https://iianalytics.com/services/analytics-leadership-consortium

08 Aug 2023123 - Learnings From the CDOIQ Symposium and How Data Product Definitions are Evolving with Brian T. O’Neill00:27:17

Today I’m wrapping up my observations from the CDOIQ Symposium and sharing what’s new in the world of data. I was only able to attend a handful of sessions, but they were primarily ones tied to the topic of data products, which, of course, brings us to “What’s a data product?” During this episode, I cover some of what I’ve been hearing about the definition of this word, and I also share my revised v2 definition. I also walk through some of the questions that CDOs and fellow attendees were asking at the sessions I went to and a few reactions to those questions. Finally, I announce an exciting development on the launch of the Data Product Leadership Community.

 

Highlights/ Skip to:

 

  • Brian introduces the topic for this episode, including his wrap-up of the CDOIQ Symposium (00:29)
  • The general impressions Brian heard at the Symposium, including a focus on people & culture and an emphasis on data products (01:51)
  • The three main areas the definition of a data product covers according to Brian’s observations (04:43)
  • Brian describes how companies are looking for successful data product development models to follow and explores where new Data Product Managers are coming from (07:17)
  • A methodology that Brian feels leads to a successful data product team (10:14)
  • How Brian feels digital-native folks see the world of data products differently (11:29)
  • The topic of Data Mesh and Human-Centered Design and how it came up in two presentations at the CDOIQ Symposium (13:24)
  • The rarity of design and UX being talked about at data conferences, and why Brian feels that is the case (15:24)
  • Brian’s current definition of a data product and how it’s evolved from his V1 definition (18:43)
  • Brian lists the main questions that were being asked at CDOIQ sessions he attended around data products (22:19)
  • Where to find answers to many of the questions being asked about data products and an update on the Data Product Leader Community that he will launch in August 2023 (24:28)
Quotes from Today’s Episode
  • “I think generally what’s happening is the technology continues to evolve, I think it generally continues to get easier, and all of the people and cultural parts and the change management and all of that, that problem just persists no matter what. And so, I guess the question is, what are we going to do about it?” — Brian T. O’Neill (03:11)
  • “The feeling I got from the questions [at the CDOIQ Symposium], … and particularly the ones that were talking about the role of data product management and the value of these things was, it’s like they’re looking for a recipe to follow.” — Brian T. O’Neill (07:17)
  • “My guess is people are just kind of reading up about it, self-training a bit, and trying to learn how to do product on their own. I think that’s how you learn how to do stuff is largely through trial and error. You can read books, you can do all that stuff, but beginning to do it is part of it.” — Brian T. O’Neill (08:57)
  • “I think the most important thing is that data is a raw ingredient here; it’s a foundation piece for the solution that we’re going to make that’s so good, someone might pay to use it or trade something of value to use it. And as long as that’s intact, I think you’re kind of checking the box as to whether it’s a data product.” — Brian T. O’Neill (12:13)

 

  • “I also would say on the data mesh topic, the feeling I got from people who had been to this conference before was that was quite a hyped thing the last couple years. Now, it was not talked about as much, but I think now they’re actually seeing some examples of this working.” — Brian T. O’Neill (16:25)

 

  • “My current v2 definition right now is, ‘A data product is a managed, end-to-end software solution that organizes, refines, or transforms data to solve a problem that’s so important customers would pay for it or exchange something of value to use it.’” — Brian T. O’Neill (19:47)

 

  • “We know [the product is] of value because someone was willing to pay for it or exchange their time or switch from their old way of doing things to the new way because it has that inherent benefit baked in. That’s really the most important part here that I think any data product manager should fully be aligned with.” — Brian T. O’Neill (21:35)

 

Links
27 Jul 2021070 - Fighting Fire with ML, the AI Incident Database, and Why Design Matters in AI-Driven Software with Sean McGregor00:34:38

As much as AI has the ability to change the world in very positive ways, it also can be incredibly destructive. Sean McGregor knows this well, as he is currently developing the Partnership on AI’s AI Incident Database, a searchable collection of news articles that covers questionable use, failures, and other incidents that affect people when AI solutions are poorly designed.  

 

On this episode of Experiencing Data, Sean takes us through his notable work around using machine learning in the domain of fire suppression, and how human-centered design is critical to ensuring these decision support solutions are actually used and trusted by the users. We also covered the social implications of new decision-making tools leveraging AI, and:

 

  • Sean's focus on ensuring his models and interfaces were interpretable by users when designing his fire-suppression system and why this was important. (0:51)
  • How Sean built his fire suppression model so that different stakeholders can optimize the system for their unique purposes. (8:44)
  • The social implications of new decision-making tools. (11:17)
  • Tailoring to the needs of 'high-investment' and 'low-investment' people when designing visual analytics. (14:58)
  • The AI Incident Database: Preventing future AI deployment harm by collecting and displaying examples of the unintended and negative consequences of AI. (18:20)
  • How human-centered design could prevent many incidents of harmful AI deployment — and how it could also fall short. (22:13)
  • 'It's worth the time and effort': How taking time to agree on key objectives for a data product with stakeholders can lead to greater adoption. (30:24)
Quotes from Today’s Episode

“As soon as you enter into the decision-making space, you’re really tearing at the social fabric in a way that hasn’t been done before. And that’s where analytics and the systems we’re talking about right now are really critical because that is the middle point that we have to meet in and to find those points of compromise.” - Sean (12:28)

 

“I think that a lot of times, unfortunately, the assumption [in data science is], ‘Well if you don’t understand it, that’s not my problem. That’s your problem, and you need to learn it.’ But my feeling is, ‘Well, do you want your work to matter or not? Because if no one’s using it, then it effectively doesn’t exist.’” - Brian (17:41)

 

“[The AI Incident Database is] a collection of largely news articles [about] bad things that have happened from AI [so we can] try and prevent history from repeating itself, and [understand] more of [the] unintended and bad consequences from AI....” - Sean (19:44)

 

“Human-centered design will prevent a great many of the incidents [of AI deployment harm] that have and are being ingested in the database. It’s not a hundred percent thing. Even in human-centered design, there’s going to be an absence of imagination, or at least an inadequacy of imagination for how these things go wrong because intelligent systems — as they are currently constituted — are just tremendously bad at the open-world, open-set problem.” - Sean (22:21)

 

“It’s worth the time and effort to work with the people that are going to be the proponents of the system in the organization — the ones that assure adoption — to kind of move them through the wireframes and examples and things that at the end of the engineering effort you believe are going to be possible. … Sometimes you have to know the nature of the data and what inferences can be delivered on the basis of it, but really not jumping into the principal engineering effort until you adopt and agree to what the target is. [This] is incredibly important and very often overlooked.” - Sean (31:36)

“The things that we’re working on in these technological spaces are incredibly impactful, and you are incredibly powerful in the way that you’re influencing the world in a way that has never, on an individual basis, been so true. And please take that responsibility seriously and make the world a better place through your efforts in the development of these systems. This is right at the crucible for that whole process.” - Sean (33:09)

 

Links Referenced

Twitter: https://twitter.com/seanmcgregor

15 Jun 2021067 - Why Roche Diagnostics’ BI and Data Science Teams Are Adopting Human-Centered Design and UX featuring Omar Khawaja00:35:11

On today’s episode of Experiencing Data, I’m so excited to have Omar Khawaja on to talk about how his team is integrating user-centered design into data science, BI and analytics at Roche Diagnostics. 

 

In this episode, Omar and I have a great discussion about techniques for creating more user-centered data products that produce value — as well as how taking such an approach can lead to needed change management on how data is used and interpreted.

 

In our chat, we covered: 

  • What Omar is responsible for in his role as Head of BI & Analytics at Roche Diagnostics — and why a human-centered design approach to data analytics is important to him. (0:57)
  • Understanding the end-user's needs: Techniques for creating more user-centric products — and the challenges of taking on such an approach. (6:10)
  • Dissecting 'data culture': Why Omar believes greater implementation of data-driven decision-making begins with IT 'demonstrating' the approach's benefits. (9:31)
  • Understanding user personas: How Roche is delivering better outcomes for medical patients by bringing analytical insights to life. (15:19)
  • How human-centered design yields early 'actionable insights' that can lead to needed change management on how data is used and interpreted. (22:12)
  • The journey of learning: Why 'it's everybody's job' to be focused on user experience — and how field research can help determine an end-users needs. (27:26)
  • Omar's love of cricket and the statistics collected about the sport! (31:23)

 

Resources and Links: Quotes from Today’s Episode

“I’ve been in the area of data and analytics since two decades ago, and out of my own learning — and I’ve learned it the hard way — at the end of the day, whether we are doing these projects or products, they have to be used by the people. The human factor naturally comes in.” - Omar (2:27)

 

“Especially when we’re talking about enterprise software, and some of these more complex solutions, we don’t really want people noticing the design to begin with. We just want it to feel valuable, and intuitive, and useful right out of the box, right from the start.” - Brian (4:08)

 

“When we are doing interviews with [end-users] as part of the whole user experience [process], you learn to understand what’s being said in between the lines, and then you learn how to ask the right questions. Those exploratory questions really help you understand: What is the real need?” - Omar (8:46)

 

“People are talking about data-driven [cultures], data-informed [cultures] — but at the end of the day, it has to start by demonstrating what change we want. ... Can we practice what we are trying to preach? Am I demonstrating that with my team when I’m making decisions in my day-to-day life? How do I use the data? IT is very good at asking our business colleagues and sometimes fellow IT colleagues to use various enterprise IT and business tools. Are we using, ourselves, those tools nicely?” - Omar (11:33)

 

“We focus a lot on what’s technically possible, but to me, there’s often a gap between the human need and what the data can actually support. And the bigger that gap is, the less chance things get used. The more we can try to close that gap when we get into the implementation stage, the more successful we probably will be with getting people to care and to actually use these solutions.” - Brian (22:20)

 

“When we are working in the area of data and analytics, I think it’s super important to know how this data and insights will be used — which requires an element of putting yourself in the user’s shoes. In the case of an enterprise setup, it’s important for me to understand the end-user in different roles and personas: What they are doing and how their job is. [This involves] sitting with them, visiting them, visiting the labs, visiting the factory floors, sitting with the finance team, and learning what they do in the system. These are the places where you have your learning.” - Omar (29:09)

14 Jun 2022093 - Why Agile Alone Won’t Increase Adoption of Your Enterprise Data Products00:47:16
Episode Description

In one of my past memos to my list subscribers, I addressed some questions about agile and data products. Today, I expound on each of these and share some observations from my consulting work. In some enterprise orgs, mostly outside of the software industry, agile is still new and perceived as a panacea. In reality, it can just become a factory for shipping features and outputs faster–with positive outcomes and business value being mostly absent. To increase the adoption of enterprise data products that have humans in the loop, it’s great to have agility in mind, but poor technology shipped faster isn’t going to serve your customers any better than what you’re doing now. 

 

Here are the 10 reflections I’ll dive into on this episode: 

  1. You can't project manage your way out of a [data] product problem. 
  2. The more you try to deploy agile at scale, take the trainings, and hire special "agilists", the more you're going to tend to measure success by how well you followed the Agile process.
  3. Agile is great for software engineering, but nobody really wants "software engineering" given to them. They do care about the perceived reality of your data product.
  4. Run from anyone that tells you that you shouldn't ever do any design, user research, or UX work "up front" because "that is waterfall." 
  5. Everybody else is also doing modified scrum (or modified _______).
  6. Marty Cagan talks about this a lot, but in short: while the PM (product managers) may own the backlog and priorities, what’s more important is that these PMs “own the problem” space as opposed to owning features or being solution-centered. 
  7. Before Agile can thrive, you will need strong senior leadership buy-in if you're going to do outcome-driven data product work.
  8. There's a huge promise in the word "agile." You've been warned. 
  9. If you don't have a plan for how you'll do discovery work, defining clear problem sets and success metrics, and understanding customers feelings, pains, needs, and wants, and the like, Agile won't deliver much improvement for data products (probably).
  10. Getting comfortable with shipping half-right, half-quality, half-done is hard. 

 

Quotes from Today’s Episode 
  • “You can get lost in following the process and thinking that as long as we do that, we’re going to end up with a great data product at the end.” - Brian (3:16)
  • “The other way to define clear success criteria for data products and hold yourself accountable to those on the user and business side is to really understand what does a positive outcome look like? How would we measure it?” - Brian (5:26)
  • “The most important thing is to know that the user experience is the perceived reality of the technology that you built. Their experience is the only reality that matters.” - Brian (9:22)
  • “Do the right amount of planning work upfront, have a strategy in place, make sure the team understands it collectively, and then you can do the engineering using agile.” - Brian (18:15)
  • “If you don’t have a plan for how you’ll do discovery work, defining clear problem sets and success metrics, and understanding customers’ feelings, pains, needs, wants, and all of that, then agile will not deliver increased adoption of your data products. - Brian (36:07)
Links:
14 Nov 2024156-The Challenges of Bringing UX Design and Data Science Together to Make Successful Pharma Data Products with Jeremy Forman00:41:37

Jeremy Forman joins us to open up about the hurdles– and successes that come with building data products for pharmaceutical companies. Although he’s new to Pfizer, Jeremy has years of experience leading data teams at organizations like Seagen and the Bill and Melinda Gates Foundation. He currently serves in a more specialized role in Pfizer’s R&D department, building AI and analytical data products for scientists and researchers. .

 

 

Jeremy gave us a good luck at his team makeup, and in particular, how his data product analysts and UX designers work with pharmaceutical scientists and domain experts to build data-driven solutions..  We talked a good deal about how and when UX design plays a role in Pfizer’s data products, including a GenAI-based application they recently launched internally.  

 

 

Highlights/ Skip to:
  • (1:26) Jeremy's background in analytics and transition into working for Pfizer
  • (2:42) Building an effective AI analytics and data team for pharma R&D
  • (5:20) How Pfizer finds data products managers
  • (8:03) Jeremy's philosophy behind building data products and how he adapts it to Pfizer
  • (12:32) The moment Jeremy heard a Pfizer end-user use product management research language and why it mattered
  • (13:55) How Jeremy's technical team members work with UX designers
  • (18:00) The challenges that come with producing data products in the medical field
  • (23:02) How to justify spending the budget on UX design for data products
  • (24:59) The results we've seen having UX design work on AI / GenAI products
  • (25:53) What Jeremy learned at the  Bill & Melinda Gates Foundation with regards to UX and its impact on him now
  • (28:22) Managing the "rough dance" between data science and UX
  • (33:22) Breaking down Jeremy's GenAI application demo from CDIOQ
  • (36:02) What would Jeremy prioritize right now if his team got additional funding
  • (38:48) Advice Jeremy would have given himself 10 years ago
  • (40:46) Where you can find more from Jeremy

 

 

Quotes from Today’s Episode
  • “We have stream-aligned squads focused on specific areas such as regulatory, safety and quality, or oncology research. That’s so we can create functional career pathing and limit context switching and fragmentation. They can become experts in their particular area and build a culture within that small team. It’s difficult to build good [pharma] data products. You need to understand the domain you’re supporting. You can’t take somebody with a financial background and put them in an Omics situation. It just doesn’t work. And we have a lot of the scars, and the failures to prove that.” - Jeremy Forman (4:12)
  • “You have to have the product mindset to deliver the value and the promise of AI data analytics. I think small, independent, autonomous, empowered squads with a product leader is the only way that you can iterate fast enough with [pharma data products].” - Jeremy Forman (8:46)
  • “The biggest challenge is when we say data products. It means a lot of different things to a lot of different people, and it’s difficult to articulate what a data product is. Is it a view in a database? Is it a table? Is it a query? We’re all talking about it in different terms, and nobody’s actually delivering data products.” - Jeremy Forman (10:53)
  • “I think when we’re talking about [data products] there’s some type of data asset that has value to an end-user, versus a report or an algorithm. I think it’s even hard for UX people to really understand how to think about an actual data product. I think it’s hard for people to conceptualize, how do we do design around that? It’s one of the areas I think I’ve seen the biggest challenges, and I think some of the areas we’ve learned the most. If you build a data product, it’s not accurate, and people are getting results that are incomplete… people will abandon it quickly.” - Jeremy Forman (15:56)
  • “ I think that UX design and AI development or data science work is a magical partnership, but they often don’t know how to work with each other. That’s been a challenge, but I think investing in that has been critical to us. Even though we’ve had struggles… I think we’ve also done a good job of understanding the [user] experience and impact that we want to have. The prototype we shared [at CDIOQ] is driven by user experience and trying to get information in the hands of the research organization to understand some portfolio types of decisions that have been made in the past. And it’s been really successful.” - Jeremy Forman (24:59)
  • “If you’re having technology conversations with your business users, and you’re focused only the technology output, you’re just building reports. [After adopting If we’re having technology conversations with our business users and only focused on the technology output, we’re just building reports. [After we adopted  a human-centered design approach], it was talking [with end-users] about outcomes, value, and adoption. Having that resource transformed the conversation, and I felt like our quality went up. I felt like our output went down, but our impact went up. [End-users] loved the tools, and that wasn’t what was happening before… I credit a lot of that to the human-centered design team.” - Jeremy Forman (26:39)
  • “When you’re thinking about automation through machine learning or building algorithms for [clinical trial analysis], it becomes a harder dance between data scientists and human-centered design. I think there’s a lack of appreciation and understanding of what UX can do. Human-centered design is an empathy-driven understanding of users’ experience, their work, their workflow, and the challenges they have. I don’t think there’s an appreciation of that skill set.” - Jeremy Forman (29:20)
  • “Are people excited about it? Is there value? Are we hearing positive things? Do they want us to continue? That’s really how I’ve been judging success. Is it saving people time, and do they want to continue to use it? They want to continue to invest in it. They want to take their time as end-users, to help with testing, helping to refine it. Those are the indicators. We’re not generating revenue, so what does the adoption look like? Are people excited about it? Are they telling friends? Do they want more? When I hear that the ten people [who were initial users] are happy and that they think it should be rolled out to the whole broader audience, I think that’s a good sign.” - Jeremy Forman (35:19)

 

Links Referenced

LinkedIn: https://www.linkedin.com/in/jeremy-forman-6b982710/

10 Jul 2024147 - UI/UX Design Considerations for LLMs in Enterprise Applications (Part 1)00:25:46

Let’s talk about design for AI (which more and more, I’m agreeing means GenAI to those outside the data space). The hype around GenAI and LLMs—particularly as it relates to dropping these in as features into a software application or product—seems to me, at this time, to largely be driven by FOMO rather than real value. In this “part 1” episode, I look at the importance of solid user experience design and outcome-oriented thinking when deploying LLMs into enterprise products. Challenges with immature AI UIs, the role of context, the constant game of understanding what accuracy means (and how much this matters), and the potential impact on human workers are also examined. Through a hypothetical scenario, I illustrate the complexities of using LLMs in practical applications, stressing the need for careful consideration of benchmarks and the acceptance of GenAI's risks. 

 

 

I also want to note that LLMs are a very immature space in terms of UI/UX design—even if the foundation models continue to mature at a rapid pace. As such, this episode is more about the questions and mindset I would be considering when integrating LLMs into enterprise software more than a suggestion of “best practices.” 

 

 

Highlights/ Skip to:

  • (1:15) Currently, many LLM feature  initiatives seem to mostly driven by FOMO 
  • (2:45) UX Considerations for LLM-enhanced enterprise applications 
  • (5:14) Challenges with LLM UIs / user interfaces
  • (7:24) Measuring improvement in UX outcomes with LLMs
  • (10:36) Accuracy in LLMs and its relevance in enterprise software 
  • (11:28) Illustrating key consideration for implementing an LLM-based feature
  • (19:00) Leadership and context in AI deployment
  • (19:27) Determining UX benchmarks for using LLMs
  • (20:14) The dynamic nature of LLM hallucinations and how we design for the unknown
  • (21:16) Closing thoughts on Part 1 of designing for AI and LLMs

 

 

Quotes from Today’s Episode

  • “While many product teams continue to race to deploy some sort of GenAI and especially LLMs into their products—particularly this is in the tech sector for commercial software companies—the general sense I’m getting is that this is still more about FOMO than anything else.” - Brian T. O’Neill (2:07)
  • “No matter what the technology is, a good user experience design foundation starts with not doing any harm, and hopefully going beyond usable to be delightful. And adding LLM capabilities into a solution is really no different. So, we still need to have outcome-oriented thinking on both our product and design teams when deploying LLM capabilities into a solution. This is a cornerstone of good product work.” - Brian T. O’Neill (3:03)
  • “So, challenges with LLM UIs and UXs, right, user interfaces and experiences, the most obvious challenge to me right now with large language model interfaces is that while we’ve given users tremendous flexibility in the form of a Google search-like interface, we’ve also in many cases, limited the UX of these interactions to a text conversation with a machine. We’re back to the CLI in some ways.” - Brian T. O’Neill (5:14)
  • “Before and after we insert an LLM into a user’s workflow, we need to know what an improvement in their life or work actually means.”- Brian T. O’Neill (7:24)
  • "If it would take the machine a few seconds to process a result versus what might take a day for a worker, what’s the role and purpose of that worker going forward? I think these are all considerations that need to be made, particularly if you’re concerned about adoption, which a lot of data product leaders are." - Brian T. O’Neill (10:17)
  • “So, there’s no right or wrong answer here. These are all range questions, and they’re leadership questions, and context really matters. They are important to ask, particularly when we have this risk of reacting to incorrect information that looks plausible and believable because of how these LLMs tend to respond to us with a positive sheen much of the time.” - Brian T. O’Neill (19:00)

 

Links

03 May 2022090 - Michelle Carney’s Mission With MLUX: Bringing UX and Machine Learning Together00:31:43

Michelle Carney began her career in the worlds of neuroscience and machine learning where she worked on the original Python Notebooks. As she fine-tuned ML models and started to notice discrepancies in the human experience of using these models, her interest turned towards UX. Michelle discusses how her work today as a UX researcher at Google impacts her work with teams leveraging ML in their applications. She explains how her interest in the crossover of ML and UX led her to start MLUX, a collection of meet-up events where professionals from both data science and design can connect and share methods and ideas. MLUX now hosts meet-ups in several locations as well as virtually.  Our conversation begins with Michelle’s explanation of how she teaches data scientists to integrate UX into the development of their products. As a teacher, Michelle utilizes the IDEO Design Kit with her students at the Stanford School of Design (d.school). In her teaching she shares some of the unlearning that data scientists need to do when trying to approach their work with a UX perspective in her course, Designing Machine Learning. Finally, we also discussed what UX designers need to know about designing for ML/AI. Michelle also talks about how model interpretability is a facet of UX design and why model accuracy isn’t always the most important element of a ML application. Michelle ends the conversation with an emphasis on the need for more interdisciplinary voices in the fields of ML and AI. 

 

Skip to a topic here:

  • Michelle talks about what drove her career shift from machine learning and neuroscience to user experience (1:15)
  • Michelle explains what MLUX is (4:40)
  • How to get ML teams on board with the importance of user experience (6:54)
  • Michelle discusses the “unlearning” data scientists might have to do as they reconsider ML from a UX perspective (9:15)
  • Brian and Michelle talk about the importance of considering the UX from the beginning of model development  (10:45)
  • Michelle expounds on different ways to measure the effectiveness of user experience (15:10)
  • Brian and Michelle talk about what is driving the increase in the need for designers on ML teams (19:59)
  • Michelle explains the role of design around model interpretability and explainability (24:44)

 

Quotes from Today’s Episode
  • “The first step to business value is the hurdle of adoption. A user has to be willing to try—and care—before you ever will get to business value.” - Brian O’Neill (13:01)
  • “There’s so much talk about business value and there’s very little talk about adoption. I think providing value to the end-user is the gateway to getting any business value. If you’re building anything that has a human in the loop that’s not fully automated, you can’t get to business value if you don’t get through the first gate of adoption.” - Brian O’Neill (13:17)
  • “I think that designers who are able to design for ambiguity are going to be the ones that tackle a lot of this AI and ML stuff.” - Michelle Carney (19:43)
  • “That’s something that we have to think about with our ML models. We’re coming into this user’s life where there’s a lot of other things going on and our model is not their top priority, so we should design it so that it fits into their ecosystem.” - Michelle Carney (3:27)
  • “If we aren’t thinking about privacy and ethics and explainability and usability from the beginning, then it’s not going to be embedded into our products. If we just treat usability of our ML models as a checkbox, then it just plays the role of a compliance function.” - Michelle Carney (11:52)
  • “I don’t think you need to know ML or machine learning in order to design for ML and machine learning. You don’t need to understand how to build a model, you need to understand what the model does. You need to understand what the inputs and the outputs are.” - Michelle Carney (18:45)
Links
09 Aug 2022097 - Why Regions Bank’s CDAO, Manav Misra, Implemented a Product-Oriented Approach to Designing Data Products00:35:22

Today, I chat with Manav Misra, Chief Data and Analytics Officer at Regions Bank. I begin by asking Manav what it was like to come in and implement a user-focused mentality at Regions, driven by his experience in the software industry. Manav details his approach, which included developing a new data product partner role and using effective communication to gradually gain trust and cooperation from all the players on his team. 

 

Manav then talks about how, over time, he solidified a formal framework for his team to be trained to use this approach and how his hiring is influenced by a product orientation. We also discuss his definition of data product at Regions, which I find to be one of the best I’ve heard to date. Today, Region Bank’s data products are delivering tens of millions of dollars in additional revenue to the bank. Given those results, I also dig into the role of design and designers to better understand who is actually doing the designing of Regions’ data products to make them so successful. Later, I ask Manav what it’s like when designers and data professionals work on the same team and how UX and data visualization design are handled at the bank. 

 

Towards the end, Manav shares what he has learned from his time at Regions and what he would implement in a new organization if starting over. He also expounds on the importance of empowering his team to ask customers the right questions and how a true client/stakeholder partnership has led to Manav’s most successful data products.

 

Highlights / Skip to:

 

  • Brief history of decision science and how it influenced the way data science and analytics work has been done (and unfortunately still is in many orgs) (1:47)
  • Manav’s philosophy and methods for changing the data science culture at Regions Bank to being product and user-driven (5:19)
  • Manav talks about the size of his team and the data product role within the team as well as what he had to do to convince leadership to buy in to the necessity of the data product partner role (10:54)
  • Quantifying and measuring the value of data products at Regions and some of his results (which include tens of millions of dollars in additional revenue) (13:05)
  • What’s a “data product” at Regions? Manav shares his definition (13:44)
  • Who does the designing of data products at Regions? (17:00)
  • The challenges and benefits of having a team comprised of both designers and data scientists (20:10)
  • Lessons Manav has learned from building his team and culture at Regions (23:09)
  • How Manav coaches his team and gives them the confidence to ask the right questions (27:17)
  • How true partnership has led to Manav’s most successful data products (31:46)

 

Quotes from Today’s Episode
  • Re: how traditional, non-product oriented enterprises do data work: “As younger people come out of data science programs…that [old] culture is changing. The folks coming into this world now are looking to make an impact and then they want to see what this can do in the real world.” — Manav 

 

  • On the role of the Data Product Partner: “We brought in people that had both business knowledge as well as the technical knowledge, so with a combination of both they could talk to the ‘Internal customers,’ of our data products, but they could also talk to the data scientists and our developers and communicate in both directions in order to form that bridge between the two.” — Manav

 

  • “There are products that are delivering tens of millions of dollars in terms of additional revenue, or stopping fraud, or any of those kinds of things that the products are designed to address, they’re delivering and over-delivering on the business cases that we created.” — Manav 

 

  • “The way we define a data product is this: an end-to-end software solution to a problem that the business has. It leverages data and advanced analytics heavily in order to deliver that solution.” — Manav 

 

  • “The deployment and operationalization is simply part of the solution. They are not something that we do after; they’re something that we design in from the start of the solution.” — Brian 

 

  • “Design is a team sport. And even if you don’t have a titled designer doing the work, if someone is going to use the solution that you made, whether it’s a dashboard, or report, or an email, or notification, or an application, or whatever, there is a design, whether you put intention behind it or not.” — Brian

 

  • “As you look at interactive components in your data product, which are, you know, allowing people to ask questions and then get answers, you really have to think through what that interaction will look like, what’s the best way for them to get to the right answers and be able to use that in their decision-making.” — Manav 

 

  • “I have really instilled in my team that tools will come and go, technologies will come and go, [and so] you’ll have to have that mindset of constantly learning new things, being able to adapt and take on new ideas and incorporate them in how we do things.” — Manav
  Links
27 Jun 2023120 - The Portfolio Mindset: Data Product Management and Design with Nadiem von Heydebrand (Part 2)00:41:35

Today I’m continuing my conversation with Nadiem von Heydebrand, CEO of Mindfuel. In the conclusion of this special 2-part episode, Nadiem and I discuss the role of a Data Product Manager in depth. Nadiem reveals which fields data product managers are currently coming from, and how a new data product manager with a non-technical background can set themselves up for success in this new role. He also walks through his portfolio approach to data product management, and how to prioritize use cases when taking on a data product management role. Toward the end, Nadiem also shares personal examples of how he’s employed these strategies, why he feels it’s so important for engineers to be able to see and understand the impact of their work, and best practices around developing a data product team. 

Highlights / Skip to:

  • Brian introduces Nadiem and gives context for why the conversation with Nadiem led to a two-part episode (00:35)
  • Nadiem summarizes his thoughts on data product management and adds context on which fields he sees data product managers currently coming from (01:46)
  • Nadiem’s take on whether job listings for data product manager roles still have too many technical requirements (04:27)
  • Why some non-technical people fail when they transition to a data product manager role and the ways Nadiem feels they can bolster their chances of success (07:09)
  • Brian and Nadiem talk about their views on functional data product team models and the process for developing a data product as a team (10:11)
  • When Nadiem feels it makes sense to hire a data product manager and adopt a portfolio view of your data products (16:22)
  • Nadiem’s view on how to prioritize projects as a new data product manager (19:48)
  • Nadiem shares a story of when he took on an interim role as a head of data and how he employed the portfolio strategies he recommends (24:54)
  • How Nadiem evaluates perceived usability of a data product when picking use cases (27:28)
  • Nadiem explains why understanding go-to-market strategy is so critical as a data product manager (30:00)
  • Brian and Nadiem discuss the importance of today’s engineering teams understanding the value and impact of their work (32:09)
  • How Nadiem and his team came up with the idea to develop a SaaS product for data product managers (34:40)
Quotes from Today’s Episode
  • “So, data product management [...] is a combination of different capabilities [...]  [including] product management, design, data science, and machine learning. We covered this in viability, desirability, feasibility, and datability. So, these are four dimensions [that] you combine [...] together to become a data product manager.” — Nadiem von Heydebrand (02:34)

 

  • “There is no education for data product management today, there’s no university degree. ... So, there’s nobody out there—from my perspective—who really has all the four dimensions from day one. It’s more like an evolution: you’re coming from one of the [parallel business] domains or from one of the [parallel business] fields and then you extend your skill set over time.” — Nadiem von Heydebrand (03:04)
  • “If a product manager has very good communication skills and is able to break down the needs in a proper way or in a good understandable way to its tech lead, or its engineering lead or data science lead, then I think it works out super well. If this bridge is missing, then it becomes a little bit tricky because then the distance between the product manager and the development team is too far.” – Nadiem von Heydebrand (09:10)

 

  • “I think every data leader out there has an Excel spreadsheet or a list of prioritized use cases or the most relevant use cases for the business strategy… You can think about this list as a portfolio. You know, some of these use cases are super valuable; some of these use cases maybe will not work out, and you have to identify those which are bringing real return on investment when you put effort in there.” – Nadiem von Heydebrand (19:01)

 

  • “I’m not a magician for data product management. I just focused on a very strategic view on my portfolio and tried to identify those cases and those data products where I can believe I can easily develop them, I have a high degree of adoption with my lines of business, and I can truly measure the added revenue and the impact.” – Nadiem von Heydebrand (26:31)

 

  • “As a true data product manager, from my point of view, you are someone who is empathetic for the lines of businesses, to understand what their underlying needs and what the problems are. At the same time, you are a business person. You try to optimize the portfolio for your own needs, because you have business goals coming from your leadership team, from your head of data, or even from the person above, the CTO, CIO, even CEO. So, you want to make sure that your value contribution is always transparent, and visible, measurable, tangible.” – Nadiem von Heydebrand (29:20)

 

  • “If we look into classical product management, I mean, the product manager has to understand how to market and how to go to the market. And it’s this exactly the same situation with data product managers within your organization. You are as successful as your product performs in the market. This is how you measure yourself as a data product manager. This is how you define success for yourself.” – Nadiem von Heydebrand (30:58)
Links
24 Jan 2023109 - The Role of Product Management and Design in Turning ML/AI into a Valuable Business with Bob Mason from Argon Ventures00:32:43

Today I’m chatting with Bob Mason, Managing Partner at Argon Ventures. Bob is a VC who seeks out early-stage founders in the ML/AI space and helps them inform their go-to-market, product, and design strategies. In this episode, Bob reveals what he looks for in early-stage data and intelligence startups who are trying to leverage ML/AI. He goes on to explain why it’s important to identify what your strengths are and what you enjoy doing so you can surround yourself with the right team. Bob also shares valuable insight into how to earn trust with potential customers as an early-stage startup, how design impacts a product’s success, and his strategy for differentiating yourself and creating a valuable product outside of the ubiquitous “platform play.” 

 

Highlights/ Skip to:

  • Bob explains why and how Argon Ventures focuses their investments in intelligent industry companies (00:53)
  • Brian and Bob discuss the importance of prioritizing go-to-market strategy over technology (03:42)
  • How Bob views the career progression from data science to product management, and the ways in which his own career has paralleled that journey (07:21)
  • The role customer adoption and user experience play for Bob and the companies he invests in, both pre-investment and post-investment (11:10)
  • Brian and Bob discuss the design capabilities of different teams and why Bob feels it’s something leaders need to keep top of mind (15:25)
  • Bob explains his recommendation to seek out quick wins for AI companies who can’t expect customers to wait for an ROI (19:09)
  • The importance Bob sees in identifying early adopters during a sales cycle for early-stage startups (21:34)
  • Bob describes how being customer-centric allows start-ups to build trust, garner quick wins, and inform their product strategy (23:42)
  • Bob and Brian dive into Bob’s belief that solving intrinsic business problems by vertical increases a start-up’s chance of success substantially over “the platform play” (27:29)
  • Bob gives insight into product trends he believes are going to be extremely impactful in the near future (29:05)
Quotes from Today’s Episode
  • “In a former life, I was a software engineer, founder, and CTO myself, so I have to watch myself to not just geek out on the technology itself because the most important element when you’re determining if you want to move forward with investment or not, is this: is there a real problem here to be solved or is this technology in search of a problem?” — Bob Mason (01:51)
  • “User-centric research is really valuable, particularly at the earliest stages. If you’re just off by a degree or two, several years down the road, that can be a really material roadblock that you hit. And so, starting off on the right foot, I think is super, super valuable.” – Bob Mason (06:12)

 

  • “I don’t think the technical folks in an early-stage startup absolve themselves of not being really intimately involved with their go-to-market and who they’re ultimately creating value for.” – Bob Mason (07:07)

 

  • “When we’re making an investment decision, startups don’t generally have any customers, and so we don’t necessarily use the signal of long-term customer adoption as a driver for our initial investment decision. But it’s very much top of mind after investment and as we’re trying to build and bring the first version of the product to market. Being very thoughtful and mindful of sort of customer experience and long-term adoption is absolutely critical.” – Bob Mason (11:23)

 

  • “If you’re a scientist, the way you’re presenting both raw data and sort of summaries of data could be quite different than if you’re working with a business analyst that’s a few years out of college with a liberal arts degree. How you interpret results and then present those results, I think, is actually a very interesting design problem.” – Bob Mason (18:40)

 

  • “I think initially, a lot of early AI startups just kind of assumed that customers would be patient and let the system run, [waiting] 3, 6, 9, 12 months [to get this] magical ROI, and that’s just not how people (buyers) operate.” – Bob Mason (21:00)

 

  • “Re: platform plays: Obviously, you could still create a tremendous platform that’s very broad, but we think if you focus on the business problem of that particular vertical or domain, that actually creates a really powerful wedge so you can increase your value proposition. You could always increase the breadth of a platform over time. But if you’re not solving that intrinsic problem at the very beginning, you may never get the chance to survive.” – Bob Mason (28:24)
Links
21 Mar 2023113 - Turning the Weather into an Indispensable Data Product for Businesses with Cole Swain, VP Product at tomorrow.io00:38:53

Today I’m chatting with Cole Swain, VP of Product at Tomorrow.io. Tomorrow.io is an untraditional weather company that creates data products to deliver relevant business insights to their customers. Together, Cole and I explore the challenges and opportunities that come with building an untraditional data product. Cole describes some of the practical strategies he’s developed for collecting and implementing qualitative data from customers, as well as why he feels rapport-building with users is a critical skill for product managers. Cole also reveals how scientists are part of the fold when developing products at Tomorrow.io, and the impact that their product has on decision-making across multiple industries. 

Highlights/ Skip to:

  • Cole describes what Tomorrow.io does (00:56)
  • The types of companies that purchase Tomorrow.io and how they’re using the products (03:45)
  • Cole explains how Tomorrow.io developed practical strategies for helping customers get the insights they need from their products (06:10)
  • The challenges Cole has encountered trying to design a good user experience for an untraditional data product (11:08)
  • Cole describes a time when a Tomorrow.io product didn’t get adopted, and how he and the team pivoted successfully (13:01)
  • The impacts and outcomes of decisions made by customers using products from Tomorrow.io (15:16)
  • Cole describes the value of understanding your active users and what skills and attributes he feels make a great product manager (20:11)
  • Cole explains the challenges of being horizontally positioned rather than operating within an [industry] vertical (23:53)
  • The different functions that are involved in developing Tomorrow.io (28:08)
  • What keeps Cole up at night as the VP of Product for Tomorrow.io (33:47)
  • Cole explains what he would do differently if he could come into his role from the beginning all over again (36:14)
Quotes from Today’s Episode
  • “[Customers aren't] just going to listen to that objective summary and go do the action. It really has to be supplied with a tremendous amount of information around it in a concise way. ... The assumption upfront was just, if we give you a recommendation, you’ll be able to go ahead and go do that. But it’s just not the case.” – Cole Swain (13:40)
  • “The first challenge is designing this product in a way that you can communicate that value really fast. Because everybody who signs up for new product, they’re very lazy at the beginning. You have to motivate them to be able to realize that, hey, this is something that you can actually harness to change the way that you operate around the weather.” – Cole Swain (11:46)
  • “People kind of overestimate at times the validity of even just real-time data. So, how do you create an experience that’s intuitive enough to be decision support and create confidence that this tool is different for them, while still having the empathy with the user, that this is still just a forecast in itself; you have to make your own decisions around it.” – Cole Swain (12:43)
  • “What we often find in weather is that the bigger decisions aren’t made in silos. People don’t feel confident to make it on their own and they require a team to be able to come in because they know the unpredictability of the scenarios and they feel that they need to be able to have partners or comrades in the situation that are in it together with them.” – Cole Swain (17:24)
  • “To me, there’s two super key capabilities or strengths in being a successful product manager. It’s pattern recognition and it’s the ability to create fast rapport with a customer: in your first conversation with a customer, within five minutes of talking with them, connect with them.” – Cole Swain (22:06)
  • “[It’s] not about ‘how can we deliver the best value singularly to a particular client,’ but ‘how can we recognize the patterns that rise the tide for all of our customers?’ And it might sound obvious that that’s something that you need to do, but it’s so easy to teeter into the direction of building something unique for a particular vertical.” – Cole Swain (25:41)
  • “Our sales team is just always finding new use cases. And we have to continue to say no and we have to continue to be disciplined in this arena. But I’d be lying to tell you if that didn’t keep me up at night when I hear about this opportunity of this solution we could build, and I know it can be done in a matter of X amount of time. But the risk of doing that is just too high, sometimes.” – Cole Swain (35:42)
Links
20 Apr 2021063 - Beyond Compliance: Designing Data Products With Data Privacy As a UX Benefit with The Data Diva (Debbie Reynolds)00:35:30

Debbie Reynolds is known as “The Data Diva” — and for good reason. 

 

In addition to being founder, CEO and chief data privacy officer of her own successful consulting firm, Debbie has been named to the Global Top 20 CyberRisk Communicators by The European Risk Policy Institute in 2020. She’s also written a few books, such as The GDPR Challenge: Privacy, Technology, and Compliance In An Age of Accelerating Change; as well as articles for other publications.

 

If you are building data products, especially customer-facing software, you’ll want to tune into this episode. Debbie and Ihad an awesome discussion about data privacy from the lens of user experience instead of the typical angle we are all used to: legal compliance. While collecting user data can enable better user experiences, we can also break a customer’s trust if we don’t request access properly.  

 

In our chat, we covered:

  • 'Humans are using your product': What it means to be a 'data steward' when building software. (0:27)
  • 'Privacy by design': The importance for software creators to think about privacy throughout the entire product creation process. (4:32)
  • The different laws (and lack thereof) regarding data privacy — and the importance to think about a product's potential harm during the design process. (6:58)
  • The importance of having 'diversity at all levels' when building data products. (16:41)
  • The role of transparency in data collection. (19:41)
  • Fostering a positive and collaborative relationship between a product or service’s designers, product owners, and legal compliance experts. (24:55)
  • The future of data monetization and how it relates to privacy. (29:18)

 

Resources and Links: Quotes from Today’s Episode

When it comes to your product, humans are using it. Regardless of whether the users are internal or external — what I tell people is to put themselves in the shoes of someone who’s using this and think about what you would want to have done with your information or with your rights. Putting it in that context, I think, helps people think and get out of their head about it. Obviously there’s a lot of skill and a lot of experience that it takes to build these products and think about them in technical ways. But I also try to tell people that when you’re dealing with data and you’re building products, you’re a data steward. The data belongs to someone else, and you’re holding it for them, or you’re allowing them to either have access to it or leverage it in some way. So, think about yourself and what you would think you would want done with your information. - Debbie (3:28)

 

Privacy by design is looking at the fundamental levels of how people are creating things, and having them think about privacy as they’re doing that creation. When that happens, then privacy is not a difficult thing at the end. Privacy really isn’t something you could tack on at the end of something; it’s something that becomes harder if it’s not baked in. So, being able to think about those things throughout the process makes it easier. We’re seeing situations now where consumers are starting to vote with their feet — if they feel like a tool or a process isn’t respecting their privacy rights, they want to be able to choose other things. So, I think that’s just the way of the world. .... It may be a situation where you’re going to lose customers or market share if you’re not thinking about the rights of individuals. - Debbie (5:20)

 

I think diversity at all levels is important when it comes to data privacy, such as diversity in skill sets, points of view, and regional differences. … I think people in the EU — because privacy is a fundamental human right — feel about it differently than we do here in the US where our privacy rights don’t really kick in unless it’s a transaction. ...  The parallel I say is that people in Europe feel about privacy like we feel about freedom of speech here — it’s just very deeply ingrained in the way that they do things. And a lot of the time, when we’re building products, we don’t want to be collecting data or doing something in ways that would harm the way people feel about your product. So, you definitely have to be respectful of those different kinds of regimes and the way they handle data. … I’ll give you a biased example that someone had showed me, which was really interesting. There was a soap dispenser that was created where you put your hand underneath and then the soap comes out. It’s supposed to be a motion detection thing. And this particular one would not work on people of color. I guess whatever sensor they created, it didn’t have that color in the spectrum of what they thought would be used for detection or whatever. And so those are problems that happen a lot if you don’t have diverse people looking at these products. Because you — as a person that is creating products — you really want the most people possible to be able to use your products. I think there is an imperative on the economic side to make sure these products can work for everyone. - Debbie (17:31)

 

Transparency is the wave of the future, I think, because so many privacy laws have it. Almost any privacy law you think of has transparency in it, some way, shape, or form. So, if you’re not trying to be transparent with the people that you’re dealing with, or potential customers, you’re going to end up in trouble. - Debbie (24:35) 

 

In my experience, while I worked with lawyers in the digital product design space — and it was heaviest when I worked at a financial institution — I watched how the legal and risk department basically crippled stuff constantly. And I say “cripple” because the feeling that I got was there’s a line between adhering to the law and then also—some of this is a gray area, like disclosure. Or, if we show this chart that has this information, is that construed as advice? I understand there’s a lot of legal regulation there. My feeling was, there’s got to be a better way for compliance departments and lawyers that genuinely want to do the right thing in their work to understand how to work with product design, digital design teams, especially ones using data in interesting ways. How do you work with compliance and legal when we’re designing digital products that use data so that it’s a team effort, and it’s not just like, “I’m going to cover every last edge because that’s what I’m here to do is to stop anything that could potentially get us sued.” There is a cost to that. There’s an innovation cost to that. It’s easier, though, to look at the lawyer and say, “Well, I guess they know the law better, so they’re always going to win that argument.” I think there’s a potential risk there. - Brain (25:01)

 

Trust is so important. A lot of times in our space, we think about it with machine learning, and AI, and trusting the model predictions and all this kind of stuff, but trust is a brand attribute as well and it’s part of the reason I think design is important because the designers tend to be the most empathetic and user-centered of the bunch. That’s what we’re often there to do is to keep that part in check because we can do almost anything these days with the tech and the data, and some of it’s like, “Should we do this?” And if we do do it, how do we do it so we’re on brand, and the trust is built, and all these other factors go into that user experience. - Brian (34:21)

24 Dec 2024159 - Uncorking Customer Insights: How Data Products Revealed Hidden Gems in Liquor & Hospitality Retail00:40:47

Today, I’m talking to Andy Sutton, GM of Data and AI at Endeavour Group, Australia's largest liquor and hospitality company. In this episode, Andy—who is also a member of the Data Product Leadership Community (DPLC)—shares his journey from traditional, functional analytics to a product-led approach that drives their mission to leverage data and personalization to build the “Spotify for wines.” This shift has greatly transformed how Endeavour’s digital and data teams work together, and Andy explains how their advanced analytics work has paid off in terms of the company’s value and profitability.

 

 

You’ll learn about the often overlooked importance of relationships in a data-driven world, and how Andy sees the importance of understanding how users do their job in the wild (with and without your product(s) in hand). Earlier this year, Andy also gave the DPLC community a deeper look at how they brew data products at EDG, and that recording is available to our members in the archive.

 

We covered:
  • What it was like at EDG before Andy started adopting a producty approach to data products and how things have now changed (1:52)
  • The moment that caused Andy to change how his team was building analytics solutions (3:42)
  • The amount of financial value that Andy's increased with his scaling team as a result of their data product work (5:19)
  • How Andy and Endeavour use personalization to help build “the Spotify of wine” (9:15)
  • What the team under Andy required in order to make the transition to being product-led (10:27)
  • The successes seen by Endeavour through the digital and data teams’ working relationship (14:04)
  • What data product management looks like for Andy’s team (18:45)
  • How Andy and his team find solutions to  bridging the adoption gap (20:53)
  • The importance of exposure time to end users for the adoption of a data product (23:43)
  • How talking to the pub staff at EDG’s bars and restaurants helps his team build better data products (27:04)
  • What Andy loves about working for Endeavour Group (32:25)
  • What Andy would change if he could rewind back to 2022 and do it all over (34:55)
  • Final thoughts (38:25)

 

 

Quotes from Today’s Episode
  • “I think the biggest thing is the value we unlock in terms of incremental dollars, right? I’ve not worked in analytics team before where we’ve been able to deliver a measurable value…. So, we’re actually—in theory—we’re becoming a profit center for the organization, not just a cost center. And so, there’s kind of one key metric. The second one, we do measure the voice of the team and how engaged our team are, and that’s on an upward trend since we moved to the new operating model, too. We also measure [a type of] “voice of partner” score [and] get something like a 4.1 out of 5 on that scale. Those are probably the three biggest ones: we’re putting value in, and we’re delivering products, I guess, our internal team wants to use, and we are building an enthused team at the same time.” - Andy Sutton (16:18)
  • “ You can put an [unfinished] product in front of an end customer, and they will give you quality feedback that you can then iterate on quickly. You can do that with an internal team, but you’ll lose credibility. Internal teams hold their analytics colleagues to a higher standard than the external customers. We’re trying to change how people do their roles. People feel very passionate about the roles they do, and how they do them, and what they bring to that role. We’re trying to build some of that into products. It requires probably more design consideration than I’d anticipated, and we’re still bringing in more designers to help us move closer to the start line.’” - Andy Sutton (19:25)
  • “ [Customer research] is becoming critical in terms of the products we’re building. You’re building a product, a set of products, or a process for an operations team. In our context, an operations team can mean a team of people who run a pub. It’s not just about convincing me, my product managers, or my data scientists that you need research; we want to take some of the resources out of running that bar for a period of time because we want to spend time with [the pub staff] watching, understanding, and researching. We’ve learned some of these things along the way… we’ve earned the trust, we’ve earned that seat at the table, and so we can have those conversations. It’s not trivial to get people to say, ‘I’ll give you a day-long workshop, or give you my team off of running a restaurant and a bar for the day so that they can spend time with you, and so you can understand our processes.’” -  Andy Sutton (24:42)
  • “ I think what is very particular to pubs is the importance of the interaction between the customer and the person serving the customer. [Pubs] are about the connections between the staff and the customer, and you don’t get any of that if you’re just looking at things from a pure data perspective… You don’t see the [relationships between pub staff and customer] in the [data], so how do you capture some of that in your product? It’s about understanding the context of the data, not just the data itself.” - Andy Sutton (28:15)
  • “Every winery, every wine grower, every wine has got a story. These conversations [and relationships] are almost natural in our business. Our CEO started work on the shop floor in one of our stores 30 years ago. That kind of relationship stuff percolates through the organization. Having these conversations around the customer and internal stakeholders in the context of data feels a lot easier because storytelling and relationships are the way we get things done. An analytics team may get frustrated with people who can’t understand data, but it’s [the analytics team’s job] to help bridge that gap.” - Andy Sutton (32:34)

 

 

Links Referenced
08 Feb 2022084 - The Messy Truth of Designing and Building a Successful Analytics SAAS Product featuring Jonathan Kay (CEO, Apptopia)00:39:56

Building a SAAS business that focuses on building a research tool, more than building a data product, is how Jonathan Kay, CEO and Co-Founder of Apptopia frames his company’s work. Jonathan and I worked together when Apptopia pivoted from its prior business into a mobile intelligence platform for brands. Part of the reason I wanted to have Jonathan talk to you all is because I knew that he would strip away all the easy-to-see shine and varnish from their success and get really candid about what worked…and what hasn’t…during their journey to turn a data product into a successful SAAS business. So get ready: Jonathan is going to reveal the very curvy line that Apptopia has taken to get where they are today. 

 

In this episode, Jonathan also describes one of the core product design frameworks that Apptopia is currently using to help deliver actionable insights to their customers. For Jonathan, Apptopia’s research-centric approach changes the ways in which their customers can interact with data and is helping eliminate the lull between “the why” and “the actioning” with data.

 

Here are some of the key parts of  the interview:

  • An introduction to Apptopia and how they serve brands in the world of mobile app data (00:36)
  • The current UX gaps that Apptopia is working to fill (03:32)
  • How Apptopia balances flexibility with ease-of-use  (06:22)
  • How Apptopia establishes the boundaries of its product when it’s just one part of a user’s overall workflow (10:06)
  • The challenge of “low use, low trust” and getting “non-data” people to act (13:45)
  • Developing strong conclusions and opinions and presenting them to customers (18:10)
  • How Apptopia’s product design process has evolved when working with data, particularly at the UI level (21:30)
  • The relationship between Apptopia’s buyer, versus the users of the product and how they balance the two (24:45)
  • Jonathan’s advice for hiring good data product design and management staff (29:45)
  • How data fits into Jonathan’s own decision making as CEO of Apptopia (33:21)
  • Jonathan’s advice for emerging data product leaders (36:30)
Quotes from Today’s Episode

 

  • “I want to just give you some props on the work that you guys have done and seeing where it's gone from when we worked together. The word grit, I think, is the word that I most associate with you and Eli [former CEO, co-founder] from those times. It felt very genuine that you believed in your mission and you had a long-term vision for it.” - Brian T. O’Neill (@rhythmspice) (02:08)
  • “A research tool gives you the ability to create an input, which might be, ‘I want to see how Netflix is performing.’ And then it gives you a bunch of data. And it gives you good user experience that allows you to look for the answer to the question that’s in your head, but you need to start with a question. You need to know how to manipulate the tool. It requires a huge amount of experience and understanding of the data consumer in order to actually get the answer to the question. For me, that feels like a miss because I think the amount of people who need and can benefit from data, and the amount of people who know how to instrument the tools to get the answers from the data—well, I think there’s a huge disconnect in those numbers. And just like when I take my car to get service, I expected the car mechanic knows exactly what the hell is going on in there, right? Like, our obligation as a data provider should be to help people get closer to the answer. And I think we still have some room to go in order to get there.” - Jonathan Kay (@JonathanCKay) (04:54)
  • “You need to present someone the what, the why, etc.—then the research component [of your data product] is valuable. And so it’s not that having a research tool isn’t valuable. It’s just, you can’t have the whole thing be that. You need to give them the what and the why first.” - Jonathan Kay (@JonathanCKay) (08:45)
  • “You can't put equal resources into everything. Knowing the boundaries of your data product is important, but it's a hard thing to know sometimes where to draw those. A leader has to ask, ‘am I getting outside of my sweet spot? Is this outside of the mission?’ Figuring the right boundaries goes back to customer research.” - Brian T. O’Neill (@rhythmspice) (12:54)
  • “What would I have done differently if I was starting Apptopia today? I would have invested into the quality of the data earlier. I let the product design move me into the clouds a little bit, because sometimes you're designing a product and you're designing visuals, but we were doing it without real data. One of the biggest things that I've learned over a lot of mistakes over a long period of time, is that we've got to incorporate real data in the design process.” - Jonathan Kay (@JonathanCKay) (20:09)
  • “We work with one of the biggest food manufacturer distributors in the world, and they were choosing between us and our biggest competitor, and what they essentially did was [say] “I need to put this report together every two weeks. I used your competitor’s platform during a trial and your platform during the trial, and I was able to do it two hours faster in your platform, so I chose you—because all the other checkboxes were equal. However, at the end of the day, if we could get two hours a week back by using your tool, saving time and saving money and making better decisions, they’re all equal ROI contributors.” - Jonathan Kay on UX (@JonathanCKay) (27:23)
  • “In terms of our product design and management hires, we're typically looking for people who have not worked at one company for 10 years. We've actually found a couple phenomenal designers that went from running their own consulting company to wanting to join full time. That was kind of a big win because one of them had a huge breadth of experience working with a bunch of different products in a bunch of different spaces.”- Jonathan Kay (@JonathanCKay) (30:34)
  • “In terms of how I use data when making decisions for Apptopia, here’s an example. If you break our business down into different personas, my understanding one time was that one of our personas was more stagnant. The data however, did not support that. And so we're having a resource planning meeting, and I'm saying, ‘let's pull back resources a little bit,’ but [my team is] showing me data that says my assumption on that customer segment is actually incorrect. I think entrepreneurs and passionate people need data more because we have so much conviction in our decisions—and because of that,I'm more likely to make bad decisions. Theoretically good entrepreneurs should have good instincts, and you need to trust those, but what I’m saying is, you also need to check those. It's okay to make sure that your instinct is correct, right? And one of the ways that I’ve gotten more mature is by forcing people to show me data to either back up my decision in either direction and being comfortable being wrong. And I am wrong at least half of the time with those things!” - Jonathan Kay (@JonathanCKay) (34:09)
20 Feb 2024137 - Immature Data, Immature Clients: When Are Data Products the Right Approach? feat. Data Product Architect, Karen Meppen00:44:50

This week, I'm chatting with Karen Meppen, a founding member of the Data Product Leadership Community and a Data Product Architect and Client Services Director at Hakkoda. Today, we're tackling the difficult topic of developing data products in situations where a product-oriented culture and data infrastructures may still be emerging or “at odds” with a human-centered approach. Karen brings extensive experience and a strong belief in how to effectively negotiate the early stages of data maturity. Together we look at the major hurdles that businesses encounter when trying to properly exploit data products, as well as the necessity of leadership support and strategy alignment in these initiatives. Karen's insights offer a roadmap for those seeking to adopt a product and UX-driven methodology when significant tech or cultural hurdles may exist.

Highlights/ Skip to:

  • I Introduce Karen Meppen and the challenges of dealing with data products in places where the data and tech aren't quite there yet (00:00)
  • Karen shares her thoughts on what it's like working with "immature data" (02:27)
  • Karen breaks down what a data product actually is (04:20)
  • Karen and I discuss why having executive buy-in is crucial for moving forward with data products (07:48)
  • The sometimes fuzzy definition of "data products." (12:09)
  • Karen defines “shadow data teams” and explains how they sometimes conflict with tech teams (17:35)
  • How Karen identifies the nature of each team to overcome common hurdles of connecting tech teams with business units (18:47)
  • How she navigates conversations with tech leaders who think they already understand the requirements of business users (22:48)
  • Using design prototypes and design reviews with different teams to make sure everyone is on the same page about UX (24:00)
  • Karen shares stories from earlier in her career that led her to embrace human-centered design to ensure data products actually meet user needs (28:29)
  • We reflect on our chat about UX, data products, and the “producty” approach to ML and analytics solutions (42:11) 
Quotes from Today’s Episode
  • "It’s not really fair to get really excited about what we hear about or see on LinkedIn, at conferences, etc. We get excited about the shiny things, and then want to go straight to it when [our] organization [may not be ] ready to do that, for a lot of reasons." - Karen Meppen (03:00)
  • "If you do not have support from leadership and this is not something [they are]  passionate about, you probably aren’t a great candidate for pursuing data products as a way of working." - Karen Meppen (08:30)
  • "Requirements are just friendly lies." - Karen, quoting Brian about how data teams need to interpret stakeholder requests  (13:27)
  • "The greatest challenge that we have in technology is not technology, it’s the people, and understanding how we’re using the technology to meet our needs." - Karen Meppen (24:04)
  • "You can’t automate something that you haven’t defined. For example, if you don’t have clarity on your tagging approach for your PII, or just the nature of all the metadata that you’re capturing for your data assets and what it means or how it’s handled—to make it good, then how could you possibly automate any of this that hasn’t been defined?" - Karen Meppen (38:35)
  • "Nothing upsets an end-user more than lifting-and-shifting an existing report with the same problems it had in a new solution that now they’ve never used before." - Karen Meppen (40:13)
  • “Early maturity may look different in many ways depending upon the nature of  business you’re doing, the structure of your data team, and how it interacts with folks.” (42:46) 
Links 
26 Nov 2024157 - How this materials science SAAS company brings PM+UX+data science together to help materials scientists accelerate R&D00:34:58

R&D for materials-based products can be expensive, because improving a product’s materials takes a lot of experimentation that historically has been slow to execute. In traditional labs, you might change one variable, re-run your experiment, and see if the data shows improvements in your desired attributes (e.g. strength, shininess, texture/feel, power retention, temperature, stability, etc.). However, today, there is a way to leverage machine learning and AI to reduce the number of experiments a material scientist needs to run to gain the improvements they seek. Materials scientists spend a lot of time in the lab—away from a computer screen—so how do you design a desirable informatics SAAS that actually works, and fits into the workflow of these end users?    

 

 

As the Chief Product Officer at MaterialsZone, Ori Yudilevich came on Experiencing Data with me to talk about this challenge and how his PM, UX, and data science teams work together to produce a SAAS product that makes the benefits of materials informatics so valuable that materials scientists depend on their solution to be time and cost-efficient with their R&D efforts.   

    We covered:
  • (0:45) Explaining what Ori does at MaterialZone and who their product serves
  • (2:28) How Ori and his team help make material science testing more efficient through their SAAS product
  • (9:37) How they design a UX that can work across various scientific domains
  • (14:08) How “doing product” at MaterialsZone matured over the past five years
  • (17:01) Explaining the "Wizard of Oz" product development technique
  • (21:09) The importance of integrating UX designers into the "Wizard of Oz"
  • (23:52) The challenges MaterialZone faces when trying to get users to adopt to their product
  • (32:42) Advice Ori would've given himself five years ago
  • (33:53) Where you can find more from MaterialsZone and Ori

 

 

Quotes from Today’s Episode
  • “The fascinating thing about materials science is that you have this variety of domains, but all of these things follow the same process. One of the problems [consumer goods companies] face is that they have to do lengthy testing of their products. This is something you can use machine learning to shorten. [Product research] is an iterative process that typically takes a long time. Using your data effectively and using machine learning to predict what can happen, what’s better to try out, and what will reduce costs can accelerate time to market.” - Ori Yudilevich (3:47)
  • “The difference [in time spent testing a product] can be up to 70% [i.e. you can run 70% fewer experiments using ML.]  That [also] means 70% less resources you’re using. Under the ‘old system’ of trial and error, you were just trying out a lot of things. The human mind cannot process a large number of parameters at once, so [a materials scientist] would just start playing only with [one parameter at a time]. You’ll have many experiments where you just try to optimize [for] one parameter, but then you might have 20, 30, or 100 more [to test]. Using machine learning, you can change a lot of parameters at once. The model can learn what has the most effect, what has a positive effect, and what has a negative effect. The differences can be really huge.” - Ori Yudilevich (5:50)
  • “Once you go deeper into a use case, you see that there are a lot of differences. The types of raw materials, the data structure, the quantity of data, etc. For example, with batteries, you have lots of data because you can test hundreds all at once. Whereas with something like ceramics, you don’t try so many [experiments]. You just can’t. It’s much slower. You can’t do so many [experiments] in parallel. You have much less data. Your models are different, and your data structure is different. But there’s also quite a lot of commonality because you’re storing the data. In the end, you have each domain, some raw materials, formulations, tests that you’re doing, and different statistical plots that are very common.” - Ori Yudilvech (11:24)
  • “We’ll typically do what we call the ‘Wizard of Oz’ technique. You simulate as if you have a feature, but you’re actually working for your client behind the scenes. You tell them [the simulated feature] is what you’re doing, but then measure [the client’s response] to understand if there’s any point in further developing that feature. Once you validate it, have enough data, and know where the feature is going, then you’ll start designing it and releasing it in incremental stages. We’ve made a lot of progress in how we discover opportunities and how we build something iteratively to make sure that we’re always going in the right direction” - Ori Yudilevich (15:56)
  • “The main problem we’re encountering is changing the mindset of users. Our users are not people who sit in front of a computer. These are researchers who work in [a materials science] lab. The challenge [we have] is getting people to use the platform more. To see it’s worth [their time] to look at some insights, and run the machine learning models. We’re always looking for ways to make that transition faster… and I think the key is making [the user experience] just fun, easy, and intuitive.” - Ori Yudilevich (24:17)
  • “Even if you make [the user experience] extremely smooth, if [users] don’t see what they get out of it, they’re still not going to [adopt your product] just for the sake of doing it. What we find is if this [product] can actually make them work faster or develop better products– that gets them interested. If you’re adopting these advanced tools, it makes you a better researcher and worker. People who [adopt those tools] grow faster. They become leaders in their team, and they slowly drag the others in.” - Ori Yudilevich (26:55)
  • “Some of [MaterialsZone’s] most valuable employees are the people who have been users. Our product manager is a materials scientist. I’m not a material scientist, and it’s hard to imagine being that person in the lab. What I think is correct turns out to be completely wrong because I just don’t know what it’s like. Having [material scientists] who’ve made the transition to software and data science? You can’t replace that.” - Ori Yudilevich (31:32)

 

 

Links Referenced

Website: https://www.materials.zone

LinkedIn: https://www.linkedin.com/in/oriyudilevich/

Email: ori@materials.zone

25 Jul 2023122 - Listener Questions Answered: Conducting Effective Discovery for Data Products with Brian T. O’Neill00:33:46

Today I’m answering a question that was submitted to the show by listener Will Angel, who asks how he can prioritize and scale effective discovery throughout the data product development process. Throughout this episode, I explain why discovery work is a process that should be taking place throughout the lifecycle of a project, rather than a defined period at the start of the project. I also emphasize the value of understanding the benefit users will see from the product as the main goal, and how to streamline the effectiveness of the discovery process. 

Highlights/ Skip to:

  • Brian introduces today’s topic, Discovery with Data Products, with a listener question (00:28)
  • Why Brian sees discovery work as something that is ongoing throughout the lifecycle of a project (01:53)
  • Brian tackles the first question of how to avoid getting killed by the process overhead of discovery and prioritization (03:38)
  • Brian discusses his take on the question, “What are the ultimate business and user benefits that the beneficiaries hope to get from the product?”(06:02)
  • The value Brian sees in stating anti-goals and anti-personas (07:47)
  • How creative work is valuable despite the discomfort of not being execution-oriented (09:35)
  • Why customer and stakeholder research activities need to be ongoing efforts (11:20)
  • The two modes of design that Brian uses and their distinct purposes (15:09)
  • Brian explains why a clear strategy is critical to proper prioritization (19:36)
  • Why doing a few things really well usually beats out delivering a bunch of features and products that don’t get used (23:24)
  • Brian on why saying “no” can be a gift when used correctly (27:18)
  • How you can join the Data Product Leadership Community for more dialog like this and how to submit your own questions to the show (32:25)
Quotes from Today’s Episode
  • “Discovery work, to me is something that largely happens up front at the beginning of a project, but it doesn’t end at the beginning of the project or product initiative, or whatever it is that you’re working on. Instead, I think discovery is a continual thing that’s going on all the time.” — Brian T. O’Neill (01:57)
  • “As tooling gets easier and easier and we need to stand up less infrastructure and basic pipelining in order to get from nothing to something, I think more of the work simply does become the discovery part of the work. And that is always going to feel somewhat inefficient because by definition it is.” — Brian T. O’Neill (04:48)
  • “Measuring [project management metrics] does not tell us whether or not the product is going to be valuable. It just tells us how fast are we writing the code and doing execution against something that may or may not actually have any value to the business at all.” — Brian T. O’Neill (07:33)
  • “How would you measure an improvement in the beneficiaries' lives? Because if you can improve their life in some way—and this often means me at work— the business value is likely to follow there.” — Brian T. O’Neill (18:42)
  • “Without a clear strategy, you’re not going to be able to do prioritization work efficiently because you don’t know what success looks like.” — Brian T. O’Neill (19:49)
  • “Doing a few things really well probably beats delivering a lot of stuff that doesn’t get used. There’s little point in a portfolio of data products that is really wide, but it’s very shallow in terms of value.” — Brian T. O’Neill (23:27)
  • “Anytime you’re going to be changing behavior or major workflows, the non-technical costs and work increase. And we have to figure out, ‘How are we going to market this and evangelize it and make people see the value of it?’ These types of behavior changes are really hard to implement and they need to be figured out during the design of the solution — not afterwards.” — Brian T. O’Neill (26:25)
Links
26 Jul 2022096 - Why Chad Sanderson, Head of Product for Convoy’s Data Platform, is a Champion of Data UX00:37:36

Today I chat with Chad Sanderson, Head of Product for Convoy’s data platform. I begin by having Chad explain why he calls himself a “data UX champion” and what inspired his interest in UX. Coming from a non-UX background, Chad explains how he came to develop a strategy for addressing the UX pain points at Convoy—a digital freight network. They “use technology to make freight more efficient, reducing costs for some of the nation’s largest brands, increasing earnings for carriers, and eliminating carbon emissions from our planet.” We also get into the metrics of success that Convoy uses to measure UX and why Chad is so heavily focused on user workflow when making the platform user-centered.

 

Later, Chad shares his definition of a data product, and how his experience with building software products has overlapped with data products. He also shares what he thinks is different about creating data products vs. traditional software products. Chad then explains Convoy’s approach to prototyping and the value of partnering with users in the design process. We wrap up by discussing how UX work gets accomplished on Chad’s team, given it doesn’t include any titled UX professionals. 

 

Highlights:

  • Chad explains how he became a data UX champion and what prompted him to care about UX (1:23)
  • Chad talks about his strategy for beginning to address the UX issues at Convoy (4:42)
  • How Convoy measures UX improvement (9:19)
  • Chad talks about troubleshooting user workflows and it’s relevance to design (15:28)
  • Chad explains what Convoy is and the makeup of his data platform team (21:00)
  • What is a data product? Chad gives his definition and the similarities and differences between building software versus data products (23:21)
  • Chad talks about using low fidelity work and prototypes to optimize solutions and resources in the long run (27:49)
  • We talk about the value of partnering with users in the design process (30:37)
  • Chad talks about the distribution of UX labor on his team (32:15)
  Quotes from Today’s Episode

 

Re: user research: "The best content that you get from people is when they are really thinking about what to say next; you sort of get into a free-flowing exchange of ideas. So it’s important to find the topic where someone can just talk at length without really filtering themselves. And I find a good place to start with that is to just talk about their problems. What are the painful things that you’ve experienced in data in the last month or in the last week?" - Chad 

 

Re: UX research: "I often recommend asking users to show you something they were working on recently, particularly when they were having a  problem accomplishing their goal. It’s a really good way to surface UX issues because the frustration is probably fresh." - Brian 

 

Re: user feedback, “One of the really great pieces of advice that I got is, if you’re getting a lot of negative feedback, this is actually a sign that people care. And if people care about what you’ve built, then it’s better than overbuilding from the beginning.” - Chad

 

“What we found [in our research around workflow], though, sometimes counterintuitively, is that the steps that are the easiest and simplest for a customer to do that I think most people would look at and say, ‘Okay, it’s pretty low ROI to invest in some automated solution or a product in this space,’ are sometimes the most important things that you can [address in your data product] because of the impacts that it has downstream.” - Chad 

 

Re: user feedback, “The amazing thing about building data products, and I guess any internal products is that 100% of your customers sit ten feet away from you. [...] When you can talk to 100% of [your users], you are truly going to understand [...] every single persona. And that is tremendously effective for creating compelling narratives about why we need to build a particular thing.” - Chad 

 

“If we can get people to really believe that this data product is going to solve the problem, then usually, we like to turn those people into advocates and evangelists within the company, and part of their job is to go out and convince other people about why this thing can solve the problem.” - Chad 

 

Links:
01 Oct 2024153 - What Impressed Me About How John Felushko Does Product and UX at the Analytics SAAS Company, LabStats00:57:31

In today’s episode, I’m joined by John Felushko, a product manager at LabStats who impressed me after we recently had a 1x1 call together. John and his team have developed a successful product that helps universities track and optimize their software and hardware usage so schools make smart investments. However, John also shares how culture and value are very tied together—and why their product isn’t a fit for every school, and every country. John shares how important  customer relationships are , how his team designs great analytics user experiences, how they do user research, and what he learned making high-end winter sports products that’s relevant to leading a SAAS analytics product. Combined with John’s background in history and the political economy of finance, John paints some very colorful stories about what they’re getting right—and how they’ve course corrected over the years at LabStats. 

   

Highlights/ Skip to:

  • (0:46) What is the LabStats product 
  • (2:59) Orienting analytics around customer value instead of IT/data
  • (5:51) "Producer of Persistently Profitable Product Process"
  • (11:22) How they make product adjustments based on previous failures
  • (15:55) Why a lack of cultural understanding caused LabStats to fail internationally
  • (18:43) Quantifying value beyond dollars and cents
  • (25:23) How John is able to work so closely with his customers without barriers
  • (30:24) Who makes up the LabStats product research team
  • (35:04) ​​How strong customer relationships help inform the UX design process
  • (38:29) Getting senior management to accept that you can't regularly and accurately predict when you’ll be feature-complete and ship
  • (43:51) Where John learned his skills as a successful product manager
  • (47:20) Where you can go to cultivate the non-technical skills to help you become a better SAAS analytics product leader
  • (51:00) What advice would John Felushko have given himself 10 years ago?
  • (56:19) Where you can find more from John Felushko

 

Quotes from Today’s Episode
  • “The product process is [essentially] really nothing more than the scientific method applied to business. Every product is an experiment - it has a hypothesis about a problem it solves. At LabStats [we have a process] where we go out and clearly articulate the problem. We clearly identify who the customers are, and who are [people at other colleges] having that problem. Incrementally and as inexpensively as possible, [we] test our solutions against those specific customers. The success rate [of testing solutions by cross-referencing with other customers] has been extremely high.” - John Felushko (6:46)
  • “One of the failures I see in Americans is that we don’t realize how much culture matters. Americans have this bias to believe that whatever is valuable in my culture is valuable in other cultures. Value is entirely culturally determined and subjective. Value isn’t a number on a spreadsheet. [LabStats positioned our producty] as something that helps you save money and be financially efficient. In French government culture, financial efficiency is not a top priority. Spending government money on things like education is seen as a positive good. The more money you can spend on it, the better.  So, the whole message of financial efficiency wasn’t going to work in that market.” - John Felushko (16:35)
  • “What I’m really selling with data products is confidence. I’m selling assurance. I’m selling an emotion. Before I was a product manager, I spent about ten years in outdoor retail, selling backpacks and boots. What I learned from that is you’re always selling emotion, at every level. If you can articulate the ROI, the real value is that the buyer has confidence they bought the right thing.” - John Felushko (20:29)
  • “[LabStats] has three massive, multi-million dollar horror stories in our past where we [spent] millions of dollars in development work for no results. No ROI. Horror stories are what shape people’s values more than anything else. Avoiding negative outcomes is what people avoid more than anything else. [It’s important to] tell those stories and perpetuate those [lessons] through the culture of your organization. These are the times we screwed up, and this is what we learned from it—do you want to screw up like that again because we learned not to do that.” - John Felushko (38:45)
  • “There’s an old description of a product manager, like, ‘Oh, they come across as the smartest person in the room.’ Well, how do you become that person? Expand your view, and expand the amount of information you consume as widely as possible. That’s so important to UX design and thinking about what went wrong. Why are some customers super happy and some customers not? What is the difference between those two groups of people? Is it culture? Is it time? Is it mental ability? Is it the size of the screen they’re looking at my product on? What variables can I define and rule out, and what data sources do I have to answer all those questions? It’s just the normal product manager thing—constant curiosity.” -John Felushko (48:04)
16 Apr 2024141 - How They’re Adopting a Producty Approach to Data Products at RBC with Duncan Milne00:43:49

In this week's episode of Experiencing Data, I'm joined by Duncan Milne, a Director, Data Investment & Product Management at the Royal Bank of Canada (RBC). Today, Duncan (who is also a member of the DPLC) gives a preview of his upcoming webinar on April 24, 2024 entitled, “Is that Data Product Worth Building? Estimating Economic Value…Before You Build It!”  Duncan shares his experience of implementing a product mindset within RBC's Chief Data Office, and he explains some of the challenges, successes, and insights gained along the way. He emphasizes the critical role of understanding user needs and evaluating the economic impact of data products—before they are built. Duncan was gracious to let us peek inside and see a transformation that is currently in progress and I’m excited to check out his webinar this month!

Highlights/ Skip to:

  • I introduce Duncan Milne from RBC (00:00)
  • Duncan outlines the Chief Data Office's function at RBC  (01:01)
  • We discuss data products and how they are used to improve business process (04:05)
  • The genesis behind RBC's move towards a product-centric approach in handling data, highlighting initial challenges and strategies for fostering a product mindset (07:26)
  • Duncan discusses developing a framework to guide the lifecycle of data products at RBC (09:29)
  • Duncan addresses initial resistance and adaptation strategies for engaging teams in a new product-centric methodology (12:04)
  • The scaling challenges of applying a product mindset across a large organization like RBC (22:02)
  • Insights into the framework for evaluating and prioritizing data product ideas based on their desirability, usability, feasibility, and viability. (26:30)
  • Measuring success and value in data product management (30:45)
  • Duncan explores process mapping challenges in banking (34:13)
  • Duncan shares creating specialized training for data product management at RBC (36:39)
  • Duncan offers advice and closing thoughts on data product management (41:38)
Quotes from Today’s Episode
  • “We think about data products as anything that solves a problem using data... it's helping someone do something they already do or want to do faster and better using data." - Duncan Milne (04:29)
  • “The transition to data product management involves overcoming initial resistance by demonstrating the tangible value of this approach." - Duncan Milne (08:38)
  • "You have to want to show up and do this kind of work [adopting a product mindset in data product management]…even if you do a product the right way, it doesn’t always work, right? The thing you make may not be desirable, it may not be as usable as it needs to be. It can be technically right and still fail. It’s not a guarantee, it’s just a better way of working.” - Brian T. O’Neill (15:03)
  • “[Product management]... it's like baking versus cooking. Baking is a science... cooking is much more flexible. It’s about... did we produce a benefit for users? Did we produce an economic benefit? ...It’s a multivariate problem... a lot of it is experimentation and figuring out what works." - Brian T. O'Neill (23:03)
  • "The easy thing to measure [in product management] is did you follow the process or not? That is not the point of product management at all. It's about delivering benefits to the stakeholders and to the customer." - Brian O'Neill (25:16)
  • “Data product is not something that is set in stone... You can leverage learnings from a more traditional product approach, but don’t be afraid to improvise." - Duncan Milne (41:38)
  • “Data products are fundamentally different from digital products, so even the traditional approach to product management in that space doesn’t necessarily work within the data products construct.” - Duncan Milne (41:55)
  • “There is no textbook for data product management; the field is still being developed…don’t be afraid to create your own answer if what exists out there doesn’t necessarily work within your context.”- Duncan Milne (42:17)
Links
13 Jun 2023119 - Skills vs. Roles: Data Product Management and Design with Nadiem von Heydebrand (Part 1)00:37:12

The conversation with my next guest was going so deep and so well…it became a two part episode! Today I’m chatting with Nadiem von Heydebrand, CEO of Mindfuel. Nadiem’s career journey led him from data science to data product management, and in this first, we will focus on the skills of data product management (DPM), including design. In part 2, we jump more into Nadiem’s take on the role of the DPM. Nadiem gives actionable insights into the realities of data product management, from the challenges of actually being able to talk to your end users, to focusing on the problems and unarticulated needs of your users rather than solutions. Nadiem and I also discuss how data product managers oversee a portfolio of initiatives, and why it’s important to view that portfolio as a series of investments. Nadiem also emphasizes the value of having designers on a data team, and why he hopes we see more designers in the industry. 

Highlights/ Skip to:

  • Brian introduces Nadiem and his background going from data science to data product management (00:36)
  • Nadiem gives not only his definition of a data product, but also his related definitions of ‘data as product,’ ‘data as information,’ and ‘data as a model’ products (02:19)
  • Nadiem outlines the skill set and activities he finds most valuable in a data product manager (05:15)
  • How a data organization typically functions and the challenges a data team faces to prove their value (11:20)
  • Brian and Nadiem discuss the challenges and realities of being able to do discovery with the end users of data products (17:42)
  • Nadiem outlines how a portfolio of data initiatives has a certain investment attached to it and why it’s important to generate a good result from those investments (21:30)
  • Why Nadiem wants to see more designers in the data product space and the problems designers solve for data teams (25:37)
  • Nadiem shares a story about a time when he wished he had a designer to convert the expressed needs of the  business into the true need of the customer (30:10)
  • The value of solving for the unarticulated needs of your product users, and Nadiem shares how focusing on problems rather than solutions helped him (32:32)
  • Nadiem shares how you can connect with him and find out more about his company, Mindfuel (36:07)
Quotes from Today’s Episode
  • “The product mindset already says it quite well. When you look into classical product management, you have something called the viability, the desirability, the feasibility—so these are three very classic dimensions of product management—and the fourth dimension, we at Mindfuel define for ourselves and for applications are, is the datability.” — Nadiem von Heydebrand (06:51)
  • “We can only prove our [data team’s] value if we unlock business opportunities in their [clients’] lines of businesses. So, our value contribution is indirect. And measuring indirect value contribution is very difficult in organizations.” — Nadiem von Heydebrand (11:57)
  • “Whenever we think about data and analytics, we put a lot of investment and efforts in the delivery piece. I saw a study once where it said 3% of investments go into discovery and 90% of investments go into delivery and the rest is operations and a little bit overhead and all around. So, we have to balance and we have to do proper discovery to understand what problem do we want to solve.” — Nadiem von Heydebrand (13:59)
  • “The best initiatives I delivered in my career, and also now within Mindfuel, are the ones where we try to build an end responsibility from the lines of businesses, among the product managers, to PO, the product owner, and then the delivery team.” – Nadiem von Heydebrand (17:00)
  • “As a consultant, I typically think in solutions. And when we founded Mindfuel, my co-founder forced me to avoid talking about the solution for an entire ten months. So, in whatever meeting we were sitting, I was not allowed to talk about the solution, but only about the problem space.”  – Nadiem von Heydebrand (34:12)
  • “In scaled organizations, data product managers, they typically run a portfolio of data products, and each single product can be seen a little bit like from an investment point of view, this is where we putting our money in, so that’s the reason why we also have to prioritize the right use cases or product initiatives because typically we have limited resources, either it is investment money, people, resources or our time.” – Nadiem von Heydebrand (24:02)
  • “Unfortunately, we don’t see enough designers in data organizations yet. So, I would love to have more design people around me in the data organizations, not only from a delivery perspective, having people building amazing dashboards, but also, like, truly helping me in this kind of discovery space.” – Nadiem von Heydebrand (26:28)
Links
06 Sep 2022099 - Don’t Boil the Ocean: How to Generate Business Value Early With Your Data Products with Jon Cooke, CTO of Dataception00:48:28

Today I’m sitting down with Jon Cooke, founder and CTO of Dataception, to learn his definition of a data product and his views on generating business value with your data products. In our conversation, Jon explains his philosophy on data products and where design and UX fit in. We also review his conceptual model for data products (which he calls the data product pyramid), and discuss how together, these concepts allow teams to ship working solutions faster that actually produce value. 

 

Highlights/ Skip to:

  • Jon’s definition of a data product (1:19) 
  • Brian explains how UX research and design planning can and should influence data architecture —so that last mile solutions are useful and usable (9:47)
  • The four characteristics of a data product in Jon’s model (16:16)
  • The idea of products having a lifecycle with direct business/customer interaction/feedback (17:15)
  • Understanding Jon’s data product pyramid (19:30)
  • The challenges when customers/users don’t know what they want from data product teams - and who should be doing the work to surface requirements (24:44)
  • Mitigating risk and the importance of having management buy-in when adopting a product-driven approach (33:23)
  • Does the data product pyramid account for UX? (35:02)
  • What needs to change in an org model that produces data products that aren’t delivering good last mile UXs (39:20)

 

Quotes from Today’s Episode
  • “A data product is something that specifically solves a business problem, a piece of analytics, data use case, a pipeline, datasets, dashboard, that type that solves a business use case, and has a customer, and as a product lifecycle to it.” - Jon (2:15)

 

  • “I’m a fan of any definition that includes some type of deployment and use by some human being. That’s the end of the cycle, because the idea of a product is a good that has been made, theoretically, for sale.” - Brian (5:50)

 

  • “We don’t build a lot of stuff around cloud anymore. We just don’t build it from scratch. It’s like, you know, we don’t generate our own electricity, we don’t mill our own flour. You know, the cloud—there’s a bunch of composable services, which I basically pull together to build my application, whatever it is. We need to apply that thinking all the way through the stack, fundamentally.” - Jon (13:06)

 

  • “It’s not a data science problem, it’s not a business problem, it’s not a technology problem, it’s not a data engineering problem, it’s an everyone problem. And I advocate small, multidisciplinary teams, which have a business value person in it, have an SME, have a data scientist, have a data architect, have a data engineer, as a small pod that goes in and answer those questions.” - Jon (26:28)

 

  • “The idea is that you’re actually building the data products, which are the back-end, but you’re actually then also doing UX alongside that, you know? You’re doing it in tandem.” - Jon (37:36)

 

  • “Feasibility is one of the legs of the stools. There has to be market need, and your market just may be the sales team, but there needs to be some promise of value there that this person is really responsible for at the end of the day, is this data product going to create value or not?” - Brian (42:35)

 

  • “The thing about data products is sometimes you don’t know how feasible it is until you actually look at the data…You’ve got to do what we call data archaeology. You got to go and find the data, you got to brush it off, and you’re looking at and go, ‘Is it complete?’” - Jon (44:02)
  Links Referenced:
11 Jun 2024145 - Data Product Success: Adopting a Customer-Centric Approach With Malcolm Hawker, Head of Data Management at Profisee00:53:09

Wait, I’m talking to a head of data management at a tech company? Why!? Well, today I'm joined by Malcolm Hawker to get his perspective around data products and what he’s seeing out in the wild as Head of Data Management at Profisee. Why Malcolm? Malcolm was a former head of product in prior roles, and for several years, I’ve enjoyed Malcolm’s musings on LinkedIn about the value of a product-oriented approach to ML and analytics. We had a chance to meet at CDOIQ in 2023 as well and he went on my “need to do an episode” list! 

 

According to Malcom, empathy is the secret to addressing key UX questions that ensure adoption and business value. He also emphasizes the need for data experts to develop business skills so that they're seen as equals by their customers. During our chat, Malcolm stresses the benefits of a product- and customer-centric approach to data products and what data professionals can learn approaching problem solving with a product orientation. 

 

Highlights/ Skip to:
  • Malcolm’s definition of a data product (2:10)
  • Understanding your customers’ needs is the first step toward quantifying the benefits of your data product (6:34)
  • How product makers can gain access to users to build more successful products (11:36) 
  • Answering the UX question to get past the adoption stage and provide business value (16:03)
  • Data experts must develop business expertise if they want to be seen as equals by potential customers (20:07)
  • What people really mean by “data culture" (23:02)
  • Malcolm’s data product journey and his changing perspective (32:05)
  • Using empathy to provide a better UX in design and data (39:24)
  • Avoiding the death of data science by becoming more product-driven (46:23)
  • Where the majority of data professionals currently land on their view of product management for data products (48:15)
Quotes from Today’s Episode
  • “My definition of a data product is something that is built by a data and analytics team that solves a specific customer problem that the customer would otherwise be willing to pay for. That’s it.” - Malcolm Hawker (3:42)
  • “You need to observe how your customer uses data to make better decisions, optimize a business process, or to mitigate business risk. You need to know how your customers operate at a very, very intimate level, arguably, as well as they know how their business processes operate.” - Malcolm Hawker (7:36)
  • “So, be a problem solver. Be collaborative. Be somebody who is eager to help make your customers’ lives easier. You hear "no" when people think that you’re a burden. You start to hear more “yeses” when people think that you are actually invested in helping make their lives easier.” - Malcolm Hawker (12:42)
  • “We [data professionals] put data on a pedestal. We develop this mindset that the data matters more—as much or maybe even more than the business processes, and that is not true. We would not exist if it were not for the business. Hard stop.” - Malcolm Hawker (17:07)
  • “I hate to say it, I think a lot of this data stuff should kind of feel invisible in that way, too. It’s like this invisible ally that you’re not thinking about the dashboard; you just access the information as part of your natural workflow when you need insights on making a decision, or a status check that you’re on track with whatever your goal was. You’re not really going out of mode.” - Brian O’Neill (24:59)
  • “But you know, data people are basically librarians. We want to put things into classifications that are logical and work forwards and backwards, right? And in the product world, sometimes they just don’t, where you can have something be a product and be a material to a subsequent product.” - Malcolm Hawker (37:57)
  • “So, the broader point here is just more of a mindset shift. And you know, maybe these things aren’t necessarily a bad thing, but how do we become a little more product- and customer-driven so that we avoid situations where everybody thinks what we’re doing is a time waster?” - Malcolm Hawker (48:00)
Links
05 Sep 2023125 - Human-Centered XAI: Moving from Algorithms to Explainable ML UX with Microsoft Researcher Vera Liao00:44:42

Today I’m joined by Vera Liao, Principal Researcher at Microsoft. Vera is a part of the FATE (Fairness, Accountability, Transparency, and Ethics of AI) group, and her research centers around the ethics, explainability, and interpretability of AI products. She is particularly focused on how designers design for explainability. Throughout our conversation, we focus on the importance of taking a human-centered approach to rendering model explainability within a UI, and why incorporating users during the design process informs the data science work and leads to better outcomes. Vera also shares some research on why example-based explanations tend to out-perform [model] feature-based explanations, and why traditional XAI methods LIME and SHAP aren’t the solution to every explainability problem a user may have.

 

Highlights/ Skip to:

  • I introduce Vera, who is Principal Researcher at Microsoft and whose research mainly focuses on the ethics, explainability, and interpretability of AI (00:35)
  • Vera expands on her view that explainability should be at the core of ML applications (02:36)
  • An example of the non-human approach to explainability that Vera is advocating against (05:35)
  • Vera shares where practitioners can start the process of responsible AI (09:32)
  • Why Vera advocates for doing qualitative research in tandem with model work in order to improve outcomes (13:51)
  • I summarize the slides I saw in Vera’s deck on Human-Centered XAI and Vera expands on my understanding (16:06)
  • Vera’s success criteria for explainability (19:45)
  • The various applications of AI explainability that Vera has seen evolve over the years (21:52)
  • Why Vera is a proponent of example-based explanations over model feature ones (26:15)
  • Strategies Vera recommends for getting feedback from users to determine what the right explainability experience might be (32:07)
  • The research trends Vera would most like to see technical practitioners apply to their work (36:47)
  • Summary of the four-step process Vera outlines for Question-Driven XAI design (39:14)

 

Links
04 May 2021064 - How AI Shapes the Products of Startups in MIT’s “Tough Tech” Venture Fund, The Engine feat. General Partner, Reed Sturtevant00:28:59

Reed Sturtevant sees a lot of untapped potential in “tough tech.”

As a General Partner at The Engine, a venture capital firm launched by MIT, Reed and his colleagues invest in companies with breakthrough technology that, if successful, could positively transform the world.

 

It’s been about 15 years since I’ve last caught up to Reed—who was CTO at a startup we worked at together—so I’m so excited to welcome him on this episode of Experiencing Data! Reed and I talked about AI and how some of the portfolio companies in his fund are using data to produce better products, solutions, and inventions to tackle some of the world’s toughest challenges. 

 

In our chat, we covered:

  • How Reed's venture capital firm, The Engine, is investing in technology driven businesses focused on making positive social impacts. (0:28)
  • The challenges that technical PhDs and postdocs face when transitioning from academia to entrepreneurship. (2:22)
  • Focusing on a greater mission: The importance of self-examining whether an invention would be a good business. (5:16)
  • How one technology business invested in by The Engine, The Routing Company, is leveraging AI and data to optimize public transportation and bridge service gaps. (9:05)
  • Understanding and solving a problem: Using ‘design exercises’ to find successful market fits for existing technological solutions. (16:53)
  • Solutions first, problems second: Why asking the right questions is key to mapping a technological solution back to a problem in the market. (19:31)
  • Understanding and articulating a product’s value to potential buyers. (22:54)
  • How the go-to-market strategies of software companies have changed over the last few decades. (26:16)
Resources and Links: Quotes from Today’s Episode

There have been a couple of times while working at The Engine when I’ve taken it as a sign of maturity when a team self-examines whether their invention is actually the right way to build a business. - Reed (5:59)

 

For some of the data scientists I know, particularly with AI, executive teams can mandate AI without really understanding the problem they want to solve.   It actually pushes the problem discovery onto the solution people — but they’re not always the ones trained to go find the problems. - Brian (19:42)

 

You can keep hitting people over the head with a product, or you can go figure out what people care about and determine how you can slide your solution into something they care about. ... You don’t know that until you go out and talk to them,listen, and and get in to their world. And I think that’s still something that’s not happening a lot with data teams. - Brian (24:45)

 

I think there really is a maturity among even the early stage teams now, where they can have a shelf full of techniques that they can just pick and choose from in terms of how to build a product, how to put it in front of people, and how to have the [user] experience be a gentle on-ramp. - Reed, on startups (27:29)

23 Jul 2024148 - UI/UX Design Considerations for LLMs in Enterprise Applications (Part 2)00:26:36

Ready for more ideas about UX for AI and LLM applications in enterprise environments? In part 2 of my topic on UX considerations for LLMs, I explore how an LLM might be used for a fictitious use case at an insurance company—specifically, to help internal tools teams to get rapid access to primary qualitative user research. (Yes, it’s a little “meta”, and I’m also trying to nudge you with this hypothetical example—no secret!) ;-) My goal with these episodes is to share questions you might want to ask yourself such that any use of an LLM is actually contributing to a positive UX outcome  Join me as I cover the implications for design, the importance of foundational data quality, the balance between creative inspiration and factual accuracy, and the never-ending discussion of how we might handle hallucinations and errors posing as “facts”—all with a UX angle. At the end, I also share a personal story where I used an LLM to help me do some shopping for my favorite product: TRIP INSURANCE! (NOT!) 

    Highlights/ Skip to:
  • (1:05) I introduce a hypothetical  internal LLM tool and what the goal of the tool is for the team who would use it 
  • (5:31) Improving access to primary research findings for better UX 
  • (10:19) What “quality data” means in a UX context
  • (12:18) When LLM accuracy maybe doesn’t matter as much
  • (14:03) How AI and LLMs are opening the door for fresh visioning work
  • (15:38) Brian’s overall take on LLMs inside enterprise software as of right now
  • (18:56) Final thoughts on UX design for LLMs, particularly in the enterprise
  • (20:25) My inspiration for these 2 episodes—and how I had to use ChatGPT to help me complete a purchase on a website that could have integrated this capability right into their website

 

 

Quotes from Today’s Episode
  • “If we accept that the goal of most product and user experience research is to accelerate the production of quality services, products, and experiences, the question is whether or not using an LLM for these types of questions is moving the needle in that direction at all. And secondly, are the potential downsides like hallucinations and occasional fabricated findings, is that all worth it? So, this is a design for AI problem.” - Brian T. O’Neill (8:09)
  • “What’s in our data? Can the right people change it when the LLM is wrong? The data product managers and AI leaders reading this or listening know that the not-so-secret path to the best AI is in the foundational data that the models are trained on. But what does the word *quality* mean from a product standpoint and a risk reduction one, as seen from an end-users’ perspective? Somebody who’s trying to get work done? This is a different type of quality measurement.” - Brian T. O’Neill (10:40)
  • “When we think about fact retrieval use cases in particular, how easily can product teams—internal or otherwise—and end-users understand the confidence of responses? When responses are wrong, how easily, if at all, can users and product teams update the model’s responses? Errors in large language models may be a significant design consideration when we design probabilistic solutions, and we no longer control what exactly our products and software are going to show to users. If bad UX can include leading people down the wrong path unknowingly, then AI is kind of like the team on the other side of the tug of war that we’re playing.” - Brian T. O’Neill (11:22)
  • “As somebody who writes a lot for my consulting business, and composes music in another, one of the hardest parts for creators can be the zero-to-one problem of getting started—the blank page—and this is a place where I think LLMs have great potential. But it also means we need to do the proper research to understand our audience, and when or where they’re doing truly generative or creative work—such that we can take a generative UX to the next level that goes beyond delivering banal and obviously derivative content.” - Brian T. O’Neill (13:31)
  • “One thing I actually like about the hype, investment, and excitement around GenAI and LLMs in the enterprise is that there is an opportunity for organizations here to do some fresh visioning work. And this is a place that designers and user experience professionals can help data teams as we bring design into the AI space.” - Brian T. O’Neill (14:04)
  • “If there was ever a time to do some new visioning work, I think now is one of those times. However, we need highly skilled design leaders to help facilitate this in order for this to be effective. Part of that skill is knowing who to include in exercises like this, and my perspective, one of those people, for sure, should be somebody who understands the data science side as well, not just the engineering perspective. And as I posited in my seminar that I teach, the AI and analytical data product teams probably need a fourth member. It’s a quartet and not a trio. And that quartet includes a data expert, as well as that engineering lead.” - Brian T. O’Neill (14:38)

 

 

Links
09 Jan 2024134 - What Sanjeev Mohan Learned Co-Authoring “Data Products for Dummies”00:46:52

In this episode, I’m chatting with former Gartner analyst Sanjeev Mohan who is the Co-Author of Data Products for Dummies. Throughout our conversation, Sanjeev shares his expertise on the evolution of data products, and what he’s seen as a result of implementing practices that prioritize solving for use cases and business value. Sanjeev also shares a new approach of structuring organizations to best implement ownership and accountability of data product outcomes. Sanjeev and I also explore the common challenges of product adoption and who is responsible for user experience. I purposefully had Sanjeev on the show because I think we have pretty different perspectives from which we see the data product space.

Highlights/ Skip to:

  • I introduce Sanjeev Mohan, co-author of Data Products for Dummies (00:39)
  • Sanjeev expands more on the concept of writing a “for Dummies” book   (00:53)
  • Sanjeev shares his definition of a data product, including both a technical and a business definition (01:59)
  • Why Sanjeev believes organizational changes and accountability are the keys to preventing the acceleration of shipping data products with little to no tangible value (05:45)
  • How Sanjeev recommends getting buy-in for data product ownership from other departments in an organization (11:05)
  • Sanjeev and I explore adoption challenges and the topic of user experience (13:23)
  • Sanjeev explains what role is responsible for user experience and design (19:03)
  • Who should be responsible for defining the metrics that determine business value (28:58)
  • Sanjeev shares some case studies of companies who have adopted this approach to data products and their outcomes (30:29)
  • Where companies are finding data product managers currently (34:19)
  • Sanjeev expands on his perspective regarding the importance of prioritizing business value and use cases (40:52)
  • Where listeners can get Data Products for Dummies, and learn more about Sanjeev’s work (44:33)
Quotes from Today’s Episode
  • “You may slap a label of data product on existing artifact; it does not make it a data product because there’s no sense of accountability. In a data product, because they are following product management best practices, there must be a data product owner or a data product manager. There’s a single person [responsible for the result]. — Sanjeev Mohan (09:31)
  • “I haven’t even mentioned the word data mesh because data mesh and data products, they don’t always have to go hand-in-hand. I can build data products, but I don’t need to go into the—do all of data mesh principles.” – Sanjeev Mohan (26:45)
  • “We need to have the right organization, we need to have a set of processes, and then we need a simplified technology which is standardized across different teams. So, this way, we have the benefit of reusing the same technology. Maybe it is Snowflake for storage, DBT for modeling, and so on. And the idea is that different teams should have the ability to bring their own analytical engine.” – Sanjeev Mohan (27:58)
  • “Generative AI, right now as we are recording, is still in a prototyping phase. Maybe in 2024, it’ll go heavy-duty production. We are not in prototyping phase for data products for a lot of companies. They’ve already been experimenting for a year or two, and now they’re actually using them in production. So, we’ve crossed that tipping point for data products.” – Sanjeev Mohan (33:15)
  • “Low adoption is a problem that’s not just limited to data products. How long have we had data catalogs, but they have low adoption. So, it’s a common problem.” – Sanjeev Mohan (39:10)
  • “That emphasis on technology first is a wrong approach. I tell people that I’m sorry to burst your bubble, but there are no technology projects, there are only business projects. Technology is an enabler. You don’t do technology for the sake of technology; you have to serve a business cause, so let’s start with that and keep that front and center.” – Sanjeev Mohan (43:03)
Links
30 Nov 2021079 - How Sisu’s CPO, Berit Hoffmann, Is Approaching the Design of Their Analytics Product…and the UX Mistakes She Won’t Make Again00:36:02

Berit Hoffmann, Chief Product Officer at Sisu, tackles design from a customer-centric perspective with a focus on finding problems at their source and enabling decision making. However, she had to learn some lessons the hard way along the road, and in this episode, we dig into those experiences and what she’s now doing differently in her current role as a CPO.

In particular, Berit reflects on her “ivory tower design” experience at a past startup called Bebop. In that time, she quickly realized the importance of engaging with customer needs and building intuitive and simple solutions for complex problems. Berit also discusses the Double Diamond Process and how it shapes her own decision-making and the various ways she carries her work at Sisu.

 

In this episode, we also cover:

  • How Berit’s “ivory tower design experience” at Bebop taught her the importance of dedicating time to focus on the customer. (01:31)
  • What Berit looked for as she researched Sisu prior to joining - and how she and Peter Bailis, Founder and CEO, share the same philosophy on what a product’s user experience should look like. (03:57)
  • Berit discusses the Double Diamond Process and the life cycle of designing a project - and shares her take on designing for decision making. (10:17)
  • Sisu’s shift from answering the why to the what - and how they approach user testing using product as a metric layer. (19:10)
  • Berit explores the tension that can arise when designing a decision support tool. (31:03)
Quotes from Today’s Episode
  • “I kind of learned the hard way, the importance of spending that time with customers upfront and really digging into understanding what problems are most challenging for them. Those are the problems to solve, not the ones that you as a product manager or as a designer think are most important. It is a lesson I carry forward with me in terms of how I approach anything I'm going to work on now. The sooner I can get it in front of users, the sooner I can get feedback and really validate or invalidate my assumptions, the better because they're probably going to tell me why I'm wrong.”- Berit Hoffmann (03:15)

 

  • “As a designer and product thinker, the problem finding is almost more important than the solutioning because the solution is easy when you really understand the need. It's not hard to come up with good solutions when the need is so clear, which you can only get through conversation, inquiry, shadowing, and similar research and design methods.” - Brian T. O’Neill (@rhythmspice) (10:54)

 

  • “Decision-making is a human process. There's no world in which you're going to spit out an answer and say, ‘just go do it.’ Software is always going to be missing the rich context and expertise that humans have about their business and the context in which they're making the decision. So, what that says to me is inherently, decision-making is also going to be an iterative process. [...] What I think technology can do is it can automate and accelerate a lot of the manual repetitive steps in the analysis that are taking up a bunch of time today. Especially as data is getting exponentially more complex and multi-dimensional.”- Berit Hoffmann  (17:44)

 

  • “When we talk to people about solving problems, 9 out of 10 people say they would add something to whatever it is that you're making to make it better. So often, when designers think about modernism, it is very much about ‘what can I take away that will help it make it better?’ And, I think this gets lost. The tendency with data, when you think about how much we're collecting and the scale of it, is that adding it is always going to make it better and it doesn't make it better all the time. It can slow things down and cause noise. It can make people ask even more questions. When in reality, the goal is to make a decision.”- Brian T. O’Neill (@rhythmspice) (30:11)

 

  • “I’m trying to resist the urge to get industry-specific or metric specific in any of the kind of baseline functionality in the product. And instead, say that we can experiment in a lightweight way in terms of outside of the product, health content, guidance on best practices, etc. That is going to be a constant tension because the types of decisions that you enact and the types of questions you're digging into are really different depending on whether you're a massive hotel chain compared to a quick-service restaurant compared to a B2B SAAS company. The personas and the questions are so different. So that's a tension that I think is really interesting when you think about the decision-making workflow and who those stakeholders are.”- Berit Hoffmann (32:05)
Links Referenced
15 Nov 2022104 - Surfacing the Unarticulated Needs of Users and Stakeholders through Effective Listening00:44:12

Today I’m chatting with Indi Young, independent qualitative data scientist and author of Time to Listen. Indi explains how it is possible to gather and analyze qualitative data in a way that is meaningful to the desired future state of your users, and that learning how to listen and not just interview users is much like learning to ride a bicycle. Listen (!) to find out why pushing back is a necessary part of the design research process, how to build an internal sensor that allows you to truly uncover the nuggets of information that are critical to your projects, and the importance of understanding thought processes to prevent harmful outcomes.

 

Highlights/ Skip to:

  • Indi introduces her perspective on analyzing qualitative data sets (00:51)
  • Indi’s motivation for working in design research and the importance of being able to capture and understand patterns to prevent harmful outcomes (05:09)
  • The process Indi goes through for problem framing and understanding a user’s desired future state (11:11)
  • Indi explains how to listen effectively in order to understand the thinking style of potential end users (15:42)
  • Why Indi feels pushing back on problems within projects is a vital part of taking responsibility and her recommendations for doing so effectively (21:45)
  • The importance Indi sees in building up a sensor in order to be able to detect nuggets clients give you for their upcoming projects (28:25)
  • The difference in techniques Indi observes between an interview, a listening session, and a survey (33:13)
  • Indi describes her published books and reveals which one she’d recommend listeners start with (37:34)
Quotes from Today’s Episode
  • “A lot of qualitative data is not trusted, mainly because the people who are doing the not trusting have encountered bad qualitative data.” — Indi Young (03:23)
  • “When you’re learning to ride a bike, when you’re learning to decide what knowledge is needed, you’re probably going to burn through a bunch of money-making knowledge that never gets used. So, that’s when you start to learn, ‘I need to frame this better, and to frame it, I can’t do it by myself.’” – Indi Young (11:57)
  • “What you want to do is get beyond the exterior and get to the interior, which is where somebody tells you what actually went through their mind when they did that thing in the past, not what’s going through their mind right now. And it’s that’s a very important distinction.” – Indi Young (20:28)
  • “Re: dealing with stakeholders: You’re not doing your job if you don’t push back. You built up a lot of experience, you got hired, they hired you and your thinking and your experience, and if what went through your mind is, like, ‘This is wrong,’ but you don’t act on it, then they should not pay you a salary.” – Indi Young (22:45)
  • “I’ve seen a lot of people leave their perfectly promising career because it was too hard to get to the point of accepting that you have to network, that I’m not going to be that one-in-a-million person who’s the brilliant person with a brilliant idea and get my just rewards that way.” – Indi Young (25:13)
  • “What’s really interesting about a listening session is that it doesn’t—aside from building this sensor and learning what the techniques are for helping a person get to their interior cognition rather than that exterior … to get past that into the inner thinking, the emotional reactions, and the guiding principles, aside from the sensor and those techniques, there’s not much to it.” – Indi Young (32:45)
  • “And once you start building that [sensor], and this idea of just having one generative question about the purpose—because the whole thing is framed by the purpose—there you go. Get started. You have to practice. So, it’s like riding a bike. Go for it. You won’t have those sensors at first, but you’ll start to learn how to build them.” – Indi Young (36:41)
Links Referenced:
07 Sep 2021073 - Addressing the Functional and Emotional Needs of Users When Designing Data Products with Param Venkataraman00:37:41
Episode Description

Simply put, data products help users make better decisions and solve problems with information. But how effective can data products be if designers don’t take the time to explore the complete needs of users?

To Param Venkataraman, Chief Design Officer at Fractal Analytics, having an understanding of the “human dimension” of a problem is crucial to creating data solutions that create impact.

On this episode of Experiencing Data, Param and I talk more about his concept of ‘attractive non-conscious design,’ the core skills of a professional designer, and why Fractal has a c-suite design officer and is making large investments in UX. 

 

In our chat, we covered:

  • Param's role as Chief Design Officer at Fractal Analytics, and the company's sharp focus on the 'human dimension' of enterprise data products. (2:04)
  • 'Attractive non-conscious design': Creating easy-to-use, 'delightful' data products that help end-users make better decisions by focusing on their needs. (5:32)
  • The importance of understanding the 'emotional need' of users when designing enterprise data products. (9:07)
  • Why designers as well as data science and analytics teams should focus more on the emotional and human element when building data products. (16:15)
  • 'The next version of design': Why and how Param believes the classic design thinking model must adapt to the 'post-data science world.' (21:39)
  • The core competencies of a professional designer and how it relates to data products. (25:59)
  • Why non-designers should learn the principles of good design — and how Fractal’s internal Phi Design System helps frame problems from the perspective of a data product's end-user, leading to better solutions. (27:51)
  • Why Param believes the coming together of design and data still needs time to mature. (33:40)
Quotes from Today’s Episode

“When you look at analytics and the AI space … there is so much that is about how do you use ... machine learning … [or] any other analytics technology or solutions — and how do you make better effective decisions? That’s at the heart of it, which is how do we make better decisions?” - Param Venkataraman (@onwardparam) (6:23) 

 

“[When it comes to business software,] most of it should be invisible; you shouldn’t really notice it. And if you’re starting to notice it, you’re probably drawing attention to the wrong thing because you’re taking people out of flow.” - Brian O’Neill (@rhythmspice) (8:57)

 

“Design is kind of messy … there’s sort of a process ... but it’s not always linear, and we don’t always start at step zero. … You might come into something that’s halfway done and the first thing we do is run a usability study on a competitor’s thing, or on what we have now, and then we go back to step two, and then we go to five. It’s not serial, and it’s kind of messy, and that’s normal.” - Brian O’Neill (@rhythmspice) (16:18)

 

“Just like design is iterative, data science also is very iterative. There’s the idea of hypothesis, and there’s an idea of building and experimenting, and then you sort of learn and your algorithm learns, and then you get better and better at it.” - Param Venkataraman (@onwardparam) (18:05)

 

“The world of data science is not used to thinking in terms of emotion, experience, and the so-called softer aspects of things, which in my opinion, is not actually the softer; it’s actually the hardest part. It’s harder to dimensionalize emotion, experience, and behavior, which is … extremely complex, extremely layered, [and] extremely unpredictable. … I think the more we can bring those two worlds together, the world of evidence, the world of data, the world of quantitative information with the qualitative, emotional, and experiential, I think that’s where the magic is.” - Param Venkataraman (@onwardparam) (21:02)

 

“I think the coming together of design and data is... a new thing. It’s unprecedented. It’s a bit like how the internet was a new thing back in the mid ’90s. We were all astounded by it, we didn’t know what to do with it, and everybody was just fascinated with it. And we just knew that it’s going to change the world in some way. … Design and data will take some time to mature, and what’s more important is to go into it with an open mind and experiment. And I’m saying this for both designers as well as data scientists, to try and see how the right model might evolve as we experiment and learn.” - Param Venkataraman (@onwardparam) (33:58)

Links Referenced
03 Oct 2023127 - On the Road to Adopting a “Producty” Approach to Data Products at the UK’s Care Quality Commission with Jonathan Cairns-Terry00:36:55

Today I’m joined by Jonathan Cairns-Terry, who is the Head of Insight Products at the Care Quality Commission. The Care Quality Commission is the the regulator for England for health and social care, and Jonathan recently joined their data team and is working to transform their approach to be more product-led and user-centric. Throughout our conversation, Jonathan shares valuable insights into what the first year of that type of shift looks like, and why it’s important to focus on outcomes, and how he measures progress. Jonathan and I explore the signals that told Jonathan it’s time for his team to invest in a designer, the benefits he’s gotten from UX research on his team, and the recent successes that Jonathan’s team is seeing as a result of implementing this approach. Jonathan is also a Founding Member of the Data Product Leadership Community and we discuss his upcoming webinar for the group on Oct 12, 2023.

 

Highlights/ Skip to:

  • I introduce Jonathan, who is the Head of Insight Products at the Care Quality Commission in the UK (00:37)
  • How Jonathan went from being a “maths person” to being a “product person” (01:02)
  • Who uses the data products that Jonthan makes at the Care Quality Commission (02:44)
  • Jonathan describes the recent transition towards a product focus (03:45)
  • How Jonathan expresses and measures the benefit and purpose of a product-led orientation, and how the team has embraced the transformation (07:08)
  • The nuance between evaluating outcomes and measuring outputs in a product-led approach, and how UX research has impacted Jonathan’s team (12:53)
  • What signals Jonathan received that told him it’s time to hire a designer (17:05)
  • How Jonathan’s team approaches shadowing users (21:20)
  • Some of the recent successes of the product-led approach Jonathan is implementing on his team (25:28)
  • What Jonathan would change if he had to start the process of moving to outcomes over outputs with his team all over again (30:04)
  • Get the full scoop on the topics discussed in this episode on October 12, 2023 when Jonathan presents his deep-dive webinar to the Data Product Leadership Community. Available to members only. Apply today.

Links

30 Apr 2024142 - Live Webinar Recording: My UI/UX Design Audit of a New Podcast Analytics Service w/ Chris Hill (CEO, Humblepod)00:50:56

Welcome to a special edition of Experiencing Data. This episode is the audio capture from a live Crowdcast video webinar I gave on April 26th, 2024 where I conducted a mini UI/UX design audit of a new podcast analytics service that Chris Hill, CEO of Humblepod, is working on to help podcast hosts grow their show. Humblepod is also the team-behind-the-scenes of Experiencing Data, and Chris had asked me to take a look at his new “Listener Lifecycle” tool to see if we could find ways to improve the UX and visualizations in the tool, how we might productize this MVP in the future, and how improving the tool’s design might help Chris help his prospective podcast clients learn how their listener data could help them grow their listenership and “true fans.” On a personal note, it was fun to talk to Chris on the show given we speak every week:  Humblepod has been my trusted resource for audio mixing, transcription, and show note summarizing for probably over 100 of the most recent episodes of Experiencing Data. It was also fun to do a “live recording” with an audience—and we did answer questions in the full video version. (If you missed the invite, join my Insights mailing list to get notified of future free webinars).

 

To watch the full audio and video recording on Crowdcast, free, head over to: https://www.crowdcast.io/c/podcast-analytics-ui-ux-design

Highlights/ Skip to:
  • Chris talks about using data to improve podcasts and his approach to podcast numbers  (03:06)
  • Chris introduces the Listener Lifecycle model which informed the dashboard design (08:17)
  • Chris and I discuss the importance of labeling and terminology in analytics UIs (11:00)
  • We discuss designing for practical use of analytics dashboards to provide actionable insights (17:05)
  • We discuss the challenges podcast hosts face in understanding and utilizing data effectively and how design might help (21:44)
  • I discuss how my CED UX framework for advanced analytics applications helps to facilitate actionable insights (24:37)
  • I highlight the importance of presenting data effectively and in a way that centers to user needs (28:50)
  • I express challenges users may have with podcast rankings and the reliability of data sources (34:24) 
  • Chris and I discuss tailoring data reports to meet the specific needs of clients (37:14)
Quotes from Today’s Episode
  • “The irony for me as someone who has a podcast about machine learning and analytics and design is that I basically never look at my analytics.” - Brian O’Neill (01:14)
  • “The problem that I have found in podcasting is that the number that everybody uses to gauge whether a podcast is good or not is the download number…But there’s a lot of other factors in a podcast that can tell you how successful it’s going to be…where you can pull levers to…grow your show, or engage more with an audience.” - Chris Hill (03:20)
  • “I have a framework for user experience design for analytics called CED, which stands for Conclusions, Evidence, Data… The basic idea is really simple: lead your analytic service with conclusions.”- Brian O’Neill (24:37)
  • “Where the eyes glaze over is when tools are mostly about evidence generators, and we just give everybody the evidence, but there’s no actual analysis about how [this is] helping me improve my life or my business. It’s just evidence. I need someone to put that together.” - Brian O’Neill (25:23)
  • “Sometimes the data doesn’t provide enough of a conclusion about what to do…This is where your opinion starts to matter” - Brian O’Neill (26:07)
  • “It sounds like a benefit, but drilling down for most people into analytics stuff is usually a tax unless you’re an analyst.” - Brian O’Neill (27:39)
  • “Where’s the source of this data, and who decided what these numbers are? Because so much of this stuff…is not shared. As someone who’s in this space, it’s not even that it’s confusing. It’s more like, you got to distill this down for me.” - Brian O’Neill (34:57)
  • “Your clients are probably going to glaze over at this level of data because it’s not helping them make any decision about what to change.”- Brian O’Neill (37:53)
Links
01 Jun 2021066 - How Alison Magyari Used Design Thinking to Transform Eaton’s Business Analytics Approach to Creating Data Products00:30:28

Earlier this year, the always informative Women in Analytics Conference took place online. I didn’t go — but a blog post about one of the conference’s presentations on the International Institute of Analytics’ website caught my attention.

 

The post highlighted key points from a talk called Design Thinking in Analytics that was given at the conference by Alison Magyari, an IT Manager at Eaton. In her presentation, Alison explains the four design steps she utilizes when starting a new project — as well as what “design thinking” means to her.

 

Human-centered design is one of the main themes of Experiencing Data, so given Alison’s talk about tapping into the emotional state of customers to create better designed data products, I knew she would be a great guest. In this episode, Alison and I have a great discussion about building a “design thinking mindset” — as well as the importance of keeping the design process flexible.

 

In our chat, we covered: 

  • How Alison employs design thinking in her role at Eaton to better understand the 'voice of the customer.' (0:28)
  • Same frustrations, no excitement, little use: The factors that led to Alison's pursuit of a design thinking mindset when building data products at Eaton. (3:35)
  • Alleviating the 'pain points' with design thinking: The importance of understanding how a data tool makes users feel. (10:24)
  • How Eaton's business analysts (and end users) take ownership of the design process — and the challenges Alison faced in building a team of business analysts committed to design thinking. (15:51)
  • 'It's not one size fits all': The benefits of keeping the design process flexible — and why curiosity and empathy are traits of successful designers. (21:06)
  • 'Pay me now or pay me later': How Alison dealt with pushback to spending more time and resources on design — and how she dealt with skepticism from business users. (24:09)

 

Resources and Links:

 

Quotes from Today’s Episode

“In IT, it’s really interesting how sometimes we get caught up in just looking at the technology for what it is, and we forget that the technology is there to serve our business partners.” - Alison (2:00)

 

“You can give people exactly what they asked for, but if you’re designing solutions and data-driven products with someone, and if they’re really for somebody else, you actually have to dig in to figure out the unarticulated needs. TAnd they may not know how to invite you in to do ask for that. They may not even know how they’re going to make a decision with data about something.  So, you can say “sorry,  ... You could say, “Well, you’re not prepared to talk to us yet,.” oOr, you can be part of helping them work it out. ‘decide,H how will you make a decision with this information? Let us be part of that problem-finding exercise with you, not just the solution part. Because you can fail if you just give people what they asked for, so it’s best to be part of the problem finding not just solving.” - Brian (8:42)

 

“During our design process, we noted down what the sentiment of our users was while they were using our data product. …  Our users so appreciated when we would mirror back to them our observations about what they were feeling, and we were right about it. I mean, they were much more open to talking to us. They were much more open and they shared exactly what they were feeling.” - Alison (12:51)

 

“In our case, we did have the business analyst team really own the design process. Towards the end, we were the champions for it, but then our business users really took ownership, which I was proud of. They realized that if they didn’t embrace this, that they were going to have to deal with the same pain points for years to come. They didn’t want to deal with that, so they were really good partners in taking ownership at the end of the day.” - Alison (16:56)

 

“The way you learn how to do design is by doing it. …  the second thing is that you don’t have to do, “All of it,” to get some value out of it. You could just do prototyping, you could do usability evaluation, you could do ‘what if’ analyses. You can do a little of one thing and probably get some value out of that fairly early, and it’s fairly safe. And then over time, you can learn other techniques. Eventually, you will have a library of techniques that you can apply. It’s a mindset, it’s really about changing the mind. It’s heads not hands, as I sometimes say: It’s not really about hands. It’s about how we think and approach problem-solving.” - Brian (20:16)

“I think everybody can do design, but I think the ones that have been incredibly successful at it have a natural curiosity. They don’t just stop with the first answer that they get. They want to know, “If I were doing this job, would I be satisfied with compiling a 50 column spreadsheet every single day in my life? Probably not. Its curiosity and empathy — if you have those traits, naturally, then design is just kind of a better fit.” - Alison (23:15)

28 Nov 2023131 - 15 Ways to Increase User Adoption of Data Products (Without Handcuffs, Threats and Mandates) with Brian T. O’Neill00:36:57

This week I’m covering Part 1 of the 15 Ways to Increase User Adoption of Data Products, which is based on an article I wrote for subscribers of my mailing list. Throughout this episode, I describe why focusing on empathy, outcomes, and user experience leads to not only better data products, but also better business outcomes. The focus of this episode is to show you that it’s completely possible to take a human-centered approach to data product development without mandating behavioral changes, and to show how this approach benefits not just end users, but also the businesses and employees creating these data products. 

 

Highlights/ Skip to:

  • Design behavior change into the data product. (05:34)
  • Establish a weekly habit of exposing technical and non-technical members of the data team directly to end users of solutions - no gatekeepers allowed. (08:12)
  • Change funding models to fund problems, not specific solutions, so that your data product teams are invested in solving real problems. (13:30)
  • Hold teams accountable for writing down and agreeing to the intended benefits and outcomes for both users and business stakeholders. Reject projects that have vague outcomes defined. (16:49)
  • Approach the creation of data products as “user experiences” instead of a “thing” that is being built that has different quality attributes. (20:16)
  • If the team is tasked with being “innovative,” leaders need to understand the innoficiency problem, shortened iterations, and the importance of generating a volume of ideas (bad and good) before committing to a final direction. (23:08)
  • Co-design solutions with [not for!] end users in low, throw-away fidelity, refining success criteria for usability and utility as the solution evolves. Embrace the idea that research/design/build/test is not a linear process. (28:13)
  • Test (validate) solutions with users early, before committing to releasing them, but with a pre-commitment to react to the insights you get back from the test. (31:50)

Links:

14 May 2024143 - The (5) Top Reasons AI/ML and Analytics SAAS Product Leaders Come to Me For UI/UX Design Help00:50:01

Welcome back! In today's solo episode, I share the top five struggles that enterprise SAAS leaders have in the analytics/insight/decision support space that most frequently leads them to think they have a UI/UX design problem that has to be addressed. A lot of today's episode will talk about "slow creep," unaddressed design problems that gradually build up over time and begin to impact both UX and your revenue negatively. I will also share 20 UI and UX design problems I often see (even if clients do not!) that, when left unaddressed, may create sales friction, adoption problems, churn, or unhappy end users. If you work at a software company or are directly monetizing an ML or analytical data product, this episode is for you! 

Highlights/ Skip to 

  • I discuss how specific UI/UX design problems can significantly impact business performance (02:51)
  • I discuss five common reasons why enterprise software leaders typically reach out for help (04:39)
  • The 20 common symptoms I've observed in client engagements that indicate the need for professional UI/UX intervention or training (13:22)
  • The dangers of adding too many features or customization and how it can overwhelm users (16:00)
  • The issues of integrating  AI into user interfaces and UXs without proper design thinking  (30:08)
  • I encourage listeners to apply the insights shared to improve their data products (48:02)
Quotes from Today’s Episode
  • “One of the problems with bad design is that some of it we can see and some of it we can't — unless you know what you're looking for." - Brian O’Neill (02:23)
  • “Design is usually not top of mind for an enterprise software product, especially one in the machine learning and analytics space. However, if you have human users, even enterprise ones, their tolerance for bad software is much lower today than in the past.” Brian O’Neill - (13:04)
  • “Early on when you're trying to get product market fit, you can't be everything for everyone. You need to be an A+ experience for the person you're trying to satisfy.” -Brian O’Neill (15:39)
  • “Often when I see customization, it is mostly used as a crutch for not making real product strategy and design decisions.”  - Brian O’Neill (16:04) 
  • "Customization of data and dashboard products may be more of a tax than a benefit. In the marketing copy, customization sounds like a benefit...until you actually go in and try to do it. It puts the mental effort to design a good solution on the user." - Brian O’Neill (16:26)
  • “We need to think strategically when implementing Gen AI or just AI in general into the product UX because it won’t automatically help drive sales or increase business value.” - Brian O’Neill (20:50) 
  • “A lot of times our analytics and machine learning tools… are insight decision support products. They're supposed to be rooted in facts and data, but when it comes to designing these products, there's not a whole lot of data and facts that are actually informing the product design choices.” Brian O’Neill - (30:37)
  • “If your IP is that special, but also complex, it needs the proper UI/UX design treatment so that the value can be surfaced in such a way someone is willing to pay for it if not also find it indispensable and delightful.” - Brian O’Neill (45:02)
Links
05 Mar 2024138 - VC Spotlight: The Impact of AI on SAAS and Data/Developer Products in 2024 w/ Ellen Chisa of BoldStart Ventures00:33:05

In this episode of Experiencing Data, I speak with Ellen Chisa, Partner at BoldStart Ventures, about what she’s seeing in the venture capital space around AI-driven products and companies—particularly with all the new GenAI capabilities that have emerged in the last year. Ellen and I first met when we were both engaged in travel tech startups in Boston over a decade ago, so it was great to get her current perspective being on the “other side” of products and companies working as a VC.  Ellen draws on her experience in product management and design to discuss how AI could democratize software creation and streamline backend coding, design integration, and analytics. We also delve into her work at Dark and the future prospects for developer tools and SaaS platforms. Given Ellen’s background in product management, human-centered design, and now VC, I thought she would have a lot to share—and she did!

Highlights/ Skip to:
  • I introduce the show and my guest, Ellen Chisa (00:00)
  • Ellen discusses her transition from product and design to venture capital with BoldStart Ventures. (01:15)
  • Ellen notes a shift from initial AI prototypes to more refined products, focusing on building and testing with minimal data. (03:22)
  • Ellen mentions BoldStart Ventures' focus on early-stage companies providing developer and data tooling for businesses.  (07:00)
  • Ellen discusses what she learned from her time at Dark and Lola about narrowing target user groups for technology products (11:54)
  • Ellen's Insights into the importance of user experience is in product design and the process venture capitalists endure to make sure it meets user needs (15:50)
  • Ellen gives us her take on the impact of AI on creating new opportunities for data tools and engineering solutions, (20:00)
  • Ellen and I explore the future of user interfaces, and how AI tools could enhance UI/UX for end users. (25:28)
  • Closing remarks and the best way to find Ellen on online (32:07)
Quotes from Today’s Episode
  • “It's a really interesting time in the venture market because on top of the Gen AI wave, we obviously had the macroeconomic shift. And so we've seen a lot of people are saying the companies that come out now are going to be great companies because they're a little bit more capital-constrained from the beginning, typically, and they'll grow more thoughtfully and really be thinking about how do they build an efficient business.”- Ellen Chisa (03: 22) 
  • “We have this big technological shift around AI-enabled companies, and I think one of the things I’ve seen is, if you think back to a year ago, we saw a lot of early prototyping, and so there were like a couple of use cases that came up again and again.”-Ellen Chisa (3:42)
  • “I don't think I've heard many pitches from founders who consider themselves data scientists first. We definitely get some from ML engineers and people who think about data architecture, for sure..”- Ellen Chisa (05:06)  
  • “I still prefer GUI interfaces to voice or text usually, but I think that might be an uncanny valley sort of thing where if you think of people who didn’t have technology growing up, they’re more comfortable with the more human interaction, and then you get, like, a chunk of people who are digital natives who prefer it.”- Ellen Chisa (24:51)
  • [Citing some excellent Boston-area restaurants!] “The Arc browser just shipped a bunch of new functionality, where instead of opening a bunch of tabs, you can say, “Open the recipe pages for Oleana and Sarma,” and it just opens both of them, and so it’s like multiple search queries at once.” - Ellen Chisa (27:22)
  • “The AI wave of  technology biases towards people who already have products [in the market] and have existing datasets, and so I think everyone [at tech companies] is getting this big, top-down mandate from their executive team, like, ‘Oh, hey, you have to do something with AI now.’”- Ellen Chisa (28:37)
  • “I think it’s hard to really grasp what an LLM is until you do a fair amount of experimentation on your own. The experience of asking ChatGPT a simple search question compared to the experience of trying to train it to do something specific for you are quite different experiences. Even beyond that, there’s a tool called superwhisper that I like that you can take audio content and end up with transcripts, but you can give it prompts to change your transcripts as you’re going. So, you can record something, and it will give you a different output if you say you’re recording an email compared to [if] you’re recording a journal entry compared to [if] you’re recording the transcript for a podcast.”- Ellen Chisa (30:11)
Links
24 Aug 2021072 - How to Get Stakeholders to Reveal What They Really Need From a Data Product with Cindy Dishmey Montgomery00:38:49
Episode Description

How do you extract the real, unarticulated needs from a stakeholder or user who comes to you asking for AI, a specific app feature, or a dashboard? 

On this episode of Experiencing Data, Cindy Dishmey Montgomery, Head of Data Strategy for Global Real Assets at Morgan Stanley, was gracious enough to let me put her on the spot and simulate a conversation between a data product leader and customer.

I played the customer, and she did a great job helping me think differently about what I was asking her to produce for me — so that I would be getting an outcome in the end, and not just an output. We didn’t practice or plan this exercise, it just happened — and she handled it like a pro! I wasn’t surprised; her product and user-first approach told me that she had a lot to share with you, and indeed she did!  

A computer scientist by training, Cindy has worked in data, analytics and BI roles at other major companies, such as Revantage, a Blackstone real estate portfolio company, and Goldman Sachs. Cindy was also named one of the 2021 Notable Women on Wall Street by Crain’s New York Business.

Cindy and I also talked about the “T” framework she uses to achieve high-level business goals, as well as the importance for data teams to build trust with end-users.

 

In our chat, we covered:

  • Bringing product management strategies to the creation of data products to build adoption and drive value. (0:56)
  • Why the first data hire when building an internal data product should be a senior leader who is comfortable with pushing back. (3:54)
  • The "T" Framework: How Cindy, as Head of Data Strategy, Global Real Assets at Morgan Stanley, works to achieve high-level business goals. (8:48)
  • How building trust with internal stakeholders by creating valuable and smaller data products is key to eventually working on bigger data projects. (12:38)
  • How data's role in business is still not fully understood. (18:17)
  • The importance for data teams to understand a stakeholder's business problem and also design a data product solution in collaboration with them. (24:13)
  • 'Where's the why': Cindy and Brian roleplay as a data product manager and a customer, respectively, and simulate how to successfully identify a customer’s problem and also open them up to new solutions. (28:01)
  • The benefits of a data product management role — and why 'everyone should understand product.' (33:49)
Quotes from Today’s Episode

“There’s just so many good constructs in the product management world that we have not yet really brought very close to the data world. We tend to start with the skill sets, and the tools, and the ML/AI … all the buzzwords. [...]But brass tacks: when you have a happy set of consumers of your data products, you’re creating real value.” - Cindy Dishmey Montgomery (1:55)

 

“The path to value lies through adoption and adoption lies through giving people something that actually helps them do their work, which means you need to understand what the problem space is, and that may not be written down anywhere because they’re voicing the need as a solution.” - Brian O’Neill (@rhythmspice) (4:07)

 

“I think our data community tends to over-promise and under-deliver as a way to get the interest, which it’s actually quite successful when you have this notion of, ‘If you build AI, profit will come.’ But that is a really, really hard promise to make and keep.” - Cindy Dishmey Montgomery (12:14)

 

“[Creating a data product for a stakeholder is] definitely something where you have to be close to the business problem and design it together. … The struggle is making sure organizations know when the right time and what the right first hire is to start that process.” - Cindy Dishmey Montgomery (23:58)

 

“The temporal aspect of design is something that’s often missing. We talk a lot about the artifacts: the Excel sheet, the dashboard, the thing, and not always about when the thing is used.” - Brian O’Neill (@rhythmspice) (27:27)

“Everyone should understand product. And even just creating the language of product is very helpful in creating a center of gravity for everyone. It’s where we invest time, it’s how it’s meant to connect to a certain piece of value in the business strategy. It’s a really great forcing mechanism to create an environment where everyone thinks in terms of value. And the thing that helps us get to value, that’s the data product.” - Cindy Dishmey Montgomery (34:22)

Links Referenced
25 Jan 2022083 -Why Bob Goodman Thinks Product Management and Design Must Dance Together to Create “Experience Layers” for Data Products00:33:08

Design takes many forms and shapes. It is an art, a science, and a method for problem solving. For Bob Goodman, a product management and design executive, the way to view design is as a story and a narrative that conveys the solution to the customer. As a former journalist with 20 years of experience in consumer and enterprise software, Bob has a unique perspective on enabling end-user decision making with data. Having worked in both product management and UX, Bob shapes the narrative on approaching product management and product design as parts of a whole, and we talked about how data products fit into this model. Bob also shares why he believes design and product need to be under the same umbrella to prevent organizational failures. We also discussed the challenges and complexities that come with delivering data-driven insights to end users when ML and analytics are behind the scenes.

  • An overview of Bob’s recent work as an SVP of product management - and why design, UX and product management were unified. (00:47)
  • Bob’s thoughts on centralizing the company data model - and how this data and storytelling are integral to the design process. (06:10)
  • How product managers and data scientists can gain perspective on their work. (12:22)
  • Bob describes a recent dashboard and analytics product, and how customers were involved in its creation. (18:30)
  • How “being wrong” is a method of learning - and a look at what Bob calls the  “spotlight challenge.” (23:04)
  • Why productizing data science is challenging. (30:14)
  • Bob’s advice for making trusted data products. (33:46)
Quotes from Today’s Episode
  • “[I think of] product management and product design as a unified function. How do those work together? There’s that Steve Jobs quote that we all know and love that design is not just what it looks like but it’s also how it works, and when you think of it that way, kind of end-to-end, you start to see product management and product design as a very unified.”- Bob Goodman (@bob_goodman) (01:34)
  • “I have definitely experienced that some people see product management and design and UX is quite separate [...] And this has been a fascinating discovery because I think as a hybrid person, I didn’t necessarily draw those distinctions. [...] From product and design standpoint, I personally was often used to, especially in startup contexts, starting with the data that we had to work with [...]and saying, ‘Oh, this is our object model, and this is where we have context, [...]and this is the end-to-end workflow.’ And I think it’s an evolution of the industry that there’s been more and more specialization, [and] training, and it’s maybe added some barriers that didn’t exist between these disciplines [in the past].”- Bob Goodman (@bob_goodman) (03:30)
  • “So many projects tend to fail because no one can really define what good means at the beginning. The strategy is not clear, the problem set is not clear. If you have a data team that thinks the job is to surface the insights from this data, a designer is thinking about the users’ discrete tasks, feelings, and objectives. They are not there to look at the data set; they are there to answer a question and inform a decision. For example, the objective is not to look at sleep data; it may be to understand, ‘am I’m getting enough rest?’”- Brian T. O’Neill (@rhythmspice) (08:22)
  • “I imagine that when one is fascinated by data, it might be natural to presume that everyone will share this equal fascination with a sort of sleuthing or discovery. And then it’s not the case, It’s TL;DR. And so, often users want the headline, or they even need the kind of headline news to start at a glance. And so this is where this idea of storytelling  with data comes in, and some of the research [that helps us] understand the mindset that consumers come to the table with.”- Bob Goodman (@bob_goodman) (09:51)

 

  • “You were talking about this technologist’s idea of being ‘not user right, but it’s data right.’ I call this technically right, effectively wrong. This is not an infrequent thing that I hear about where the analysis might be sound, or the visualization might technically be the right thing for a certain type of audience. The difference is, are we designing for decision-making or are we designing to display the data that does tell some story, whether or not it informs the human decision-making that we’re trying to support? The latter is what most analytics solutions should strive to be”- Brian T. O’Neill (@rhythmspice) (16:11)
  • “We were working to have a really unified approach and data strategy, and to deliver on that in the best possible way for our clients and our end-users [...]. There are many solutions for custom reports, and drill-downs and data extracts, and we have all manner of data tooling. But in the part that we’re really productizing with an experience layer on top, we’re definitely optimizing on the meaningful part versus the display side [which] maybe is a little bit of a ‘less is more’ type of approach.”- Bob Goodman (@bob_goodman) (17:25)

 

  • “Delivering insights is simply the topic that we’re starting with, which is just as a user, as a reader, especially a business reader, ‘how much can I intake? And what do I need to make sense of it?’ How declarative can you be, responsibly and appropriately to bring the meaning and the insights forward?There might be a line that’s too much.”- Bob Goodman (@bob_goodman)  (33:02)
Links Referenced
10 Aug 2021071 - The ROI of UX Research and How It Applies to Data Products with Bill Albert00:45:30

There are many benefits in talking with end users and stakeholders about their needs and pain points before designing a data product. 

 

Just take it from Bill Albert, executive director of the Bentley University User Experience Center, author of Measuring the User Experience, and my guest for this week’s episode of Experiencing Data. With a career spanning more than 20 years in user experience research, design, and strategy, Bill has some great insights on how UX research is pivotal to designing a useful data product, the different types of customer research, and how many users you need to talk to to get useful info.

 

In our chat, we covered:

  • How UX research techniques can help increase adoption of data products. (1:12)
  • Conducting 'upfront research': Why talking to end users and stakeholders early on is crucial to designing a more valuable data product. (8:17)
  • 'A participatory design process': How data scientists should conduct research with stakeholders before and during the designing of a data product. (14:57)
  • How to determine sample sizes in user experience research -- and when to use qualitative vs. quantitative techniques. (17:52)
  • How end user research and design improvements helped Boston Children's Hospital drastically increase the number of recurring donations. (24:38)
  • How a person's worldview and experiences can shape how they interpret data. (32:38)
  • The value of collecting metrics that reflect the success and usage of a data product. (38:11)
Quotes from Today’s Episode

“Teams are constantly putting out dashboards and analytics applications — and now it’s machine learning and AI— and a whole lot of it never gets used because it hits all kinds of human walls in the deployment part.” - Brian (3:39)

 

“Dare to be simple. It’s important to understand giving [people exactly what they] want, and nothing more. That’s largely a reflection of organizational maturity; making those tough decisions and not throwing out every single possible feature [and] function that somebody might want at some point.” - Bill (7:50)

 

“As researchers, we need to more deeply understand the user needs and see what we’re not observing in the lab [and what] we can’t see through our analytics. There’s so much more out there that we can be doing to help move the experience forward and improve that in a substantial way.” - Bill (10:15)

 

“You need to do the upfront research; you need to talk to stakeholders and the end users as early as possible. And we’ve known about this for decades, that you will get way more value and come up with a better design, better product, the earlier you talk to people.” - Bill (13:25)

 

“Our research methods don’t change because what we’re trying to understand is technology-agnostic. It doesn’t matter whether it’s a toaster or a mobile phone — the questions that we’re trying to understand of how people are using this, how can we make this a better experience, those are constant.” - Bill (30:11)

 

“I think, what’s called model interpretability sometimes or explainable AI, I am seeing a change in the market in terms of more focus on explainability, less on model accuracy at all costs, which often likes to use advanced techniques like deep learning, which are essentially black box techniques right now. And the cost associated with black box is, ‘I don’t know how you came up with this and I’m really leery to trust it.’” - Brian (31:56)

Resources and Links:
16 Nov 2021078 - From Data to Product: What is Data Product Management and Why Do We Need It with Eric Weber00:40:46

Eric Weber, Head of Data Product at Yelp, has spent his career developing a product-minded approach to producing data-driven solutions that actually deliver value. For Eric, developing a data product mindset is still quite new and today, we’re digging into all things “data product management” and why thinking of data with a product mindset matters.

In our conversation, Eric defines what data products are and explains the value that data product managers can bring to their companies. Eric’s own ethos on centering on empathy, while equally balanced with technical credibility, is central to his perspectives on data product management. We also discussed how Eric is bringing all of this to hand at Yelp and the various ways they’re tackling their customers' data product needs.

In this episode, we also cover:

  • What is a data product and why do we need data product management? (01:34)
  • Why successful data product managers carry two important traits - empathy and technical credibility. (10:47)
  • A discussion about the levels of problem-solving maturity, the challenge behind delivering solutions, and where product managers can be the most effective during the process. (16:54)
  • A look at Yelp’s customer research strategy and what they are focusing on to optimize the user experience. (21:28)
  • How Yelp’s product strategy is influenced by classes of problems – and Yelp’s layers of experimentation. (27:38)
  • Eric reflects on unlearning and talks about his newsletter, From Data to Product. (34:36)
Quotes from Today’s Episode
  • “Data products bring companies a way to think about the long-term viability and sustainability of their data investments. [...] And part of that is creating things that are sustainable, that have a strategy, that have a customer in mind. And a lot of these things people do - maybe they don't call it out explicitly, but this is a packaging that I think focuses us in the right places rather than hoping for the best.”-  Eric Weber (@edweber1) (02:43)
  • “My hypothesis right now is that by introducing [product management] as a role, you create a vision for our product that is not just tied to a person, it's not just tied to a moment in time of the company. It's something where you can actually have another product manager come in and understand where things are headed. I think that is really the key to seeing the 10 to 20-year sustainability, other than crossing your fingers and hoping that one person stays for a long time, which is kind of a tough bet in this environment.”- Eric Weber (@edweber1) (07:27)
  • “My background is in design and one of the things that I have to work on a lot with my clients and with data scientists in particular, is getting out of the head of wanting to work on “the thing” and learning how to fall in love with the customer's problem and their need. And this whole idea of empathy, not being a squishy thing, but do you want your work to matter? Or, do you just write code or work on models all day long and you don't care if it ships and makes a difference? I think good product-minded people care a lot about that outcome. So, this output versus outcome thing is a mindset change that has to happen.”- Brian T. O’Neill (@rhythmspice) (10:56)
  • “The question about whether you focus on internal development or external buying often goes back to, what is your business trying to do? And how much is this going to cost us over time? And it's fascinating because I want [anyone listening] to come across [the data product] field as an area in motion. It's probably going to look pretty different a year from now, which I find pretty awesome and fascinating myself.”- Eric Weber (@edweber1) (27:02)
  • “If you don't have a deep understanding of what your customer is trying to do and are able to abstract it to some general class of problem, you're probably going to end up building a solution that's too narrow and not sustainable because it will solve something in the short term. But, what if you have to re-architect the whole thing? That's where it becomes really expensive and where having a product strategy pays off.”- Eric Weber (@edweber1) (31:28)
  • “I've had to unlearn that idea that I need to create a definitive framework of what someone does. I just need to be able to put on different lenses. [For example] if I'm talking to design today, these are probably the things that they're going to be focused on and concerned about. If I'm talking to our executive team, this is probably how they're going to break this problem down and look at it. So, I think it's not necessarily dropping certain frameworks, it's being able to understand that some of them are useful in certain scenarios and they're not in others. And that ability is something that I think has created this chance for me to look at the data product from different spaces and think about why it might be valuable.”- Eric Weber (@edweber1) (35:54)
Links

 

01 Nov 2022103 - Helping Pediatric Cardiac Surgeons Make Better Decisions with ML featuring Eugenio Zuccarelli of MIT Media Lab00:42:33

Today I’m chatting with Eugenio Zuccarelli, Research Scientist at MIT Media Lab and Manager of Data Science at CVS. Eugenio explains how he has created multiple algorithms designed to help shape decisions made in life or death situations, such as pediatric cardiac surgery and during the COVID-19 pandemic. Eugenio shared the lessons he’s learned on how to build trust in data when the stakes are life and death. Listen and learn how culture can affect adoption of decision support and ML tools, the impact delivery of information has on the user's ability to understand and use data, and why Eugenio feels that design is more important than the inner workings of ML algorithms.

 

Highlights/ Skip to:

  • Eugenio explains why he decided to work on machine learning models for cardiologists and healthcare workers involved in the COVID-19 pandemic (01:53) 
  • The workflow surgeons would use when incorporating the predictive algorithm and application Eugenio helped develop (04:12)
  • The question Eugenio’s predictive algorithm helps surgeons answer when evaluating whether to use various pediatric cardiac surgical procedures (06:37)
  • The path Eugenio took to build trust with experienced surgeons and drive product adoption and the role of UX (09:42)
  • Eugenio’s approach to identifying key problems and finding solutions using data (14:50)
  • How Eugenio has tracked value delivery and adoption success for a tool that relies on more than just accurate data & predictions, but also surgical skill and patient case complexity (22:26)
  • The design process Eugenio started early on to optimize user experience and adoption (28:40)
  • Eugenio’s key takeaways from a different project that helped government agencies predict what resources would be needed in which areas during the COVID-19 pandemic (34:45)
Quotes from Today’s Episode
  • “So many people today are developing machine-learning models, but I truly find the most difficult parts to be basically everything around machine learning … culture, people, stakeholders, products, and so on.” — Eugenio Zuccarelli (01:56)
  • “Developing machine-learning components, clean data, developing the machine-learning pipeline, those were the easy steps. The difficult ones who are gaining trust, as you said, developing something that was useful. And talking about trust, it’s especially tricky in the healthcare industry.” — Eugenio Zuccarelli (10:42)

 

  • “Because this tennis match, this ping-pong match between what can be done and what’s [the] problem [...] thankfully, we know, of course, it is not really the route to go. We don’t want to develop technology for the sake of it.” — Eugenio Zuccarelli (14:49)

 

  • “We put so much effort on the machine-learning side and then the user experience is so key, it’s probably even more important than the inner workings.” — Eugenio Zuccarelli (29:22)

 

  • “It was interesting to see exactly how the doctor is really focused on their job and doing it as well as they can, not really too interested in fancy [...] solutions, and so we were really able to not focus too much on appearance or fancy components, but more on usability and readability.” — Eugenio Zuccarelli (33:45)

 

  • “People’s ability to trust data, and how this varies from a lot of different entities, organizations, countries, [etc.] This really makes everything tricky. And of course, when you have a pandemic, this acts as a catalyst and enhances all of these cultural components.” — Eugenio Zuccarelli (35:59)

 

  • “I think [design success] boils down to delivery. You can package the same information in different ways [so that] it actually answers their questions in the ways that they’re familiar with.” — Eugenio Zuccarelli (37:42)
Links
07 Mar 2023112 - Solving for Common Pitfalls When Developing a Data Strategy featuring Samir Sharma, CEO of datazuum00:35:18

Today I’m chatting with Samir Sharma, CEO of datazuum. Samir is passionate about developing data strategies that drive business outcomes, and shares valuable insights into how problem framing and research can be done effectively from both the data and business side. Samir also provides his definition of a data strategy, and why it can be complicated to uncover whose job it is to create one. Throughout the conversation, Samir and I uncover the value of including different perspectives when implementing a data strategy and discuss solutions to various communication barriers. Of course, dashboards and data products also popped up in this episode as well! 

 

Highlights/ Skip to:

  • How Samir defines a data strategy and whose job it is to create one (01:39)
  • The challenges Samir sees when trying to uncover and understand a company’s existing data strategy (03:39)
  • The problem with the problem statements that Samir commonly encounters (08:37)
  • Samir unpacks the communication challenges that lead to negative business outcomes when developing data products (14:05)
  • An example of how improving research and problem framing solved a problem for Samir’s first big client (24:33)
  • How speaking in a language your users understand can open the door to more exciting and valuable projects (31:08)
Quotes from Today’s Episode
  • “I don’t think business teams really care how you do it. If you can get an outcome—even if it’s quick and dirty. We’re not supposed to be doing these things for months on end. We’re supposed to be iterating quickly to start to show that result and add value and then building on top of that to show more value, more results.” — Samir Sharma (07:29)
  • “Language is so important for business teams and technical teams and data teams to actually be able to speak a common language which has common business constructs. Why are organizations trying to train 20,000 people on data literacy, when they’ve got a ten-person data team? Why not just teach the ten people in the data team business language?” — Samir Sharma (10:52)

 

  • “I will continuously talk about processes because there’s not enough done actually understanding processes and how data is an event that occurs when a process is kicked off. … If you don’t understand the process and how data is enabling that process, or how data is being generated and the trigger points, then you’re just building something without really understanding where I need to fit that product in or where I need to fit that workflow in.” – Samir Sharma (11:46)

 

  • “But I start with asking clear questions about if I built you this dashboard, what is the decision you’re going to make off the back of it? Nine times out of ten, that question isn’t asked, if I build you this widget on this dashboard, what decision or action are you going to make or take? And how is that going to be linked back to the map that strategic objective? And if you can ask that question, you can build with purpose.” – Samir Sharma (19:27)

 

  • “You show [users] a bit of value, you show them what they’ve been dying to have, you give them a little bit extra in that so they can really optimize their decisions, and suddenly, you’ve got both sides now speaking a language that is really based on business outcomes and results.” – Samir Sharma (32:38)

 

  •  “If the people in that conversation are the developers on one side, the business team, and they’re starting to see a new narrative, even the developers will start to say, “Oh! Now, I know exactly why I’m doing this. Now, I know why I’m building it.” So, they’re also starting to learn about the business, about what impacts sales, and maybe how marketing then intertwines into that. It’s important that that is done, but not enough time has been taken on that approach.” – Samir Sharma (24:05)
  • The thing for me is, business teams don’t know what they don’t know, right? Most of the time, they’re asking a question. If I was on the data team and I’d already built a dashboard that would [answer that question], then I haven’t built it properly in the first instance. What I’ve done is I’ve built it for the beauty and the visualization instead of the what I would class is the ugliness and impact that I need.” – Samir Sharma (17:05)
Links
23 Jan 2024135 - “No Time for That:” Enabling Effective Data Product UX Research in Product-Immature Organizations00:52:47

This week, I’m chatting with Steve Portigal, who is the Principal of Portigal Consulting and the Author of Interviewing Users. We discuss the changes that prompted him to release a second version of his book 10 years after its initial release, and dive into the best practices that any team can implement to start unlocking the value of data product UX research. Steve explains that the key to making time for user research is knowing what business value you’re after, not simply having a list of research questions. We then role-play through some in-depth examples of real-life experiences we’ve seen from both end users and leadership when it comes to implementing a user research strategy. Thhroughout our conversation, we come back to the idea that even taking imperfect action towards doing user research can lead to increased data product adoption and business value. 

Highlights/ Skip to:

  • I introduce Steve Portigal, Principal of Portigal Consulting and Author of Interviewing Users (00:38)
  • What changes caused Steve to release a second edition of his book (00:58)
  • Steve and I discuss the importance of understanding how to conduct effective user research (03:44)
  • Steve explains why it’s crucial to understand that the business challenge and the research questions are two different things (08:16)
  • Brian and Steve role-play a common scenario that comes up in user research, and Steve explains an optimal workflow for user research (11:50)
  • The importance of provocation in performing user research (21:02)
  • How Steve would handle a situation where a member of leadership is preventing research being done with end users (24:23)
  • Why a consultative approach is valuable when getting buy-in for conducting user research (35:04)
  • Steve shares some of the major benefits of taking imperfect action towards starting user research (36:59)
  • The impact and value even easy wins in user research can have (42:54)
  • Steve describes the exploratory nature of user research and how to maximize the chance of finding the most valuable insights (46:57)
  • Where you can connect with Steve and get a copy of v2 of his book, Interviewing Users (49:35)
Quotes from Today’s Episode
  • “If you don’t know what you’re doing, and you don’t know what you should be investing effort-wise, that’s the inexperience in the approach. If you don’t know how to plan, what should we be trying to solve in this research? What are we trying to learn? What are we going to do with it in the organization? Who should we be talking to? How do we find them? What do we ask them? And then a really good one: how do we make sense of that information so that it has impact that we can take away?” — Steve Portigal (07:15)
  • “What do people get [from user research]? I think the chance for a team to align around something that comes in from the outside.” – Steve Portigal (41:36)
  • On the impact user research can have if teams embrace it: “They had a product that did a thing that no one [understood], and they had to change the product, but also change how they talked about it, change how they built it, and change how they packaged it. And that was a really dramatic turnaround. And it came out of our research, but [mostly] because they really leaned into making use of this stuff.” – Steve Portigal (42:35)
  • "If we knew all the questions to ask, we would just write a survey, right? It’s a lower time commitment from the participant to do that. But we’re trying to get at what we don’t know that we don’t know. For some of us, that’s fun!" – Steve Portigal (48:36)
Links

 

05 Oct 2021075 - How CDW is Integrating Design Into Its Data Science and Analytics Teams with Prasad Vadlamani00:42:11

How do we get the most breadth out of design and designers when building data products? One way is to have designers be at the front leading the charge when it comes to creating data products that must be useful, usable, and valuable.

 

For this episode Prasad Vadlamani, CDW’s Director of Data Science and Advanced Analytics, joins us for a chat about how they are making design a larger focus of how they create useful, usable data products. Prasad talks about the importance of making technology—including AI-driven solutions—human centered, and how CDW tries to keep the end user in mind. 

Prasad and I also discuss his perspectives on how to build designers into a data product team and how to successfully navigate the grey areas between various areas of expertise. When this is done well, then the entire team can work with each other's strengths and advantages to create a more robust product. We also discuss the role a UI-free user experience plays in some data products, some differences between external and internally-facing solutions, and some of Prasad’s valuable takeaways that have helped to shape the way he thinks design, data science, and analytics can collaborate.

 

In our chat, we covered: 

 

  • Prasad’s first introduction to designers and how he leverages the disciplines of design and product in his data science and analytics work (1:09)
  • The terminology behind product manager and designer and how these functions play a role in an enterprise AI team (5:18)
  • How teams can use their wide range of competencies to their advantage (8:52)
  • A look at one UI-less experience and the value of the “invisible interface” (14:58)
  • Understanding the model development process and why the model takes up only a small percentage of the effort required to successfully bring a data product to end users (20:52)
  • The differences between building an internal vs external product, what to consider, and Prasad’s “customer zero” approach. (29.17)
  • Expectations Prasad sets with customers (stakeholders) about the life expectancy of data products when they are in their early stage of development (35:02)
22 Feb 2022085 - Dr. William D. Báez on the Journey and ROI of Integrating UX Design into Machine Learning and Analytics Solutions00:44:42

Why design matters in data products is a question that, at first glance, may not be easily answered for some until they see users try to use ML models and analytics to make decisions. For Bill Báez, a data scientist and VP of Strategy at Ascend Innovations, realizing that design and UX matters in this context was a realization that grew over the course of a few years. Bill’s origins in the Air Force, and his transition to Ascend Innovations, instilled lessons about the importance of using design thinking with both clients and users. 

 

After observing solutions built in total isolation with zero empathy and knowledge of how they were being perceived in the wild, Bill realized the critical need to bring developers “upstairs” to actually observe the people using the solutions that were being built. 

 

Currently, Ascend Innovation’s consulting is primarily rooted in healthcare and community services, and in this episode, Bill provides some real-world examples where their machine learning and analytics solutions were informed by approaching the problems from a human-centered design perspective. Bill also dives in to where he is on his journey to integrate his UX and data science teams at Ascend so they can create better value for their clients and their client’s constituents. 

Highlights in this episode include:

  • What caused Bill to notice design for the first time and its importance in data products (03:12)
  • Bridging the gap between data science, UX, and the client’s needs at Ascend (08:07)
  • How to deal with the “presenting problem” and working with feedback (16:00)
  • Bill’s advice for getting designers, UX, and clients on the same page based on his experience to date (23:56)
  • How Bill provides unity for his UX and data science teams   (32:40)
  • The effects of UX in medicine (41:00)
Quotes from Today’s Episode
  • “My journey into Design Thinking started in earnest when I started at Ascend, but I didn’t really have the terminology to use. For example, Design Thinking and UX were actually terms I was not personally aware of until last summer. But now that I know and have been exposed to it and have learned more about it, I realize I’ve been doing a lot of that type of work in earnest since 2018. - Bill (03:37)
  • “Ascend Innovations has always been product-focused, although again, services is our main line of business. As we started hiring a more dedicated UX team, people who’ve been doing this for their whole career, it really helped me to understand what I had experienced prior to coming to Ascend. Part of the time I was here at Ascend that UX framework and that Design Thinking lens, it really brings a lot more firepower to what data science is trying to achieve at the end of the day.” - Bill (08:29)
  • “Clients were surprised that we were asking such rudimentary questions.  They’ll say ‘Well, we’ve already talked about that,’ or, ‘It should be obvious.’ or ‘Well, why are you asking me such a simple question?’ And we had to explain to them that we wanted to start at the bottom to move to the top. We don’t want to start somewhere midway and get the top. We want to make sure that we are all in alignment with what we’re trying to do, so we want to establish that baseline of understanding. So, we’re going to start off asking very simple questions and work our way up from there...” - Bill (21:09)
  • “We’re building a thing, but the thing only has value if it creates a change in the world. The world being, in the mind of the stakeholder, in the minds of the users, maybe some third parties that are affected by that stuff, but it’s the change that matters. So what is the better state we want in the future for our client or for our customers and users? That’s the thing we’re trying to create. Not the thing; the change from the thing is what we want, and getting to that is the hard part.” - Brian (@rhythmspice) (26:33)
  • “This is a gift that you’re giving to [stakeholders] to save time, to save money, to avoid building something that will never get used and will not provide value to them. You do need to push back against this and if they say no, that’s fine. Paint the picture of the risk, though, by not doing design. It’s very easy for us to build a ML model. It’s hard for us to build a model that someone will actually use to make the world better. And in this case, it’s healthcare or support, intervention support for addicts. “Do you really want a model, or do you want an improvement in the lives of these addicts? That’s ultimately where we’re going with this, and if we don’t do this, the risk of us pushing out an output that doesn’t get used is high. So, design is a gift, not a tax...” - Brian (@rhythmspice) (34:34)
  • “I’d say to anybody out there right now who’s currently working on data science efforts: the sooner you get your people comfortable with the idea of doing Design Thinking, get them implemented into the projects that are currently going on. [...] I think that will be a real game-changer for your data scientists and your organization as a whole...” - Bill  (42:19)

 

06 Apr 2021062 - Why Ben Shneiderman is Writing a Book on the Importance of Designing Human-Centered AI00:38:28

Ben Shneiderman is a leading figure in the field of human-computer interaction (HCI). 

Having founded one of the oldest HCI research centers in the country at the University of Maryland in 1983, Shneiderman has been intently studying the design of computer technology and its use by humans. Currently, Ben is a Distinguished University Professor in the Department of Computer Science at the University of Maryland and is working on a new book on human-centered artificial intelligence.

 

I’m so excited to welcome this expert from the field of UX and design to today’s episode of Experiencing Data! Ben and I talked a lot about the complex intersection of human-centered design and AI systems.

 

In our chat, we covered:

  • Ben's career studying human-computer interaction and computer science. (0:30)
  • 'Building a culture of safety': Creating and designing ‘safe, reliable and trustworthy’ AI systems. (3:55)
  • 'Like zoning boards': Why Ben thinks we need independent oversight of privately created AI. (12:56)
  • 'There’s no such thing as an autonomous device': Designing human control into AI systems. (18:16)
  • A/B testing, usability testing and controlled experiments: The power of research in designing good user experiences. (21:08)
  • Designing ‘comprehensible, predictable, and controllable’ user interfaces for explainable AI systems and why [explainable] XAI matters. (30:34)
  • Ben's upcoming book on human-centered AI. (35:55)
Resources and Links:

 

Quotes from Today’s Episode The world of AI has certainly grown and blossomed — it’s the hot topic everywhere you go. It’s the hot topic among businesses around the world — governments are launching agencies to monitor AI and are also making regulatory moves and rules. … People want explainable AI; they want responsible AI; they want safe, reliable, and trustworthy AI. They want a lot of things, but they’re not always sure how to get them. The world of human-computer interaction has a long history of giving people what they want, and what they need. That blending seems like a natural way for AI to grow and to accommodate the needs of real people who have real problems. And not only the methods for studying the users, but the rules, the principles, the guidelines for making it happen. So, that’s where the action is. Of course, what we really want from AI is to make our world a better place, and that’s a tall order, but we start by talking about the things that matter — the human values: human rights, access to justice, and the dignity of every person. We want to support individual goals, a person’s sense of self-efficacy — they can do what they need to in the world, their creativity, their responsibility, and their social connections; they want to reach out to people. So, those are the sort of high aspirational goals that become the hard work of figuring out how to build it. And that’s where we want to go. - Ben (2:05)

 

The software engineering teams creating AI systems have got real work to do. They need the right kind of workflows, engineering patterns, and Agile development methods that will work for AI. The AI world is different because it’s not just programming, but it also involves the use of data that’s used for training. The key distinction is that the data that drives the AI has to be the appropriate data, it has to be unbiased, it has to be fair, it has to be appropriate to the task at hand. And many people and many companies are coming to grips with how to manage that. This has become controversial, let’s say, in issues like granting parole, or mortgages, or hiring people. There was a controversy that Amazon ran into when its hiring algorithm favored men rather than women. There’s been bias in facial recognition algorithms, which were less accurate with people of color. That’s led to some real problems in the real world. And that’s where we have to make sure we do a much better job and the tools of human-computer interaction are very effective in building these better systems in testing and evaluating. - Ben (6:10)

 

Every company will tell you, “We do a really good job in checking out our AI systems.” That’s great. We want every company to do a really good job. But we also want independent oversight of somebody who’s outside the company — someone who knows the field, who’s looked at systems at other companies, and who can bring ideas and bring understanding of the dangers as well. These systems operate in an adversarial environment — there are malicious actors out there who are causing trouble. You need to understand what the dangers and threats are to the use of your system. You need to understand where the biases come from, what dangers are there, and where the software has failed in other places. You may know what happens in your company, but you can benefit by learning what happens outside your company, and that’s where independent oversight from accounting companies, from governmental regulators, and from other independent groups is so valuable. - Ben (15:04)

 

There’s no such thing as an autonomous device. Someone owns it; somebody’s responsible for it; someone starts it; someone stops it; someone fixes it; someone notices when it’s performing poorly. … Responsibility is a pretty key factor here. So, if there’s something going on, if a manager is deciding to use some AI system, what they need is a control panel, let them know: what’s happening? What’s it doing? What’s going wrong and what’s going right? That kind of supervisory autonomy is what I talk about, not full machine autonomy that’s hidden away and you never see it because that’s just head-in-the-sand thinking. What you want to do is expose the operation of a system, and where possible, give the stakeholders who are responsible for performance the right kind of control panel and the right kind of data. … Feedback is the breakfast of champions. And companies know that. They want to be able to measure the success stories, and they want to know their failures, so they can reduce them. The continuous improvement mantra is alive and well. We do want to keep tracking what’s going on and make sure it gets better. Every quarter. - Ben (19:41)

 

Google has had some issues regarding hiring in the AI research area, and so has Facebook with elections and the way that algorithms tend to become echo chambers. These companies — and this is not through heavy research — probably have the heaviest investment of user experience professionals within data science organizations. They have UX, ML-UX people, UX for AI people, they’re at the cutting edge. I see a lot more generalist designers in most other companies. Most of them are rather unfamiliar with any of this or what the ramifications are on the design work that they’re doing. But even these largest companies that have, probably, the biggest penetration into the most number of people out there are getting some of this really important stuff wrong. - Brian (26:36)

Explainability is a competitive advantage for an AI system. People will gravitate towards systems that they understand, that they feel in control of, that are predictable. So, the big discussion about explainable AI focuses on what’s usually called post-hoc explanations, and the Shapley, and LIME, and other methods are usually tied to the post-hoc approach.That is, you use an AI model, you get a result and you say, “What happened?” Why was I denied a parole, or a mortgage, or a job? At that point, you want to get an explanation. Now, that idea is appealing, but I’m afraid I haven’t seen too many success stories of that working. … I’ve been diving through this for years now, and I’ve been looking for examples of good user interfaces of post-hoc explanations. It took me a long time till I found one. The culture of AI model-building would be much bolstered by an infusion of thinking about what the user interface will be for these explanations. And even the DARPA’s XAI—Explainable AI—project, which has 11 projects within it—has not really grappled with this in a good way about designing what it’s going to look like. Show it to me. … There is another way. And the strategy is basically prevention. Let’s prevent the user from getting confused and so they don’t have to request an explanation. We walk them along, let the user walk through the step—this is like Amazon checkout process, seven-step process—and you know what’s happened in each step, you can go back, you can explore, you can change things in each part of it. It’s also what TurboTax does so well, in really complicated situations, and walks you through it. … You want to have a comprehensible, predictable, and controllable user interface that makes sense as you walk through each step. - Ben (31:13)

17 Sep 2024152 - 10 Reasons Not to Get Professional UX Design Help for Your Enterprise AI or SAAS Analytics Product00:53:00

In today’s episode, I’m going to perhaps work myself out of some consulting engagements, but hey, that’s ok! True consulting is about service—not PPT decks with strategies and tiers of people attached to rate cards. Specifically today, I decided to reframe a topic and approach it from the opposite/negative side. So, instead of telling you when the right time is to get UX design help for your enterprise SAAS analytics or AI product(s), today I’m going to tell you when you should NOT get help! 

 

Reframing this was really fun and made me think a lot as I recorded the episode. Some of these reasons aren’t necessarily representative of what I believe, but rather what I’ve heard from clients and prospects over 25 years—what they believe. For each of these, I’m also giving a counterargument, so hopefully, you get both sides of the coin. 

 

Finally, analytical thinkers, especially data product managers it seems, often want to quantify all forms of value they produce in hard monetary units—and so in this episode, I’m also going to talk about other forms of value that products can create that are worth paying for—and how mushy things like “feelings” might just come into play ;-)  Ready?

 

 

Highlights/ Skip to:
  • (1:52) Going for short, easy wins
  • (4:29) When you think you have good design sense/taste 
  • (7:09) The impending changes coming with GenAI
  • (11:27) Concerns about "dumbing down" or oversimplifying technical analytics solutions that need to be powerful and flexible
  • (15:36) Agile and process FTW?
  • (18:59) UX design for and with platform products
  • (21:14) The risk of involving designers who don’t understand data, analytics, AI, or your complex domain considerations 
  • (30:09) Designing after the ML models have been trained—and it’s too late to go back 
  • (34:59) Not tapping professional design help when your user base is small , and you have routine access and exposure to them  
  • (40:01) Explaining the value of UX design investments to your stakeholders when you don’t 100% control the budget or decisions 

 

Quotes from Today’s Episode
  • “It is true that most impactful design often creates more product and engineering work because humans are messy. While there sometimes are these magic, small GUI-type changes that have big impact downstream, the big picture value of UX can be lost if you’re simply assigning low-level GUI improvement tasks and hoping to see a big product win. It always comes back to the game you’re playing inside your team: are you working to produce UX and business outcomes or shipping outputs on time? ” (3:18)
  • “If you’re building something that needs to generate revenue, there has to be a sense of trust and belief in the solution. We’ve all seen the challenges of this with LLMs. [when] you’re unable to get it to respond in a way that makes you feel confident that it understood the query to begin with. And then you start to have all these questions about, ‘Is the answer not in there,’ or ‘Am I not prompting it correctly?’ If you think that most of this is just an technical data science problem, then don’t bother to invest in UX design work… ” (9:52)
  • “Design is about, at a minimum, making it useful and usable, if not delightful. In order to do that, we need to understand the people that are going to use it. What would an improvement to this person’s life look like? Simplifying and dumbing things down is not always the answer. There are tools and solutions that need to be complex, flexible, and/or provide a lot of power – especially in an enterprise context. Working with a designer who solely insists on simplifying everything at all costs regardless of your stated business outcome goals is a red flag—and a reason not to invest in UX design—at least with them!“ (12:28)“I think what an analytics product manager [or] an AI product manager needs to accept is there are other ways to measure the value of UX design’s contribution to your product and to your organization. Let’s say that you have a mission-critical internal data product, it’s used by the most senior executives in the organization, and you and your team made their day, or their month, or their quarter. You saved their job. You made them feel like a hero. What is the value  of giving them that experience and making them feel like those things… What is that worth when a key customer or colleague feels like you have their back with this solution you created? Ideas that spread, win, and if these people are spreading your idea, your product, or your solution… there’s a lot of value in that.” (43:33)
  • “Let’s think about value in non-financial terms. Terms like feelings. We buy insurance all the time. We’re spending money on something that most likely will have zero economic value this year because we’re actually trying not to have to file claims. Yet this industry does very well because the feeling of security matters. That feeling is worth something to a lot of people. The value of feeling secure is something greater than whatever the cost of the insurance plan. If your solution can build feelings of confidence and security, what is that worth? Does “hard to measure precisely” necessarily mean “low value?”  (47:26)
21 Jan 2025161 - Designing and Selling Enterprise AI Products [Worth Paying For]00:34:00

With GenAI and LLMs comes great potential to delight and damage customer relationships—both during the sale, and in the UI/UX. However, are B2B AI product teams actually producing real outcomes, on the business side and the UX side, such that customers find these products easy to buy, trustworthy and indispensable? 

 

What is changing with customer problems as a result of LLM and GenAI technologies becoming more readily available to implement into B2B software? Anything?

 

Is your current product or feature development being driven by the fact you might be able to now solve it with AI? The “AI-first” team sounds like it’s cutting edge, but is that really determining what a customer will actually buy from you? 

 

Today I want to talk to you about the interplay of GenAI, customer trust (both user and buyer trust), and the role of UX in products using probabilistic technology.  

 

These thoughts are based on my own perceptions as a “user” of AI “solutions,” (quotes intentional!), conversations with prospects and clients at my company (Designing for Analytics), as well as the bright minds I mentor over at the MIT Sandbox innovation fund. I also wrote an article about this subject if you’d rather read an abridged version of my thoughts.

 

Highlights/ Skip to:

  • AI and LLM-Powered Products Do Not Turn Customer Problems into “Now” and “Expensive” Problems (4:03)
  • Trust and Transparency in the Sale and the Product UX: Handling LLM Hallucinations (Confabulations) and Designing for Model Interpretability (9:44)
  • Selling AI Products to Customers Who Aren’t Users (13:28)
  • How LLM Hallucinations and Model Interpretability Impact User Trust of Your Product (16:10)
  • Probabilistic UIs and LLMs Don’t Negate the Need to Design for Outcomes (22:48)
  • How AI Changes (or Doesn’t) Our Benchmark Use Cases and UX Outcomes (28:41)
  • Closing Thoughts (32:36)
  Quotes from Today’s Episode
  • “Putting AI or GenAI into a product does not change the urgency or the depth of a particular customer problem; it just changes the solution space. Technology shifts in the last ten years have enabled founders to come up with all sorts of novel ways to leverage traditional machine learning, symbolic AI, and LLMs to create new products and disrupt established products; however, it would be foolish to ignore these developments as a product leader. All this technology does is change the possible solutions you can create. It does not change your customer situation, problem, or pain, either in the depth, or severity, or frequency. In fact, it might actually cause some new problems. I feel like most teams spend a lot more time living in the solution space than they do in the problem space. Fall in love with the problem and love that problem regardless of how the solution space may continue to change.” (4:51)
  • “Narrowly targeted, specialized AI products are going to beat solutions trying to solve problems for multiple buyers and customers. If you’re building a narrow, specific product for a narrow, specific audience, one of the things you have on your side is a solution focused on a specific domain used by people who have specific domain experience. You may not need a trillion-parameter LLM to provide significant value to your customer. AI products that have a more specific focus and address a very narrow ICP I believe are more likely to succeed than those trying to serve too many use cases—especially when GenAI is being leveraged to deliver the value. I think this can be true even for platform products as well. Narrowing the audience you want to serve also narrows the scope of the product, which in turn should increase the value that you bring to that audience—in part because you probably will have fewer trust, usability, and utility problems resulting from trying to leverage a model for a wide range of use cases.” (17:18)
  • “Probabilistic UIs and LLMs are going to create big problems for product teams, particularly if they lack a set of guiding benchmark use cases. I talk a lot about benchmark use cases as a core design principle and data-rich enterprise products. Why? Because a lot of B2B and enterprise products fall into the game of ‘adding more stuff over time.’ ‘Add it so you can sell it.’ As products and software companies begin to mature, you start having product owners and PMs attached to specific technologies or parts of a product. Figuring out how to improve the customer’s experience over time against the most critical problems and needs they have is a harder game to play than simply adding more stuff— especially if you have no benchmark use cases to hold you accountable. It’s hard to make the product indispensable if it’s trying to do 100 things for 100 people.“ (22:48)
  • “Product is a hard game, and design and UX is by far not the only aspect of product that we need to get right. A lot of designers don’t understand this, and they think if they just nail design and UX, then everything else solves itself. The reason the design and experience part is hard is that it’s tied to behavior change– especially if you are ‘disrupting’ an industry, incumbent tool, application, or product. You are in the behavior-change game, and it’s really hard to get it right. But when you get it right, it can be really amazing and transformative.” (28:01)
  • “If your AI product is trying to do a wide variety of things for a wide variety of personas, it’s going to be harder to determine appropriate benchmarks and UX outcomes to measure and design against. Given LLM hallucinations, the increased problem of trust, model drift problems, etc., your AI product has to actually innovate in a way that is both meaningful and observable to the customer. It doesn’t matter what your AI is trying to “fix.” If they can’t see what the benefit is to them personally, it doesn’t really matter if technically you’ve done something in a new and novel way. They’re just not going to care because that question of what’s in it for me is always sitting behind, in their brain, whether it’s stated out loud or not.” (29:32)

 

Links
29 Nov 2022105 - Defining “Data Product” the Producty Way and the Non-technical Skills ML/AI Product Managers Need00:41:53

Today I’m discussing something we’ve been talking about a lot on the podcast recently - the definition of a “data product.” While my definition is still a work in progress, I think it’s worth putting out into the world at this point to get more feedback. In addition to sharing my definition of data products (as defined the “producty way”), on today’s episode definition, I also discuss some of the non-technical skills that data product managers (DPMs) in the ML and AI space need if they want to achieve good user adoption of their solutions. I’ll also share my thoughts on whether data scientists can make good data product managers, what a DPM can do to better understand your users and stakeholders, and how product and UX design factors into this role. 

Highlights/ Skip to:

  • I introduce my reasons for sharing my definition of a data product (0:46)
  • My definition of data product (7:26)
  • Thinking the “producty” way (8:14)
  • My thoughts on necessary skills for data PMs (in particular, AI & machine learning product management) (12:21)
  • How data scientists can become good data product managers (DPMs) by taking off the data science hat (13:42)
  • Understanding the role of UX design within the context of DPM (16:37)
  • Crafting your sales and marketing strategies to emphasize the value of your product to the people who can use or purchase it (23:07)
  • How to build a team that will help you increase adoption of your data product (30:01)
  • How to build relationships with stakeholders/customers that allow you to find the right solutions for them (33:47)
  • Letting go of a technical identity to develop a new identity as a DPM who can lead a team to build a product that actually gets used (36:32)
  Quotes from Today’s Episode
  • “This is what’s missing in some of the other definitions that I see around data products  [...] they’re not talking about it from the customer of the data product lens. And that orientation sums up all of the work that I’m doing and trying to get you to do as well, which is to put the people at the center of the work that you’re doing and not the data science, engineering, tech, or design. I want you to put the people at the center.” (6:12)
  • “A data product is a data-driven, end-to-end, human-in-the-loop decision support solution that’s so valuable, users would potentially pay to use it.” (7:26)
  • “I want to plunge all the way in and say, ‘if you want to do this kind of work, then you need to be thinking the product-y way.’ And this means inherently letting go of some of the data science-y way of thinking and the data-first kinds of ways of thinking.” (11:46)
  • “I’ve read in a few places that data scientists don’t make for good data product managers. [While it may be true that they’re more introverted,] I don’t think that necessarily means that there’s an inherent problem with data scientists becoming good data product managers. I think the main challenge will be—and this is the same thing for almost any career transitioning into product management—is knowing when to let go of your former identity and wear the right hat at the right time.” (14:24)
  • “Make better things for people that will improve their life and their outcomes and the business value will follow if you’ve properly aligned those two things together.” (17:21)
  • “The big message here is this: there is always a design and experience, whether it is an API, or a platform, a dashboard, a full application, etc. Since there are no null design choices, how much are you going to intentionally shape that UX, or just pray that it comes out good on the other end? Prayer is not really a reliable strategy.  If you want to routinely do this work right, you need to put intention behind it.” (22:33) 
  • “Relationship building is a must, and this is where applying user experience research can be very useful—not just for users, but also with stakeholders. It’s learning how to ask really good questions and learning the feelings, emotions, and reasons why people ask your team to build the thing that they’ve asked for. Learning how to dig into that is really important.” (26:26)
  Links
03 Sep 2024151 - Monetizing SAAS Analytics and The Challenges of Designing a Successful Embedded BI Product (Promoted Episode)00:49:57

Due to a technical glitch that ended up unpublishing this episode right after it originally was released, Episode 151 is a replay of my conversation with Zalak Trivdei from this past March . Please enjoy our chat if you missed it the first time around!

 

Thanks,

Brian

 

  Links

Original Episode: https://designingforanalytics.com/resources/episodes/139-monetizing-saas-analytics-and-the-challenges-of-designing-a-successful-embedded-bi-product-promoted-episode/ 

Sigma Computing: https://sigmacomputing.com

Email: zalak@sigmacomputing.com 

LinkedIn: https://www.linkedin.com/in/trivedizalak/

Sigma Computing Embedded: https://sigmacomputing.com/embedded

About Promoted Episodes on Experiencing Data: https://designingforanalytics.com/promoted

30 May 2023118 - Attracting Talent and Landing a Role in Data Product Management with Kyle Winterbottom00:49:23

Today I’m chatting with Kyle Winterbottom, who is the owner of Orbition Group and an advisor/recruiter for companies who are hiring top talent in the data industry. Kyle and I discuss whether the concept of data products has meaningful value to companies, or if it’s in a hype cycle of sorts. Kyle then shares his views on what sets the idea of data products apart from other trends, the well-paid opportunities he sees opening up for product leaders in the data industry, and why he feels being able to increase user adoption and quantify the business impact of your work is also relevant in a candidate’s ability to negotiate higher pay. Kyle and I also discuss the strange tendency for companies to mistakenly prioritize technical skills for these roles, the overall job market for data product leaders, average compensation numbers, and what companies can do to attract this talent.

Highlights/ Skip to:

  • Kyle introduces himself and his company, Orbition Group (01:02)
  • Why Brian invited Kyle on the show to discuss the recruitment of technical talent for data & analytics teams (02:00)
  • Kyle shares what’s causing companies to build out data product teams (04:49)
  • The reason why viewing data as a product seems to be driving better adoption in Kyle’s view (07:22)
  • Does Kyle feel that the concept of data products is mostly hype or meaningful? (11:26)
  • The different levels of maturity Kyle sees in organizations that are approaching him for help hiring data product talent, and how soft skills are often overlooked (15:37)
  • Kyle’s views on who is successfully landing data product manager roles and how that’s starting to change (23:20)
  • What Kyle’s observations are on the salary bands for data product manager roles and the type of money people can make in this space (25:41)
  • Brian and Kyle discuss how the skills of DPMs can help these leaders improve earning potential (30:30)
  • Kyle’s observations and advice to companies seeking to improve the data product talent they attract (38:12)
  • How listeners can learn more about Kyle and Orbition Group (47:55)
Quotes from Today’s Episode
  • “I think data products, obviously, there’s starting to get a bit of hype around it, which I’ve got no doubt will start to lead organizations to look down that route, just because they see and hear about other organizations doing it. ... [but] what it’s helping organizations to do is to drive adoption.” — Kyle Winterbottom (05:45)
  • “I think we’re at a point now where it’s becoming more and more clear, day by day, week by week, the there’s more to [the data industry] than just the building of stuff.” – Kyle Winterbottom (12:56)
  • “The whole soft skills piece is becoming absolutely integral because it’s become—you know, it’s night and day now, between the people that are really investing in themselves in that area and how quickly they’re progressing in their career because of that. But yeah, most organizations don’t even think about that.” – Kyle Winterbottom (18:49)
  • “I think nine times out of ten, most businesses overestimate the importance of the technical stuff practically in every role. … Even data analysts, data scientists, all they’re bothered about is the tech stack that they’ve used, [but] there’s a lot more to it than just the tech that they use.” – Kyle Winterbottom (22:56)
  • “There’s probably a big opportunity for really good product people to move into the data space because it’s going to be well paid with lots of opportunity. [It’s] quite an interesting space.” – Kyle Winterbottom (24:05)
  • “As soon as you get to a point where if you can help to drive adoption and then you can quantify the commercial benefit of that adoption to the organization, that probably puts you up near the top in terms of percentile of being important to a data organization.” – Kyle Winterbottom (32:21)
  • “We’re forever talking in our industry about the importance of storytelling. Yeah, I’ve never seen a business once tell a good story about how good it is to work for them, specifically in regards to their data analytics team and telling a story about that.” – Kyle Winterbottom (39:37)
Links
12 Dec 2023132 - Leveraging Behavioral Science to Increase Data Product Adoption with Klara Lindner00:42:56

In this conversation with Klara Lindner, Service Designer at diconium data, we explore how behavioral science and UX can be used to increase adoption of data products. Klara describes how she went from having a highly technical career as an electrical engineer and being the founder of a solar startup to her current role in service design for data products. Klara shares powerful insights into the value of user research and human-centered design, including one which stopped me in my tracks during this episode: how the people making data products and evangelizing data-driven decision making aren’t actually following their own advice when it comes to designing their data products. Klara and I also explore some easy user research techniques that data professionals can use, and discuss who should ultimately be responsible for user adoption of data products. Lastly, Klara gives us a peek at her upcoming December 19th, 2023 webinar with the The Data Product Leadership Community (DPLC) where she will be going deeper on two frameworks from psychology and behavioral science that teams can use to increase adoption of data products. Klara is also a founding member of the DPLC and was one of—if not the very first—design/UX professionals to join.

 

Highlights/ Skip to:

  • I introduce Klara, and she explains the role of Service Design to our audience (00:49)
  • Klara explains how she realized she’s been doing design work longer than she thought by reflecting on the company she founded, Mobisol (02:09)
  • How Klara balances the desire to design great dashboards with the mission of helping end users (06:15)
  • Klara describes the psychology behind user research and her upcoming talk on December 19th at The Data Product Leadership Community (08:32)
  • What data product teams can do as a starting point to begin implementing user research principles (10:52) 
  • Klara gives a powerful example of the type of insight and value even basic user research can provide (12:49)
  • Klara and I discuss a key revelation when it comes to designing data products for users, which is the irony that even developers use intuition as well as quantitative data when building (16:43)
  • What adjustments Klara had to make in her thinking when moving from a highly technical background to doing human-centered design (21:08)
  • Klara describes the two frameworks for driving adoption that she’ll be sharing in her talk at the DPLC on December 19th (24:23)
  • An example of how understanding and addressing adoption blockers is important for product and design teams (30:44)
  • How Klara has seen her teams adopt a new way of thinking about product & service design (32:55)
  • Klara gives her take on the Jobs to be Done framework, which she will also be sharing in her talk at the DPLC on December 19th (35:26)
  • Klara’s advice to teams that are looking to build products around generative AI (39:28)
  • Where listeners can connect with Klara to learn more (41:37)

 

Links
20 Sep 2022100 - Why Your Data, AI, Product & Business Strategies Must Work Together (and Digital Transformation is The Wrong Framing) with Vin Vashishta00:45:08

Today I’m chatting with Vin Vashishta, Founder of V Squared. Vin believes that with methodical strategic planning, companies can prepare for continuous transformation by removing the silos that exist between leadership, data, AI, and product teams. How can these barriers be overcome, and what is the impact of doing so? Vin answers those questions and more, explaining why process disruption is necessary for long-term success and gives real-world examples of companies who are adopting these strategies.

 

Highlights/ Skip to:

  • What the AI ‘Last Mile’ Problem is (03:09)
  • Why Vin sees so many businesses are reevaluating their offerings and realigning with their core business model (09:01)
  • Why every company today is struggling to figure out how to bridge the gap between data, product, and business value (14:25)
  • How the skillsets needed for success are evolving for data, product, and business leaders (14:40)
  • Vin’s process when he’s helping a team with a data strategy, and what the end result looks like (21:53)
  • Why digital transformation is dead, and how to reframe what business transformation means in today’s day and age (25:03)
  • How Airbnb used data to inform their overall strategy to survive during a time of massive industry disruption, and how those strategies can be used by others as a preventative measure (29:03)
  • Unpacking how a data strategy leader can work backward from a high-level business strategy to determining actionable steps and use cases for ML and analytics (32:52)
  • Who (what roles) are ultimately responsible in an ideal strategy planning session? (34:41)
  • How the C-Suite can bridge business & data strategy and the impact the world’s largest companies are seeing as a result (36:01)
Quotes from Today’s Episode
  • “And when you have that [core business & technology strategy] disconnect, technology goes in one direction, what the business needs and what customers need sort of lives outside of the silo.” – Vin Vashishta (06:06)
  • “Why are we doing data and not just traditional software development? Why are we doing data science and not analytics? There has to be a justification because each one of these is more expensive than the last, each one is, you know, less certain.” – Vin Vashishta (10:36)
  • “[The right people to train] are smart about the technology, but have also lived with the users, have some domain expertise, and the interest in making a bigger impact. Let’s put them in strategy roles.” – Vin Vashishta (18:58)
  • “You know, this is never going to end. Transformation is continuous. I don’t call it digital transformation anymore because that’s making you think that this thing is somehow a once-in-a-generation change. It’s not. It’s once every five years now.” – Vin Vashishta (25:03)
  • “When do you want to have those [business] opportunities done by? When do you want to have those objectives completed by? Well, then that tells you how fast you have to transform if you want to use each one of these different technologies.” – Vin Vashishta (25:37)
  • “You’ve got to disrupt the process. Strategy planning is not the same anymore. Look at how Amazon does it. ... They are destroying their competitors because their strategy planning process is both expert and data model-driven.” – Vin Vashishta (33:44)
  • “And one of the critical things for CDOs to do is tell stories with data to the board. When they sit in and talk to the board. They need to tell those stories about how one data point hit this one use case and the company made $4 million.” – Vin Vashishta (39:33)
Links
29 Aug 2024150 - How Specialized LLMs Can Help Enterprises Deliver Better GenAI User Experiences with Mark Ramsey00:52:22

“Last week was a great year in GenAI,” jokes Mark Ramsey—and it’s a great philosophy to have as LLM tools especially continue to evolve at such a rapid rate. This week, you’ll get to hear my fun and insightful chat with Mark from Ramsey International about the world of large language models (LLMs) and how we make useful UXs out of them in the enterprise. 

 

Mark shared some fascinating insights about using a company’s website information (data) as a place to pilot a LLM project, avoiding privacy landmines, and how re-ranking of models leads to better LLM response accuracy. We also talked about the importance of real human testing to ensure LLM chatbots and AI tools truly delight users. From amusing anecdotes about the spinning beach ball on macOS to envisioning a future where AI-driven chat interfaces outshine traditional BI tools, this episode is packed with forward-looking ideas and a touch of humor.

    Highlights/ Skip to:
  • (0:50) Why is the world of GenAI evolving so fast?
  • (4:20) How Mark thinks about UX in an LLM application
  • (8:11) How Mark defines “Specialized GenAI?”
  • (12:42) Mark’s consulting work with GenAI / LLMs these days
  • (17:29) How GenAI can help the healthcare industry
  • (30:23) Uncovering users’ true feelings about LLM applications
  • (35:02) Are UIs moving backwards as models progress forward?
  • (40:53) How will GenAI impact data and analytics teams?
  • (44:51) Will LLMs be able to consistently leverage RAG and produce proper SQL?
  • (51:04) Where can find more from Mark and Ramsey International

 

Quotes from Today’s Episode
  • “With [GenAI], we have a solution that we’ve built to try to help organizations, and build workflows. We have a workflow that we can run and ask the same question [to a variety of GenAI models] and see how similar the answers are. Depending on the complexity of the question, you can see a lot of variability between the models… [and] we can also run the same question against the different versions of the model and see how it’s improved. Folks want a human-like experience interacting with these models.. [and] if the model can start responding in just a few seconds, that gives you much more of a conversational type of experience.” - Mark Ramsey (2:38)
  • “[People] don’t understand when you interact [with GenAI tools] and it brings tokens back in that streaming fashion, you’re actually seeing inside the brain of the model. Every token it produces is then displayed on the screen, and it gives you that typewriter experience back in the day. If someone has to wait, and all you’re seeing is a logo spinning, from a UX experience standpoint… people feel like the model is much faster if it just starts to produce those results in that streaming fashion. I think in a design, it’s extremely important to take advantage of that [...] as opposed to waiting to the end and delivering the results some models support that, and other models don’t.”- Mark Ramsey (4:35)
  • "All of the data that’s on the website is public information. We’ve done work with several organizations on quickly taking the data that’s on their website, packaging it up into a vector database, and making that be the source for questions that their customers can ask. [Organizations] publish a lot of information on their websites, but people really struggle to get to it. We’ve seen a lot of interest in vectorizing website data, making it available, and having a chat interface for the customer. The customer can ask questions, and it will take them directly to the answer, and then they can use the website as the source information.” - Mark Ramsey (14:04)
  • “I’m not skeptical at all. I’ve changed much of my [AI chatbot searches] to Perplexity, and I think it’s doing a pretty fantastic job overall in terms of quality. It’s returning an answer with citations, so you have a sense of where it’s sourcing the information from. I think it’s important from a user experience perspective. This is a replacement for broken search, as I really don’t want to read all the web pages and PDFs you have that *might* be about my chiropractic care query to answer my actual [healthcare] question.” - Brian O’Neill (19:22)
  • “We’ve all had great experience with customer service, and we’ve all had situations where the customer service was quite poor, and we’re going to have that same thing as we begin to [release more] chatbots. We need to make sure we try to alleviate having those bad experiences, and have an exit. If someone is running into a situation where they’d rather talk to a live person, have that ability to route them to someone else. That’s why the robustness of the model is extremely important in the implementation… and right now, organizations like OpenAI and Anthropic are significantly better at that [human-like] experience.” - Mark Ramsey (23:46)
  • "There’s two aspects of these models: the training aspect and then using the model to answer questions. I recommend to organizations to always augment their content and don’t just use the training data. You’ll still get that human-like experience that’s built into the model, but you’ll eliminate the hallucinations. If you have a model that has been set up correctly, you shouldn’t have to ask questions in a funky way to get answers.” - Mark Ramsey (39:11)
  • “People need to understand GenAI is not a predictive algorithm. It is not able to run predictions, it struggles with some math, so that is not the focus for these models. What’s interesting is that you can use the model as a step to get you [the answers]. A lot of the models now support functions… when you ask a question about something that is in a database, it actually uses its knowledge about the schema of the database. It can build the query, run the query to get the data back, and then once it has the data, it can reformat the data into something that is a good response back." - Mark Ramsey (42:02)
  Links
21 Sep 2021074 - Why a Former Microsoft ML/AI Researcher Turned to Design to Create Intelligent Products from Messy Data with Abhay Agarwal, Founder of Polytopal00:44:32
Episode Description

The challenges of design and AI are exciting ones to face. The key to being successful in that space lies in many places, but one of the most important is instituting the right design language.

For Abhay Agarwal, Founder of Polytopal, when he began to think about design during his time at Microsoft working on systems to help the visually impared, he realized the necessity of a design language for AI. Stepping away from that experience, he leaned into how to create a new methodology of design centered around human needs. His efforts have helped shift the lens of design towards how people solve problems.

In this episode, Abhay and I go into details on a snippet from his course page for the Stanford d. where he claimed that “the foreseeable future would not be well designed, given the difficulty of collaboration between disciplines.” Abhay breaks down how he thinks his design language for AI should work and how to build it out so that everyone in an organization can come to a more robust understanding of AI. We also discuss the future of designers and AI and the ebb and flow of changing, learning, and moving forward with the AI narrative. 

In our chat, we covered:

  • Abhay’s background in AI research and what happened to make him move towards design as a method to produce intelligence from messy data. (1:01)
  • Why Abhay has come up with a new design language called Lingua Franca for machine learning products [and his course on this at Stanford’s d.school]. (3:21)
  • How to become more human-centered when building AI products, what ethnographers can uncover, and some of Abhay’s real-world examples. (8:06)
  • Biases in design and the challenges in developing a shared language for both designers and AI engineers. (15:59)
  • Discussing interpretability within black box models using music recommendation systems, like Spotify, as an example. (19:53)
  • How “unlearning” solves one of the biggest challenges teams face when collaborating and engaging with each other. (27:19) 
  • How Abhay is shaping the field of design and ML/AI -- and what’s in store for Lingua Franca. (35:45)
Quotes from Today's Episode

“I certainly don’t think that one needs to hit the books on design thinking or listen to a design thinker describe their process in order to get the fundamentals of a human-centered design process. I personally think it’s something that one can describe to you within the span of a single conversation, and someone who is listening to that can then interpret that and say, ‘Okay well, what am I doing that could be more human-centered?’ In the AI space, I think this is the perennial question.” - Abhay Agarwal (@Denizen_Kane) (6:30)

 

“Show me a company where designers feel at an equivalent level to AI engineers when brainstorming technology? It just doesn’t happen. There’s a future state that I want us to get to that I think is along those lines. And so, I personally see this as, kind of, a community-wide discussion, engagement, and multi-strategy approach.” - Abhay Agarwal (@Denizen_Kane) (18:25)

 

“[Discussing ML data labeling for music recommenders] I was just watching a video about drum and bass production, and they were talking about, “Or you can write your bass lines like this”—and they call it reggaeton. And it’s not really reggaeton at all, which was really born in Puerto Rico. And Brazil does the same thing with their versions of reggae. It’s not the one-drop reggae we think of Bob Marley and Jamaica. So already, we’ve got labeling issues—and they’re not even wrong; it’s just that that’s the way one person might interpret what these musical terms mean” - Brian O’Neill (@rhythmspice) (25:45)

 

“There is a new kind of hybrid role that is emerging that we play into...which is an AI designer, someone who is very proficient with understanding the dynamics of AI systems. The same way that we have digital UX designers, app designers—there had to be apps before they could be app designers—there is now AI, and then there can thus be AI designers.” - Abhay  Agarwal (@Denizen_Kane) (33:47)

  Links Referenced
15 Oct 2024154 - 10 Things Founders of B2B SAAS Analytics and AI Startups Get Wrong About DIY Product and UI/UX Design00:44:47

Sometimes DIY UI/UX design only gets you so far—and you know it’s time for outside help. One thing prospects from SAAS analytics and data-related product companies often ask me is how things are like in the other guy/gal’s backyard. They want to compare their situation to others like them. So, today, I want to share some of the common “themes” I see that usually are the root causes of what leads to a phone call with me. 

 

 

By the time I am on the phone with most prospects who already have a product in market, they’re usually either having significant problems with 1 or more of the following: sales friction (product value is opaque); low adoption/renewal worries (user apathy), customer complaints about UI/UX being hard to use; velocity (team is doing tons of work, but leader isn’t seeing progress)—and the like. 

 

 

I’m hoping today’s episode will explain some of the root causes that may lead to these issues — so you can avoid them in your data product building work!  

 

 

Highlights/ Skip to:
  • (10:47) Design != "front-end development" or analyst work
  • (12:34)  Liking doing UI/UX/viz design work vs. knowing 
  • (15:04)  When a leader sees lots of work being done, but the UX/design isn’t progressing
  • (17:31) Your product’s UX needs to convey some magic IP/special sauce…but it isn’t
  • (20:25) Understanding the tradeoffs of using libraries, templates, and other solution’s design as a foundation for your own 
  • (25:28) The sunk cost bias associated with POCs and “we’ll iterate on it”
  • (28:31) Relying on UI/UX "customization" to please all customers
  • (31:26) The hidden costs of abstraction of system objects, UI components, etc.  to make life easier for engineering and technical teams
  • (32:32) Believing you’ll know the design is good “when you see it” (and what you don’t know you don’t know)
  • (36:43) Believing that because the data science/AI/ML modeling under your solution was, accurate, difficult, and/or expensive makes it automatically worth paying for 

 

 

Quotes from Today’s Episode
  • The challenge is often not knowing what you don’t know about a project. We often end up focusing on building the tech [and rushing it out] so we can get some feedback on it… but product is not about getting it out there so we can get feedback. The goal of doing product well is to produce value, benefits, or outcomes. Learning is important, but that’s not what the objective is. The objective is benefits creation. (5:47)
  • When we start doing design on a project that’s not design actionable, we build debt and sometimes can hurt the process of design. If you start designing your product with an entire green space, no direction, and no constraints, the chance of you shipping a good v1 is small. Your product strategy needs to be design-actionable for the team to properly execute against it. (19:19)
  • While you don’t need to always start at zero with your UI/UX design, what are the parts of your product or application that do make sense to borrow , “steal” and cheat from? And when does it not?  It takes skill to know when you should be breaking the rules or conventions. Shortcuts often don’t produce outsized results—unless you know what a good shortcut looks like.  (22:28)
  • A proof of concept is not a minimum valuable product. There’s a difference between proving the tech can work and making it into a product that’s so valuable, someone would exchange money for it because it’s so useful to them. Whatever that value is, these are two different things. (26:40)
  • Trying to do a little bit for everybody [through excessive customization] can often result in nobody understanding the value or utility of your solution. Customization can hide the fact the team has decided not to make difficult choices. If you’re coming into a crowded space… it’s like’y not going to be a compelling reason to [convince customers to switch to your solution]. Customization can be a tax, not a benefit. (29:26)
  • Watch for the sunk cost bias [in product development]. [Buyers] don’t care how the sausage was made. Many don’t understand how the AI stuff works, they probably don’t need to understand how it works. They want the benefits downstream from technology wrapped up in something so invaluable they can’t live without it.  Watch out for technically right, effectively wrong. (39:27)
07 Feb 2023110 - CDO Spotlight: The Value and Journey of Implementing a Data Product Mindset with Sebastian Klapdor of Vista00:32:52

Today I’m chatting with Dr. Sebastian Klapdor, Chief Data Officer for Vista. Sebastian has developed and grown a successful Data Product Management team at Vista, and it all began with selling his vision to the rest of the executive leadership. In this episode, Sebastian explains what that process was like and what he learned. Sebastian shares valuable insights on how he implemented a data product orientation at Vista, what makes a good data product manager, and why technology usage isn’t the only metric that matters when measuring success. He also shares what he would do differently if he had to do it all over again.

 

Highlights/ Skip to:

  • How Sebastian defines a data product (01:48)
  • Brian asks Sebastian about the change management process in leadership when implementing a data product approach (07:40)
  • The three dimensions that Sebastian and his team measure to determine adoption success (10:22)
  • Sebastian shares the financial results of Vista adopting a data product approach (12:56)
  • The size and scale of the data team at Vista, and how their different roles ensure success (14:30)
  • Sebastian explains how Vista created and grew a team of 35 data product managers (16:47)
  • The skills Sebastian feels data product managers need to be successful at Vista (22:02)
  • Sebastian describes what he would do differently if he had to implement a data product approach at a company again (29:46)
Quotes from Today’s Episode
  • “You need to establish a culture, and that’s often the hardest part that takes the longest -  to treat data as an asset, and not to treat it as a byproduct, but to treat it as a product and treat it as a valuable thing.” – Sebastian Klapdor (07:56)
  • “One source of data product managers is taking data professionals. So, you take data engineers, data scientists, or former analysts, and develop them into the role by coaching them [through] the product management skills from the software industry.” – Sebastian Klapdor (17:39)

 

  • “We went out there and we were hiring people in the market who were experienced [Product Managers]. But we also see internal people, actually grooming and growing into all of these roles, both from these 80 folks who have been around before, but also from other areas of Vista.” – Sebastian Klapdor (20:28)

 

  • “[Being a good Product Manager] comes back to the good old classics of collaborating, of being empathetic to where other people are at, their priorities, and understanding where [our] priorities fit into their bigger piece, and jointly aligning on what is valuable for Vista.” – Sebastian Klapdor (22:27)

 

  • “I think there’s nothing more detrimental than saying, ‘Yeah, sure, we can deliver things, and with data, it can do everything.’ And then you disappoint people and you don’t stick to your promises. … If you don’t stick to your promise, it will hurt you.” – Sebastian Klapdor (23:04)
  • “You don’t do the typical waterfall approach of solving business problems with data. You don’t do the approach that a data scientist tries to get some data, builds a model, and hands it over to data engineer who should productionize that. And then the data engineer gets back and says certain features can’t be productionized because it’s very complex to get the data on a daily basis, or in real time. By doing [this work] in a data product team, you can work actually in Agile and you’re super fast building what we call a minimum lovable product.” – Sebastian Klapdor (26:15)

  • “That was the biggest learning … whom do we staff as data product managers? And what do we expect of a good data product manager? How does a career path look like? That took us a really long time to figure out.” – Sebastian Klapdor (30:18)
  • “We have a big, big, big commitment that we want to start stuffing UX designers onto our [data] product teams.” - Sebastian Klapdor (21:12)
Links
19 Apr 2022089 - Reader Questions Answered about Dashboard UX Design00:48:16

Dashboards are at the forefront of today’s episode, and so I will be responding to some reader questions who wrote in to one of my weekly mailing list missives about this topic. I’ve not talked much about dashboards despite their frequent appearance in data product UIs, and in this episode, I’ll explain why. Here are some of the key points and the original questions asked in this episode:

  • My introduction to dashboards (00:00)
  • Some overall thoughts on dashboards (02:50)
  • What the risk is to the user if the insights are wrong or misinterpreted (4:56)
  • Your data outputs create an experience, whether intentional or not (07:13)
  • John asks: How do we figure out exactly what the jobs are that the dashboard user is trying to do? Are they building next year's budget or looking for broken widgets?  What does this user value today? Is a low resource utilization percentage something to be celebrated or avoided for this dashboard user today?  (13:05)
  • Value is not intrinsically in the dashboard (18:47)
  • Mareike asks: How do we provide Information in a way that people are able to act upon the presented Information?  How do we translate the presented Information into action? What can we learn about user expectation management when designing dashboard/analytics solutions? (22:00)
  • The change towards predictive and prescriptive analytics (24:30)
  • The upfront work that needs to get done before the technology is in front of the user (30:20)
  • James asks: How can we get people to focus less on the assumption-laden and often restrictive term "dashboard", and instead worry about designing solutions focused on outcomes for particular personas and workflows that happen to have some or all of the typical ingredients associated with the catch-all term "dashboards?” (33:30)
  • Stop measuring the creation of outputs and focus on the user workflows and the jobs to be done (37:00)
  • The data product manager shouldn’t just be focused on deliverables (42:28)

 

Quotes from Today’s Episode
  • “The term dashboards is almost meaningless today, it seems to mean almost any home default screen in a data product. It also can just mean a report. For others, it means an entire monitoring tool, for some, it means the summary of a bunch of data that lives in some other reports. The terms are all over the place.”- Brian (@rhythmspice) (01:36)
  • “The big idea here that I really want leaders to be thinking about here is you need to get your teams focused on workflows—sometimes called jobs to be done—and the downstream decisions that users want to make with machine-learning or analytical insights. ” - Brian (@rhythmspice) (06:12)
  • “This idea of human-centered design and user experience is really about trying to fit the technology into their world, from their perspective as opposed to building something in isolation where we then try to get them to adopt our thing.  This may be out of phase with the way people like to do their work and may lead to a much higher barrier to adoption.” - Brian (@rhythmspice) (14:30)
  • “Leaders who want their data science and analytics efforts to show value really need to understand that value is not intrinsically in the dashboard or the model or the engineering or the analysis.” - Brian (@rhythmspice) (18:45)
  • “There's a whole bunch of plumbing that needs to be done, and it’s really difficult. The tool that we end up generating in those situations tends to be a tool that’s modeled around the data and not modeled around [the customers] mental model of this space, the customer purchase space, the marketing spend space, the sales conversion, or propensity-to-buy space.” - Brian (@rhythmspice) (27:48)
  • “Data product managers should be these problem owners, if there has to be a single entity for this. When we’re talking about different initiatives in the enterprise or for a commercial software company, it’s really sits at this product management function.”  - Brian (@rhythmspice) (34:42)
  • “It’s really important that [data product managers] are not just focused on deliverables; they need to really be the ones that summarize the problem space for the entire team, and help define a strategy with the entire team that clarifies the direction the team is going in. They are not a project manager; they are someone responsible for delivering value.” - Brian (@rhythmspice) (42:23)

Links Referenced:

04 Apr 2023114 - Designing Anti-Biasing and Explainability Tools for Data Scientists Creating ML Models with Josh Noble00:42:05

Today I’m chatting with Josh Noble, Principal User Researcher at TruEra. TruEra is working to improve AI quality by developing products that help data scientists and machine learning engineers improve their AI/ML models by combatting things like bias and improving explainability. Throughout our conversation, Josh—who also used to work as a Design Lead at IDEO.org—explains the unique challenges and importance of doing design and user research, even for technical users such as data scientists. He also shares tangible insights on what informs his product design strategy, the importance of measuring product success accurately, and the importance of understanding the current state of a solution when trying to improve it.

Highlights/ Skip to:

  • Josh introduces himself and explains why it’s important to do design and user research work for technical tools used by data scientists (00:43)
  • The work that TruEra does to mitigate bias in AI as well as their broader focus on AI quality management (05:10)
  • Josh describes how user roles informed TruEra’s design their upcoming monitoring product, and the emphasis he places on iterating with users (10:24) 
  • How Josh approaches striking a balance between displaying extraneous information in the tools he designs vs. removing explainability (14:28)
  • Josh explains how TruEra measures product success now and how they envision that changing in the future (17:59)
  • The difference Josh sees between explainability and interpretability (26:56)
  • How Josh decided to go from being a designer to getting a data science degree (31:08)
  • Josh gives his take on what skills are most valuable as a designer and how to develop them (36:12)
Quotes from Today’s Episode
  • “We want to make machine learning better by testing it, helping people analyze it, helping people monitor models. Bias and fairness is an important part of that, as is accuracy, as is explainability, and as is more broadly AI quality.” — Josh Noble (05:13)
  • “These two groups, the data scientists and the machine-learning engineer, they think quite differently about the problems that they need to solve. And they have very different toolsets. … Looking at how we can think about making a product and building tools that make sense to both of those different groups is a really important part of user experience.” – Josh Noble (09:04)
  • “I’m a big advocate for iterating with users. To the degree possible, get things in front of people so they can tell you whether it works for them or not, whether it fits their expectations or not.” – Josh Noble (12:15)
  • “Our goal is to get people to think about AI quality differently, not to necessarily change. We don’t want to change their performance metrics. We don’t want to make them change how they calculate something or change a workflow that works for them. We just want to get them to a place where they can bring together our four pillars and build better models and build better AI.” – Josh Noble (17:38)
  • “I’ve always wanted to know what was going on underneath the design. I think it’s an important part of designing anything to understand how the thing that you are making is actually built.” – Josh Noble (31:56)
  • “There’s a empathy-building exercise that comes from using these tools and understanding where they come from. I do understand the argument that some designers make. If you want to find a better way to do something, spending a ton of time in the trenches of the current way that it’s done is not always the solution, right?” – Josh Noble (36:12)
  • “There’s a real empathy that you build and understanding that you build from seeing how your designs are actually implemented that makes you a better teammate. It makes you a better collaborator and ultimately, I think, makes you a better designer because of that.” – Josh Noble (36:46)
  • “I would say to the non-designers who work with designers, measuring designs is not invalidating the designer. It doesn’t invalidate the craft of design. It shouldn’t be something that designers are hesitant to do. I think it’s really important to understand in a qualitative way what your design is doing and understand in a quantitative way what your design is doing.” – Josh Noble (38:18)
Links
07 Jan 2025160 - Leading Product Through a Merger/Acquisition: Lessons from The Predictive Index’s CPO Adam Berke00:42:10

Today, I’m chatting with Adam Berke, the Chief Product Officer at The Predictive Index. For 70 years, The Predictive Index has helped customers hire the right employees, and after the merger with Charma, their products now nurture the employee/manager relationship. This is something right up Adam’s alley, as he previously helped co-found the employee and workflow performance management software company Charma before both aforementioned organizations merged back in 2023.

 

You’ll hear Adam talk about the first-time challenges (and successes) that come with integrating two products and two product teams, and why squashing out any ambiguity with overindexing (i.e. coming prepared with new org charts ASAP) is essential during the process. 

 

Integrating behavioral science into the world of data is what has allowed The Predictive Index to thrive since the 1950s. While this is the company’s main selling point, Adam explains how the science-forward approach can still create some disagreements–and learning opportunities–with The Predictive Index’s legacy customers.

Highlights/ Skip to:

  • What is The Predictive Index and how does the product team conduct their work (1:24)
  •  Why Charma merged with The Predictive Index (5:11)
  •  The challenges Adam has faced as a CPO since the Charma/Predictive Index merger (9:21)
  • How Predictive Index has utilized behavioral science to remove the guesswork of hiring (14:22)
  • The makeup of the product team that designs and delivers The Predictive Index's products (20:24)
  •  Navigating the clashes between changing science and Predictive Index's legacy customers (22:37)
  •  How The Predictive Index analyzes the quality of their products with multiple user data metrics (27:21)
  • What Adam would do differently if had to redo the merger (37:52)
  •  Where you can find more from Adam and The Predictive Index (41:22)
  Quotes from Today’s Episode
  • “ Acquisitions are complicated. Outside of a few select companies, there are very few that have mergers and acquisitions as a repeatable discipline. More often than not, neither [company in the merger] has an established playbook for how to do this. You’re [acquiring a company] because of its product, team, or maybe even one feature. You have different theories on how the integration might look, but experiencing it firsthand is a whole different thing.  My initial role didn’t exist in [The Predictive Index] before. The rest of the whole PI organization knows how to get their work done before this, and now there’s this new executive. There’s just tons of [questions and confusion] if you don’t go in assuming good faith and be willing to work through the bumps. It’s going to get messy.” - Adam Berke (9:41)
  • “We integrated the teams and relaunched the product. Charma became [a part of the product called] PI Perform, and right away there was re-skinning, redesign, and some back-end architecture that needed to happen to make it its own module. From a product perspective, we’re trying to deliver [Charma’s] unique value prop. That’s when we can start [figuring out how to] infuse PI’s behavioral science into these workflows. We have this foundation. We got the thing organized. We got the teams organized. We were 12 people when we were acquired… and here we are a year later. 150+ new customers have been added to PI Perform because it’s accelerating now that we’re figuring out the product.” - Adam Berke (12:18)
  • “Our product team has the roles that you would expect: a PM, researcher, ux design, and then one atypical role–a PhD behavioral scientist. [Our product already had] suggested topics and templates [for manager/IC one-on-one meetings], but now we want to make those templates and suggested topics more dynamic. There might be different questions to draw out a better discussion, and our behavioral scientists help us determine [those questions]... [Our behavioral scientists] look at the science, other research, and calibrate [the one-on-one questions] before we implement them into the product.” - Adam Berke (21:04)
  • “We’ve adapted the technology and science over time as they move forward. We want to update the product with the most recent science, but there are customers who have used this product in a certain way for decades in some cases. Our desire is to follow the science… but you can’t necessarily stop people from using the stuff in a way that they used it 20 years ago. We sometimes end up with disagreements [with customers over product changes based on scientific findings], and those are tricky conversations.  But even in that debate… it comes down to all the best practices you would follow in product development in general–listening to your customers, asking that additional ‘why’ question, and trying to get to root causes.” - Adam Berke (23:36)
  • “ We’re doing an upgrade to our platform right now trying to figure out how to manage user permissions in the new version of the product. The way that we did it in the old version had a lot of problems associated… and we put out a survey. “Hey, do you use this to do X?’ We got hundreds of responses and found that half of them were not using it for the reason that we thought they were. At first, we thought thousands of people were going to have deep, deep sensitivities to tweaks in how this works, and now we realize that it might be half that, at best. A simple one-question survey asked about the right problem in the right way can help to avoid a lot of unnecessary thrashing on a product problem that might not have even existed in the first place.” - Adam Berke (35:22)

 

Links Referenced
02 Apr 2024140 - Why Data Visualization Alone Doesn’t Fix UI/UX Design Problems in Analytical Data Products with T from Data Rocks NZ00:42:44

This week on Experiencing Data, I chat with a new kindred spirit! Recently, I connected with Thabata Romanowski—better known as "T from Data Rocks NZ"—to discuss her experience applying UX design principles to modern analytical data products and dashboards. T walks us through her experience working as a data analyst in the mining sector, sharing the journey of how these experiences laid the foundation for her transition to data visualization. Now, she specializes in transforming complex, industry-specific data sets into intuitive, user-friendly visual representations, and addresses the challenges faced by the analytics teams she supports through her design business. T and I tackle common misconceptions about design in the analytics field, discuss how we communicate and educate non-designers on applying UX design principles to their dashboard and application design work, and address the problem with "pretty charts." We also explore some of the core ideas in T's Design Manifesto, including principles like being purposeful, context-sensitive, collaborative, and humanistic—all aimed at increasing user adoption and business value by improving UX.

 

Highlights/ Skip to:

  • I welcome T from Data Rocks NZ onto the show (00:00)
  • T's transition from mining to leading an information design and data visualization consultancy. (01:43)
  • T discusses the critical role of clear communication in data design solutions. (03:39)
  • We address the misconceptions around the role of design in data analytics. (06:54) 
  • T explains the importance of journey mapping in understanding users' needs. (15:25)
  • We discuss the challenges of accurately capturing end-user needs. (19:00) 
  • T and I discuss the importance of talking directly to end-users when developing data products. (25:56) 
  • T shares her 'I like, I wish, I wonder' method for eliciting genuine user feedback. (33:03)
  • T discusses her Data Design Manifesto for creating purposeful, context-aware, collaborative, and human-centered design principles in data. (36:37)
  • We wrap up the conversation and share ways to connect with T. (40:49)
Quotes from Today’s Episode
  • "It's not so much that people…don't know what design is, it's more that they understand it differently from what it can actually do..." - T from Data Rocks NZ (06:59)
  • "I think [misconception about design in technology] is rooted mainly in the fact that data has been very tied to IT teams, to technology teams, and they’re not always up to what design actually does.” - T from Data Rocks NZ (07:42) 
  • “If you strip design of function, it becomes art. So, it’s not art… it’s about being functional and being useful in helping people.” - T from Data Rocks NZ (09:06)
  • "It’s not that people don’t know, really, that the word design exists, or that design applies to analytics and whatnot; it’s more that they have this misunderstanding that it’s about making things look a certain way, when in fact... It’s about function. It’s about helping people do stuff better." - T from Data Rocks NZ (09:19)
  • “Journey Mapping means that you have to talk to people...  Data is an inherently human thing. It is something that we create ourselves. So, it’s biased from the start. You can’t fully remove the human from the data" - T from Data Rocks NZ (15:36)
  •  “The biggest part of your data product success…happens outside of your technology and outside of your actual analysis. It’s defining who your audience is, what the context of this audience is, and to which purpose do they need that product. - T from Data Rocks NZ (19:08)
  • “[In UX research], a tight, empowered product team needs regular exposure to end customers; there’s nothing that can replace that." - Brian O'Neill (25:58)
  • “You have two sides [end-users and data team]  that are frustrated with the same thing. The side who asked wasn’t really sure what to ask. And then the data team gets frustrated because the users don’t know what they want…Nobody really understood what the problem is. There’s a lot of assumptions happening there. And this is one of the hardest things to let go.” - T from Data Rocks NZ (29:38)
  • “No piece of data product exists in isolation, so understanding what people do with it… is really important.” - T from Data Rocks NZ (38:51)
Links
04 Oct 2022101 - Insights on Framing IOT Solutions as Data Products and Lessons Learned from Katy Pusch00:39:11

Today I’m chatting with Katy Pusch, Senior Director of Product and Integration for Cox2M. Katy describes the lessons she’s learned around making sure that the “juice is always worth the squeeze” for new users to adopt data solutions into their workflow. She also explains the methodologies she’d recommend to data & analytics professionals to ensure their IOT and data products are widely adopted. Listen in to find out why this former analyst turned data product leader feels it’s crucial to focus on more than just delivering data or AI solutions, and how spending more time upfront performing qualitative research on users can wind up being more efficient in the long run than jumping straight into development.

 

Highlights/ Skip to:

  • What Katy does at Cox2M, and why the data product manager role is so hard to define (01:07)
  • Defining the value of the data in workflows and how that’s approached at Cox2M (03:13)
  • Who buys from Cox2M and the customer problems that Katy’s product solves (05:57)
  • How Katy approaches the zero-to-one process of taking IOT sensor data and turning it into a customer experience that provides a valuable solution (08:00)
  • What Katy feels best motivates the adoption of a new solution for users (13:21)
  • Katy describes how she spends more time upfront before development to ensure she’s solving the right problems for users (16:13)
  • Katy’s views on the importance of data science & analytics pros being able to communicate in the language of their audience (20:47)
  • The differences Katy sees between designing data products for sophisticated data users vs a broader audience (24:13)
  • The methods Katy uses to effectively perform qualitative research and her triangulation method to surface the real needs of end users (27:29)
  • Katy’s views on the most valuable skills for future data product managers (35:24)

 

Quotes from Today’s Episode
  • “I’ve had the opportunity to get a little bit closer to our customers than I was in the beginning parts of my tenure here at Cox2M. And it’s just like a SaaS product in the sense that the quality of your data is still dependent on your customers’ workflows and their ability to engage in workflows that supply accurate data. And it’s been a little bit enlightening to realize that the same is true for IoT.” – Katy Pusch (02:11)

 

  • “Providing insights to executives that are [simply] interesting is not really very impactful. You want to provide things that are actionable and that drive the business forward.” – Katy Pusch (4:43)

 

  • “So, there’s one side of it, which is [the] happy path: figure out a way to embed your product in the customer’s existing workflow. That’s where the most success happens. But in the situation we find ourselves in right now with [this IoT solution], we do have to ask them to change their workflow.”-- Katy Pusch (12:46)

 

  • “And the way to communicate [the insight to other stakeholders] is not with being more precise with your numbers [or adding] statistics. It’s just to communicate the output of your analysis more clearly to the person who needs to be able to make a decision.” -- Katy Pusch (23:15)

 

  • “You have to define ‘What decision is my user making on a repeated basis that is worth building something that it does automatically?’ And so, you say, ‘What are the questions that my user needs answers to on a repeated basis?’ … At its essence, you’re answering three or four questions for that user [that] have to be the most important [...] questions for your user to add value. And that can be a difficult thing to derive with confidence.” – Katy Pusch (25:55)

 

  • “The piece of workflow [on the IOT side] that’s really impactful there is we’re asking for an even higher degree of change management in that case because we’re asking them to attach this device to their vehicle, and then detach it at a different point in time and there’s a procedure in the solution to allow for that, but someone at the dealership has to engage in that process. So, there’s a change management in the workflow that the juice has to be worth the squeeze to encourage a customer to embark in that journey with you.” – Katy Pusch (12:08)

 

  • “Finding people in your organization who have the appetite to be cross-functionally educated, particularly in a data arena, is very important [to] help close some of those communication gaps.” – Katy Pusch (37:03)

Enhance your understanding of Experiencing Data w/ Brian T. O’Neill (UX for AI Products, Analytics SAAS and Data Product Management) with My Podcast Data

At My Podcast Data, we strive to provide in-depth, data-driven insights into the world of podcasts. Whether you're an avid listener, a podcast creator, or a researcher, the detailed statistics and analyses we offer can help you better understand the performance and trends of Experiencing Data w/ Brian T. O’Neill (UX for AI Products, Analytics SAAS and Data Product Management). From episode frequency and shared links to RSS feed health, our goal is to empower you with the knowledge you need to stay informed and make the most of your podcasting experience. Explore more shows and discover the data that drives the podcast industry.
© My Podcast Data