Monday, October 2, 2023

Creating Domestic Robots That Really Help




Episode 2: How Labrador and iRobot Create Domestic Robots That Really Help

Evan Ackerman: I’m Evan Ackerman, and welcome to ChatBot, a new podcast from IEEE Spectrum where robotics experts interview each other about things that they find fascinating. On this episode of ChatBot, we’ll be talking with Mike Dooley and Chris Jones about useful robots in the home. Mike Dooley is the CEO and co-founder of Labrador Systems, the startup that’s developing an assistive robot in the form of a sort of semi-autonomous mobile table that can help people move things around their homes. Before founding Labrador, Mike led the development of Evolution Robotics’ innovative floor-cleaning robots. And when Evolution was acquired by iRobot in 2012, Mike became iRobot’s VP of product and business development. Labrador Systems is getting ready to launch its first robot, the Labrador Retriever, in 2023. Chris Jones is the chief technology officer at iRobot, which is arguably one of the most successful commercial robotics companies of all time. Chris has been at iRobot since 2005, and he spent several years as a senior investigator at iRobot research working on some of iRobot’s more unusual and experimental projects. iRobot Ventures is one of the investors in Labrador Systems. Chris, you were doing some interesting stuff at iRobot back in the day too, that I think a lot of people may not know how diverse iRobot’s robotics projects were.

Chris Jones: I think iRobot as a company, of course, being around since 1990, has done all sorts of things. Toys, commercial robots, consumer, military, industrial, all sorts of different things. But yeah, myself in particular, I spent the first seven, eight years of my time at iRobot doing a lot of super fun kind of far-out-there research types of projects, a lot of them funded by places like DARPA and working with some great academic collaborators, and of course, a whole crew of colleagues at iRobot. But yeah, some of those were ranged from completely squishy robots to robot arms to robots that could climb mountainsides to robots under the water, all sorts of different fun, useful, but fun, of course, and really challenging, which makes it fun, different types of robot concepts.

Ackerman: And those are all getting incorporated to the next generation Roomba, right?

Jones: I don’t know that I can comment on—

Ackerman: That’s not a no. Yeah. Okay. So Mike, I want to make sure that people who aren’t familiar with Labrador get a good understanding of what you’re working on. So can you describe kind of Labrador’s robot, what it does and why it’s important?

Mike Dooley: Yeah. So Labrador, we’re developing a robot called the Retriever, and it’s really designed as an extra pair of hands for individuals who have some issue either with pain, a health issue or injury that impacts their daily activities, particularly in the home. And so this is a robot designed to help people live more independently and to augment their abilities and give them some degree of autonomy back where they’re fighting that with the issue that they’re facing. And the robot, I think it’s been— after previewing its CES, it has been called a self-driving shelf. It’s designed to be really a mobile platform that’s about the size of a side table but has the ability to carry things as large as a laundry basket or set the dinner and plates on it, automatically navigates from place to place. It raises up to go up to countertop height when you’re by the kitchen sink and lowers down when you’re by your armchair. And it has the ability to retrieve too. So it’s a cross between robots that are used in warehousing to furniture mixed together to make something that’s comfortable and safe for the environment, but really is really meant to help folks where they have some difficulty moving themselves. This is meant to help them give that some degree of that independence back, as well as extend the impact of it for caregivers.

Ackerman: Yeah, I thought that was a fantastic idea when I first saw it at CES, and I’m so glad that you’ve been able to continue working on it. And especially with some support from folks like iRobot, right? Chris, iRobot is an investor in Labrador?

Jones: Correct. Through iRobot Ventures, we’re an early investor in Labrador. Of course, where that means, and we continue to be super excited about what they’re doing. I mean, for us, anyone who has great ideas for how robots can help people, in particular, assist people in their home with independent living, etc., I think is something we strongly believe is going to be a great application for robots. And when making investments, I’ll just add, of course, that earliest stage, a lot of it is about the team, right? And so Mike and the rest of his team are super compelling, right? That paired with a vision, that’s something that we believe is a great application for robots. It makes it an easy decision, right, to say there’s someone we’d like to support. So we love seeing their progress.

Ackerman: Yeah, me too.

Dooley: And we appreciate your support very much. So yeah.

Ackerman: All right, so what do you guys want to talk about? Mike, you want to kick things off?

Dooley: I can lead off. Yeah, so in full disclosure, at some point in my life, I was-- Chris, what’s the official name for an iRobot employee? I forgot what they came up with. It’s not iRoboteer, is it?

Jones: iRoboteer. Yeah.

Dooley: Okay, okay. All right, so I was an iRoboteer in my past life and crossed over with Chris for a number of years. And I know they’ve renovated the building a couple times now, but these products you mentioned or the robots you mentioned at the beginning, a lot of them are in display in a museum. And so I think my first question to Chris was, can you think of one of those, either that you worked on or maybe it didn’t, but you go, “Man, this should have taken off or this should have been this--” or it should have or you wished it would have. It would have been great if one of those that’s in there because there’s a lot, so.

Jones: Yes, there are a lot. You’re right. We have a museum, and it has been renovated in the last couple years, Mike, so you should come back and visit and check out the new updated museum. How would I answer that? There are so many things in there. I would say one that I have some sentimentality toward, and I think it holds some really compelling promise, even though at least to date, it hasn’t gone anywhere outside of the museum, Evan, is related to the squishy robots I was talking about. And in my mind, in one of the key challenges in unlocking future value in robots, and in particular, in autonomous robots, for example, in the home, is manipulation, is physical manipulation of the environment in the home. And Mike and Labrador are doing a little bit of this, right, by being able to maneuver and pick up, carry, drop off some things around the home. But the idea of a robot that’s able to physically pick up, grasp objects, pick them up off the floor, off a counter, open and close doors, all of those things is kind of the Holy Grail, right, if you can cost-effectively and robustly do that. In the home, there’s all sorts of great applications for that. And one of those research projects that’s in the museum was actually something called the Jamming Gripper. Mike, I don’t know if you remember seeing that at all, but this takes me back. And Evan, actually, I’m sure there are some IEEE stories and stuff back in the day from this. But this was an idea of a very compliant, it’s a soft manipulator. It’s not a hand. It’s actually very close to imagining a very soft membrane that’s filled with coffee grounds. So imagine a bag of coffee, right? Very soft and compliant.

But vacuum-packed coffee, you pull a vacuum on that bag. It turns rigid in the shape that it was in. It’s like a brick, which is a great concept for thinking about robot manipulation. That’s one idea. We had spent some research time with some folks in academia, had built a huge number of prototypes, and I still feel like there’s something there. There’s a really interesting concept there that can help with that more general purpose manipulation of objects in the home. So Mike, if you want to talk to us about licensing, maybe we can do that for Labrador with all your applications.

Dooley: Yeah. Actually, that’s what you should add. It would probably increase your budget dramatically, but you should add live demonstrations to the museum. See if you can have projects to get people to bring some of those back. Because I’m sure I saw it. I never knew it was doing that.

Jones: I mean, maybe we can continue this. There might be a little bit of a thread to continue that question into—the first one that came to my mind, Mike, when I was thinking about what to ask. And it’s something I have a lot of admiration or respect for you and how you do your job, which is you’re super good at engaging and listening to users kind of in their context to understand what their problems are. Such that you can best kind of articulate or define or ideate things that could help them address problems that they encounter in their everyday life. And that then allows you kind of as a leader, right, to use that to motivate quick prototype development to get the next level of testing or validation of what if this, right? And those things may or may not involve duct tape, right, involve some very crude things that are trying to elicit kind of that response or feedback from a user in terms of, is this something that would be valuable to you in overcoming some challenges that I’ve observed you having, let’s say, in your home environment? So I’m curious, Mike, how do you think about that process and how that translates into shaping a product design or the identification of an opportunity? I’m curious, maybe what you’ve learned through Labrador. I know you spent a lot of time in people’s homes to do exactly that. So I’m curious, how do you conduct that work? What are you looking for? How does that guide your development process?

Dooley: The word that you talk about is customer empathy, is are you feeling their pain? Are you understanding their need, and how are you connecting with it? And my undergrad’s in psychology, so I always was interested in what makes people think the way they do. I remember a iRobot study going into a home. And we were in the last day testing with somebody and a busy mom. And we’re testing Braava Jet. It’s a little robot that iRobot sells, that it’s really good for places with tight spaces for spraying and scrubbing floors, like kitchens and bathrooms. And the mom said, she almost said it was exhaustion, is that— I said, “What is it?” She says, “Does this do as good of a job as you could do?” And I think most people from iRobot would admit, “No. Can I match what the grease power, all the effort and everything I can put into this?” And she says, “But at least I can set this up, hit a button, and I can go to sleep. And at least it’s getting the job done. It’s doing something, and it gives me my time back.” And when you hear that, people go, “Well, Roomba is just something that cleans for people or whatever.” Like, “No. Roomba gives people their time back.” And once you’re on that channel, then you start thinking about, “Okay, what can we do more with the product that does that, that’s hitting that sort of core thing?” So yeah, and I think having the humbleness to not build a product you want, build it to the need, and then also the humbleness about where you can meet that need and where you can’t. Because robotics is hard, and we can’t make Rosey yet and things like that, so.

Ackerman: Mike, I’m curious, did you have to make compromises like that? Is there an example you could give with Labrador?

Dooley: Oh, jeez, all the— yeah. I mean, no, Labrador is perfect. No, I mean, we go through that all the time. I think on Labrador, no, we can’t do everything people want. What you’re trying to say, is it— I think there’s different languages of minimum viable product or good enough. There was somebody at Amazon used the term— I’m going to blank on it. It was like wonderful enough or something, or they have a nicer—

Jones: Lovable?

Dooley: Lovable. Yeah, lovable enough or something. And I think that that’s what you have to remember, is like, so on one hand, you have to be— you have to sort of have this open heart that you want to help people. And the other point, you have to have a really tight wallet because you just can’t spend enough to meet everything that people want. And so just a classic example is, Labrador goes up and down a certain amount of height. And people’s cabinets and someone in a wheelchair, they would love it if we would go up to the upper cabinets above the kitchen sink or other locations. And when you look at that, mechanically we can, but that then creates-- there’s product realities about stability and tilt testing. And so we have to fit those. Chris knows that well with Ava, for instance, is how heavy the base is for every inch you raise the mass above a certain amount. And so we have to make a limit. You have to say, “Hey, here’s the envelope. We’re going to do this to this, or we’re going to carry this much because that’s as much as we could deliver with this sort of function.” And then, is that lovable enough? Is that is that rewarding enough to people? And I think that’s the hard [inaudible], is that you have to do these deliveries within constraints. And I think sometimes when I’m talking to folks, they’re either outside robotics or they’re very much on the engineering side and not thinking about the product. They tend to think that you have to do everything. And it’s like that’s not how product development works, is you have to do just the critical first step, because then that makes this a category, and then you can do the next one and the next one. I think it brings to mind— Roomba has gone through an incredible evolution of what its functions were and how it worked and its performance since the very first version and to what Chris and team offer now. But if they tried to do the version today back then, they wouldn’t have been able to achieve it. And others fail because they probably went to the wrong angle. And yeah.

Jones: Evan, I think you asked if there are anything that was operating under constraints. I think product development in general, I presume, but certainly, robotics is all about constraints. It’s how do you operate within those? How do you understand where those boundaries are and having to make those calls as to— how are you going to have to— how are you going to decide to constrain your solution, right, to make sure that it’s something that’s feasible for you to do, right? It’s meeting a compelling need. It’s feasible for you to do. You can robustly deliver it. Trying to get that entire equation to work means you do have to reckon with those constraints kind of across the board to find the right solve. Mike, I’m curious. You do your user research, you have that customer empathy, you’ve perhaps worked through some of these surprising challenges that I’m sure you’ve encountered along the way with Labrador. You ultimately get to a point that you’re able to do pilots in homes, right? You’re actually now this— maybe the Duct Tape is gone or it’s at least hidden, right? It’s something that looks and feels more like a product and you’re actually getting into some type of more extended pilot of the product or idea of the product in users’ homes. What are the types of things you’re looking to accomplish with those pilots? Or what have you learned when you go from, “All right, I’ve been watching this user in their home with those challenges. So now I’m actually leaving something in their home without me being there and expecting them to be able to use it”? What’s the benefit or the learnings that you encounter in conducting that type of work?

Dooley: Yeah, it’s a weird type of experiment and there’s different schools of thought of how you do stuff. Some people want to go in and research everything to death and be a fly on the wall. And we went through this— I won’t say the source of it. A program we had to go through because of some of the— because of some of the funding that we’re getting from another project. And the quote in the beginning, they put up a slide that I think it’s from Steve Jobs. I’m sure I’m going to butcher it, that people don’t know what they want until I show them or something. I forget what the exact words are. And they were saying, “Yeah, that’s true for Steve Jobs, but for you, you can really talk to the customer and they’re going to tell you what they need.” I don’t believe that.

Jones: They need a faster horse, right? They don’t need a car.

Dooley: Yeah, exactly.

Jones: They’re going to tell you they need a faster horse.

Dooley: Yeah, so I’m in the Steve Jobs camp and on that. And it’s not because people aren’t intelligent. It’s just that they’re not in that world of knowing what possibilities you’re talking about. So I think there is this sort of soft skill between, okay, listen to their pain point. What is that difficulty of it? You’ve got a hypothesis to say, “Okay, out of everything you said, I think there’s an overlap here. And now I want to find out—” and we did that. We did that in the beginning. We did different ways of explaining the concept, and then the first level we did was just explain it over the phone and see what people thought of it and almost test it neutrally. Say, “Hey, here’s an idea.” And then, “Oh, here’s an idea like Roomba and here’s an idea like Alexa. What do you like or dislike?” Then we would actually build a prototype that was remote-controlled and brought it in their home, and now we finally do the leave-behind. And the whole thing is it’s like how to say it. It’s like you’re sort of releasing it to the world and we get out of the way. The next part is that it’s like letting a kid go and play soccer on their own and you’re not yelling or anything or don’t even watch. You just sort of let it happen. And what you’re trying to do is organically look at how are people— you’ve created this new reality. How are people interacting with it? And what we can see is the robots, they won’t do this in the future, but right now they talk on Slack. So when they send it to the kitchen, I can look up and I can see, “Hey, user one just sent it to the kitchen, and now they’re sending it to their armchair, and they’re probably having an afternoon snack. Oh, they sent it to the laundry room. Now they sent it over to the closet. They’re doing the laundry.” And the thing for us was just watching how fast were people adopting certain things, and then what were they using it for. And the striking thing that was—

Jones: That’s interesting.

Dooley: Yeah, go ahead.

Jones: I was just going to say, I mean, that’s interesting because I think I’m sure it’s very natural to put the product in someone’s home and kind of have a rigid expectation of, “No, no, this is how you use it. No, no, you’re doing it wrong. Let me show you how you use this.” But what you’re saying is it’s almost, yeah, you’re trying your best to solve their need here, but at some point you kind of leave it there, and now you’re also back into that empathy mode. It’s like, “Now with this tool, how do you use it?” and see kind of what happens.

Dooley: I think you said it in a really good way, is that you’ve changed this variable in the experiment. You’ve introduced this, and now you go back to just observing, just hearing what they’re— just watching what they’re doing with it, being as in-intrusive as possible, which is like, “We’re not there anymore.” Yeah, the robot’s logging it and we can see it, but it’s just on them. And we’re trying to stay out of the process and see how they engage with it. And that’s sort of like the thing that— we’ve shared it before, but we were just seeing that people were using it 90 to a 100 times a month, especially after the first month. It was like, we were looking at just the steady state. Would this become a habit or routine, and then what were they using it for?

Jones: So you’re saying when you see that, you have kind of a data point of one or a small number, but you have such a tangible understanding of the impact that this seems to be having, that you as an entrepreneur, right, that gives you a lot of confidence that may not be visible to whatever people that are outside the walls just trying to look at what you’re doing in the business. They see one data point, which is harder to grapple with, but you, being that close and understanding in that connection between what the product is doing and the needs that that gives you or the team a substantial confidence boost, right, is to, “This is working. We need to scale it. We have to show that this ports to other people in their homes, etc.,” but it gives you that confidence.

Dooley: Yeah, and then when we take the robots away, because we only have so many and we rotate them, getting the guilt trip emojis two months later from people, “I miss my robot. When are you going to build a new one?” and all that and stuff. So—

Jones: Do people name the robots?

Dooley: Yeah. They immediately do that and come up with creative names for it. One was called Rosey, naturally, but others was like— I’m forgetting the name she called it. It was inspired by a science fiction on an artificial AI companion and things. And it was just quite a bit of just different angles of— because she saw this as her assistant. She saw this as sort of this thing. But yeah, so I think that, again, for a robot, what you can see in the design is the classic thing at CES is to make a robot with a face and arms that doesn’t really do anything with those, but it pretends to be humanoid or human-like. And so we went the entire other route with this. And the fact that people then still relate to it that way, it means that-- we’re not trying to be cold or dispassionate. We’re just really interested in, can they get that value? Are they reacting to what the robot is doing, not to what the sort of halo that you sort of dressed it up as for that?

Jones: Yeah, I mean, as you know, like with Roomba or Braava and things like that, it’s the same thing. People project anthropomorphism or project that personality onto them, but that’s not really there, right, in a strong way. So yeah.

Dooley: Yeah, no, and it’s weird. And it’s something they do with robots in a weird way that they don’t-- people don’t name their dishwasher usually or something. But no, I would have-

Jones: You don’t?

Dooley: Yeah, [inaudible]. I did for a while. The stove got jealous, and then we had this whole thing when the refrigerator got into it.

Ackerman: I’ve heard anecdotally that maybe this was true with PackBots. I don’t know if it’s true with Roombas. That people want their robot back. They don’t want you to replace their old robot with a new robot. They want you to fix the old robot and have that same physical robot. It’s that lovely connection.

Jones: Yeah, certainly, PackBot on kind of the military robot side for bomb disposal and things like that, you would directly get those technicians who had a damaged robot, who they didn’t want a new robot. They wanted this one fixed, right? Because again, they anthropomorphize or there is some type of a bond there. And I think that’s been true with all of the robots, right? It’s something about the mobility, right, that embodies them with some type of a-- people project a personality on it. So they don’t have to be fancy and have arms and faces necessarily for people to project that on them. So that seems to be a common trait for any autonomously mobile platform.

Ackerman: Yeah. Mike, it was interesting to hear you say that. You’re being very thoughtful about that, and so I’m wondering if Chris, you can address that a little bit too. I don’t know if they do this anymore, but for a while, robots would speak to you, and I think it was a female voice that they had if they had an issue or something or needed to be cleaned. And that I always found to be an interesting choice because it’s sort of like the company is now giving this robot a human characteristic that’s very explicit. And I’m wondering how much thought went into that, and has that changed over the years about how much you’re willing to encourage people to anthropomorphize?

Jones: I mean, it’s a good question. I mean, that’s evolved, I would say, over the years, from not so much to there’s more of kind of a vocalization coming from the robot for certain scenarios. It is an important part. Some users, that is a primary way of interacting. I would say more of that type of feedback these days comes through more of kind of the mobile experience through the app to give both the feedback, additional information, actionable next steps. If you need to empty the dustbin or whatever it is, that that’s just a richer place to put that and a more accepted or common way for that to happen. So I don’t know, I would say that’s the direction things have trended, but I don’t know that that’s— that’s not because I don’t believe that we’re not trying to humanize the robot itself. It’s just more of a practical place where people these days will expect. It’s almost like Mike was saying about the dishwasher and the stove, etc. If everything is trying to talk to you like that or kind of project its own embodiment into your space, it could be overwhelming. So I think it’s easier to connect people at the right place and the right time with the right information, perhaps, if it’s through the mobile experience though.

But it is. That human-robot interaction or that experience design is a nuanced and tricky one. I’m certainly not an expert there myself, but it’s hard to find that right balance, that right mix of, what do you ask or expect of the user versus what do you assume or don’t give them an option? Because you also don’t want to overload them with too much information or too many options or too many questions, right, as you try to operate the product. So sometimes you do have to make assumptions, make defaults, right, that maybe can be changed if there’s really a need to that might require more digging. And Mike, I was curious. That was a question I had for you, was you have a physically, a meaningfully-sized product that’s operating autonomously in someone’s home, right?

Dooley: Yes.

Jones: Roomba can drive around and will navigate, and it’s a little more expected that we might bump into some things as we’re trying to clean and clean up against walls or furniture and all of that. Then it’s small enough that that isn’t an issue. How do you design for a product of the size that you’re working on, right? What went into kind of human-robot interaction side of that to allow for people who need to use this in their home that are not technologists, but they can take advantage of the— that can take advantage of the great value, right, that you’re trying to deliver for them. But it’s got to be super simple. How did you think about that HRI kind of design?

Dooley: There’s a lot wrapped into that. I think the bus stop is the first part of it. What’s the simplest way that they can command in a metaphor? Like everybody can relate to armchair or front door, that sort of thing. And so that idea that the robot just goes to these destinations is super simplifying. People get that. It’s almost now at a nanosecond how fast they get that and that metaphor. So that was one of it. And then you sort of explain the rules of the road of how the robot can go from place to place. It’s got these bus routes, but they’re elastic and that it can go around you if needed. But there’s all these types of interactions. Okay, we figured out what happens when you’re coming down the hall and the robot’s coming down. Let’s say you’re somebody else and they just walk towards each other. And I know in hospitals, the robot’s programmed to go to the side of the corridor. There’s no side in a home. That’s the stuff. So those are things that we still have to iron out, but there’s timeouts and there’s things of—that’s where we’ll be—we’re not doing it yet, but it’d be great to recognize that’s a person, not a closed door or something and respond to it. So right now, we have to tell the users, “Okay, it’ll spin a time to make sure you’re there, but then it’ll give up. And if you really wanted to, you could tell it to go back from your app. You could get out of the way if you want, or you could stop it by doing this.”

And so that’ll get refined as we get to the market, but those interactions, yeah, you’re right. You have this big robot that’s coming down. And one of the surprising things was it’s not just people. One of the women in the pilot had a Border Collie, and their Border Collie’s, by instinct, bred to herd sheep. So it would hear the robot. The robot’s very quiet, but she would command it. It would hear the robot coming down the hall and it would put its paw out to stop it, and that became it’s game. It started herding the robot. And so it’s really this weird thing, this metaphor you’re getting at.

Jones: Robots are pretty stubborn. The robot probably just sat there for like five minutes, like, “Come on. Who’s going to blink?”

Dooley: Yeah. Yeah. And the AI we’d love to add, we have to catch up with where you guys are at or license some of your vision recognition algorithms because, first, we’re trying to navigate and avoid obstacles. And that’s where all the tech is going into in terms of the design and the tiers of safety that we’re doing. But it’s just like what the user wanted in that case is, if it’s the dog, can you play my voice, say, “Get out” or, “Move,” or whatever, or something, “Go away”? Because she sent me a video of this. It’s like it was happening to her too, is she would send the robot out. The dogs would get all excited, and she’s behind it in her wheelchair. And now the dogs are waiting for her on the other side of the robot, the robot’s wondering what to do, and they’re all in the hall. And so yeah, there’s this sort of complication that gets in there that you have multiple agents going on there.

Ackerman: Maybe one more question from each of you guys. Mike, you want to go first?

Dooley: I’m trying to think. I have one more. And when you have new engineers start—let’s say they haven’t worked on robots before. They might be experienced. They’re coming out of school or they’re from other industries and they’re coming in. What is some key thing that they learn, or what sort of transformation goes on in their mind when they finally get in the zone of what it means to develop robots? And it’s a really broad question, but there’s sort of a rookie thing.

Jones: Yeah. What’s an aha moment that’s common for people new to robotics? And I think this is woven throughout this entire conversation here, which is, macro level, robots are actually hard. They’re difficult to kind of put the entire electromechanical software system together. It’s hard to perceive the world. If a robot’s driving around the home on its own, it needs to have a pretty good understanding of kind of what’s around it. Is something there, is something not there? The richer that understanding can be, the more adaptable or personalized that it can be. But generating that understanding is also hard. They have to be built to deal with all of those unanticipated scenarios that they’re going to encounter when they’re let out into the wild. So it’s that I think it’s surprising to a lot of people how long that long tail of corner cases ends up being that you have to grapple with. If you ignore one of them, it can mean it can end the product, right? It’s a long tail of things. Any one of them ends up, if it rears its head enough for those users, they’ll stop using the product because, “Well, this thing doesn’t work, and this has happened like twice to me now in the year I’ve had it. I’m kind of done with it,” right?

So you really have to grapple with the very long, long tail of corner cases when the technology hits the real world. I think that’s a super surprising one for people who are new to robotics. It’s more than a hardware consumer product company, consumer electronics company. You do need to deal with those challenges of perception, mobility in the home, the chaos of— specifically, you’re talking about more of the home environment, not the more structured environment and the industrial side. And I think that’s something that everyone has to go through that learning curve of understanding the impact that can have.

Dooley: Yeah. Of the dogs and cats.

Jones: Yeah, I mean, who would have thought cats are going to jump on the thing or Border Collies are going to try to herd it, right? And you have to just-- and you don’t learn those things until you get products out there. And that’s, Mike, what I was asking you about pilots and what do you hope to learn or the experience there. Is you have to take that step if you’re going to start kind of figuring out what those elements are going to start looking like. It’s very hard to do just intellectually or on paper or in the lab. You have to let them out there. So that’s a learning lesson there. Mike, maybe a similar question for you, but--

Ackerman: This is the last one, so make it a good one.

Jones: Yep. The last one, it better be a good one, huh? It’s a similar question for you, but maybe cut more on address to an entrepreneur in the robotic space. I’m curious, for a robot company to succeed, there’s a lot of, I’ll call them, ecosystem partners, right, that have to be there. Manufacturing, channel, or go-to-market partners, funding, right, to support a capital-intensive development process, and many more. I’m curious, what have you learned or what do people need to going into a robotics development or looking to be a robotics entrepreneur, what do people miss? What have you learned? What have you seen? What are the partners that are the most important? And I’m not asking for, “Oh, iRobot’s an investor. Speak nicely on the financial investor side.” That’s not what I’m after. But what have you learned, that you better not ignore this set of partners because if one of them falls through or it doesn’t work or is ineffective, it’s going to be hard for all the other pieces to come together?

Dooley: Yeah, it’s complex. I think just like you said, robots is hard. I think when we got acquired by iRobot and we were having some of the first meetings over— it’s Mike from software. Halloran.

Ackerman: This was Evolution Robotics?

Dooley: Evolution. Yeah, but Mike Halloran from iRobot, we came to the office at the Evolution’s office, and he just said, “Robots are hard. They’re really hard.” And it’s like, that’s the point we knew there was harmony. We were sort of under this thing. And so for everything what Chris is saying is that all of that is high stakes. And so you sort of have to be-- you have to be good enough on all those fronts of all those partners. And so some of it is critical path technology. Depth cameras, that function is really critical to us, and it’s critical to work well and then cost and scale. And so just being flexible about how we can deal with that and looking at that sort of chain and how do we sort of start at one level and scale it through? So you look at sort of, okay, what are these key enabling technologies that have to work? And that’s one bucket that are there. Then the partnerships on the business side, we’re in a complex ecosystem. I think the other rude awakening when people look at this is like, “Well, yeah, why doesn’t-- as people get older, they have disabilities. That’s what you have-- that’s your insurance funds.” It’s like, “No, it doesn’t.” It doesn’t for a lot of-- unless you have specific types of insurance. We’re partnering with Nationwide. They have long-term care insurance - and that’s why they’re working with us - that pays for these sorts of issues and things. Or Medicaid will get into these issues depending on somebody’s need.

And so I think what we’re trying to understand is—this goes back to that original question about customer empathy—is that how do we adjust what we’re doing? That we have this vision. I want to help people like my mom where she is now and where she was 10 years ago when she was experiencing difficulties with mobility initially. And we have to stage that. We have to get through that progression. And so who are the people that we work with now that solves a pain point that can be something that they have control over that is economically viable to them? And sometimes that means adjusting a bit of what we’re doing, because it’s just this step onto the long path as we do it.

Ackerman: Awesome. Well, thank you both again. This was a great conversation.

Jones: Yeah, thanks for having us and for hosting, Evan and Mike. Great to talk to you.

Dooley: Nice seeing you again, Chris and Evan. Same. Really enjoyed it.

Ackerman: We’ve been talking with Chris Jones from iRobot and Mike Dooley from Labrador Systems about developing robots for the home. And thanks again to our guests for joining us, for ChatBot and IEEE Spectrum. I’m Evan Ackerman. Reference: https://ift.tt/iAy8wL5

No comments:

Post a Comment

ChatGPT’s success could have come sooner, says former Google AI researcher

In 2017, eight machine-learning researchers at Google released a groundbreaking research paper called Attention Is All You Need , which in...