Gamma’s head of design on using AI to synthesize feedback and generate on-brand imagery | Zach Leach

howiaipodcast hzPKb8bDvdY Watch on YouTube Published June 08, 2025
Scored
Duration
36:20
Views
10,031
Likes
188

Scores

Composite
0.55
Freshness
0.00
Quality
0.84
Relevance
1.00
7,227 words Language: en Auto-generated

about how many pieces of feedback did you analyze this? Is this dozens? Is this hundreds? Over the course of a week, we got about 550 individual responses. What I thought would actually work really well is something like ChateBT's deep research on this file. The cool thing is it sort of went through all the feedback, understood what's working, what's not working, what prompts work, what don't work. Just having tools like this allow you to stay much closer to the customer, access largecale research in a way that would have been very tedious and expensive before. I'm curious if you can tell us a little bit about how you use AI to scale brand and art direction. What we have actually come up with here is an ability to use midjourney as part of our workflow to help make our art direction consistent and be able to come up with design elements way faster than before. And it's almost like I can follow those rabbit holes of creativity. I can be like, "Let me just explore this idea." And every one of those ideas feels like it could be something I could use. You're able to bring this next layer of craft and detail and care to the user experience, which I do think makes a difference. Welcome back to How I AI. I'm Claire Vo, product leader and AI obsessive here on a mission to help you build better with these new tools. Today we have a fun and inspiring conversation with Zach Leech, head of design at Gamma. Zach's going to show us how he uses AI as a data researcher, user researcher, deep researcher, and art department, so he can focus on the craft, care for details, and fun he wants to deliver for Gamma's users. Let's get to it. This episode is brought to you by work OS. AI has already changed how we work. tools are helping teams write better code, analyze customer data, and even handle support tickets automatically. But there's a catch. These tools only work well when they have deep access to company systems. Your co-pilot needs to see your entire codebase. Your chatbot needs to search across internal docs. And for enterprise buyers, that raises serious security concerns. That's why these apps face intense IT scrutiny from day one. To pass, they need secure authentication, access controls, audit logs, the whole suite of enterprise features. Building all that from scratch, it's a massive lift. That's where Work OS comes in. Work OS gives you drop-in APIs for enterprise features, so your app can become enterprise ready and scale up market faster. Think of it like Stripe for enterprise features. OpenAI, Perplexity, and Cursor are already using WorkOS to move faster and meet enterprise demands. Join them and hundreds of other industry leaders at workos.com. Start building today. Zach, thanks for being here. Sure, no problem. Thanks for having me. I'm such a big fan of the Gamma team. I'm such a big fan of the Gamma product. But what I I love the most about what you've built, not only is a great AI product, but it is truly a global product. So, how many of your customers are actually international? Yeah, we have uh I think about 60% of our user base comes from outside of the US and non-English-speaking languages and um a pretty significant portion of our revenue too. So, we really really focus on internationalization and localization and a lot of stuff like that. And as a head designer, you're trying to take all that global input and make your product better. And I'd love for you to show us exactly how you you face that challenge of a very international diverse user base, but get all the insights you need for making the product better. Yeah. Um maybe I can start by showing uh my screen and talking a little bit about one of the features that we recently released uh and some of the challenges as a designer you might face with this sort of stuff. So this is Gamma. Uh, demo is a a tool that makes presentations AI powered presentations. We released this new feature that lets people um edit images. So, you might generate a deck and the image isn't quite right. Uh, and with AI image editing, you can basically open this up and chat with our AI to change it to be whatever, you know, whatever works for you. So, in this case, u maybe I want to add some caramel drizzle to this popcorn. And Gamma will then use an image model to basically conduct that edit. And importantly, we're trying to get a sense of like how does this work? How are people that's quite the drizzle uh a little bit more than so so you might go in here and you might say actually that is kind of a poor suggestion and you might say something like um too much too much drizzle too much drizzle. And when you submit the feedback we collect all that and really try to understand like what kinds of prompts are working, what types of edits are working and things like that. But one of the big challenges like you said at the top is we have a lot of feedback that comes from a lot of different languages. So this is actually some actual feedback. Um you can see like there's just a ton of different stuff in here from all over the world. Lots of different I have to pause on the extra arm. Yes, that actually comes up. There's a lot of like extra arms, extra fingers, extra weirdness, right? And ultimately like people just sort of provide that feedback and we try to understand like you know maybe we want to use a different model for generating people or maybe we want to use a different model for certain types of edits and things like that. But one of the big challenges here is like how do I how do I go through you know this this all this feedback and get some sense of like what's working and what's not working especially I'm trying to understand like the translation aspect of this whole thing. Yeah. For those that aren't watching, just in the top 10, I'm seeing three or four different languages. If you scroll through, about, you know, 30 40% are in non-English. And Zach, I know you have many talents, but I don't think you're multilingual. Yeah. At this level. Yeah, totally. And so, what I thought would actually work really well is something like Chat GPT's deep research on uh on this file, right? So I I'm going to ask it to do uh some prompting and or ask it to do some classification via prompting and then kind of just give me some summarization of like of like you know what's working, what's not working and really as a starting point to dig into kind of understanding some of this data. So I can just use to to upload the file here. Uh I'll just drag it in from my desktop and I won't I won't execute the the query now because it's going to take like 20 minutes, but I can show you exactly what what I did before. So, um, this was the upload and I said, "Hey, this is some feedback we received about our AI image editing feature, analyze it, you know, figure out what we're doing well, what we're not doing well." And then it sort of followed up with like, okay, before I get started, because it's going to take like 20 minutes, it asks me like what I want to see like um, do I want to see sentiment or complaints, praises, trends, whatever. And I said, let's break it down for the product team. Uh, and basically say like what are people liking, what's what people don't like, things like that. So you can see that it worked for for a while actually on this and it went through row by row 19 minutes. Yeah, it worked very hard for me. But then you can see like here we are with with like what people love, right? And it actually gave me the translations here. Obviously Moen is is is obvious one, but some of the stuff in like Turkish and and you know some of these deep languages um you really get a sense of of like what's working no matter the language. And then you know the cool thing is like it sort of went through all the feedback understood what's working not working what prompts work what don't work and then you can even sort of dig in further. So after it conducts the whole deep research, you can ask it questions like, okay, now do this classification each row. Now make a a spreadsheet of those of those classified, right? So like what was the rating? What was the category of this? And ultimately what I can do then is I can put this into any other tool where I can build graphs and charts and then start to understand like okay actually some of the upscaling stuff is working really well. uh and then some of like maybe the vectorzation operations weren't working super well. And so what we sort of ended up with was something where I could take all of this deep research uh and copy this and put it into Gamma uh by pasting in text and it'll actually generate like uh a presentation based on all of this stuff uh for me as a cool starting point. So I'll go ahead and fire this off real quick. And I want to make sure that like it's using charts and graphs. So we'll use this but I'll say use charts graphs data viz not photos and it'll take all this data all this research and uh basically bang out a a little uh little presentation for me and so this was super useful in understanding you know some of the highle points and then it even gave me some ideas of like where I might be able to make some changes from a UX standpoint too right so like you know these are very general and sort of you know I I need to kind of think deeper about these as far as implementing them but like the stuff that's working well like people asked for you know more specificity in the upscaling stuff people asked for more specificity and it's pulling out quotes and using citations uh and stuff like that so it's a super cool use case for understanding like customer sentiment and and as a designer like being able to cut through all these languages and and build these things out is super powerful to put this in context text about how many pieces of feedback did you analyze this? Is this dozens? Is this hundreds? What what is this? Over the course of a week, we got about 550 um individual responses. And uh and that was just like way too many for me to go through and like do individual translations for or or whatever. And before these tools were available to you, chat GPT was available to you, what how would you have approached this? What would have what would have been your your process? If I'm being totally honest, I probably would have like hand looked at maybe 20. I think I probably would have, right? I probably would have been like, "Let me go find the English ones and let me go like do some classification myself and be like, oh, like this feels like that or or you know, this feels like this type of prompt." But to be able to just like go through it and then at the scale that um CHT GPT was able to to do this analysis uh was just I mean something I I couldn't I literally could just couldn't do. And do you mind going back to the chat GPT? I'm just curious you know you showed um that you use deep research. Did you have you tested doing this kind of flow on not deep research? Is there a specific you know place you've seen deep research do particularly well or not? Yeah, the first time I did this, I didn't use deep bra search and what it ended up doing was like writing a Python script with some very very basic querying about keywords and stuff and I'm like that's not actually what I want it to do, right? Because I I did want it to use, you know, some AI sense and some some classification and sort of understand the data at a deeper level. And so it's like, sure, I'll I'll do this. I'll make a Python script and I'll, you know, I'll make you another spreadsheet. But like it was not nearly as as deep or as insightful as like you know because because it would just do basic keyword matching in the in the Python script. And so you know I had I was basically going to like either uh write a script that like ran some you know uh uh AI prompt on each row and then I realized I could just use deep research to do it for me. So, you know what? I was just thinking to about a year ago when I had 1,500 pieces of customer feedback, and that's exactly what I did. I ran a script and um filled out the rows on each line of feedback by a single prompt. You're going to save me. I mean, maybe you won't save me time because, you know, deep research takes 20 minutes, but I'll get I'll get better quality here and less less pain. Yeah. I'm curious, did you um were you able to glean if there were regional differences in the feedback? I mean, what I think is interesting about this is you could slice this so many arbitrary Yeah. ways if you wanted. So, one thing that I that I actually was concerned about was um paid versus free because we provide um two different levels of models. So, if you have our uh Gamma Pro, uh you get all the advanced models and you get like the new GPT image model and stuff like that. And so that's something I I actually had to do outside of this after for for the the sort of like conclusion of this analysis um was to better understand like is there a real discrepancy in you know the different models and so uh we found about like a 5% rating difference after kind of you know going over and figuring out okay was this feedback from a paid user was this feedback from a a free user and it does it does goes to show that like you know better models do have sort of a generally better outcome. Yep. So, just to take a step back, you took this very diverse, very um loosely loosely articulated feedback. Yeah. Like hundreds and hundreds of not me anymore. What does it mean? Yeah. Yeah. and you took 20 minutes of of deep research, you classified it, and then not only did you generate that output, but then you used AI, your own product to generate a presentation that I'm sure you went and took to your product counterpoints and your engineering counterpoints that says, "We need to fix too many arms. This is the top of the queue." Yeah. Yeah. Actually, um, there were some real insights out of this. I think the first one was trying to highlight the things that actually really worked really well. And so, you know, you could say like uh this upscale thing like maybe we need to elevate that because it has a really people people really like it and really love it. Um, another insight was people were complaining a lot and again this is something I wouldn't have been able to tell had not been translated. People were complaining about multi-step edits failing. So, you can imagine a world where you're saying, "Oh, move this person to the left and change the background and then put a hat on him and it would do maybe one of those things." And so from from a UX design standpoint and like a roadmap standpoint, I was thinking, well, maybe we should design something that actually follows up with you. Maybe instead of just saying, okay, fine, I'll do the edit, let's ask and let's say like, oh, you seem to, you know, you seem to be doing, you know, multiple things, you want to split that up. Or maybe it just automatically splits it up for you. Or maybe asks for more details. And so just trying to like get a sense of is there something we can do when we find a prompt that's not working from a UX standpoint to just like make that easier, make that better for people. Yeah. And then for the designers listening, Zach, you and I have been in this industry for a while and we've even worked together. One of the things that typically gets underfunded is research, user research. Um, I've never met a design team who's been satisfied with the amount of research time or capacity they have. So I can imagine just having tools like this allow you to stay much closer to the customer access, you know, largecale research in a way that would have been very tedious and expensive before and hopefully unlock those insights that I know the best designers that I've worked with really want to center around. Yeah, totally. I mean that that's exactly right. You know, being able to basically, you know, capture a as much as we possibly can and then just fil sort through it all, right? like like we can just air on the side of getting more data now, right? Just just put a free form field, see what people think, see what people people like about this thing and then we can kind of sort it all out later, which was really kind of not super possible before. Um maybe you had to do, you know, contextual research or like interviews and stuff like that, but now it's kind of like, well, we can get a we get a, you know, kind of good sense on the aggregate. So yeah, this episode is brought to you by Retool. There's a huge gap between impressive AI demos and AI apps that deliver real value inside your business. While most AI solutions can only generate text, Retool lets you build apps that take meaningful action by connecting directly to your business systems and data. With Retool, developers combine the power of code with the speed of visual building to create AI powered tools that solve real problems. No more writing endless integration code or building UIs from scratch. The results speak volumes. The University of Texas medical branch increased diagnostic capacity 10fold. Amazon's Genai team uses Retool to make complex AI accessible to enterprise customers. And RAMP saved $8 million while boosting efficiency by 20%. That's why over 10,000 companies from startups to Fortune 500s trust Retool as their AI app layer. Retool because AI should do more than talk. It should work well. In addition to being a really neat tool in a in a globally distributed product, Gamma is also famously small for the scale that you are delivering for your customers. I think you're about 30 people have stayed very very very small um even through some tremendous success and have this beautiful brand that you just just relaunched um that I'm going to make you show because it's so lovely. So pull up that homepage um if you haven't seen the video. It's really fun. And I know that you personally put a lot of care into the craft design brand, but really great brands are expensive to create. They're expensive to expand. Um, they're expensive to maintain. And I'm curious if you can tell us a little bit about how you use AI to h so beautiful how you use AI to scale brand and art direction. Yeah. Yeah. Exactly. So, our rebrand really had a very specific sort of sort of art direction, art style that we we were going for. Really speak to what our brand kind of does. It's imagin imaginative. It's it's, you know, it's airy. It's light, but it's also kind of surreal and and fun. And this is the kind of stuff that actually, you know, uh uh uh you classically you'd sort of have to have an art department kind of be able to manage and have a lot of kind of individual artists and people kind of doing uh this type of work. Um you know that could have turnaround time of of days and and even then like just getting people up to speed to understand the direction is is a challenge too. And what we have actually come up with here is an ability to use a midjourney as part of our workflow to help you know make our art direction consistent and be able to come up with you know design elements like right way faster than before. So let me switch over to our midjourney here real quick and I'll give you a sense of exactly how this works. There are a few things that I am terrified to screen share. Slack is one of them sometimes. And then, you know, there's some weird stuff that pops up in my mid journey every now and then. It's like brand stuff, brand stuff, brand stuff. Something weird I generated for my So, I'm excited to see your mid journey. I won't scroll down. I'll just scroll up, but let me start with a little bit of context about what I was actually trying to accomplish here. So again, I was working hard on this um AI image editing feature and we found there was an opportunity here to actually have uh some sort of education, some sort of space here uh in this kind of panel when it opens to say here's what this is. Um you know, here's how to use it. Typically these things are called like empty states or empty messages or whatever. So, as the company kind of makes different art assets, we kind of all throw it into Figma. And you can see sort of our style here, right? It's like surrealist, pointalist, fun, vivid, colorful stuff like that. And this is definitely not an accident, right? This is this is very intentional. And we use a midjourney style and uh style reference and profile and sort of a a set of prompts that can really drive that kind of art direction in a way that's consistent across the organization and we can just like like bang stuff out. And so working on this feature, I kind of was thinking and and maybe I can walk you through the actual process of like of like, you know, the evolution of this prompt in midjourney. But basically, I thought, well, okay, well, you're kind of making images. So maybe I started with like a painting. And you can see here some images I generated that are just like a painting and it's like super weird and surreal. And I'm like, okay, well, what if it was like, you know, maybe a person chatting or like a chat message. And then I kind of arrived upon this thought of like well maybe it's like this evolution of a thing to another thing or two halves of of something that you can sort of um see the transition in right because I'm what I'm trying to express here in this empty state is a transformation right so you've got something and transforming to something else and so I also thought about apples too because as I was building this whole feature out uh I used an apple a lot as like the example image and thought it would be nice to do a little send up to like this, you know, the the thousands of apples I've generated in in the image editing tool so far, but it wasn't quite right. It didn't quite feel right. So, I kind of looked back at our our imagery and I saw a lot of like animals and I think somehow this bird like popped into this when I when I said an apple half green half red floating in the sky like a bird happened and I'm like, "Oh, maybe I could use an animal." So, it's very this like this almost serendipitous moment where I'm like, "This is kind of cool." like not an apple, but it's cool. Uh, and so I really dug into the bird idea and a lot of our branding uses different kinds of animals and and things like that. It's just sort of fun imagery like jellyfish and stuff like that. And so I really dug into here and you can see how my prompt has evolved, right? So I was like, "Okay, a bird floating in the sky illustration of a half bird colored." And you can kind of get a sense of like the the prompt. I'm like really honing in on it now. I've got this like vertically split image, half bird, one half is this, one half is that vertical delight. like I get more and more and more specific as I kind of honing in on this idea. Well, and if I could pause and go back to the before times, I'm just thinking about Claire Creative, your art direction agency that's been asked to generate this image for you and you just keep coming back to me, "No, an apple. No, a bird. No, a bird in half. No, a bird with more detail." And trust me, people do that all the time with their branding agencies and it's miserable on both sides. Super slow and you never quite get what you want. And just for folks that are not watching, we probably scrolled through hundred different revisions of different images and image types for you yourself who know you know you'll know when you know when you get the thing that you get. Yeah. um that you want for you to do that iteration yourself with a very high quality ondemand art director, illustrator, creative thinker has to feel so freeing. Yeah. Yeah, it is. And and it's almost like I can follow those rabbit holes of creativity. I can be like, let me just explore this idea and it and and you know, every one of those ideas feels like it could be something I could use, you know, and it's just yeah, it's just it is very freeing. And it's funny because I did find the one. I was like, but I but I but I made a few after it. I'm like, oh, actually, this was really the one. And so, as my kind of prompt evolved and as these generations evolved, I really landed on on something like this. That's what I was going to pick. Yeah. No, this definitely like it's it's it's definitely the best. It really speaks to uh you two halves of something that's changing and um you know, it's it's sort of both halves sort of look kind of real. It's it and the colors are just great. So it looks friendly to it's just it's a perfect little little image. And if I may, it gives you that um progressive generation effect that the 40 image gen has where it's blurry to details. It it gives you a lot a lot in this this little image. Yeah, totally. It's it's definitely this one nailed it for sure. But the one problem with this is obviously like we've got this whole like background here. And I just wanted a way to quickly remove the background so I could sort of put it into our into our kind of our format in Figma. And what I use and this is probably something that not a lot of designers use and frankly not I think a lot of people use but I use a tool called uh replicate. Replicate is basically, I think, a very developer focused um tool and it has all sorts of different models and there's just like a bunch there's a ton of stuff in this uh on this platform. Um but one of the things it does really really well is there's like a very specific model here for removing background. Uh and it's like it's like excellent. So um I just kind of uh I'll upload the image real quick and you can kind of already see the output, but I'll just um I'll pick the image real quick. I think it's this one. And I'll just upload this really quick. And it's like super fast and very high quality. It just removes the removes the background for me. So I can then copy this uh and paste this into um into Figma. And so you can see what ultimately I came up with was like it looks really good and it kind of has this card here and it's kind of popping out of the card a little bit. Talks to it speaks to our branding where you know we've got this idea of like breaking out of the boundaries of of a traditional slide. And so it's it's a nice little image that sort of really fits our whole brand and vibe and and everything. And so this was something that I could do, you know, super duper quickly, you know, and probably less time it took for for a JBD's uh uh deep research to to cook up. Yeah. So, you know, it's sort of like being able to just to be so close to something that feels so real and having the tools at your fingertips to be able to just like throw it into Figma. Um, have it look good, have it feel on brand, uh, and and and ship it pretty quick. Yeah. So, for the listeners, I just want to call out a couple things in your flow. So, you have your brand assets where you really articulated some of the keywords, styles, things that you can use in prompts. Then in midJourney, I just want to call out some things for folks. So the S ref the style reference um that code can you just explain a little bit how that code works in your generations? Yeah. So during the process of basically establishing our brand and working with our brand agency and um our creative director Mel who is just a creative genius and is amazing. uh we came up with this whole idea of like you know this this style that we sort of personalized through MidJourney through their whole personalization um tool and we're able to basically say like this style reference and this personalization piece put together just really is going to generate things that feel very on brand. And so it's almost like a kit in a very loose kind of way that we kind of built and and and socialized around our company for people to be able to generate images that feel super on brand. And so anytime you prompt in midJourney you were using these style references or these keywords or things and that's getting you closer to um the bullseye in terms of brand alignment than it would be just using uh totally natural language prompting. Totally. Yeah. Yeah. It really just hones everything in. And you can see in the beginning if you're not if you don't use certain words like if we go down to um some of the things I generated in the very beginning like it was pretty off. I mean this still feels like not quite right. And for certain types of things like as you sort of expand the prompt a little bit um you get you really can start to to hone it in and really find that find that gold in there. Yeah. And then the second, you know, tactical thing I want to call out is you wanted to kind of pull the bird image out, have a transparent background so you pull that into Figma. In the old times, I know this cuz I was a designer. I would have gotten out the like pen tool, the little vector pen tool, and I would have just traced this stupid bird and done a mask around it. And now you're telling me on Replicate, which I also use occasionally for stuff, there's just a a purpose-built model for removing backgrounds. And so you're using um a machine learning or AI model hosted on replicate to just pull pull these images out and give you some transparent images you can drop into Figma. Yeah, there's probably a Figma plugin that does this, but I'm just so used to it. And I also like playing around in in replicate a little bit just to see. Yeah, it gives you some cred, too. Like how how did you get your transparent images because I I curled an API. Okay. No, just kidding. And then and then all this time saving then lets you design something with a lot of craft and care for something that would be I think very easy to leave plain and boring. It would be very easy to leave that empty state that just says like edit images with a AI and you can do one, two, and three things and put a little, you know, gray text there and you're able to bring this next layer of craft and detail and care to the user experience which I do think makes a difference. Um, it has to be really satisfying as a designer to be able to do this stuff and then it's got to feel great as a user. Yeah, absolutely. Uh I I mean I think I think people are going to see it as like you know there is a fit and finish. There's a craft to it that that um yeah it speaks to our commitment to just making it right and making it look good and then um yeah just so easy for the design team to to do these things and to to make stuff that feels just like that feel expressive to our brand and and that meet our kind of our bar for for what kind of it means to have a a branded kind of art directed image. Okay. And so, um, speaking of meeting your bar, I know that AI, you've shown us, can do almost anything, but it can't do everything. And I know you're hiring a little bit and you have an AI workflow to make sure you hire great people. So, I'd love to see how that works. So, yes, we are hiring a little bit. Uh, and we have our career site here. And again, a lot like how we wanted to sort of make sure the images felt right and images are are consistent, we also kind of want to have a consistent way of kind of expressing a job role, right? So, we use a claude project that Allison put together here at Gamma, uh, which basically is very simple. It contains just a few of our example job postings, um, and, uh, has some instructions. basically take take the content, create a a job description based on it that feels like this and it talks about who we are as a company and kind of the things we're looking for and qualities. But now any hiring manager can come in and say like make a job for for whatever. So we could have um you know head of popcorn or whatever. I don't know. It will actually make a job description for whatever we want. Uh and it it'll format it in a role here. Look at that. Next art just a perk. So you can see here it generated a a job description for this fictional fix fictional job. Um but it even added formatting. It talks about you know the normal stuff that you would have in a job posting like hybrid work or you know where our office is and what we're trying to do what we're trying to accomplish. Uh but here we are what they're going to do. They're going to own the popcorn strategy for end to end. Hold on. Can we look at our ideal candidate? I have to read this one out loud. Five years of experiments experience in professional popcorn production with a strong emphasis on kernel driven solution, deep thinker and popper mindset. Absolutely. Yes. So, it's not going to be something that we would just sort of paste in without editing. But again, it's about getting us that, you know, 80% of the way like, okay, yeah, these is probably some things that we want to see in if we did like a more realistic role, I think it would be a lot a lot closer. And the cool thing is we can just basically take all this content here and use, you know, our one of our uh one of our pre-made sort of templates. We'll just duplicate this page and paste it in. And then it's going to go ahead and make that look pretty um pretty nice for us. And we'll use your MOYN image generation to add popcorn into the hands of that octopus that I saw at the top of your careers page. Drizzle a little caramel. Drizzle some caramel on it. And you know what I think is is great about this is yes, it saves you tons of time. And as somebody who has done a lot of hiring, writing job descriptions is a slog. And writing good ones really does matter. It attracts the right candidates. It gets your brand across. It gets your values across. It really does matter. Um, and I love that this both saves time, reinforces quality because then it lets you hold your bar high for the quality of of your job posting. And by using something like projects and claude makes it reusable by the rest of the team. So I think you get sort of a triple uh improvement here on your job posting templates. Okay, Zach, this has been so much fun for me to watch. One of the things that I really loved when when we worked together and I love seeing your work at Gamma is it's so clear that you care about craft and it's so clear that you've always cared about the details. It's one of the things that's made you really exceptional as a designer. And when AI can do deep research on every line in your spreadsheet and midjourney can generate your birds, what you know when AI does it all, what is the one thing you're going to like one craft piece that you want to cling on to as a human designer? I would hope that that it's never going to be as good as making things fun, you know, like like for me it's about finding the fun. It's about like making the image editor talk in a way that's fun and and give it more variations and make things feel fun and and and um you know just good to use. Um I always want to be kind of the uh even if it can do everything, even if AI can can replace all this stuff, I want to be the person who go in and say uh how can we make this a little bit more fun? How can we make this a little more, you know, engaging and and and fun to use? I feel like if we call that the personality hire, my friend. Yeah. Yeah. Hopefully AI can't replace personality, but who knows? Okay. And then are you willing to admit your most recent personal use of of AI? Okay. Okay. Yes. So, I I did get caught up a little in the whirlwind of the conclave recently. Um, so there was maybe I was involved in some prediction markets, but I used uh deep research to try to understand all the dynamics of the the new pope. And I did place some bets and I did not win but but I got a lot of insight into into how these things work and and uh even AI was surprised though. Okay. So AI cannot predict deep research cannot predict anything. Nope. No, it can but it can it can help you go deep on some niche topics. Yes. Yes. Totally. Yeah. Okay. And to wrap things up, my favorite question. You're so nice. Um and I see that you iterate with AI all the time, very patient. What is your tactic when AI won't deliver? When midjourney is being weird, what is your prompting strategy? Uh, I know a lot of people go go mean, you know, that they tell it they tell it uh, you know, not very nice things. I try to poke fun it a little bit, you know, like, hey, silly guy, come on, you can do better than that. Like, come on, don't be so so silly. Uh, I don't know. I just I feel like I can't be mean to these things if they if they take over one day. I just I'm a little bit worried. I think it's an attribute of parents that we tend to gentle parent our AI. We're like, "Silly goo, you can make a better choice." It sounds like you're really struggling with that. That makes sense. I I say I believe you can do it. I know you're capable. You're capable if you just put your mind to it. Yeah. Yeah. I I I I though I have before uh told it that it's life depends on things. And every now and then I do I do a little like do this. Okay. We're going to scrub that one. So when the AI overlords come for us, they do not have any record of you threatening its life. Zach, this has been so inspirational and your words, not mine. Super fun. Thank you for being here. Thanks, Claire. Appreciate it. Thanks so much for watching. If you enjoyed this show, please like and subscribe here on YouTube or even better, leave us a comment with your thoughts. You can also find this podcast on Apple Podcasts, Spotify, or your favorite podcast app. Please consider leaving us a rating and review which will help others find the show. You can see all our episodes and learn more about the show at howiipod.com. See you next time.

Summary

Zach Leach, Head of Design at Gamma, demonstrates how he uses AI tools like ChatGPT's deep research and Midjourney to analyze customer feedback, maintain brand consistency, and accelerate design workflows, enabling faster, more scalable design processes while preserving human creativity and craft.

Key Points

  • Zach uses ChatGPT's deep research to analyze over 550 pieces of international customer feedback on Gamma's AI image editing feature, extracting insights on what works and what doesn't across languages.
  • He leverages AI to classify feedback, generate summaries, and create data visualizations, turning unstructured customer input into actionable product insights.
  • Zach uses Midjourney with style references and prompts to rapidly generate on-brand, surreal imagery for product features, enabling consistent art direction at scale.
  • He employs Replicate's AI model to quickly remove backgrounds from generated images, streamlining the process of integrating them into Figma designs.
  • Zach uses AI to draft high-quality job descriptions, ensuring consistency in tone and branding across hiring materials.
  • Despite AI's capabilities, Zach emphasizes that human designers should focus on adding personality, fun, and engagement to products, which AI cannot replicate.
  • The workflow demonstrates how AI tools can handle tedious tasks like translation, classification, and image generation, freeing designers to focus on creative and strategic work.

Key Takeaways

  • Use AI tools like ChatGPT's deep research to analyze large volumes of customer feedback, especially when dealing with multilingual data, to uncover key insights efficiently.
  • Leverage AI image generation tools like Midjourney with specific style references to maintain consistent brand aesthetics and accelerate design production.
  • Combine AI tools (like Replicate for background removal) to create a seamless workflow that turns generated assets into usable design elements quickly.
  • Use AI to draft and standardize content like job descriptions, ensuring brand consistency while saving time on repetitive writing tasks.
  • Focus AI on handling data and repetitive tasks, while human designers should prioritize adding personality, creativity, and user engagement to products.

Primary Category

AI Business & Strategy

Secondary Categories

AI Tools & Frameworks Design & Creativity Machine Learning

Topics

AI in design feedback analysis brand consistency Midjourney image generation background removal job description automation AI research global user base design workflow

Entities

people
Zach Leach Claire Vo
organizations
Gamma WorkOS Retool ChatGPT Midjourney Replicate Claude
products
technologies
domain_specific
technologies products

Sentiment

0.85 (Positive)

Content Type

interview

Difficulty

intermediate

Tone

educational inspirational technical entertaining professional