Gemini Designer is INSANE... Build Beautiful Websites in Minutes
Scores
Google has given us so many AI products across so many fields. One big product that they released is Google Stitch, their AI designer. It has a ton of potential beyond just designs, you can actually convert them into fully working prototypes as well. Why should you care? Stitch is trained on a lot of data from Google and because of that, it can create pretty good designs. In this video, I'll show you how I use it properly and then how you can convert them into apps with your favorite AI coding tools. The way Stitch works is you prompt what you want and it designs all the UI. Then you iterate on top of that and it works for both web and mobile UI. There are two types of modes, standard and experimental. In standard mode, it's just text to UI. You write the prompt, click generate designs, and you'll get your results. With this, you can create up to 350 designs per month. But in experimental mode, you can use an image as a reference and generate up to 50 designs per month. Not only that, you can also choose the visuals. The main thing you need to know about Stitch is that if you tell it to make everything at once, it's not going to design well. But by iterating over your idea, it's going to stay consistent and build what you want. For example, I tried to make the UI of a chess powered AI app called Night AI, and it built it with this amazing dark theme. In the very first prompt, I told it to make a chess board. After it created the chessboard, I went on to create the other screens. The basic idea was to have three types of games. Human versus human, human versus AI and a really novel one where two AI models play against each other. Based on that, it created different screens for each mode. After the battle and home screens were created, the next step was creating the login and signup pages. When I asked it to do that, you can see in the result that it consistently applied the chess pattern to the login pages and it incorporated the design that we wanted in our app. And it also automatically created the extra forgot password screen as well. Obviously, you're going to get some problems along the way, but fixing them is pretty easy. You just have to prompt them. For example, there was a problem with the UI here, so I prompted it to change the design. It made the pieces more visible and better aligned, which made the design even more attractive. There also wasn't any logout functionality in the UI, so I asked it to create a settings screen that included the logout option. I then generated two more screens, a result screen that showed the outcome of the match. since that was missing and a game analysis screen that displayed a summary of your moves and the complete history of what happened during the game. Now that I've eventually completed this design, I'll want to convert it into a functioning app. For that, I do have a workflow that I'll guide you through. But before we do that, I also want to demo some other designs I made with Stitch. As I already told you, you can also create mobile UI with Stitch. I also tried to create the Apple Notes app. For that I basically got the whole structure from chat GPT which gave me the file structure and descriptions for each screen. GPT5 temporary has an increased limit of 100s per 3 though standard access is typically 80 messages 3 hours. After reach limit your responses automatic switch to smaller models for which it has unlimit. I also tried to design the Ark browser as an Android app. But even after clearly mentioning that it was supposed to be an iOS app, it still generated the UI with material UI as the base, which is something you should know. If you want iOS applications, it's probably not going to be that good. You can see the prompt I gave and what it generated. The home screen, tabs across the top, history, and profile settings. For the actual workflow, we're going to be using Klein. They have an open-source extension in VS Code that allows you to use multiple models when you want. They have a pretty amazing agent as you're going to see throughout the video. So, let's get into the workflow. The first phase in creating this application is the extraction of the design system. The stitch folder contains all the designs downloaded from Stitch in HTML format. Using this workflow, we extract the design system and make a fully functioning Nex.js front end from it. It first analyzes the stitch folder, then extracts things like the color palette, the fonts, and the layout patterns present in the screens. It also analyzes specific patterns that will be used later for responsiveness. I've outputed the whole workflow in this client rules file, which lets client know how the workflow works. The first four phases are going to be focused on setting up the documentation and the fifth phase is going to have the implementation of these documentations. But it has all been completed and based on the first phase, it actually implemented this design system. And in the fifth phase, which I'm going to show you, it also changed a lot of files based on this design system file. On the other hand, the Claude Pro plan isn't limited by a window, but rather by how much you can upload per chat. Allows you to upload approximately files per chat with each file up to around 30 megabytes in size, which is significantly smaller compared to Chat GPT Plus. If we go to the login screen, we can see that Klein went through the designs and found different things like the purpose of the page, the sections included, and the interactive elements. It's done this for all the HTML files in the stitch folder, including the main chessboard. In phase three, it extracts the components we need to put inside the app based on the HTML files. First, it identifies the reusable components, then outputs them into a component library.md file. There are a bunch of other instructions as well to make sure it extracts the right components. In phase five, it's going to implement all the components inside the app based on this file. After deciding what all the components are on the pages in phase four, it decides the navigation flow between those pages. It outputs that to a navigation flow.md file. It creates these user flows showing how users will navigate the product. For instance, it uses an online shopping store as an example, then maps out any app given to it using that same framework. You can see the navigation flow it generated after going through our HTML files, and it was pretty accurate. I read through it, and if I go back to the changes that were implemented, this tells client how we want the pages that are already built to be connected. Phase five is the implementation phase. And as we all know, these LLMs can forget stuff. So, we're not going to directly ask it to implement everything. Instead, it creates an implementation blueprint.mmd file. In that file, it uses all the documentation from the previous four phases, the components and the design system. After it creates the implementation file, it automatically implements it. You can see here it looked at the components we made and is implementing them along with some additional rules as well. This is all inside one workflow.md file which we used to start the workflow. As you can see right here, I'll also have it linked in the description below. The workflow file allows you to use this process to build out the complete design into fully functioning apps that you get from Google Stitch. To show you the output it made, you can see it correctly implemented all of the authentication UI. And if we go inside the chessboard, that's also fully functional. Right now, this is only a demo and doesn't contain a backend with AI models. So, the models you see are selectable right here, but not fully implemented. And the board can't move by itself. The UI that was implemented is pretty spot-on. There might be a few errors like misalignments because it might not have the screenshots. But implementing from the HTML files is way better than having screenshots because it clones the UI one to one and turns them into functioning apps. In phase six, it builds a validation checklist that scans the app that has already been built and compares it with the design files and implementation, making sure everything has been implemented correctly. It focuses on design and UI aspects, validating that all of those things have been completed. It scans the app and creates the validation criteria inside the validation checklist MD file. You can see that when it reached this step inside the workflow, it verified everything, the colors, fonts, and even the smallest things like border radius. It also validates components to ensure the design implementation is correct. These AI models do hallucinate and testing is really important. So, it goes through all of that. It gives us a scorecard as well that outlines any fixes that might need to be implemented so we can go back and fix them. I also wish that Google Stitch had an MCP server because then we could directly convert those designs into working code just like with Figma. And speaking of Figma, Klein actually has it in its MCP store if you want to use that. We also have a full video on how to build apps with Figma's amazing new AI tools.
Summary
The video demonstrates how to use Google Stitch's AI design tool to create beautiful web and mobile UIs, then convert them into fully functional apps using Klein's AI-powered workflow in VS Code.
Key Points
- Google Stitch is an AI tool that generates UI designs from text prompts, supporting both web and mobile interfaces.
- Stitch offers two modes: standard (text-to-design) and experimental (image-to-design), with monthly usage limits.
- Effective design requires iteration—prompting step-by-step leads to more consistent and accurate results.
- The author created a chess AI app design, including login, game modes, and analysis screens, all with consistent styling.
- Stitch can generate additional screens like forgot password and settings automatically based on context.
- To convert designs into functional apps, the author uses Klein's VS Code extension and a structured workflow.
- The workflow extracts design systems (colors, fonts, layout), components, and navigation flows from HTML exports.
- Klein generates an implementation blueprint and builds a Nex.js app based on the extracted design system.
- The process includes a validation checklist to ensure UI fidelity between the design and the final app.
- While Stitch's AI can hallucinate, the workflow enables error detection and fixes through systematic validation.
Key Takeaways
- Use Google Stitch iteratively with specific prompts to build consistent and high-quality UIs.
- Leverage AI tools like Klein to convert AI-generated designs into fully functional apps without coding from scratch.
- Follow a structured workflow: extract design system → components → navigation → implementation → validation.
- Use a component library and navigation flow file to guide AI in building accurate app implementations.
- Always validate the final app against the original design to catch and fix discrepancies.