ChatGPT Apps vs n8n vs AgentKit: The Future of AI Automation
Scores
OpenAI recently released Chat GPT apps. These are different apps running right inside ChatGpt, giving it way more control. This new ecosystem introduces agent builder and widget interfaces instead of just plain text, which looks like the future of how we'll actually be using AI in our daily lives, and it's going to kill so many upcoming AI startups. Hidden in this announcement is the promise of a new app store built specifically for Chat GPT. Chat GPT has 800 million users, making it one of the biggest platforms on the planet. Anyone can publish to this app store, including paid and subscription-based apps. The launch is still 3 to 4 months away, which means right now is the golden window for builders to get ahead. In this video, I'm showing you exactly how to build these apps using prompts I'll share, how to get them running, and how to connect them to ChatGpt. But first, a word from our sponsor, make.com. Make.com isn't just another automation tool. It's real-time visual orchestration with intelligent adaptive behavior built in. With Agentic automation, your workflows don't just execute, they evolve. They solve problems autonomously, leverage global knowledge, enhance traditional automation, and improve efficiency across every process. With Make.com AI agents, you can describe goals in natural language, and these agents choose the best path forward, connecting tools, handling edge cases, and adapting as your systems grow. Automate at speed with over 3,000 plus pre-built apps and an AI assisted no code builder. Make the complex simple by orchestrating Genai and LLM powered workflows and scale with control using make grid MCP and advanced analytics that give you full visibility and precision. Whether you're building solo or as part of a team, make.com helps you turn ideas into action. Click the link in the description and start building today. In their announcement, OpenAI discussed what's coming next with these chat GPT apps. Developers will be able to publish their apps directly in chat GPT where there will be a dedicated directory allowing users to browse and search for these apps. They're also releasing monetization including support for the agentic commerce protocol which enables instant checkout within chat GPT. This works especially well for personalized apps such as Door Dash where you could simply ask ChatGpt to order your food and pay right there within the app. Right now the available apps are quite limited. Custom apps are coming though and apps for any specific use case could suddenly take off on the Chad GPT app store and Chad GPT definitely has the user base for it with over 700 million people using it each week. I came across Alex Finn on X and he outlined a really neat workflow and provided some great prompts. I highly recommend checking out his post. You might be wondering how this actually works. When you prompt chat GPT for a specific app, it sends that request to an MCP on the back end. These apps are essentially just MCPs running behind the scenes. They simply return a widget. If you haven't configured a widget, they return text instead. The exciting part is that we can build our own MCP and host it ourselves. Chat GPT only needs the link. It can then send requests, receive widgets, and render them directly in the display. To actually build out those MCP servers, I have a modified version of the prompt from that post. This prompt guides you through building a chat GPT app using the MCP SDK and Express to expose a tool that connects to the Google Places API. I'm creating a tool that you can query to show places from the map directly inside chat GPT. It calls an open AI tool which queries our places API for the information to display on the maps. Then it renders those responsive cards with detailed information inside them. The MCP server is going to run locally and then through Enro. I'll explain this later on. Since all of this is new technology, I'm providing documentation for the app's SDK itself along with whatever API I'm using for a specific use case. If you were building a weather app, you'd need to include documentation for the weather API you plan to use. You can simply ask ChatGpt or your coding tool to fetch the API documentation for you along with specific information related to that API. I've turned this into a repeatable prompt that you can use. Just plug it into your agent, tell it what you need to do, and it'll fill out the required fields. You'll have a readytouse prompt to build chat GPT apps, just like I did. I actually found two ways to build chat GPT apps. The first way is by building my own MCPs from scratch. I implemented an app that finds the best places near you and displays them as cards right inside ChatGpt. The second way is by using existing MCPS. Instead of building one from scratch, you can take an already available MCP and use it for your own app. For example, I found an MCP server for the movie database, which is a collection of movies and their specific statistics. I could use its API and implement it, but since the MCP server has already been created, I can simply use that directly. I converted it into a tool that I could use myself in the form of a chat GPT app. Here's how this is going to work. You're either going to take the prompts that I give you or you can create your own using the template prompt. If you want to build using the second method, there's a separate prompt you can use. give that prompt to your AI agent and it will figure everything out for you. Also, these prompts are going to be available in the resources section of our discord. When it comes to providing this prompt to an AI coding agent, whether it's cursor, codeex, or claude code, all of them will work just fine. Once you've given it the prompt, it's going to start coding. When that's done, the next step for your use case is to get the API key or token. This will be different for every site depending on what you're building. Your AI agent will give you a pretty good overview of how to get that token and how to update it. After updating your token, you'll need to run the MCP server. It's going to start running locally on your computer at local host. But to make it accessible to chat GPT, you'll need to use Enrock. Enro allows the server running on your computer to be available to the entire internet. Your server is deployed locally and it'll stay active as long as your computer is on, letting anyone on the internet access and use it. To do this, you'll need to have Enro installed. It's super simple. Just sign up and for your operating system, it'll give you a few commands, including an authentication command. After you're authenticated, you just need to type engrock http 3000. This is the most common setup, but if you face any errors or if it doesn't work, ask your AI agent which port your local server is running on. If it says 3001 or 30002, just change the number accordingly. For most cases though, it's going to be 3,000. Once you enter that, Enro will start up and give you a forwarding address. Copy that link and you'll be able to connect it with ChatGpt. You're going to head into ChatGpt. For this, you need to have a plus or higher account. Once you're in, go to your settings, then navigate to apps and connectors. Before doing anything else, scroll down to advanced settings and make sure that developer mode is turned on. Once that's enabled, you'll see a create option appear. Click on it and it'll open up a small panel. In this panel, first name your tool. In our case, we'll call it movie app. Then write a short description of what the app does. I've noticed that this description doesn't currently help Chat GPT much in recognizing what the app actually does. Hopefully, they'll improve that over time since this is still in beta. After adding the description, this is where you're going to paste the URL you copied from Enro. At the end of that URL, just add /mcp and you'll be good to go. If you've set up authentication, which can be a bit complicated and isn't covered in this prompt, you can handle that here. For our purposes, we'll just select no authentication, mark the app as trusted, and create it. It's going to look something like this. Another tip, I highly recommend asking your AI to use shad CN components while building out the widgets. Otherwise, the app will still look great, but this extra touch makes it feel more polished. I didn't include it in the base prompt since it's a bit subjective, but it's definitely worth trying. If you keep making changes to your app once you've updated it, just use the refresh option inside the app. Whenever you're done making edits, as you can see, my movie app is on version six after a few iterations, simply hit refresh to reload the latest version. Now, if I give a prompt asking about the most famous movies and confirm the permission request, it'll render small cards with detailed information right inside ChatGpt. This is truly amazing because everything is starting to move toward these visual interfaces. ChatGpt when enabled by other websites will start pulling in data like this seamlessly. That's why the addition of payment options and the chat GPT app store really expands the entire ecosystem. It's genuinely exciting to see. As for the places app, it works the same way. If I give it a prompt asking it to search for some restaurants in New York, it's going to show a popup again. And once I confirm, it'll generate these really clean looking cards. The images are not the best quality, but that's just the data it's pulling in. Still, all the details are displayed right there on the cards themselves. Again, this is really how I see people using chat GPT in the future with other apps, even though these are MCPs running in the back end. People used to think working with MCPs was really difficult. But if you think about it, this is essentially an MCP store. Open AAI has just made it accessible to literally anyone, even those who've never heard of MCPS before. It's going to be truly powerful. When the app store officially launches, you probably won't be running it locally through Enro anymore. Instead, we'll see much larger MCP systems with proper authentication. I'll be dropping a video on that, showing how to build full systems with this and how to create really solid apps using prompts. So, if you're interested in this, subscribe for that. That brings us to the end of this video. If you'd like to support the channel and help us keep making videos like this, you can do so by using the super thanks button below. As always, thank you for watching and I'll see you in the next one.
Summary
This video explores how to build custom ChatGPT apps using MCPs (Model Context Protocols) and tools like Enro to host them locally, enabling visual interfaces within ChatGPT. It highlights the upcoming ChatGPT App Store and how developers can create and publish apps using prompts, APIs, and no-code automation platforms like Make.com.
Key Points
- OpenAI is launching a ChatGPT App Store with monetization and agentic commerce, enabling users to build and publish apps directly within ChatGPT.
- ChatGPT apps are powered by MCPs that return widgets instead of plain text, creating a more interactive and visual AI experience.
- Developers can build custom apps by creating MCP servers using tools like Express and connecting them to APIs such as Google Places or Movie Database.
- Apps can be hosted locally and made accessible to ChatGPT via Enro, which exposes local servers to the internet.
- The process involves using AI agents to generate code, configuring API keys, and deploying the app through ChatGPT’s app settings with a URL endpoint.
- Two methods are shown: building MCPs from scratch or reusing existing ones, both accessible via reusable prompts.
- Users need a Plus or higher ChatGPT account and must enable Developer Mode to create and test apps.
- The app interface renders cards with rich data, such as restaurant listings or movie details, directly inside ChatGPT.
- Make.com is promoted as a tool for building agentic automation workflows that can integrate with ChatGPT apps.
- The future of AI automation lies in visual, app-like interfaces within ChatGPT, making AI more accessible to non-developers.
Key Takeaways
- Use reusable prompts to generate ChatGPT app code with AI agents, then deploy using Enro to expose your local server.
- Build custom apps by creating MCPs that return widgets, connecting to APIs like Google Places or OMDB for rich data display.
- Enable Developer Mode in ChatGPT and use the App creation panel to publish your app with a public URL endpoint.
- Leverage Make.com to build intelligent automation workflows that can power your ChatGPT apps with real-time logic.
- Prepare for the ChatGPT App Store launch by building and testing apps now to gain early access and potential monetization.