Stop fixing AI Code and make It respect your guidelines
In this talk, I focus on using Model Context Protocol (MCP) to make generative AI practical and reliable in large iOS codebases. The goal is to show real production workflows, not theory or demos that only work for small projects.
I explain what MCP is and why prompt-based AI often fails in large organisations. I show how MCP allows AI tools to use structured and trusted context, such as internal guidelines, architecture rules, and reusable iOS components.
A key part of the talk is a real-world example. I demonstrate how to generate a SwiftUI view using only a Figma design link. By connecting a Figma MCP server with a custom internal MCP server, the AI understands both the design intent (tokens, layout, components) and the internal UI libraries that must be used. The generated code follows the design system and existing APIs instead of generic SwiftUI code.
I also share best practices for using MCP at scale: how to design MCP servers, define agent skills and custom instructions, control context size, and introduce AI-assisted coding in large teams without reducing code quality or maintainability.
This talk is aimed at mid-to-senior iOS engineers, tech leads, and architects who are interested in AI-assisted development, design systems, and improving developer productivity in complex environments.



.jpeg)
