From "Vibe Coding" to Architecture Primitives: A deep dive into building a serverless volunteer registry for AWS Community Day Bolivia 2025. Discover how I navigated the "personality" of AI agents, overcame session context hurdles, and why I chose a Multi-Cloud-Ready approach to ensure portability.
Since I was a child, computers have held a magnetic fascination for me. I vividly remember the first time I saw one in my mother’s office at the age of eleven; in that exact moment, I knew precisely what I wanted to do with my life.
Although destiny led me down the path of infrastructure, I never truly abandoned the dream of developing applications. While I have a solid foundation in programming and can hold my own in the terminal, a lack of daily practice and the time required for a deep dive had—until now—prevented me from producing professional-grade software.
Today, however, the landscape has shifted. With the rise of Artificial Intelligence, LLMs, and Generative AI, a new horizon has opened for profiles like mine: the ability to bring complex applications to life without needing to be an expert in deep syntax, by leveraging AI-powered editors that amplify our creative potential.
This adventure began with the organization of AWS Community Day Bolivia 2025. This year, the honor of leading the event fell to the AWS User Group Cochabamba, a team where I serve as one of the leaders. Organizing an event of this magnitude requires impeccable coordination with volunteers; while tools like Google Forms or Sheets are useful, I was looking for something more: a comprehensive, custom-built solution.
My vision was to build a platform that would allow us to:
Although the development wasn’t finalized in time for the Community Day, the experience was a revelation. This process deeply enriched my understanding of Generative AI, application architecture, and the modern software development lifecycle. Above all, it allowed me to discover the “tips & tricks” of working with AI agents and the delicate human-machine synergy required to achieve the desired result.
As of this writing, Kiro has advanced significantly. Many of the initial limitations I encountered have been addressed—particularly regarding session management and memory—solidifying its position as a cutting-edge tool that leaves previous versions behind.
I must confess, I was quite naive at first. I trusted Kiro’s output almost blindly because, initially at least, it generated the site’s starting structure correctly and surprisingly fast. The project kicked off with a specifications proposal (a Spec) designed by Kiro itself; in my prompt, I asked it to generate a three-tier serverless project: AstroJS for the frontend, FastAPI for the backend, and DynamoDB for data persistence.
During the early stages, I worked with three editors open and deployed directly from my local machine to the User Group’s AWS account. However, the time came to do things right and set up deployment via a CI/CD workflow (because, as the saying goes: “the shoemaker’s son always goes barefoot”). For this, I decided to use AWS CDK. In my experience, managing state files is an additional headache I wanted to avoid, so I ruled out Terraform and Pulumi from the start.
At the start of every task, I would enter a prompt to polish details that didn’t quite look right or weren’t functioning correctly. This is where I hit my first major roadblock: I had four different Kiro sessions that shared no context with one another.
If Kiro detected an API bottleneck while developing the frontend, the tool was essentially “blind” to the other components. This forced me into a constant cycle of copying and pasting to transfer data from one environment to another. To solve this, I decided to centralize everything into a single session. I grouped all the repositories into a root directory so that Kiro could navigate between them with full awareness of how the components interacted.
Below is the local structure I defined to achieve that synchrony:
❯ tree -L 1 -a
.
├── .amazonq
├── .devbox
├── .envrc
├── .gitignore
├── .kiro
├── .python-version
├── .venv
├── README.md
├── devbox.json
├── devbox.lock
├── generated-diagrams
├── pyproject.toml
├── registry-api
├── registry-documentation
├── registry-frontend
├── registry-infrastructure
└── uv.lock
9 directories, 8 files
The reader will notice that I have Devbox configured in the root directory. I made this choice because Kiro frequently needed to run Python scripts to sanitize repositories, perform searches, or handle troubleshooting tasks. Therefore, I saw the need to provide it with an isolated dependency chain, thus avoiding the installation of unnecessary software on my primary operating system.
This was the most complex stage and the one that demanded the most time. It was a period of alignment between the AI and myself; the moment where we discovered our character, our limits, and just how much we could tolerate one another. Readers might be skeptical, but after working through dozens of sessions, I can confirm that each one develops a distinct “personality.” There is a subtle but real difference: the speed at which they grasp previous context, the tone of the conversation, and the level of initiative varies from session to session.
For those who haven’t yet experimented with Kiro (or Amazon Q), there are fundamental aspects that must be taken very seriously:
Ignoring these rules comes at a high price: Kiro will generate code that doesn’t work or doesn’t align with your goals. The most dangerous part is that the tool will always confidently assure you that everything is fine, leaving you with a false sense of success.
The breaking point came during an iteration where I requested an architectural adjustment. Kiro misinterpreted the instruction and altered the entire project: it transformed a Serverless architecture into one based on ECS and Aurora. It was a frustrating experience, but a necessary one. At that moment, I decided to establish rules of engagement: I created an “Architecture Primitives” document. In it, I defined strict guidelines on project structure, repository management, and expected behavior when publishing changes.
From that “contract” onward, the project stabilized. I moved forward with a fluidity that previously seemed impossible, and thanks to that, the beta version materialized much sooner than expected and is already available online.
The project’s architecture is illustrated in the following diagram:
To achieve the goals of cost-efficiency and scalability, I designed a modular structure divided into logical layers. Here is how the components interact:
This is the first line of contact with the user, prioritizing security and speed.
We separated the control planes to ensure the security of sensitive operations.
This is where the system’s intelligence resides, utilizing an event-driven microservices approach.
The heart of the system uses the Service Registry pattern to avoid tight coupling.
A polyglot combination to handle different data types efficiently.
Components that span the entire architecture to ensure health and protection.
To prevent our collaboration with AI from descending into the architectural chaos I mentioned earlier, I had to formalize our knowledge into two fundamental pillars. These documents didn’t just guide Kiro; they established the foundation of what I consider modern assisted development.
It isn’t just about writing code; it’s about following principles that guarantee the system’s evolution. Based on the documentation from the Registry Project , we implemented three golden rules:
Learning to talk to an agent like Kiro requires more than simple instructions; it requires a framework. These are the key takeaways from our Assistance Guidelines :
Although the project lives and breathes in the Amazon Web Services ecosystem, I made a strategic decision from day one: the architecture had to be Multi-Cloud-Ready.
Many might ask: Why complicate the design if we already have the AWS toolset? The answer lies in technical sovereignty and cost control. By implementing patterns like the Service Registry and using Devbox to isolate environments, we avoid the dreaded vendor lock-in.
Designing this way forces us to separate business logic from infrastructure. This means that if the community decided tomorrow to migrate part of the workload to another platform or integrate external services, the core of our application would not suffer a technical trauma. It is an architecture designed for freedom—where AWS is our choice of excellence, but not our only possibility.
This project allowed me to experience a new way of working and a new way of thinking about software development. It isn’t about replacing developers; it’s about empowering them—freeing their minds so they can focus on what truly matters: creativity, innovation, and solving complex problems.
Generative AI is a powerful tool, but it requires a human to guide it, correct it, and refine it. That is what I love most about this experience: the fact that there is a human behind every line of generated code, and that this human has the capacity to learn, evolve, and improve.
It is not about letting the AI do everything; it is about learning to work alongside it, leveraging its power to do more, be more, and achieve things that previously seemed impossible.
I hope this journey serves as an inspiration for you to explore new ways of working, thinking, and creating. May it also remind you that the future isn’t something that just happens to us—it is something we build together, with tools, with technology, with AI, and with humanity.
Finally, to all those who have always wanted to develop applications but haven’t been able to: now is the time. From this point forward, nothing is impossible. However, do not go into this process blindly; you must do it with a solid foundation. So, it’s time to hit the books!
Thank you for reading this far, and as always, see you next time!
Site: https://registry.cloud.org.bo
Repositories:
Keyboard Shortcuts
| Command | Function |
|---|---|
| ? (Shift+/) | Bring up this help modal |
| g+h | Go to Home |
| g+p | Go to Posts |
| g+e | Open Editor page on GitHub in a new tab |
| g+s | Open Source page on GitHub in a new tab |
| r | Reload page |