Skip to content

Deliverables

Trajectory

When I started this thesis, I thought I was going to build a tool — something concrete and functional that would help people design and fabricate using AI. I imagined that through platforms like MegaTool, I could give communities an accessible way to generate models or support their projects. But the deeper I went, the more I realized that I wasn’t building a product. I was mapping a space. And what I found was complex, fast-moving, and full of contradictions.

This project ended up sitting between three big domains: AI in design, AI in fabrication, and AI in makerspaces. Each of these could be a full thesis on their own. Trying to work across all of them in just six months was ambitious, but it also showed me that the most valuable thing I could do was trace the edges — to map the current possibilities, limitations, and values emerging around AI in creative spaces.

One of the hardest — but most important — moments was realizing that there is no final prototype. AI moves too quickly. Tools that didn’t exist when I started are now solving problems I was just beginning to name. MegaTool couldn’t survive the shift to private APIs or the reality that most people didn’t need another input-output model generator. And I’m not a developer — I’m one person trying to navigate a space that shifts every two months.

What stuck, though, was the workshops. What became most meaningful were not the tools, but the conversations — the reflections, the doubts, the shared excitement, and the constant return to purpose. In every session, I found myself explaining not how to “use AI,” but how to think with it — how to see that it’s just a pattern replication system. It doesn’t understand meaning. It doesn’t create intention. That’s still our job.

I learned that people don’t feel overwhelmed by AI. They’re mostly just curious. They want to play, to see what it can do. But very few see it as something that will meaningfully change their work — at least not yet. And that’s okay. My job wasn’t to convince anyone of AI’s magic. It was to show them how to stay grounded, how to test, how to reflect, and how to keep the human part of the process intact.

Through this, I also questioned one of my strongest beliefs: that people shouldn’t use AI unless they understand it. I still think that’s important, but I also see now that it’s complicated. Most of us use tools every day that we don’t fully understand. So maybe it’s not about full understanding — maybe it’s about building just enough awareness to make intentional decisions.

So what’s left? A methodology, a set of ethical reflections, a workshop model, and a community that now knows more about AI than before. There’s no final product. But there are paths to follow: I could co-design an assistant tool for a makerspace, build a simple chatbot that helps troubleshoot fabrication, or just keep holding space for workshops that combine literacy with experimentation.

At the heart of all this is a simple thought: AI should amplify thinking, not replace it. This thesis isn’t an ending. It’s a set of coordinates — an invitation to others to keep questioning, keep reflecting, and keep making — with care.

Project Sustainability Plan

This project was never about creating a single “solution.” From the beginning, I knew that the tools would evolve faster than I could keep up with. So instead of building a final product, I focused on creating a methodology that is flexible, community-centered, and able to grow with the people who use it.

  1. Ongoing Workshops and AI Literacy The workshops are the core of the project’s sustainability. They can be repeated and adapted across makerspaces, schools, and labs. They:

Teach AI through hands-on testing of current tools (text-to-image, text-to-3D, CAD, etc.),

Encourage participants to reflect on fabrication, ethics, and purpose,

And most importantly, keep the human element in tech education alive.

Workshops offer a format that can evolve with tools while grounding users in a mindset of critical questioning and play.

  1. Online Platform for Open Access Since not everyone can attend in person, the next step would be to create a website or platform that shares:

AI literacy materials,

Prompts and tool guides,

Workshop formats and templates,

And reflections from the project.

This becomes a living archive and resource hub — and also lowers the barrier to entry for people who are curious but not technical.

  1. Tool-Use Checklists and Reflection Guides One of the most useful outcomes of this process is the idea of a tool-questioning checklist — not for designing, but for understanding what the AI tool actually does and what role it plays in the process. Questions like:

Is this tool for creative exploration or technical tasks?

Does it support ideation, modeling, or fabrication?

Is it open-source or private?

Who is it made for — experts or non-experts?

Does it help you learn, or does it do the work for you?

This helps people quickly position the tool within their process and decide how to use it — or whether to use it at all. It’s a way to reclaim agency over tools that often feel mysterious or too advanced.

  1. Future Development Options Co-developing with a single makerspace: Working long-term with one space to create tailored support (e.g., AI-based troubleshooting chatbots or fabrication prompts).

Building AI learning assistants: Not tools that replace thinking — but bots that ask you questions, check your fabrication file, or suggest sustainable alternatives.

Partnering with academic and civic institutions: To integrate the methodology into workshops, learning programs, or teacher training.

Closing Note

This thesis doesn’t close with a product. It closes with a framework, a methodology, and a conversation. AI is not here to replace makers — it’s here to challenge us to rethink how we make, who gets to make, and what kind of design futures we want to build.

If AI is going to be part of a makerspace, it should not be a magic button. It should be a learning companion — something that supports, guides, and reflects with us, not for us.