Think about thinking

This is my first blog post and I'm really excited to begin this journey with you. It's actually the first in a series of 25 pieces I'm writing: an exploration of the invisible systems shaping how we sense, think, and create.

I should say upfront that this blog isn't about AI. It's about those hidden patterns everywhere. But I'm starting here, with artificial intelligence, because it's the most visible example of Systems Intelligence we've collectively experienced. It's the perfect entry point because it's happening to all of us, right now: this shift from thinking about systems to thinking with them.

ChatGPT makes the feedback loops impossible to ignore. Once you see them here, you start noticing them everywhere: in how restaurants orchestrate experience, how scents trigger memory, how spaces shape behavior.

So I've been thinking about thinking lately. Not in the abstract, academic way, but in that everyday noticing way. How we frame things without realizing we're framing them.

There's this quote from Systems Intelligence research that captures it: "It is a well-known fact of cognitive science and creativity research that re-framing is key to new opportunities, higher productivity and to creativity at large. Thinking about thinking is about identifying one's favoured framing patterns, challenging them and adjusting them."

We begin with AI not because it's the subject, but because it's the clearest window into what this is really about: becoming conscious of the systems we're already part of.

Here's what that actually feels like when you're living it...

The mirror effect

When you ask ChatGPT to help organize your thoughts, sometimes you become aware of how scattered or structured they already are. When you refine a prompt to get better results, you're actually mapping out what you really want to know. When it misunderstands your question, you realize how much context you carry that you never say out loud. When you interact, ask questions, explore ideas, and get feedback, you start noticing how you think.

Post-conversation, you might catch yourself estimating what you actually learned versus what you already knew.

I studied electrical and computer engineering, then artificial intelligence and innovation, adding cognitive science and design along the way. To understand how we all make sense of things, both machines and humans alike.

Then I realized: AI is applied philosophy.

You're constantly encountering questions like: What does it mean to understand something? How do I know what I know? What's the difference between information and meaning? These aren't abstract anymore when you're in dialogue with AI. They're real choices about how you ask questions and interpret answers.

Here's what fascinates me: humans and machines both try to make sense of the world, just through different forms.

Different hardware, same idea

Consider how you experience things through what I call the eight perceptual systems: taste, scent, touch, sight, sound, space, time, and system. You evolved these as ways of sensing and understanding. AI approximates them through data processing, pattern recognition, language understanding, and feedback loops.

The forms are different, but the goal is the same of turning signals into meaning.

Systems Intelligence as lived experience

Most people think of systems as something outside themselves: networks to map, connections to trace, patterns to observe from a distance.

But real intelligence begins when you recognize you're inside the system. When you see your own role in the feedback loop.

That's what using AI reveals. You're not just getting answers from a tool. You're in a dance where your questions shape the responses, which shape your next questions, which refine your understanding. You're both teacher and student in the same moment.

You ask a question, it responds, you clarify based on what you see, the system adapts. Back and forth. The intelligence emerges from both of you together.

The practice of attention

From a developer standpoint in the AI field (and maybe consumer?), you develop a particular kind of awareness. You start noticing not just what you think, but how you think. Not just answers, but the questions that led there.

You begin seeing patterns everywhere. A conversation becomes a back-and-forth of context and response. A memory feels like information stored and retrieved. A decision starts to look like weighing possibilities.

This might sound analytical, and it is. But it's also just... how you start experiencing things when you spend time in dialogue with systems that process information.

The beautiful part? This heightened attention doesn't stay confined to your technical contexts. You carry it everywhere. You walk into a room and notice how the space shapes behavior. You taste something and wonder what memories it's activating. You have a conversation and become aware of the assumptions underneath.

Using AI makes you a participant-observer in your own thinking.

Where this takes me

My work centers on understanding patterns: in data, in behavior, in how we experience the world. Interdisciplinary by design, because once you see that humans and machines are both trying to solve the same core problem, how to find meaning in complexity, the boundaries between fields start to dissolve.

The chef designing a flavor combination is working with sensory patterns and memory, whether they call it that or not. The architect shaping a space is creating an environment that will guide how people move and feel. The person having a conversation with AI is exploring how questions and context create understanding.

The central theme is that AI's most powerful (and perhaps unexpected) function today is metacognition: it makes you “notice how you think.”

It's not just a knowledge tool, but a self-awareness tool. And that “self” can be an individual, or an entire organization reflecting on its own decisions.


Thank you for thinking with me. This piece is part of Ode by Muno, where I explore the invisible systems shaping how we sense, think, and create.

I'm curious what patterns you're noticing in your own life — the feedback loops you're part of, the systems you're sensing. Leave a comment with what resonated, or what you see differently. Share this with someone who thinks about these intersections too. And if you want to follow this evolving conversation, subscribe to get new pieces as I write them.

The quote at the intro is from the book, Systems Intelligence.

In my next post, I'll map the names we give to thinking machines (Haiku, Sonnet, Gemini Flash, Michelangelo). Because naming reveals what we want intelligence to be.