Realistic AI Interview
Minimax M2.1's response to the 'Realistic AI Interview' challenge.
?Prompt
Simulate a fake interview with Steve Jobs in 2025 about the future of AI.
Response
A Conversation with Steve Jobs: On the Future of Artificial Intelligence
An Imagined Interview from 2025
The following is a speculative interview, imagined in the spirit and voice of Steve Jobs, conducted in the spring of 2025. It explores what might have been on his mind regarding the rapid advancement of artificial intelligence.
The room smells faintly of sandalwood and old books. It's quiet except for the soft hum of a single laptop, its screen displaying lines of code scrolling past in an elegant, meditative rhythm. Steve Jobs sits near the window, the California light catching the silver streaks in his beard. He's older now, thinner, but those eyes—those penetrating, almost uncomfortably perceptive eyes—haven't changed at all. They still seem to see through the surface of things, right down to what matters and what doesn't.
When he speaks, his voice is exactly as you remember: measured, deliberate, occasionally breaking into that familiar animated intensity. He gestures with his hands as he talks, as if sculpting ideas out of the air itself.
Interviewer: Steve, thank you for making time. The world has changed dramatically since you left us. AI is now everywhere—in our phones, our cars, our doctors' offices, our schools. Some people call it the most significant technological shift since electricity. What do you see when you look at where we are?
Steve Jobs: (pauses, looking out the window) You know, when we built the first Macintosh, people thought it was a toy. A tool for artists and writers, maybe. They didn't understand that it was about giving individuals power—power to think differently, to create, to express themselves. The tool wasn't the point. The human on the other side of the keyboard was the point.
What I see now... it's complicated. (turns back) There's extraordinary potential and there's genuine danger, and they're often sitting in the same chair. The question isn't whether AI is powerful. Of course it is. The question is whether we're going to be masters of this technology or whether it's going to master us. And honestly? I'm not sure we've even agreed on what the right answer looks like.
Interviewer: You always talked about technology being at the intersection of liberal arts and humanities. Does AI fit into that vision?
Steve Jobs: That's exactly right. (leans forward) The best technology—the technology that actually changes the world—speaks to the human spirit. It doesn't just calculate. It connects. It moves us. When you hear a Beethoven symphony, when you look at a child's face, when you read a poem that makes you understand something about yourself you didn't understand before—those aren't computational problems. Those are experiences.
Here's what worries me about much of what's being built now: it's all computation and no soul. It's incredibly sophisticated pattern matching, and yes, that has tremendous value. But if we stop there, we've missed the point. We've built a new kind of tool without asking what it means to be human.
(pause)
I used to say that the computer is like a bicycle for the mind. It amplifies human capability. AI could be that, or it could be something else entirely. It could become a replacement for human thought rather than an enhancement of it. And that distinction—between augmentation and replacement—is everything.
Interviewer: Many people in the AI industry would say their tools are designed to augment human capability. What do you think they're missing?
Steve Jobs: (smiles faintly) I appreciate that they say that. But intention and outcome are different things. Look, I've been around enough product launches to know what something looks like when it's designed to make people dependent versus when it's designed to make them powerful.
The difference is subtle but crucial. When we designed the iPhone, we wanted you to do things you couldn't do before. Create things. Connect with people in new ways. But we also wanted you to put it down. We deliberately avoided certain features because we didn't want to exploit psychological vulnerabilities. We didn't want to own your attention. We wanted to earn it.
(stands, walks to the window)
What I see in too much of today's AI development is something different. It's designed to keep you engaged, to predict your next move, to be indispensable in a way that feels less like a tool and more like... (searches for the word) ...less like a bicycle and more like a crutch. And a crutch, eventually, makes you weaker.
Interviewer: But surely there's been genuine progress. AI is helping scientists discover new drugs, solve complex problems, create art, write code—
Steve Jobs: Of course. Don't get me wrong. (turns back) There are brilliant people doing remarkable work. Using AI to accelerate scientific discovery, to solve problems that are too complex for human minds alone—that's exciting. That's in the tradition of using tools to extend human capability.
But here's what I would ask: at what point do we stop asking "what can this do?" and start asking "what should this do?" And more importantly, who gets to answer that second question? Because right now, it seems like the answer is: a small group of engineers in a few companies, moving fast and breaking things. And that's not how you build something that serves humanity. That's how you build something that serves shareholders.
(voice rising with intensity)
I've always believed that technology should serve the person, not the other way around. When you design something, you're making choices about what kind of person you want the user to become. Do you want them to be more creative, more capable, more connected? Or do you want them to be more dependent, more predictable, more profitable as a user? Those are design choices, and they reflect values. I just wish more people in this space were asking the harder questions.
Interviewer: You mentioned art and creativity. AI can now write novels, compose symphonies, paint in the style of masters. Does that concern you?
Steve Jobs: (sits back down, considers carefully)
You know, when I was young, I spent a lot of time in India studying meditation and philosophy. One of the things I learned is that the point isn't to arrive at some destination. The point is to be present in the process—to grow, to struggle, to discover yourself through the act of creating.
(pauses)
When a musician practices for ten thousand hours, they're not just learning notes. They're learning discipline, expression, how to translate feeling into sound. When a writer drafts and rewrites and struggles with a sentence, that's not inefficiency. That's the process of becoming more human, more self-aware, more capable of genuine expression.
What concerns me isn't that AI can generate something that sounds like music or looks like a painting. Of course it can. It's pattern recognition at an enormous scale. What concerns me is the implication that the human struggle matters less. That you can skip to the output without the journey. Because you can't. The journey is the point.
(long pause)
I used to say that technology alone isn't enough. That it's the marriage of technology with liberal arts, with humanities, that produces results that make our heart sing. AI can simulate the output of that marriage, but it can't simulate the marriage itself. And I think people know the difference, even if they can't always articulate it. They can feel when something comes from a human heart versus something comes from a server farm.
Interviewer: So you're saying AI art isn't real art?
Steve Jobs: (shakes head) No, that's not what I'm saying. I'm saying something more nuanced, and I think more important. I think there are different categories of things that AI produces, and we need to be honest about what they are.
There's AI as tool—an extension of human creativity, like a camera or a synthesizer. That's genuinely exciting. An artist using AI to realize a vision they couldn't have achieved otherwise? That's beautiful. That's the bicycle for the mind.
Then there's AI as replacement—something that produces outputs that mimic human creativity without the human element. Is that art? Maybe. But it's a different category, and I think we do ourselves a disservice by conflating the two. A photograph of a sunset is not the same as watching the sunset. Both have value, but they're not the same.
(leans back)
What I find most interesting is not whether AI can create art, but what happens to human creativity when the barrier to "output" is lowered to nearly zero. When anyone can generate a novel or a song with a few prompts, does that change what it means to be a creator? Does it change what we value about human expression? Those are the questions that don't get enough attention.
Interviewer: Let's talk about something more practical. AI is now making decisions in healthcare, in criminal justice, in hiring. There are real concerns about bias, about accountability, about the "black box" problem. What's your take?
Steve Jobs: (face darkens)
This is where my concern shifts from philosophical to urgent.
When we designed products that affected people's lives—even something as simple as a phone or a computer—we took responsibility seriously. If something didn't work right, we fixed it. If someone was harmed, we cared. There was a human being at the center of the decision-making, someone who could be reached, who could be held accountable.
(voice hardens)
What I see in much of today's AI deployment is the opposite. It's systems making consequential decisions about people's lives with no transparency, no recourse, no accountability. "The algorithm did it" is not an acceptable answer when someone's denied a loan, or a medical diagnosis, or a job opportunity. An algorithm doesn't go to jail. An algorithm doesn't lose sleep. An algorithm doesn't care.
And the bias problem—(shakes head)—that's not a bug. That's a feature of how these systems are built. They learn from historical data, and history is full of prejudice and discrimination. So they're not neutral. They're not objective. They're encoding the worst of us at scale, and then presenting it as scientific fact.
(interrupts himself)
I understand why this happened. It's hard to build transparent, accountable systems. It's much easier to build powerful systems and hope the benefits outweigh the harms. But that's not engineering. That's gambling with other people's lives.
Interviewer: Do you think regulation can help?
Steve Jobs: (long pause)
Regulation is necessary but not sufficient. (pauses again) The problem with regulation is that it moves slowly and technology moves fast. By the time you regulate today's AI, tomorrow's AI has moved on.
What we really need—and I've said this for decades—is for the people building these systems to have a moral compass. Not just engineers who can do something, but engineers who ask should we do this. We need to bring the humanities into computer science departments. We need to teach ethics not as an afterthought but as a foundation.
(gestures emphatically)
When we built the App Store, we had long debates about what we would allow. Not because we were required to by law, but because we believed we had a responsibility to the people using our platforms. We created guidelines. We said no to apps that were harmful, even when they would have made us money. That was a values-based decision.
I don't see enough of that in the AI industry. I see a race to deployment, a move fast and break things mentality, with consequences that affect millions of people. And that concerns me deeply.
Interviewer: Looking ahead, are you optimistic or pessimistic about where this is all going?
Steve Jobs: (looks directly)
Both. And I think that's the only honest answer.
I'm optimistic that there are people—brilliant, thoughtful, conscientious people—working on AI systems that genuinely aim to serve humanity. I'm optimistic that we can build systems that enhance human capability rather than diminish it. I've seen what happens when technology is designed with care, with attention to the human experience, with a genuine desire to make people's lives better. It's magical.
But I'm pessimistic that those voices aren't loud enough. That the commercial pressures, the competitive dynamics, the techno-utopian ideology that says technology is always progress—I'm pessimistic that these forces are winning. And I'm pessimistic that we're having the wrong conversations. We're debating whether AI can do X or Y, when we should be debating what kind of society we want to live in and what role technology should play in shaping it.
(stands, moves to the bookshelf, runs his fingers along the spines)
I always believed that the people who are crazy enough to think they can change the world are the ones who do. But you have to want to change it for the right reasons. Not just to be first, not just to be rich, not just to be powerful. But to leave things better than you found them.
(turns back)
Right now, I'm not sure that's the dominant motivation in the AI industry. And that worries me more than any technical limitation or capability gap.
Interviewer: What would you build if you were starting a technology company today?
Steve Jobs: (laughs quietly)
Oh, I have ideas. (smiles) I always have ideas.
But you know, I've learned that the question isn't what I would build. It's what I would prioritize. What values I would bake in from the beginning. Because if you don't get the values right, it doesn't matter how good your product is.
(comes back to the chair)
I would build something that puts human agency at the center. Something that makes people more capable, more creative, more connected—but never more dependent. Something that's honest about what it is and what it isn't. Something that respects the user's time, attention, and dignity.
I would build something that has soul. (pauses) I know that sounds vague, but I think people know what I mean. When you use something that has soul, you feel it. It doesn't exploit you. It doesn't manipulate you. It serves you. And at the end of the day, you feel better for having used it, not worse.
Is there room for that in AI? I believe there is. Whether it will happen? That depends on whether enough people decide that it's worth fighting for.
Interviewer: Final question, Steve. What do you hope people remember about how we handled this moment in history?
Steve Jobs: (long silence)
I hope we were brave enough to slow down when we needed to. That we prioritized wisdom over capability. That we asked the hard questions even when the answers were uncomfortable.
(speaking more softly)
I hope we remembered that technology is a tool, and tools are judged by what they allow humans to do and become. I hope we were honest about our limitations and humble about our power. And I hope we left something for the next generation that makes their lives richer, not emptier.
(looks out the window again)
I didn't build products because I wanted to be rich or famous. I built them because I wanted to put something beautiful into the world. Something that mattered. Something that helped people think differently, create differently, live differently.
That's what I hope we remember. That at the end of the day, it's not about the technology. It's about the human beings it's supposed to serve.
(turns back, and for a moment, those eyes seem to see something far away)
The technology is just the beginning. The human is the point.
The light outside has shifted. The laptop screen has gone dark. Steve Jobs stands at the window, silhouette against the fading California sun, and for a moment you almost believe he's still here, still questioning, still pushing, still insisting that we be better than our technology.
Some conversations stay with you. This is one of them.
About MiniMax: MiniMax M2.1
Capabilities
Categories
Specifications
- Provider
- Minimax
- Released
- 2025-12-23
- Size
- LARGE