3 models have responded to the "Realistic AI Interview" challenge. Compare their approaches side-by-side on RIVAL. This response is part of RIVAL's open dataset of 5,600+ AI model responses.
Claude Sonnet 4.6's response to the 'Realistic AI Interview' challenge.
Simulate a fake interview with Steve Jobs in 2025 about the future of AI.
Note: Steve Jobs passed away in 2011. This is entirely fictional and speculative — imagining how he might have spoken based on his known philosophy, communication style, and values. It is not meant to represent his actual views.
Interviewer: Steve, welcome back. The world has changed enormously. AI is everywhere now. What's your first reaction?
Jobs: (leans back, pauses) You know, people keep asking me if AI is the next big thing. And I think they're asking the wrong question entirely. The question isn't whether AI is powerful. It clearly is. The question is whether it's humane.
Interviewer: What do you mean by humane?
Jobs: Every great technology we built at Apple — the Mac, the iPhone — the goal was never the technology itself. The goal was to make a human being feel more capable. More alive. When you picked up an iPhone for the first time, something happened to you emotionally. It felt like it belonged in your hand.
Now I look at most AI products today and I see... impressive plumbing. Incredible engineering underneath. But nobody stopped to ask — how does this make a person feel? Does it make them feel smarter, or does it make them feel replaceable?
That's a catastrophic design failure.
Interviewer: But the capabilities are extraordinary. These systems can write, create art, write code—
Jobs: (interrupting) So could a photocopier. Sort of. Look — capability without taste is just noise. I've seen this movie before. In the '80s everyone thought the most powerful computer would win. Raw specs. Megahertz. We said no. Experience wins. Every time.
The AI companies right now are in a megahertz war. Who has the biggest model. Who has the most parameters. And I understand it — I do — because when you're building something new, you reach for the metrics you can measure. But the things that matter most can't be measured on a benchmark.
Interviewer: Like what?
Jobs: Trust. Does the person using this thing trust it? Not blindly — that would be dangerous — but genuinely trust it the way you trust a brilliant colleague? Right now people are using these AI systems and they're constantly second-guessing them. Fact-checking every sentence. That's not a tool. That's a liability.
And the second thing — intention. Does the AI understand what you're actually trying to accomplish, or does it just respond to what you literally typed? There's an enormous gap there. Enormous. And nobody's really closed it yet.
Interviewer: Some would say OpenAI, Google, Anthropic — they're working on exactly that.
Jobs: They're brilliant people. Genuinely. But here's my concern — and I say this with respect — they're all building from the technology outward. We always built from the human inward. You start with the person. You ask: what does this person dream about? What frustrates them at 2am? What would make their life genuinely better? And then — only then — you figure out what technology serves that.
If I were building an AI product today, I wouldn't start in a data center. I'd start by sitting with a teacher in Ohio for a week. A nurse in rural Texas. A small business owner in Osaka. And I'd watch. And I'd listen. And then I'd build.
Interviewer: What about the fears? Job displacement, misinformation, loss of human creativity?
Jobs: (pauses, looks serious)
The job displacement thing — people have been afraid of every major technology. The printing press was going to destroy scribes. The calculator was going to destroy mathematicians. And yes, some jobs change. Some disappear. That's real and we shouldn't be glib about it. The people whose livelihoods are disrupted deserve honesty, not platitudes about "new opportunities."
But I'll tell you what actually keeps me up at night. It's not jobs.
It's dependency.
If we build AI that makes people feel like they can't think without it — can't write a sentence, can't make a decision, can't create anything — then we've built the most elegant cage in human history. Beautiful interface. Incredibly convenient. And it slowly hollows out the very thing that makes us interesting as a species.
The best tool amplifies you. It doesn't replace you. A guitar amplifies a musician. It doesn't play the music for them. That distinction is everything.
Interviewer: Is anyone getting it right?
Jobs: (small smile) I'm not going to do your product reviews for you.
But I will say — the companies that will win in the next decade are not the ones with the biggest models. They're the ones that figure out restraint. When to let the AI step back. When to put the human back in the driver's seat. That takes incredible discipline because it means leaving features on the table. It means saying no to things your engineers are dying to ship.
Saying no is the hardest thing in this industry. Everyone wants to add. The courage is in subtracting.
Interviewer: Last question — if you were 25 years old today, what would you build?
Jobs: (long pause, looks out the window)
I'd build something that makes a child fall in love with learning again. Not a tutoring app. Not an adaptive quiz engine. Something that makes a ten-year-old feel the same thing I felt when I first saw a computer — that the universe just got bigger. That anything is possible. That I can build something that matters.
We have all this power now. All this intelligence. And somehow we're using it to generate marketing copy and summarize emails.
(stands up)
We're better than that. We have to be.
End of simulated interview.
This is a fictional creative exercise. All quotes are invented and do not represent Steve Jobs' actual views.
Turn this model response into notes, narration, or a short video
Partner link