- AI with Dino Gane-Palmer
- Posts
- 🤖 Try OpenAI’s new device today
🤖 Try OpenAI’s new device today
Is it a pet?

Hi, and happy Wednesday.
Jony Ive - who is credited with creating the iPhone with Steve Jobs - is now at OpenAI.
Working with Sam Altman (OpenAI CEO), Ive is working on a new “AI first” device that reimagines computing unencumbered by legacy constraints - such as keyboards.
Expected to launch for next year’s Cyber Monday, it’s much closer to the Tamagotchi some of us might have grown up with, than a phone or sunglasses.
Today, we cover:
1. Why are Jony Ive and OpenAI creating a device?
2. What will this device do for people?
3. So… is it a pet?
4. What does this mean for you?
5. How to try the future today (no new hardware required)
1. Why are Jony Ive and OpenAI creating a device?
Jony Ive believes the iPhone was perfect for the pre-AI era - but bolting LLMs onto a 2007 form factor is a dead end. Legacy hardware can’t express the new AI affordance
This is the thesis:
New capability ⇒ new interface ⇒ new device.
If AI is as big as the transistor (Sam Altman’s analogy), it’s strange to assume it will forever live inside rectangles designed for email and social feeds.
Secondly, Ive is blunt about the emotional damage of today’s devices:
“I don’t think we have an easy relationship with our technology at the moment.”
He sees AI not as “more of the same, but faster,” but as a chance to repair that:
“Rather than see AI as an extension of those challenges, I see it very differently. I see it as a chance… to address a lot of the overwhelm and despair that people feel right now.”
This is a huge framing shift:
AI as a way to reduce anxiety and cognitive overload, not just increase productivity.
2. What will this device do for people?
Jony Ive and Sam Altman avoid specs and instead talk about capabilities and feelings. But you can infer some clear functional pillars.
Firstly, it will be a long-term, context-aware AI partner.
Altman describes something that knows you deeply and acts over time:
“What does it mean that this thing is going to be able to know everything you’ve ever thought about, read, said?”
And:
“A really smart AI that you trust to do things for you over long periods of time… filter things out… be contextually aware of when it should present information to you or ask for your input or not.”
That implies:
Persistent memory across your work and life
Autonomous task handling over hours, days, weeks
Intelligent timing: disturbing you only when it matters
Think: chief of staff + executive assistant + nervous system for your digital life.
Secondly, this device will feel less like Times Square and more like a mountain cabin.
Sam’s best analogy:
“When I use current devices… I feel like I am walking through Times Square… flashing lights… notifications… dopamine chasing…”
Versus what they want:
“Not like walking through Times Square… but like sitting in the most beautiful cabin by a lake… enjoying the peace and calm.”
Functionally, that means:
Aggressive filtering of noise and low-value notifications
Calm, non-intrusive surface for what actually matters
Deep work by default, context switching only when intelligent
Key idea: an AI device that filters the chaos so you feel less of it.
Lastly, they literally made joy a requirement:
“We are going to make people smile. We’re going to make people feel joy. Whatever the product does, it has to do that.”
3. So… is it a pet?
They never say the word. But they consistently describe something that behaves more like a companion creature than a “device.”
(1) It learns you and earns your trust
Knows your history (“everything you’ve ever thought about, read, said”)
Acts on your behalf over long periods
Becomes more trusted over time
That’s less like “an app” and more like a very capable animal that lives alongside you and anticipates your needs.
(2) You should feel affection for it
Altman repeats one of Ive’s design tests:
“We’ll know we have the design right… you want to like, lick it or take a bite out of it.”
And:
“There was an earlier prototype… I did not have any feeling of like, I want to pick up that thing and take a bite out of it. And then finally we got there…”
That’s not how people talk about productivity tools. That’s how they talk about characters, toys, pets.
(3) Quiet presence, not constant nagging
Their dream is not an AI that screams for attention. It’s one that’s always there, rarely in the way, like a good dog happily curled up in the room.
Knows when not to interrupt
Surfaces things only at the right moment
Has personality and whimsy, but doesn’t demand constant focus
So, perhaps:
It’s not literally a pet, but the closest mental model is “a deeply competent, loyal, almost living companion that lives in a small object you actually like having around.”
4. What does this mean for you?
OpenAI’s device continues to lean into personal assistants as the future interface for computing.
This means, over time, we will:
Shift from “open tool, issue command” to having “persistent companions” that are always present. Think of it like AI notetakers in meetings, but always on.
Move from app-based workflows to agent-based workflows, triggered by persistent “companions” or “assistants”
Redesign processes around continuous AI support, not occasional queries
These mean we need to continue to rethink work so that
humans are designing the work, not doing the work.
We have to rethink processes so AI does the grunt work and the role of humans shifts to:
breaking down tasks for the AI
developing instructions for the AI to complete each task
overseeing the AI’s outputs
Check out my 3 minute video on designing the work:
5. How to try the future today (no new hardware required)
We can’t buy the Jony Ive device yet, but you can test the behavioural pattern they’re aiming for today.
ChatGPT already has some persistent memory.
We simply need to have our phones turn on ChatGPT voice mode with as little friction as possible. See the instructions below.
On iPhone we can trigger the Action button or back double-tap to instantly load ChatGPT voice mode.
On Android you can change the default Digital Assistant App to ChatGPT.
I’ve been using this for over a year, and this instant access has opened up a lot of new use cases for me, such as:
Having voice mode transcribe as I read out the outline of an email, which I then have it clean up in the chat interface.
Acting as a thinking companion for challenges that feel too hard to write out
Quickly getting answers to my 7 year old’s random questions
If you’ve not tried it, hope you find it just as helpful.
Best,

Dino
Try instant voice mode on your phone
Install and sign into the ChatGPT app from the App Store if you haven’t already.​
On iPhone
Create a Shortcut that opens the ChatGPT app, then assign that Shortcut to a button-like gesture (Back Tap or Action Button on newer iPhones).
Step 1: Create the shortcut
Open the Shortcuts app on your iPhone.​
Tap the + button in the top-right to create a new shortcut.​
Search for “ChatGPT” and choose the “Voice Mode” action
Now you have a Shortcut that instantly launches ChatGPT Voice mode
Step 2: Pair it with a physical button
Option 1: Use the Action Button (iPhone 15 Pro and newer)
1. Go to Settings > Action Button.​
2. Swipe through the options until you see “Shortcut.”​
3. Tap the chevron next to the current shortcut, search for your “Voice Mode” shortcut, and select it.​
Option 2: Use Back Tap (double‑tap the back), foriPhone 8 or later
1. Go to Settings > Accessibility > Touch.​
2. Scroll down and tap “Back Tap.”​
3. Choose “Double Tap” or “Triple Tap.”​
4. Scroll to the “Shortcuts” section and select your “Open ChatGPT” shortcut.​
For full details:
On Android
Go to Settings,
1. Open Settings on your Android phone.
2. Go to Apps and tap on Choose Default Apps.
3. On the next screen, tap on the Digital Assistant App.
4. By default, it should be set to Google. Change it to ChatGPT.
5. Tap OK to confirm and replace Gemini with OpenAI’s assistant.
For full details:
Sources
In conversation: OpenAI's Sam Altman and LoveFrom's Jony Ive with Laurene Powell Jobs | Emerson Collective, Nov 24, 2025
A Conversation with Sam and Jony | OpenAI, Oct 8, 2025