What's that OpenAI hardware even likely to be? The attempts at consumer AI hardware so far have had no compelling answer to "why isn't this an app on the phone I already have".
Same strategy as a Kindle, Fire TV or a Portal – a proprietary gateway to services they sell, probably sold at or below cost. It's in line with their work on ChatGPT voice mode and such.
Funnily enough Amazon did try making a subsidised smartphone with deep integration of their services and it was a disastrous flop. Kindle and Fire TV succeeded by not directly competing with smartphones, but rather complementing them by being good at things that smartphones can do but not well.
If OpenAI plans to fork Android into their own thing then I can't see it going any better than Amazon's attempt did, and if it's a wearable like Humane/Rabbit then it needs an answer to "why isn't this an app".
Kindle is good stuff though. On the other hand, I can't imagine what a dedicated device would bring to the table that a smartphone / tablet would not in this case...
It’s probably a wearable device and the most instant uncanny valley nerd alert device imaginable, that makes bluetools and google glass wearers seem almost normal.
Spend a few minutes with the multi-modal models and you can figure it out. It'll probably be closer to the humane AI pin, but with a better design (no stupid projector) and a better version of the model backing it.
I get the impression that OpenAI’s growth has made it harder to move at the pace of the market, both with competitors and in open source, to speak nothing of the internal drama with Altman’s leadership.
They’re not exactly short of funding, and it’d hardly be wise for big tech to consolidate all its funding on a single player. Hell, Claude has shaped up really well and it doesn’t have that distinct GPT flavour to its output.
Despite being known for making big bets on products, Apple has historically not really been big on huge M&A bets or financial investments into other companies.
Also from my experience it's the counterparty that should feel honored to be working with Apple rather than the other way around, which I suspect played a role here too.
Why would I invest in a company if I had the intention to buy it? I could give you $400, and hope you use it in the way I want, or I could buy you, and apply that $400 in exactly the way I want.
Seeing that there might be downsides to support OpenAI, is there any harm in Apple sitting out "this round"? Are there not future rounds where Apple can reconsider talking with OpenAI?
Just taking a guess: the recent high-level exits from OpenAI may have contributed to Apple's backing away from talks with OpenAI.
It was embarrassing a decade ago, now it's bordering absurd how a product with so much visibility and importance by a company who typically is extremely perfection orientated is so behind the competition.
I don't think Siri users get a lot of exposure to the competition. I've never used whatever Google's version is. I remember when Cortana came out on PC and I was like wtf is this, idk what it's called now, probably just Copilot which I imagine is useful because ChatGPT.
I’m of the opposite opinion: make Siri dumber please, make sure it does a handful of things reliably. I want voice control to my phone, I don’t want a chat bot. Siri is… okay at voice control, but if they just threw in the towel and admitted that it is really just voice control, they could stick to a finite list of things it supports, and that’s it.
Imagine, you could actually get a documented list of
the things Siri supports. And phrases that activate them, and always activate them. No fuzzy AI baloney, just “these phrases do these things.”
Save the “unlimited knowledge chatbot” feature for another app please.
• sentiment analysis on text messages and the notifications to determine if they are urgent or high priority (apple is working on this actually for ios 18.1)
• summarize many emails or notifications, at a glance.
• improve siri. instead of transcribing the users request, and asking siri to take action right away, feed that transcription through an LLM first to fix grammar issues. Or correct words that don’t make sense in the context. This could reduce errors when asking siri to do something.
• have siri support handling multiple requests in one shot. An LLM can trivially pick apart a run on sentence and structure it into an actionable array or siri requests. Like “hey siri turn off all the lights and close the garage door. Oh also please let my wife know that i’m on my way. Actually, let my kids know too. Also i need navigation to the party today, i think it’s in my calendar.”
There’s many subtle ways to improve the user experience here. It seems like you’re expecting this to change your life lol. Don’t believe the hype, there is no AGI anytime soon. But let’s not pretend there’s no value here. You just have to be a bit creative…
"Apple Intelligence" was never ChatGPT. You would be prompted (if the app is installed, IIRC) to ask ChatGPT if the Apple Intelligence system was unable to process the request, or needed a larger set of expertise. It would then prompt you for permission to confer with it.
I understand Apple Intelligence is separate from ChatGPT in my device.
Not clear if Apple Intelligence on server side (Siri, etc) uses OpenAI or not in the background, even a private instance.
As for prompting for permission to confer with it on device, it would be great if I could point it at any model/endpoint I wanted as long as it was OpenAI API compatible.
I’d recommend reading Apple’s white papers on this. Even when a user opts (per request) to send data to OpenAI, it sends it obfuscated and requires OpenAI to not store anything. The only time something should be traceable to the user is if the user is signed in to OpenAI to access premium features.
Yeah given NotebookLM and Gemini are basically on par in every way, they would be wise to just expand their search deal with the Goog and not let a new entrant come and eat their food
Actual article: https://www.wsj.com/tech/apple-no-longer-in-talks-to-join-op...
Actual discussion earlier: https://news.ycombinator.com/item?id=41677333
Perhaps the elimination of the most of the original leadership team along with the board of directors was seen as a red flag.
Alternatively, Apple could just invest in Anthropic instead which seems to have had less drama and a lot of the same talent.
Or they could simply skip this hype and wait for the next.
What's next on the list of last century sci-fi that hasn't had 100 billion dollars of investment yet?
Fusion power maybe?
Helion FTW!
Just imagine if Fusion had the same interest as 'AI' and 'Crypto', although arguably it's magnitude more complex.
It could have in order to power AI and Crypto..
'AI' makes NFTs look like a good investment
At least AI does something somewhat productive with the energy it wastes.
Investment of Theseus.
Anthropic already has very large investment from Amazon. I doubt Apple wants to tie their AI future to Amazon so closely.
Google has also invested in Anthropic. I forget if the investments by AMZN and GOOG are cash or credits though.
The DD revealed what we're all sensing: Innovation has stalled.
“Open” in OpenAI stands for the open door that all leadership and talent is walking out through, I guess.
Discussion (68 points, 3 days ago, 82 comments) https://news.ycombinator.com/item?id=41677333
Weird. Somehow missed this one on the day. Was it visible on the front page?
Posts that generate more comments than upvotes tend to fall off the front page quickly.
Convenient that such posts are (sometimes) in direct conflict with the interests of Y Combinator (formerly run by Sam Altman).
It's an anti-flamewar thing. There's nothing nefarious about it.
Apple is well aware of Altman's intent to build consumer AI hardware (additionally, Altman is allegedly collaborating with Jony Ive on the project).
What's that OpenAI hardware even likely to be? The attempts at consumer AI hardware so far have had no compelling answer to "why isn't this an app on the phone I already have".
Same strategy as a Kindle, Fire TV or a Portal – a proprietary gateway to services they sell, probably sold at or below cost. It's in line with their work on ChatGPT voice mode and such.
Funnily enough Amazon did try making a subsidised smartphone with deep integration of their services and it was a disastrous flop. Kindle and Fire TV succeeded by not directly competing with smartphones, but rather complementing them by being good at things that smartphones can do but not well.
If OpenAI plans to fork Android into their own thing then I can't see it going any better than Amazon's attempt did, and if it's a wearable like Humane/Rabbit then it needs an answer to "why isn't this an app".
> fork Android into their own thing
They could make a custom version of Android and make a device that delivers what the Rabbit R1 said it would but didn’t.
A device that can run Android apps, but where the device can interact with the apps on its own based on your voice commands.
The idea of the Rabbit R1 was good, kind of. OpenAI could pull it off.
To be fair the phone failures were at least in part due to Googles anti competitive behaviours that they are being (have been?) fined for.
Ah yes, following the winning strategy of Fire devices
Kindle is good stuff though. On the other hand, I can't imagine what a dedicated device would bring to the table that a smartphone / tablet would not in this case...
It’s probably a wearable device and the most instant uncanny valley nerd alert device imaginable, that makes bluetools and google glass wearers seem almost normal.
Something like the flop Humane:
https://www.cnet.com/tech/mobile/humane-ai-hands-on-my-life-...
Why do you need an app? Couldn't you just call an AI and talk to it?
the only AI hardware that matters are the GPU´s
if altman gets his billions he can buy or build some maybe
There are several ways that AI hardware could compete with GPUs:
1. more VRAM -- better capacity for running large models and training new models/LoRAs/etc.;
2. more matrix and tensor accelerators/cores -- be able to run the neural networks faster;
3. dedicated silicon for ReLU and other common neural net building blocks to better accelerate workflows;
4. efficient use of the above to ideally run a model at the same speed but lower power;
5. affordability at the higher end so you are not spending £30,000+ on an AI workstation/server just to get 80GB VRAM.
Spend a few minutes with the multi-modal models and you can figure it out. It'll probably be closer to the humane AI pin, but with a better design (no stupid projector) and a better version of the model backing it.
The alternative to consider is Meta RayBan. Personally I think that form factor is a more likely starting point as vision is going to be a big deal.
But agreed that the pin (or ring / earring / other wearable) form factor is an obvious place to build too.
It should be clear that OpenAi wants to be as important as the other tech giants (MAGMA? I don't know the new acronym), not just a large tech company
Didn't go for the obvious 'GAMMA'?
Let’s say that’s true. Wouldn’t it be a useful edge?
looking at the numbers that have been published, openai is loosing money hand over foot, has suspect leadership.
Not only that OpenAI has lots of competitors that could be hoovered up at a 90% discount once the first big AI company goes pop.
I get the impression that OpenAI’s growth has made it harder to move at the pace of the market, both with competitors and in open source, to speak nothing of the internal drama with Altman’s leadership.
They’re not exactly short of funding, and it’d hardly be wise for big tech to consolidate all its funding on a single player. Hell, Claude has shaped up really well and it doesn’t have that distinct GPT flavour to its output.
I assume OpenAI wanted too much investment, and it makes sense to just pay them as a supplier instead of as an investment.
Despite being known for making big bets on products, Apple has historically not really been big on huge M&A bets or financial investments into other companies.
Also from my experience it's the counterparty that should feel honored to be working with Apple rather than the other way around, which I suspect played a role here too.
Jimmy Iovine and Dr. Dre have entered the chat...
And that was Apple’s biggest purchase ever at only $3B.
Yeah, they've spent those $3bn in 10 years, which is a rounding error for M&A for a company of this size
Why would I invest in a company if I had the intention to buy it? I could give you $400, and hope you use it in the way I want, or I could buy you, and apply that $400 in exactly the way I want.
[dupe] https://news.ycombinator.com/item?id=41677333
Seeing that there might be downsides to support OpenAI, is there any harm in Apple sitting out "this round"? Are there not future rounds where Apple can reconsider talking with OpenAI?
Just taking a guess: the recent high-level exits from OpenAI may have contributed to Apple's backing away from talks with OpenAI.
Just taking a guess: OpenAi wanted a trillion $ valuation.
Honestly, I don't care who's involved.
But, please make Siri less of a steaming pile.
It was embarrassing a decade ago, now it's bordering absurd how a product with so much visibility and importance by a company who typically is extremely perfection orientated is so behind the competition.
I don't think Siri users get a lot of exposure to the competition. I've never used whatever Google's version is. I remember when Cortana came out on PC and I was like wtf is this, idk what it's called now, probably just Copilot which I imagine is useful because ChatGPT.
I’m of the opposite opinion: make Siri dumber please, make sure it does a handful of things reliably. I want voice control to my phone, I don’t want a chat bot. Siri is… okay at voice control, but if they just threw in the towel and admitted that it is really just voice control, they could stick to a finite list of things it supports, and that’s it.
Imagine, you could actually get a documented list of the things Siri supports. And phrases that activate them, and always activate them. No fuzzy AI baloney, just “these phrases do these things.”
Save the “unlimited knowledge chatbot” feature for another app please.
What would GPT do in consumer products anyway?
Edit images? Answer incoming calls? Make reservations? Phones have that and tons more already.
ideas:
• sentiment analysis on text messages and the notifications to determine if they are urgent or high priority (apple is working on this actually for ios 18.1)
• summarize many emails or notifications, at a glance.
• improve siri. instead of transcribing the users request, and asking siri to take action right away, feed that transcription through an LLM first to fix grammar issues. Or correct words that don’t make sense in the context. This could reduce errors when asking siri to do something.
• have siri support handling multiple requests in one shot. An LLM can trivially pick apart a run on sentence and structure it into an actionable array or siri requests. Like “hey siri turn off all the lights and close the garage door. Oh also please let my wife know that i’m on my way. Actually, let my kids know too. Also i need navigation to the party today, i think it’s in my calendar.”
There’s many subtle ways to improve the user experience here. It seems like you’re expecting this to change your life lol. Don’t believe the hype, there is no AGI anytime soon. But let’s not pretend there’s no value here. You just have to be a bit creative…
Does this mean it's no longer integrated into the iPhone, or can be turned on/off?
"Apple Intelligence" was never ChatGPT. You would be prompted (if the app is installed, IIRC) to ask ChatGPT if the Apple Intelligence system was unable to process the request, or needed a larger set of expertise. It would then prompt you for permission to confer with it.
I understand Apple Intelligence is separate from ChatGPT in my device.
Not clear if Apple Intelligence on server side (Siri, etc) uses OpenAI or not in the background, even a private instance.
As for prompting for permission to confer with it on device, it would be great if I could point it at any model/endpoint I wanted as long as it was OpenAI API compatible.
Server side isn't OpenAI either, unless its something that it can't handle in which case it prompts you client side first
That's good to know.
For me though, trusting apple to keep it just in their cloud is the entire basis of any kind of trust.
If even a bit of it leaks outside of their network to OpenAI even behind the scenes, it might not be appealing to many.
I’d recommend reading Apple’s white papers on this. Even when a user opts (per request) to send data to OpenAI, it sends it obfuscated and requires OpenAI to not store anything. The only time something should be traceable to the user is if the user is signed in to OpenAI to access premium features.
It will still be integrated, and it will still require user confirmation for every request sent to ChatGPT.
It would be great if I could point it at any model/endpoint I wanted as long as it was OpenAI API compatible.
That will help support Apple's privacy supports and brand promises too.
Good! Someone in Apple's leadership is clairvoyant
As clairvoyant as leaders get! If only we could all be so lucky.
Yeah given NotebookLM and Gemini are basically on par in every way, they would be wise to just expand their search deal with the Goog and not let a new entrant come and eat their food
[dead]
Even with the 50 bps rate cut, everyone is still very tight with their money. It makes sense that Apple would back out in this high rate environment.
Apple has ~66 billion in cash sitting around. A 50bps rate cut costs them houndreds of millions a year.
Apple also has $100b of debt, so maybe it's a wash?
The debt was issued when rates were very low with fixed coupons as low as 0.375%, only the more recently issued bonds are in the 3-4% range.
Nevertheless they are paying fixed rates on their debt and (are likely) recieving floating rates on their cash.