Sora Update #1

(blog.samaltman.com)

126 points | by davidbarker a day ago ago

163 comments

  • simianparrot a day ago ago

    > In particular, we'd like to acknowledge the remarkable creative output of Japan--we are struck by how deep the connection between users and Japanese content is!

    Translation from snake speech bs: We've been threatened by Japanese artists via their lawyers that unless we remove the "Ghibli" feature that earned us so much money, and others like it, we're going to get absolutely destroyed in court.

    • qoez 19 hours ago ago

      My hunch is that openai used ghibli as the example in their earlier dall-e blog posts strategically because anime was earlier said by the PM not to be protected by copyright in training. OpenAI is always sneakier than most people give them credit for.

      • ethbr1 15 hours ago ago

        > OpenAI is always sneakier than most people give them credit for.

        There's usually more useful information in what Sam Altman specifically doesn't say than what he does.

    • 47thpresident 15 hours ago ago

      I'm pretty sure this is in response to the flood of Sora anime parodies that have flooded TikTok in the past 48 hours. Seems like OpenAI is acknowledging some strongly worded letters from anime rights holders rather than individual artists, or the response wouldn't be this swift.

    • andrew_mason1 20 hours ago ago

      hey don't forget about nintendo too

    • vivzkestrel 4 hours ago ago

      can we get an AI model that translates all CEO speeches with from snake oil salesman BS to direct talk please?

    • qlm 21 hours ago ago

      > Japanese “content”

      Sickening

      • simianparrot 21 hours ago ago

        No human writes like this. If he actually did it’s worrying.

        • rossant 20 hours ago ago

          Would you mind explaining? As a non native English speaker I may have missed some nuance.

          • layer8 19 hours ago ago

            The word “content” is often perceived as devaluing creative work: https://www.nytimes.com/2023/09/27/movies/emma-thompson-writ...

            Paradoxically, it signals indifference or disregard about the actual contents of a work.

            • majewsky 8 hours ago ago

              Eevee put it best:

              > I absolutely cannot fucking stand creative work being referred to as "content". "Content" is how you refer to the stuff on a website when you're designing the layout and don't know what actually goes on the page yet. "Content" is how you refer to the collection of odds and ends in your car's trunk. "Content" is what marketers call the stuff that goes around the ads.

              From https://eev.ee/blog/2025/07/03/the-rise-of-whatever/

          • danhau 20 hours ago ago

            The word content. Art would have been the appropriate term.

            • Zacharias030 13 hours ago ago

              some of it are cultural products too.

            • sph 20 hours ago ago

              Wait until they coopt the word "art" to include AI-generated slop. I dread the future discussion tarpits about whether AI creations can be considered art.

              • _DeadFred_ 13 hours ago ago

                A piece of wood, a rock can be pretty/interesting to look at. It is not art. AI slop might be pretty/interesting, but it is not art.

              • krapp 20 hours ago ago

                My person in deity that future has been here for a while now.

                Not only do they consider it art, they call what you and I consider art "humanslop" and consider it inferior to AI.

                • idiotsecant 19 hours ago ago

                  This sounds a lot like boomers complaining about kitty litter instead of bathrooms in elementary school

                  It's easy to get too chronically online and focus on some tiny weird thing you saw when in fact it's just a tiny weird thing

            • mlrtime 20 hours ago ago

              Disagree, it is content. The Japanese anime (referenced) is specifically made to be marketed and sold.

              • estearum 19 hours ago ago

                Almost every piece of art you've ever seen (by virtue of you seeing it) was made to be marketed and sold.

                Art is overwhelmingly not a charity project from artists to the commons.

                • Kiro 17 hours ago ago

                  I presume "by virtue of you seeing it" includes other conditions or I don't understand how you can claim such a thing.

                  • estearum 17 hours ago ago

                    Where exactly have you seen art that wasn’t made to be sold? Be specific.

                    • Kiro 9 hours ago ago

                      Friends, family, coworkers, my own, random posts online, everywhere.

                      • estearum 8 hours ago ago

                        Ah yes, the very normal activity of showing your coworkers your hobbyist art! Is this happening a couple dozen times per day?

                        • fragmede 2 hours ago ago

                          Via Instagram, while they're showing off pictures of their kids and their hobbies... yes? Do you show only your coworkers, what, system diagrams of work things making the between work times still also about work?

                          Different places have different cultures, apparently your coworkers aren't to know anything about you beyond what's necessary for them to work with you, but across the whole world, not everywhere is like that, and it seems unnecessary to state that you don't live in such a place in that way.

                • ricardobeat 19 hours ago ago

                  Most independent artists will disagree with this statement. They do it for passion, to communicate, to tell stories, to fulfill their own urges. Some works incidentally hit a sweet spot and become commercial successes, but that's not their purpose. On the other hand, the 'art' you see being marketed around you is made specifically to be marketed and sold, with little personal connection to the artist, and often against their own preferences. That's "content".

                  • estearum 17 hours ago ago

                    Is that what they tell you when you’re standing in the gallery with a checkbook? Or in the boardroom with a signature?

                    No, you almost never see art that wasn’t meant to be sold. Public art pieces are commissioned (sold), art in galleries were created by professional artists (even if commercially unsuccessful) 99.99999% of the time.

                    Surely if this wasn’t true, you could point to a few specific examples of art — or even broad categories of art — that weren’t made to be sold and that you have personally seen?

                    • ricardobeat 16 hours ago ago

                      I think you're just interpreting the meaning of "made to be sold" very literally. Of course artists want to make a living and have their art be appreciated, so they expect pieces to be sold; but that is not the main motivation behind making the art, where commercial "art" - advertising, mainstream cinema, pop music, most art galleries, anime, 80% of what you see in arts and crafts fairs, pieces in IKEA - is created with profit as the main motive.

                      Going back to the origin of this, stating that Ghibli style videos generated with SORA (which the OP initially called "content") are equivalent to Studio Ghibli movies because they are both "art made to be sold" would be wild. A film like Spirited Away took over 1 million hours of work, if making money was the main goal it would have never happened.

                      • estearum 16 hours ago ago

                        > Of course artists want to make a living and have their art be appreciated, so they expect pieces to be sold

                        "they want their to be appreciated, so they expect pieces to be sold" is a clever trick but one is not related to the other. One could want their art to be appreciated and never sell it, but virtually no one would see this art for a variety of reasons including the fact that marketability increases visibility and that there is very, very little amateur art that is worth looking at, much less promoting to a larger audience.

                        It seems you agree that in fact art (that anyone sees) is overwhelmingly made to be sold.

                        I didn't say anything about their "main motivation" and neither you nor I (nor even the artist, frankly) could say much about what someone's main motivation is.

                        What we can say is that nearly all of the art anyone sees was in fact made to be sold, which is the specific claim that I made.

                        • Terretta 14 hours ago ago

                          > nearly all of the art anyone sees

                          See comment above.

                          • estearum 14 hours ago ago

                            Yes you're just restating my thesis but with the air of disputing it.

                            • WhyOhWhyQ 14 hours ago ago

                              Buddy your thesis is that art does not exist because of capitalism. That is a ridiculous 'thesis'.

                              • estearum 13 hours ago ago

                                ... what? Not sure how you got that, but no, that's not what I believe.

                                Here, I'll restate it:

                                > Almost every piece of art you've ever seen (by virtue of you seeing it) was made to be marketed and sold.

                                > Art is overwhelmingly not a charity project from artists to the commons.

                    • Terretta 15 hours ago ago

                      > almost never see art that wasn’t meant to be sold

                      Because most art isn't in a gallery or store. You quite literally aren't seeing it.

                      • estearum 14 hours ago ago

                        In other words:

                        > Almost every piece of art you've ever seen (by virtue of you seeing it) was made to be marketed and sold.

                  • richardfulop 18 hours ago ago

                    Art is not an objective definition, it is the subjective experience of the observer. Content is a format.

              • qlm 20 hours ago ago

                The involvement of money does not preclude a work from being considered art. Your claim is cynical and ahistorical.

                • nickthegreek 19 hours ago ago

                  it also doesn’t preclude it from being content.

                  • estearum 19 hours ago ago

                    I don't think any supposes it does. They're arguing that the word choice implies something about the speaker's value system and the place that art or human culture has in it.

                  • qlm 19 hours ago ago

                    Well, yes, but I didn’t really think that needed to be said.

        • rhetocj23 20 hours ago ago

          None of us should be surprised. This joker has zero respect for the artistry of humans.

  • solid_fuel a day ago ago

    I don't understand some parts of this, the writing doesn't seem to flow logically from one thought to another.

        >  Second, we are going to have to somehow make money for video generation. People are generating much more than we expected per user, and a lot of videos are being generated for very small audiences. 
        > We are going to try sharing some of this revenue with rightsholders who want their characters generated by users. 
        > The exact model will take some trial and error to figure out, but we plan to start very soon. Our hope is that the new kind of engagement is even more valuable than the revenue share, but of course we we want both to be valuable.
    
    
    The first part of this paragraph implies that the video generation service is more expensive than they expected, because users are generating more videos than they expected and sharing them less. The next sentence then references sharing revenue with "rightsholders"? What revenue? The first part makes it sound like there's very little left over after paying for inference.

    Secondly, to make a prediction about the future business model - it sounds like large companies (disney, nintendo, etc) will be able to enter revenue sharing agreements with OpenAI where users pay extra to use specific brand characters in their generated videos, and some of that licensing cost will be returned to the "rightsholders". But I bet everyone else - you, me, small youtube celebrities - will be left out in the cold with no controls over their likeness. After all, it's not like they could possibly identify every single living person and tie them to their likeness.

    • cg505 a day ago ago

      1. They need to charge users for generation.

      2. They might get into trouble charging users to generate some other entity's IP, so they may revenue-share with the IP owner.

      They're probably still losing money even if they charge for video generation, but recouping some of that cost, even if they revshare, is better than nothing.

      • earthnail a day ago ago

        You got the last paragraph wrong. They need to negotiate with rights holders on the revenue split. They’re hoping that the virality aspect will be more important to rights holders than money alone, but they will of course also give money to rights holders.

        Or, in other words: here’s Sam Altman saying to Disney “you should actually be grateful if people generate tons of videos with Disney characters because it puts them front and center again.”, but then he acknowledges that OpenAI also benefits from it and therefore should pay Disney something. But this will be his argument when negotiating for a lower revenue share, and if his theory holds, then brands that don’t enter into a revenue share with OpenAI because they don’t like the deal terms may lose out on even more money and attention that they would get via Sora.

    • melvinmelih a day ago ago

      > After all, it's not like they could possibly identify every single living person and tie them to their likeness.

      Wasn’t he literally scanning eye balls a couple years ago?

      • nickthegreek 19 hours ago ago

        the scanning continues.

      • rglover 18 hours ago ago

        "Just look into the orb, bro."

    • raphman 20 hours ago ago

      "Sora Update #4: Through a partnership with Google, Meta and Snap Inc., you will be able to generate tasteful photos of the cute girl you saw on the bus. She will receive a compensation of $0.007 once she signs our universal content creators' agreement."

    • sebzim4500 a day ago ago

      I don't get the confusion. He's saying that

      (i) they will need to start charging money per generation (ii) they will share some of this money with rightsholders

      • solid_fuel 11 hours ago ago

        It's confusing to me because charging money is implied - "we are going to have to somehow make money" - but not actually stated, and then it jumps past the revenue structure into sharing money with "rightsholders".

        It has left me wondering if, instead of just charging users, they would start charging "rightsholders" for IP protection. I could see a system where e.g. Disney pays OpenAI $1 million up front to train in recognition of Mickey Mouse, and then receives a revenue share back from users who generate videos containing Mickey Mouse.

      • samastur 21 hours ago ago

        they will TRY to share this money ;)

        • cedilla 21 hours ago ago

          Yes – "with rightsholders who want their characters generated by users. "

          So it's not about reimbursing "rightsholders" they rip off. It's about giving a pittance to those who allow them to continue to do so.

          Sorry, trying to give a pittance to them.

      • basisword 19 hours ago ago

        They will share the money with the rights holders large enough to sue them. Fuck the rest. Just as they’ve done with training material for ChatGPT.

    • 48terry a day ago ago

      > the writing doesn't seem to flow logically from one thought to another.

      Neither has most of the stuff Sam has said since basically the moment he started talking.

      It is possible, perhaps, that he is actually a very stupid person!

      • braebo 13 hours ago ago

        My read says intelligent sociopathic narcissist.

    • camillomiller a day ago ago

      “Dear rights holders, we abused your content to train our closed model, but rest assured we’ll figure out a way to get you pennies back if you don’t get too mad at us”

  • g42gregory a day ago ago

    It is already illegal to use images in somebody's likeness for commercial purposes or purposes that harm their reputation, could be confusing, etc... Basically the only times you could use these images are for some parodies, for public figures, and fair use.

    Now, the OpenAI will be lecturing their own users, while expecting them to make them rich. I suspect, the users will find it insulting.

    Generation for personal use is not illegal, as far as I know.

    • nickthegreek 19 hours ago ago

      you can use the images to harm someone’s reputation legally as long as you don’t represent them as real.

    • camillomiller a day ago ago

      Wait, are you telling me Sam Altman has no regard for the law and thinks his own messianic endeavors are more important than that? Shocker!

  • surrTurr a day ago ago

    > launch new sora update

    > enable generating ghibli content since users are ADDICTED to that style

    > willingly ignore the fact that the people who own this content don't want this

    > wait a few days

    > "ooooh we're so sorry for letting these users generate copyrighted content"

    > disables it via some dumb ahh prompt detection algorithm

    > dumb down the model and features even more

    > add expensive pricing

    > wait a few months

    > launch new model without all of these restrictions again so that the difference to the new model feels insane

    • slacktivism123 18 hours ago ago

      >dumb ahh prompt detection algorithm

      Don't worry, you can write "dumb ass" here without needing to use algospeak. This isn't Instagram or TikTok and you won't be unpersoned by a "trust and safety" team for doing so.

      P.S. No need for a space after your meme arrows :-)

    • workfromspace 14 hours ago ago

      I'm new to Sora. Which step are we in at the moment?

    • spongebobstoes 18 hours ago ago

      copyright is such a poorly designed tax on our society and culture. innovations like Sora should be possible, but faces huge headwinds because... Disney wants even more money?

      the blind greed of copyright companies disgusts me

      • _DeadFred_ 13 hours ago ago

        Society has benefited hugely from copyright law. In fact, the first copyright laws were created in response to desires to have education material/a better educated society.

        Saying 'disney/laws bad because I want billionaire corporation to have access to something they know they don't but built their business model around using anyway.' isn't saying anything but 'I want what I want'.

        If anything society should take this slow and do it right, not throw out hundreds of years of thinking/decisions/progress because 'disney' and 'cool new tech'.

        We should not bend/throw away laws because billion dollar industry chose to build a new business model around ignoring them. Down that path lies dystopia.

  • minimaxir a day ago ago

    Yeah, Nintendo called, and faster than expected.

    > People are generating much more than we expected per user, and a lot of videos are being generated for very small audiences.

    What did OpenAI expect, really? They imposed no meaningful generation limits and and "very small audiences" is literally the point of an invite-only program.

    • minimaxir a day ago ago

      Update after more testing: looks like every popular video game prompt (even those not owned by Nintendo) triggers a Content Warning, and prompting "Italian video game plumber" didn't work either. Even indie games like Slay the Spire and Undertale got rejected. The only one that didn't trigger a "similarity to third party content" Content Violation was Cyberpunk 2077.

      Even content like Spongebob and Rick and Morty is now being rejected after having flooded the feeds.

      • mallowdram 20 hours ago ago

        I see a movie: The MoTrix, copyright blasting Soraddicts invent a new prompting language (or discover the one Altman seeded) as a way of evading Agents of the Entity, a © deity/program. Once unleashed, the world descends into HeroClix and ReadyPlayerOne slop simulation where original becomes indistinguishable from stolen.

    • techblueberry a day ago ago

      I don’t understand, what do they mean very small audiences, am I not supposed to make video for myself?

      • minimaxir a day ago ago

        OpenAI likely intended users to post every video they make to the public feed instead of just using the app as a free video generator. (i.e. Midjourney)

        Of course, another reason that people don’t publish their generated videos is because they are bad. I may or may not be speaking from experience.

        • dgs_sgd a day ago ago

          Can confirm.. I got access to the app yesterday and I have used it exclusively for making drafts and sending them to my friends without posting.

          • mrcwinn a day ago ago

            100%. I’m not comfortable sharing likeness of myself publicly. I send goofy stuff to friends. That was day 1, at least.

            Day 2+ I haven’t used the app again.

      • notatoad a day ago ago

        my read: they made the app look like tiktok, and were expecting people to make tiktok style viral videos. instead, what people are making is cameo-style personalized messages for their friends, starring mario.

    • Jordan-117 a day ago ago

      Current limit seems to be 100 per rolling 24 hour period, so not unlimited but definitely huge given the compute costs.

      • minimaxir a day ago ago

        Setting the limit that high for a soft launch is bizarre. I got access to Sora and got the gist of it with like 10 generations.

        • angulardragon03 9 hours ago ago

          Gotta juice the utilisation numbers somehow. Limiting everyone to 10 per day would kneecap them, and they’d have nothing with which to attract new investors to keep the gravy train going

    • ojosilva a day ago ago

      And I don't think you can revenue share these generations with rights owners just like that. What rights owner will let their "product" be depicted in any imaginable situation by any prompt by anyone in the planet? Words are powerful and images a 1000 words worth, videos are a millionth fold... I've seen a quick Sora video from OpenAI themselves I believe of the real life Mario Bros Princess, a rather voluptuous one, playing herself on a console and the image stuck. And it's not just misuse, distortion or appropriation but also association: imagine a series of very viral videos of Pikachu drinking Coke or a fan series of Goku with friends at KFC... it could condition, or steal, future marketing deals for the rights holders.

      This is a non-starter, unless you own a "license to AI" from the rights owner directly, such as an ad agency that uses Sora to generate an ad it was hired to do.

    • CPLX a day ago ago

      Indeed. If you read between the lines that’s clearly it.

      And on that note can I add how much I truly despise sentences like this:

      > We are hearing from a lot of rightsholders who are very excited for this new kind of "interactive fan fiction" and think this new kind of engagement will accrue a lot of value to them, but want the ability to specify how their characters can be used (including not at all).

      To me this sentence sums up a certain kind of passive aggressive California, Silicon Valley, sociopathic way of communicating with people that just makes my skin crawl. It’s sort of a conceptual cousin to concepts like banning someone from a service without even telling them or using words like “sunset” instead of “cancel” and so on.

      What that sentence actually fucking means is that a lot of powerful people with valuable creative works contacted them with lawyers telling them to knock this the fuck off. Which they thought was appropriate to put in parentheses at the end as if it wasn’t the main point.

      • lelandfe a day ago ago

        Wow, I am sure excited for your new kind of interactive fan fiction of my properties. It will accrue us a lot of value! Anyway, please do not use our properties.

        • vntok a day ago ago

          Nice but there's no need for the "please": it's not a request, it's a demand from an official lawyer-penned, strongly-worded, lawsuit actionable letter.

      • signatoremo a day ago ago

        You may not like their message, but the style can be found in practically any public communication from any corporation. Read a layoff announcement from Novo Nordisk as an example [1]. No difference.

        This is what I don’t like about HN, manufactured outrage when one dislikes the messenger. No substance whatsoever.

        When users are given such a powerful tool like Sora, there will naturally be conflicts. If one makes a video putting a naked girl in a sacred Buddhist temple in Bangkok, how do you think Thai people will react?

        This is OpenAI attempting balancing acts between conflicting interests, while trying to make money.

        [1]-https://www.novonordisk.com/content/nncorp/global/en/news-an...

        • toshinoriyagi 9 hours ago ago

          Yes, but one of the conflicting interests is illegal. We all know these companies pirate a huge amount of copyrighted data to train their LLMs and VLMs. Clear copyright infringement, Anthropic just lost a few billion dollars for this.

          In addition, the training process attempts to reproduce the copyrighted training data as perfectly as possible, with the intent to rent the resulting model out for commercial gain afterwards. Many argue that this is not fair use, but another instance of copyright infringement.

          And if the previous infractions weren't enough, OpenAI's customers are now generating mass videos of copyrighted characters.

          So, while it may be common corporate speak, it is still snake-tongued weasel-blather that downplays the illegality of their actions.

        • id00 a day ago ago

          I actually really like that comment. It's an example of classic doublespeak and it's a shame that "Open"AI uses it and we as society tolerate that (as well as other companies of course)

        • redserk 21 hours ago ago

          If we’re going on HN rants, this bizarre tendency of reframing the blatantly obvious into something it isn’t doesn’t help any argument.

          The messenger isn’t some random, disconnected third party here.

      • martin-t a day ago ago

        It feels like big exploitative multimedia companies are the main force fighting big exploitative ML companies over copyright of art.

        I wish big exploitative tech companies would fight them over copyright of code but almost all big exploitative tech companies are also big exploitative ML companies.

        Oracle to the rescue? What a sick, sad world.

      • saxonww a day ago ago

        I'm not really disagreeing with you, but I think it's more about salesmanship than anything else. "We released v1 and copyright holders immediately threatened to sue us, lol" sounds like you didn't think ahead, and also paints copyright holders in a negative light; copyright holders who you need to not be enemies but who, if you're not making it up, are already unhappy enough to want to sue you.

        Sam's sentence tries to paint what happened in a positive light, and imagines positive progress as both sides work towards 'yes'.

        So I agree that it would be nice if he were more direct, but if he's even capable of that it would be 30 years from now when someone's asking him to reminisce, not mid-hustle. And I'd add that I think this is true of all business executives, it's not necessarily a Silicon Valley thing. They seem to frequently be mealy-mouthed. I think it goes with the position.

      • rhetocj23 20 hours ago ago

        Exactly. I really hope Altman gets whats coming for him.

      • adriand a day ago ago

        > To me this sentence sums up a certain kind of passive aggressive California, Silicon Valley, sociopathic way of communicating with people that just makes my skin crawl.

        To me that's Sam Altman in a nutshell. I remember listening to an extended interview with him and I felt creeped out by the end of it. The film Mountainhead does a great job capturing this.

  • geraldalewis a day ago ago

    > rightsholders

    • roxolotl a day ago ago

      It’s telling to how society values copyright of different media that 4 years into people yelling about these being copyright violation machines the first time there’s been an emergency copyright update has been with video.

      • ronsor a day ago ago

        The only thing we need is an emergency copyright deprecation.

        • martin-t a day ago ago

          So people who spend time working on code or art should have exactly zero protection against somebody else just taking their work and using it to make money?

          • reorder9695 a day ago ago

            No, but the current system is totally idiotic. Why not have a fixed timeframe i.e. 30-50 years to make money? Life of the author + x years is stupid not only because it's way too long, it keeps going until way after the creator is no longer benefitting, and it can cause issues with works where you don't know who the author is so you can't cleanly say it's old enough not to have copyright.

            I'm not sure for most (specifically smaller, who need the most protection) creators this would actually change very much. Media typically makes money in it's first few years of life, not 70 years on.

            • mallowdram 19 hours ago ago

              The shareholder class would demand rapid fire exploitation of © the moment it expired and the resulting media would be a soup of mediocrity. The idea is to recognize the highly creative have unique imaginations that invent paradigms that propel culture. Excluding that for 70+ years generates that. Had Lucas gained the rights to Flash Gordon (DeLaurentiis beat him to it) he'd never been forced to create the SW universe. Think about constraints as the path to progress.

              • iterance 18 hours ago ago

                This does not demonstrate a sound understanding of how the public domain works, why copyright lengths have been extended so ferociously over the last century (it's shareholders who want this), nor the impact it has both on creative process and public conversation.

                This is a highly complex question about how legal systems, companies, and individual creatives come in conflict, and cannot be summarized as a positive creative constraint / means to celebrate their works.

                • mallowdram 18 hours ago ago

                  I develop copyright material from the letter and the images that I've both sold to studios and own myself. Copyright lengths are there to prevent the shareholder class from rapid exploitation. Once copyright declines to years not decades, shareholders will demand that be exploited rather than new ideas. The public conversation is rather irrelevant as the layperson doesn't have a window into the massive risk, long-term development required to invent new things, that's how copyright is not a referendum, it's a specialized discourse. Yes the idea of long-term copyright developed under work-for-hire or individual ownership can be easily summarized. License, sample, or steal. Those are the windows.

            • martin-t 13 hours ago ago

              Then the solution is fixing the problem, not removing any protections at all.

              In fact, copyright should belong to the people who actually create stuff, not those who pay them.

          • 1gn15 a day ago ago

            Yes.

            (The "takers" also do not have copyright protection.)

            • martin-t 13 hours ago ago

              So basically the only winners should be:

              - owners of large platforms who don't care what "content"[0] is successful or if creators get rewarded, as long as there is content to show between ads

              - large corporations who can afford to protect their content with DRM

              Is that correct?

              Do you expect it to play out differently? Game it out in your head.

              [0]: https://eev.ee/blog/2025/07/03/the-rise-of-whatever/#:~:text...

            • noduerme a day ago ago

              Great, you've just removed any incentive for people to make anything.

              • JimDabell a day ago ago

                The vast amounts of permissively licensed works directly contradicts you.

                Even if you take away copyright, there are plenty of incentives to create. Copyright is not the sole reason people create.

                • noduerme a day ago ago

                  Vague. Are you talking about reasons to create like the joy of creating? Your bio describes you as a 'tech entrepreneur', not 'DIY tinkerer'. So I'll assume that when you spend a great deal of time entrepreneuring something, you do so with the hope of remuneration. Maybe not by licensing the copyright, but in some form.

                  Permissive licenses are great in software, where SAAS is an alternative route to getting paid. How does that work if you're a musician, artist, writer, or filmmaker who makes a living selling the rights to your creative output? It doesn't.

                  • JimDabell a day ago ago

                    > Vague. Are you talking about reasons to create like the joy of creating?

                    That’s one of them, but I really don’t have to be specific about the reasons. I just have to point out the existence of permissively licensed works. You said:

                    > Great, you've just removed any incentive for people to make anything.

                    This is very obviously untrue. Perhaps you meant to say “…you’ve just removed some incentives for people to make some things”?

              • ares623 a day ago ago

                It's ok I don't have any talent so that won't affect me

      • musicale a day ago ago

        "Hi, as the company that bragged about how we had ripped off Studio Ghibli, and encouraged you to make as many still frames as possible, we would now like to say that you are making too many fake Disney films and we want you to stop."

        • timschmidt a day ago ago

          Cue an open weights model from Qwen or DeepSeek with none of these limitations.

          • ineedasername a day ago ago

            These attempted limitations tend to be very brittle when the material isn’t excised from the training data, even more so when it’s visual rather than just text. It becomes very much like that board game Taboo where the goal is to get people to guess a word without saying a few other highly related words or synonyms.

            For example, I had no problem getting the desired results when I promoted Sora for “A street level view of that magical castle in a Florida amusement area, crowds of people walking and a monorail going by on tracks overhead.”

            Hint: it wasn’t Universal Studios, and unless you know the place by blind sight you’d think it had been the mouse’s own place.

            On pure image generation, I forget which model, one derived from stable diffusion though, there was clearly a trained unweighting of Mickey Mouse such that you couldn’t get him to appear by name, but go at it a little sideways? Even just “Minnie Mouse and her partner”? Poof- guardrails down. If you have a solid intuition of the term “dog whistling” and how it’s done, it all becomes trivial.

            • moduspol 6 hours ago ago

              I can get it to do rides at Disney World (including explicitly by name) but it’s incredibly good at blocking superheroes. And that’s gotta be a pretty common prompt, yet I haven’t seen that kind of content in the feed, either.

              And not just by name. Try to get it to generate the Hulk, even with roundabout descriptions. You can get past the initial (prompt-level) blocking, but it’ll generate the video and then say the guardrails caught it.

            • timschmidt a day ago ago

              Absolutely. Though the smarter these things get, and the more layers of additional LLMs on top playing copyright police that there are, I do expect it to get more challenging.

              My comment was intended more to point out that copyright cartels are a competitive liability for AI corps based in "the west". Groups who can train models on all available culture without limitation will produce more capable models with less friction for generating content that people want.

              People have strong opinions about whether or not this is morally defensible. I'm not commenting on that either way. Just pointing out the reality of it.

              • TeMPOraL a day ago ago

                It's a matter of time. I imagine they'll get more effect suppressing activations of specific concepts within the LLM, possibly in real time. I.e. instead of filtering prompt for "Mickie Mouse" analogies, or unlearning the concept, or even checking the output before passing it to user, they could monitor the network for specific activation patterns and clamp them during inference.

  • jameslk a day ago ago

    Viacom-suing-YouTube-after-it-used-all-its-IP-as-a-growth-hack vibes

    • nextworddev a day ago ago

      Lol blast from the past. Real Gs remember this.

  • aubanel 21 hours ago ago

    > "We are hearing from a lot of rightsholders who are very excited for this new kind of "interactive fan fiction" and think this new kind of engagement will accrue a lot of value to them, but want the ability to specify how their characters can be used (including not at all)"

    Marvelous ability to convolute the simple message "rightholders told us to fuck off"

  • brandon272 a day ago ago

    Obviously, OpenAI could have had copyright restrictions in place from the get-go with this, but instead made an intentional decision to allow people to generate everything ranging from Spongebob videos to Michael Jackson videos to South Park videos.

    Today, Sora users on reddit are pretty much beside themselves because of newly enabled content restrictions. They are (apparently) no longer able to generate these types of videos and see no use for the service without that ability!

    To me it raises two questions:

    1) Was the initial "free for all" a marketing ploy?

    2) Is it the case that people find these video generators a lot less interesting when they have to come up with original ideas for the videos and cannot use copyright characters, etc?

    • ronsor a day ago ago

      These video generators are mostly useful for memes at this point, and adding copyright shackles make it a lot less useful for memeing.

  • piskov a day ago ago

    Broke: cure cancer, new physics, agi, take your jobs, what have you. Please give us a trillion.

    Woke: AI slop tictoc to waste millions of human-hours.

    • noduerme a day ago ago

      You make a good point. They may well as admit at this point that curing cancer, new physics, and AGI aren't going to happen very soon.

      What surprises me a bit is that they'd take this TikTok route, rather than selling Sora as a very expensive storyboarding tool to film/tv studios, producers, etc. Why package it as an app for your neice to make viral videos that's bound to lose money with every click? Just sell it for $50k/hr of video to someone with deep pockets. Is it just a publicity stunt?

      • rsynnott a day ago ago

        > What surprises me a bit is that they'd take this TikTok route, rather than selling Sora as a very expensive storyboarding tool to film/tv studios, producers, etc.

        Because it’s not good enough, I would assume. Hard to see it actually being useful in this role.

      • measurablefunc a day ago ago

        The query data they are collecting can be used for ad targeting. Remember, if you're not paying for it (and in many cases even when you are paying for it) then the data collected from your use of the application is going to be used by someone to make money one way or another. Google made billions from search queries & OpenAI has an even better query/profiling perspective on its users b/c they are providing all sorts of different modalities for interaction, that data is extremely valuable, analogous to how Google search queries (along w/ data from their other products) are extremely valuable to corporate marketing departments that are willing to pay a premium for access to Google's targeting algorithms.

    • amarcheschi a day ago ago

      Almost as if the AGI talks were what a ceo would do to pump the hype of its company as much as possible

    • eclipticplane a day ago ago

      > AI slop tictoc to waste millions of human-hours.

      Don't forget the power it consumes from an already overloaded grid [while actively turning off new renewable power sources], the fresh water data centers consume for cooling, and the noise pollution forced on low-income residents.

      • amarcheschi a day ago ago

        As a european, i don't know if it's more funny or sad that american citizens close tho data centers are effectively subsidizing ai for the rest of the world by paying more for their electricity since the datacenters are mostly there

    • rsynnott a day ago ago

      Well, yeah, but that stuff was all bullshit, whereas the fake tiktok kind of exists and might keep the all-important money taps on for another six months or so.

  • mallowdram a day ago ago

    It began as floor wax now it's a dessert topping.

  • tmaly 15 hours ago ago

    I just heard people were making full length South Part episodes with Sora 2. But it seems now that this has been banned by OpenAI.

  • zarzavat 20 hours ago ago

    Revenue sharing for AI generated videos of characters sounds completely insane.

    I can't tell if this is face saving or delusion.

    • CaptainOfCoit 20 hours ago ago

      As someone who is concerned about how artists are supposed to earn a living in a ecosystem where anyone can trivially copy any style effortlessly, it does sound better than the status quo?

      The fact that LLMs are trained on humans data yet the same humans receive no benefits from it (cannot even use the weights for free, even if they unwillingly contributed to it existing), kind of sucks.

      What alternative is there? Let companies freely slurp up people's work and give absolutely nothing back?

    • sumedh 20 hours ago ago

      It sounds insane to you but sounds completely normal to me.

      Why should AI generated videos not have revenue sharing.

      In the end what matters is whether people enjoy the video, it does not matter if its AI created or human created.

  • HypomaniaMan a day ago ago

    Just because something can be done doesn't mean it should be

    • measurablefunc a day ago ago

      The logic is that if they don't do it then Meta or some other company will & they have decided it's better that they do it b/c they are the better, more righteous, & moral people. But the main issue is I don't understand how they went from solving general intelligence to becoming an ad sponsored synthetic media company without anyone noticing.

      • camillomiller a day ago ago

        Oh we all noticed, but this is a new level of entrepreneurial narcissism and corporate gas lighting. Maybe one day Sam Altman will generally be perceived as who he actually is

        • measurablefunc a day ago ago

          He is the boy wonder genius who will usher an era of infinite abundance but before he does that he has to take a detour to generate a lot of synthetic media & siphon a lot of user queries at every hour of every day so that advertisers can better target consumers w/ their plastic gadgets & snake oils. I'm sure they just need a few more trillions in data center buildouts & then they can get around to building the general purpose intelligence that will deliver us to the fully automated luxurious communist utopia.

  • _fs a day ago ago

    is it still invite only? I tried downloading the app to give it a whirl, but apparently you need a code to even open the app

  • rr808 a day ago ago

    That is my reminder to generate more AI slop to burn through all that VC cash.

    • rhetocj23 a day ago ago

      Someone I know uses chatgpt a lot. Not because they find it incredibly valuable. But because they want to stick it to the VC's funding OAI and increase their costs with no revenue.

      So this is why you have to be careful about usage numbers. The only true meaningful number is about those who are contributing towards revenue. Without that OAI is just a giant money sink.

      • MountDoom a day ago ago

        I suspect this has the opposite effect. More daily users == higher valuation, so more profit if the VCs decide to sell. There's no pressure on OpenAI to become profitable yet.

  • stared 20 hours ago ago

    It is sad (and predictable, PR- and legal-wise) that there was no mention of the Ghibli Studio.

    I would be actually moved if there was some genuine in the line of "We are sorry - we wanted to make a PR stunt, but we went to hard." and offered real $ for that. (Not that I believe it is going to happen, as GenAI does not like this kind of precedence.)

  • rpgbr 17 hours ago ago

    >Second, we are going to have to somehow make money for video generation. People are generating much more than we expected per user, and a lot of videos are being generated for very small audiences.

    Once again, Scam Altman looking for excuses to raise more money. What a joke…

  • tkamado a day ago ago

    The OpenAI dream: replace your job with AI, replace your free time with AI slop?

    • stogot a day ago ago

      And replace rightsholders with “maybe we will try to revenue share… maybe”

      • pants2 a day ago ago

        They also said at one point they'll share their profits with the world as UBI

  • cess11 a day ago ago

    Is this a roundabout way to say that they've realised that people are using their service to make porn of celebrities and fictional characters in the entertainment industry, and aim to figure out a way to keep making money from it without involving "rightsholders" in scandals?

  • kg a day ago ago

    The detail that rightsholders seem to be demanding a revenue share is interesting. That sounds administratively and technologically very complex to implement and probably also just plain expensive to implement.

    • minimaxir a day ago ago

      Sam says Sora 2 has to make money but there is no revenue model that can feasibly offset a $4-5 compute cost per video.

      • dwohnitmok a day ago ago

        With some back of the napkin math, I am pretty sure you're off by at least two orders of magnitude, conceivably 4. I think 2 cents per video is an upper limit.

        https://news.ycombinator.com/item?id=45434298

        Generally speaking, API costs that the consumer sees are way higher than compute costs that the provider pays.

        EDIT: Upper limit on pure on-going compute cost. If you factor in chip capital costs as another commentator on the other thread pointed out, you might add another order of magnitude.

        • rafram a day ago ago

          You also aren’t including amortized training costs, which are immense (and likely ongoing as they continue to tweak the model).

          • dwohnitmok 18 hours ago ago

            I suspect amortized training costs are only a relatively small fraction of the amortized hardware costs (i.e. counting amortized hardware costs already accounts for the large fraction of the cost of training and pulling out training as a completely separate category double counts a lot of the cost).

      • nojs a day ago ago

        Where did you get that figure from?

        • minimaxir a day ago ago

          It’s more a ballpark since exact numbers vary and OpenAI could be employing shenanigans to cut costs, but in comparison, Veo 3 which has similar quality 720p video costs $0.40/second for the user, and Sora’s videos are 10 seconds each. Although Veo 3 could cost more or less to Google than what is charged to the user.

          I suspect OpenAI’s costs to be higher if anything since Google’s infra is more cost-efficient.

        • nvr219 a day ago ago

          It was revealed to them in a dream.

    • pxoe 18 hours ago ago

      This "but it's too hard to implement" excuse never made sense to me. So it's doable to make a system like this, to have smart people working on it, hire and poach other smart people, to have payments systems, tracking systems, personal data collection, request filtering and content awareness, all that jazz, but somehow all of that grinds to a halt the moment a question like this arises? and it's been a problem for years, yet some of the smartest people are just unable to approach it, let alone solve it? Does it not seem idiotic to see them serve 'most advanced' products over and over, and then pretend like this question is "too complex" for them to solve? Shouldn't they be smart enough to rise up to that level of "complexity" anyway?

      Seems more like selective, intentional ignoring of the problem to me. It's just because if they start to pay up, everyone will want to get paid, and paying other people is something that companies like this systematically try to avoid as much as possible.

    • martin-t a day ago ago

      This is how all work should be rewarded.

      Workers getting paid a flat rate while owners are raking in the entire income generated by the work is how the rich get richer faster than any working person can.

  • crimsoneer a day ago ago

    So that sounds like they "released" this fully aware it would generate loads of hype, but never ever be legally feasible to release at scale, so we can expect some heavily cut down version to eventually become publicly released?

    • nmfisher a day ago ago

      Feels very much like a knee-jerk response to Facebook releasing their "Vibes" app the week before. It's basically the same thing, OpenAI are probably willing to light a pile of money on fire to take the wind out of their sails.

      I also don't think the "Sam Altman" videos were authentic/organic at all, smells much more like a coordinated astroturfing campaign.

      • ares623 a day ago ago

        Or to distract from the new routing and intent/context detection thing they have going on.

  • CompoundEyes a day ago ago

    I don't have access but it seems you can impose a friend into a video? Are we not rightsholders to our own likeness? It seems like a person should be able to block a video someone shares without their consent or earn revenue then if their likeness is used.

    • minimaxir a day ago ago

      You have to explicitly opt into sharing your likeness with permission controls.

      > person should be able to block a video someone shares without their consent

      That is already implemented.

      • solid_fuel a day ago ago

        > You have to explicitly opt into sharing your likeness with permission controls.

        Ok... how is that supposed to work? I don't have an OpenAI account, there are no permission controls for me. Someone else could easily upload a picture of me, no?

        • pants2 a day ago ago

          No, you have to register yourself with a video where you're required to say a unique code.

          So unless you've posted a video of yourself online saying every number from 1 to 99 they won't be able to copy your likeness

          • kouteiheika 20 hours ago ago

            This seems... pretty easy to get around? There are already open weight models which can take any photo and audio and make a video out of it with the character speaking/singing/whatever, and it runs on normal consumer hardware.

            • pants2 16 hours ago ago

              So you wouldn't know what the three numbers are ahead of time, you'd have to be using a real time face replacement model (or I guess live-switching between pre-rendered clips) and somehow convince the app that you're the iPhone selfie cam.

              But at that point you might as well just use WAN 2.2 Animate and forget about Sora.

          • solid_fuel a day ago ago

            That's more than I expected from them, genuinely. But it still doesn't seem like a very solid solution. I wonder how much variation in look and voice it accepts?

            My partner likes to cosplay, and some of the costumes are quite extensive. If they want to generate a video in a specific outfit will they need to record a new source video? The problem exists in the other direction, too. If someone looks a lot like Harrison Ford, will they be able to create videos with their own likeness?

            I wonder how this extends to videos with multiple people, as well. E.g. if both my friend and I want to be in a video together.

          • WmWsjA6B29B4nfk a day ago ago

            It’s not like making a video of someone saying a number, given a single photo and any voice sample is a very difficult problem today. We can just fast-forward a few weeks into a world where this „registration“ is already broken.

          • noduerme a day ago ago

            So only the heads of companies who lead shareholder meetings are vulnerable to this exploit? Cool.