9 comments

  • acheong08 38 minutes ago ago

    What does production ready even mean? The problem with AI is that there isn't an obvious way to prove how much human attention\care was actually put in & thus no signal on quality. Nobody is gonna review 1M lines. Also, the 1M line number shouldn't really be a boast. More lines != higher quality or more features

  • verall 13 hours ago ago

    I dug into this for a minute since I am sort of qualified. Only read through code, I have not tried to run it.

    The mass of it is incredible, but it's a bit hard to tell what is an AI hallucination.

    Digging in at random, it appears real, but it doesn't smell right for code of this mass. Most of the code seems to exist as it describes - which is an insane quantity. Everything seems to be what you would get if you asked AI to write it - for example, I looked through tonemapping and it appears to implement a small set of textbook tonemapping algos. In code that is really used, I expect to see something more purpose driven, not sure how to better say it. But an AI when asked to "handle tonemapping" would just pick a few literature methods and implement.

    One random pick was "oximedia-gaming" which it said was stable with nvenc support. I wanted to check this out because I've called nvenc from C++ before and it was hard. The nvenc support is a no-op. So that's not quite right.

    If (and it's a huge if), this code really works as it says, and it's written mostly by LLMs (it appears to be), then this is a huge testament to the value highly structured environment like Rust provides to the LLM.

    I shudder to imagine the token costs...

  • ghrl 15 hours ago ago

    I see a single commit adding nearly 2M lines of code with the readme claiming to be a full production-ready project. Am I missing something here?

    Also, since it mentions full WASM support, a web demo would be nice.

    • kitasan 15 hours ago ago

      Fair question. Development was done in a private repo — we squashed the history into a single commit when open-sourcing. That's a deliberate policy choice (clean public history), though I understand it makes it harder to evaluate the project's evolution. WASM package is already published: https://www.npmjs.com/package/@cooljapan/oximedia — format probing, container demuxing, zero-copy buffers all work in-browser. A live web demo is next on the list.

  • pdyc 15 hours ago ago

    nice project but it would be good if you can make other codecs like h.264 etc. optional it will increase its adoption and help in battle testing entire framework.

    • kitasan 9 hours ago ago

      Thanks for the suggestion. H.264 Baseline/High profile patents are largely expiring between 2027–2028, with a few stragglers potentially lasting until 2030. We're tracking this closely — once the patent landscape is clear, we plan to offer H.264/H.265 as an optional feature. In the meantime, AV1 matches or exceeds H.265 in compression efficiency, so for greenfield projects there's little reason to reach for the legacy codecs.

      • pdyc 9 hours ago ago

        problem is users will still have files in mp4+h264 combo we cannot dictate that but i think i can workaround that by using system default codecs via other library.

  • lifis 12 hours ago ago

    Seemingly written partially or fully with AI but no disclosure of it at all and details like how AI was used, what model was used, agentic harness and tests, whether and how it was checked by humans, whether you take responsibility for it, etc.?

    WTF?

  • hebetude 15 hours ago ago

    Pass, one commit, no way to evaluate the evolution of the codebase.