Chips for the Rest of Us

(engineering.nyu.edu)

56 points | by hasheddan a day ago ago

21 comments

  • dmbche a day ago ago

    " chip design cycle is also probably one of the most complicated engineering processes that exists in the world,” said Institute Professor Siddharth Garg (ECE). “There’s a saying that rocket science is hard, but chip design is harder.” "

    Why not Rockets for the rest of us first, if that's easier?

    • gsf_emergency_6 a day ago ago

      Rocket science is hard because you have to burn your own cash if you are not connected. Chip design is less risky for the individual, but it's been harder (so far) to signal your mastery to the funders.

      The difficulty is not (entirely) technical

    • wlesieutre a day ago ago

      There’s already lots of rockets for the rest of us, they’re just not as big

  • christoff12 9 hours ago ago

    Are they Better Made?

    (this is joke[1])

    ---

    - [1] https://bettermade.com/product-category/potato-chips/

  • __rito__ a day ago ago

    Link to the BASICS course mentioned: https://engineering.nyu.edu/academics/programs/digital-learn...

    Link to the Zero to ASIC course that they are collaborating with: https://www.zerotoasiccourse.com/digital/

    I wish for free alternatives to these.

  • gsf_emergency_6 a day ago ago

    uni PR wasn't bad faith, just bad placement. source is here

    https://github.com/shailja-thakur/VGen

    Earlier from the NYU (2023)

    https://zenodo.org/records/7953725

    Related (?) blog post (2023)

    https://01001000.xyz/2023-12-21-ChatGPT-AI-Silicon/

  • fleshmonad a day ago ago

    We have textual slop, visual slop, audio slop, so we asked: "What else do we want to sloppify?". And then it dawned on me. ICs. ICs haven't been slopped yet — sure, we could ask the machine to generate some vhdl, but that isn't the same. So we present: Silicon Slop.

    I am actually astonished. Is this what happens when the NYU board of directors tells every department they have to use and create AI, or they will stop funding? What is going on?

    • carlCarlCarlCar a day ago ago

      Ah, thanks; we definitely needed more artisanal, real human social media slop like this.

      Improving the lived experience keeping it real! Feels so much more authentic.

      More people would love AI if it communicated like an emo *Nix elitist. Train it on Daria, Eeyore, and grunge lyrics! People will love it!

  • stogot a day ago ago

    > To address this challenge, Garg and colleagues scoured Verilog code on GitHub and excerpted content from 70 Verilog textbooks to amass the largest AI training dataset of Verilog ever assembled. The team then created VeriGen, the first specialized AI model trained solely to generate Verilog code.

    I expect this will become the norm in a number of fields. Perhaps COBOL is next?

    • fancy_pantser 21 hours ago ago

      > The team then created VeriGen, the first specialized AI model trained solely to generate Verilog code.

      Perhaps it's the first open one. I was an eng manager at a hyperscaler helping one of our clients, a large semiconductor design company, build models to use internally. It was trained on their extensive Verilog repos, tooling, and strict style guides. I see this being repeated across industries, at least since 2023 there are quite a few deep-pocketed S&P 500 orgs creating models from scratch or extensively finetuning to give unique advantages they require. They're rarely announced specifically, but you can often infer from the initial investment or partnership announcements that they're working on it.

  • alecco a day ago ago

    > Consequently, the NYU researchers’ goal is to make chip design more accessible, so nonengineers, whatever their background can create their own custom-made chips.

    What?

    • Aloisius a day ago ago

      "ChatGPT: Please design a chip for me."

      Basically.

      • kingstnap 18 hours ago ago

        Unironically what industry is trying to do. Saw a slide with basically exactly that written on it sometime ago at a conference (MLCAD).

    • bigyabai a day ago ago

      I'm just as confused as you are, honestly. It feels like we've seen the "ASIC for everything" campaign so many times over, and yet only FPGAs and CUDA typically find adoption in the industry.

      A lot of my questions went away when I got to this line though:

      > He’s also fully engaged in the third leg of the “democratizing chip design” stool: education.

      This is a valiant effort. Chip design is a hard world to break into, and many applications that could benefit from ASICs aren't iterating or testing on it because it sucks to do. It's a lot of work to bring that skill ceiling down, but as a programmer I could see how an LLVM-style intermediate representation layer could help designers get up-and-running faster.

      • charlie-83 a day ago ago

        Isn't HDL basically the intermediate representation you want? Plus, you can learn it with simulation or FPGA dev board which makes it reasonably accessable

        • tormeh a day ago ago

          All I remember from my experience with VHDL/Verilog is that they really truly suck.

      • bsder a day ago ago

        > I'm just as confused as you are, honestly. It feels like we've seen the "ASIC for everything" campaign so many times over, and yet only FPGAs and CUDA typically find adoption in the industry.

        That's because we don't need more digital. Digital transistors are effectively free (to a first approximation).

        The axes that we need more of involve analog and RF. Less power consumption, better RF speed/range, higher speed PCI, etc. all require messy analog and RF design. And those are the expensive tools. Those are also the complex tools require genuine knowledge.

        Now, if your AI could deliver analog and RF, you'd make a gazillion dollars. The fact that everybody knows this and still haven't pulled it off should tell you something.

        • tormeh a day ago ago

          Would you really earn more money doing this than monopolizing online search advertising? Because I find that hard to believe. Hardware seems like a miserable business.

          • pesfandiar a day ago ago

            That might change if geopolitical tensions fragment the global supply chains.

          • bsder a day ago ago

            Being a fab is a garbage business.

            Being a software supplier to fabless semiconductor companies is a very profitable business.

            In the Gold Rush, the people who came out rich were selling the shovels and denim.

  • bgwalter a day ago ago

    Bootstrap framework for chips, Verilog stolen from books and from GitHub.