Cups Remote Code Execution Vulnerability Fix Available

(ubuntu.com)

45 points | by heisenbit 5 days ago ago

23 comments

  • candiddevmike a day ago ago

    CUPS and all of the other "root but with chroot" daemons like Postfix use a legacy security model that will hopefully be modernized to use things like namespaces and cgroups. Hopefully this is a wake up call to start pursuing these migrations faster. Right now it's really painful to get Postfix and friends to not run as root, and the maintainers are very hostile towards enabling this behavior.

    • noinsight a day ago ago

      You can already sandbox services very effectively when you run them under systemd.

      It’s really nice to be able to harden things solely through systemd.

    • orf a day ago ago

      > and the maintainers are very hostile towards enabling this behavior.

      Got any links to read more?

  • mjw1007 a day ago ago

    I remember Ubuntu's decision to abandon its original "no open ports in the default install" policy for the sake of zeroconf/mdns was controversial at the time.

    https://wiki.ubuntu.com/ZeroConfPolicySpec https://lists.ubuntu.com/archives/ubuntu-devel/2006-July/019...

  • axoltl a day ago ago

    Canonicals little jab under the "importance of coordinated disclosure" section rubs me the wrong way. They seem to be under the impression the recipient of a vulnerability report gets to set the rules, much like when a project receives a bug report. They don't. That power rests solely with the researcher, and they can do as they see fit.

    • hn_throwaway_99 a day ago ago

      > That power rests solely with the researcher, and they can do as they see fit.

      That's true by definition, but there is still a "right way" and a "wrong way" to disclose if you don't want people to consider you an asshole. To be clear, not familiar enough to with this situation to say what happened, but I think it's totally fine for Canonical to call this out.

      • axoltl a day ago ago

        As a security researcher myself, we've been having the discussion about "right way" and "wrong way" for a long time. The consensus so far is that there is no consensus, and what the "right way" is changes. When Google Project Zero started doing 90 day disclosure deadlines, many viewed this as the wrong way and irresponsible. Now it's viewed as the only way to do disclosures.

        • suprjami a day ago ago

          > The consensus so far is that there is no consensus, and what the "right way" is changes.

          Ask your coworkers if the "right way" includes leaking the vuln existence early to hype it up for internet clout, which allows anyone smart to guess the component easily, then breaking your own embargo by publishing a PoC a week earlier than agreed and a week earlier than fixes were supposed to come out.

          If anyone thinks that's the "right way" then suggest they find a job somewhere else. Preferably not in tech at all.

        • hn_throwaway_99 a day ago ago

          After looking more into what happened, this does not appear to be a case of someone using the wrong fork for the salad course.

          The fact that good people can debate about the right/optimal disclosure procedures doesn't mean that we shouldn't call out easily identifiable cases of crappy behavior.

          • axoltl a day ago ago

            First off, the salad fork expression is going into my repertoire, it is excellent.

            Second, I will say that evilsocket has a bit of an abrasive and impulsive communication style and that can make for fairly adversarial conversations. Combine that with developers being somewhat defensive around their code/project, and it's a bit of a powder keg waiting to explode.

            I'm not sure there's anything to take away from this or lessons to be learned other than that creating some mental space/distance is a good thing.

      • bigfatkitten a day ago ago

        For there to be a "right way" and "wrong way" requires vendors to always do the right thing when notified of a vulnerability, which they sometimes do.

      • SAI_Peregrinus a day ago ago

        "Right way" == the implicit duty to the public's safety is followed. For Professional Engineers this duty is explicit in their license. Coordinated disclosure & full disclosure are the usual ways this can happen.

        "Wrong way" == the duty to the public's safety is violated. Selling exploits to ransomware groups or other criminal organizations is one way this can happen.

    • Daviey a day ago ago

      That really comes with the territory of being an ethical security researcher, particularly when the recipient is welcoming of the disclosure. I'd also argue more so when it is open source.

      Otherwise, why not just drop them as zero days?

      The disclosure methodology is based on the researcher holding the cards and putting pressure on the recipient to resolve the issue in a timely manner.

      What do you feel is the alternative?

    • appendix-rock a day ago ago

      You’re conflating power and proper practice. “Yeah, well, they’re allowed to” is a comment that’s never, ever worth making, because EVERYONE already knows it. The need to always kick up a fuss about “doing whatever I want in my land” always feels like a weirdly American reflex that’s always a little ignorant of the context of the actual discussion taking place.

      • axoltl a day ago ago

        You're making it sound like there is a well-agreed-upon way of disclosing vulnerabilities ("_proper_ practice"). I'm a security researcher and this particular discussion has been going on for well over a decade at this point.

        My point wasn't "they're allowed to do whatever they want". My point was there are many different viewpoints on the matter and Canonical seems to be insisting theirs is the correct one.

        (Also, I'm not American)

        • a day ago ago
          [deleted]
        • close04 a day ago ago

          Canonical's statement hits all the right notes. It's a short summary of what coordinated disclosure is and a gentle reminder of why it's important to follow that practice, without really pointing any fingers. I don't see something in Canonical's actions that suggests they actively harmed the process, or that the way the early disclosure was handled was in the users' interest.

          You say it "rubs you the wrong way" without pointing out what exactly you think is wrong with it. One could easily understand that as a security researcher you just want to do whatever you want, when you want it, without anyone pointing fingers at you.

          Your attitude that "That power rests solely with the researcher, and they can do as they see fit" while technically correct exudes bad faith. Further explanation that "there's no consensus" isn't making it any better.

          • axoltl a day ago ago

            I want to clarify that I think that everyone involved wants to do the "Right Thing". What I'm arguing about is that there are different schools of thought in the vulnerability research community as to what the "Right Thing" is at any given point in time.

            I am not claiming that a security researcher can just drop 0day on Twitter and not expect some amount of backlash.

            > You say it "rubs you the wrong way" without pointing out what exactly you think is wrong with it.

            Let's step through it:

            >> "Vulnerabilities are normally discussed..."

            This implies there's a "correct" way of handling vulnerabilities, and anything that doesn't conform to this correct way is the "wrong" way.

            >> "Sometimes, information can leak and this has the potential to put users at risk."

            Somewhat of a nothing burger. Information can also potentially help secure users. In my case I immediately shut down the CUPS daemons on all of my boxes. This shortened my exposure window immensely.

            >> "We encourage everyone to consider the greater good."

            This reeks of a 'holier than thou' attitude, by implying the disclosure wasn't done with the greater good in mind. I believe evilsocket encountered some headstrong maintainers that had a hard time keeping information about the vulnerabilities secret and assessed the risk of a malicious actor discovering the vulnerability to be too great. The greater good - in the opinion of the security researcher - was better served by ensuring the relevant information was publicly disseminated.

            >> "If disagreements come up during disclosure, third-party coordinators, such as CERT/CC’s VINCE, can step in to mediate discussion."

            Disagreements did come up, and Canonical again is subtly claiming they were handled in the "wrong" way as if there is some objective "right" and "wrong" way.

  • cypherpunks01 a day ago ago

    Attacking UNIX Systems via CUPS, Part I, 2024-09-26 (linked at end of page)

    https://www.evilsocket.net/2024/09/26/Attacking-UNIX-systems...

  • jmclnx a day ago ago

    This morning I just installed the fixed Slackware Packages for cups. It became available on Oct 1 18:00 UTC:

    >(* Security fix *)

    >patches/packages/cups-filters-1.28.17-x86_64-2_slack15.0.txz: Rebuilt. Mitigate security issue that could lead to a denial of service or the execution of arbitrary code. Rebuilt with --with-browseremoteprotocols=none to disable incoming connections, since this daemon has been shown to be insecure. If you actually use cups-browsed, be sure to install the new /etc/cups/cups-browsed.conf.new containing this line:

    >BrowseRemoteProtocols none

    >For more information, see:

    >https://www.cve.org/CVERecord?id=CVE-2024-47176

  • 4oo4 a day ago ago

    This is missing portant info. Nowhere does it say what the fixed package versions are, this is crucial for auditing whether you are fully patched.

  • a day ago ago
    [deleted]