Enable CORS for Your Blog

(blogsareback.com)

95 points | by cdrnsf 4 days ago ago

45 comments

  • andai a day ago ago

    In uni I built a simple web scraper in JavaScript. It just ran in the browser. It would fetch a page, extract the links, then fetch those pages.

    You could watch it run in realtime and it would build out a tree of nested links as it crawled. It was a lot of fun to watch! (More fun than CLI based crawlers for sure.)

    The only issue I had was not being able to fetch pages from the front end due to CORS, so I just added a "proxy" to my server in 1 line of PHP. There, now it's secure? ;)

    • andai 6 hours ago ago

      Edit: Guys! I'm talking about cool scraper running in browser (has that ever been done before?), and everyone's hyperfixating on bloody CORS.

      Well I guess I failed the Show Don't Tell eh :) lost the code when I left uni, I might have to make another one...

    • hansvm a day ago ago

      CORS protects the browser's knowledge of a user on your website from other websites. It's a big failing of the protocol and/or communications about it that people think it offers any more security guarantees than that.

    • ehutch79 a day ago ago

      If you know what a user agent is, let alone how to change it, CORS is not meant for you.

      Its guide rails to help the tech illiterate not get hacked. It raises the bar on what gets through. It’s not going to stop a determined attacker, but will catch enough to make a dent. Defense in depth and all that.

      • lxgr a day ago ago

        CORS (or rather the same origin policy, of which CORS is an explicit server-side opt-out) is not a generic security improvement, it solves a very specific problem: (Code on) website A being able to make requests to website B with the cookies of B (often implying user login state/authentication at B) and read the response.

        In a (possibly better) parallel universe, cross-site requests just don't send cookies or other ambient authentication state like that by default, and we wouldn't need CORS.

      • rnghrhfhf a day ago ago

        It sounds like you need to go back to school because you’re entirely

    • patmorgan23 21 hours ago ago

      Because the Single Origin Policy is a client side security measure. It's to protect the user from random malicious JS out on the web. If you are writing server side code you're outside the scope of who the security model is trying to protect.

    • binaryturtle a day ago ago

      That's rater like "gnu.org" which blocks you when you're using a slightly older browser. But when you change your user agent to "curl" it magically starts working. Or the German news site "spiegel.de" which also blocks old browsers from accessing the site entirely, unless you change the user agent to "bingbot" (or some other random bot from their whitelist). *insert a facepalm emoji here*

      • mrweasel a day ago ago

        They probably do that to keep out the dumbest scrapers and bots. They often present themselves as out of date Chrome browsers.

        • kstrauser a day ago ago

          Nailed it. I get a zillion bit hits claiming to be Safari for iOS 8 and the like.

          Sorry if you’re trying to visit my site on such a device.

    • nisegami a day ago ago

      >There, now it's secure? ;)

      CORS achieved its objective here, because now those requests were coming from your PHP server rather than the user's browser.

  • hansvm a day ago ago

    I really appreciate how out of all the security models they could've chosen, we ended up with the one which prevents you from writing better client-side frontends for incumbents or otherwise participating in a free and open ecosystem, while simultaneously being too confusing to use securely without a fair amount of explicit coaching and extremely careful code review for literally 100% of the junior devs I've met.

    TFA is just a manifestation of the underlying problem. You thought you were publishing your thoughts to a world wide web of information, but that behavior is opt-in rather than opt-out.

    • throawayonthe a day ago ago

      i haven't heard much about alternative (proposed?) security models for the web, do you have any resources?

  • travisvn a day ago ago

    Hey folks, I'm the developer working on Blogs Are Back. WakaTime has me clocked in at over 900 hours on this project so far...

    If CORS weren't an issue, it could've been done in 1/10th of that time. But if that were the case, there would've already been tons of web-based RSS readers available.

    Anyway, the goal of this project is to help foster interest in indie blogs and help a bit with discovery. Feel free to submit your blog if you'd like!

    If anyone has any questions, I'd be happy to answer them.

    • chrismorgan a day ago ago

      > style="opacity:0;transform:translateY(20px)"

      In my opinion, that’s a bigger problem than CORS. Proxyless web feed reader is a lost cause, you’re wasting your time because only a small minority are ever going to support it. But that opacity and transition nonsense gratuitously slows down page loading for everyone, and hides content completely for those that aren’t running JS.

      (What I would also like to know is: how come this is the third time I’ve seen exactly this—each block of content having this exact style attribute—in the past month, when I don’t remember encountering exactly it before?)

      • travisvn a day ago ago

        The entire web app is JS based. It's a requirement I'm ok with.

        And to answer your question, you're seeing that kind of styling so frequently because it's likely part of Framer Motion, an extremely popular animation library

        https://www.npmjs.com/package/framer-motion https://www.npmjs.com/package/motion

        • Orygin a day ago ago

          Would also be great if the animations respected the `prefers-reduced-motion` setting, instead of forcing down animations that reduces accessibility.

        • mzajc a day ago ago

          Is the website machine generated? Besides the hard-dependency on JavaScript, this also causes the exact same problem I've seen on another[1] machine generated site: https://postimg.cc/TyMBfVZ6, https://postimg.cc/n9j1X5Dk. This happens randomly on refresh on Firefox 148.0-1.

          Is the fade effect really worth having parts of your site disappear at random?

          [1] https://news.ycombinator.com/item?id=46675669

        • trick-or-treat a day ago ago

          I think cooler heads will agree that a middle ground where the content is available on the initial request is best. But what do I know /s

      • pianom4n 19 hours ago ago

        Seriously. This page terrible with multiple annoying rendering delays, and I'm supposed to care about helping their RSS feeds load faster?

      • elliotbnvl a day ago ago

        This is something Opus 4.6 likes to generate a LOT for some reason.

    • rglullis a day ago ago

      Hey, this is very interesting! As someone working on an extension that works as an ActivityPub client, I don't have to deal with CORS issues so much (most servers configure CORS properly, and the extension can bypass CORS issues anyway) but I just spent a good chunk of my weekend working on a proxy that could deal with Mastodon's "authorized fetch".

      So, basically, any URI that I need to resolve goes tries first to fetch directly and it falls back to making the request through the proxy if I get any type of authentication error.

    • freetonik a day ago ago

      Hey! Blogs Are Back is cool! Nice to see more modern RSS readers, and also thematic blog collections. If you seek more curated blogs to share with your users, check out my project https://minifeed.net/

    • moebrowne a day ago ago
    • Klonoar a day ago ago

      You need to put a screenshot of the app on your page.

    • m_sahaf a day ago ago

      How can someone add platforms to the guide? I want to add Caddy

  • mike-cardwell a day ago ago

    I have done this. I also relaxed my Cross-Origin-Embedder-Policy header - https://developer.mozilla.org/en-US/docs/Web/HTTP/Reference/...

  • arjie a day ago ago

    Huh, that's a pretty interesting request. And it makes sense to me. I've enabled it on my RSS feed. I wanted to see if I could add my blog feed to it to test but when I went to do so I had to install a Chrome extension on your app to do it. All right, if someone wants my blog for whatever reason that badly, they can now do it.

  • RandomGerm4n a day ago ago

    It's really cool that you can simply get the full text from sites that refuse to offer the entire text in their RSS feed, without having to go to their site. However, there are a few things that don't work so well. When you add feeds from YouTube, the video is not embedded. Even if the feature is out of scope, it would be good if the title and a link to the video were displayed instead. Also Bluesky posts lacks the embedded content. Furthermore, a maximum of 100 feeds is clearly not enough. If you add things like YouTube, Reddit, Lemmy, Bluesky, etc. you will reach the limit very quickly. Even if these are not content that you actually read in the reader, it would be annoying to have two different RSS Apps just for that reason.

  • alessivs a day ago ago

    Let's get this behavior to be the default in Wordpress and Laravel for public sections—that would cover a lot of ground. I regularly encounter and suffer from unmodified instances of Laravel's default session cookie timeout of 120 minutes. If a more relaxed CORS policy were the default, it won't be an inconvenience and would likely be just as widespread.

  • impure a day ago ago

    I have noticed some sites block cross origin requests to their feeds. It’s annoying but I just use a server now so I don’t care. I very much recommend RSS readers to use a server as it means you get background fetch and never miss a story on sites with a lot of stories like HN.

    • notpushkin a day ago ago

      From the linked post, I think the point of fetching it in-browser is so that your subscriptions stay private. Idk why this is desirable, but if people want it, it’s nice to give them the option.

  • dmje a day ago ago

    Isn't the core audience of BaB people who already have a feed reader? I know I am.

    • trvz a day ago ago

      People who still use RSS aren’t ever perfectly happy with their RSS feed reader.

  • hvb2 a day ago ago

    This feels like such a weird ask?

    Why would anyone do this, so their content can be easily read elsewhere potentially with a load of ads surrounding it?

    This seems to really reason through only the happy path, ignoring bad actors, and there'll always be bad actors.

    • sheept a day ago ago

      If a malicious website wanted to copy a blog's website to put ads on it, they already can just copy it outside of the browser on their end, which has the "benefit" of preventing the original blog from taking the post down.

      CORS also doesn't prevent a popular website with a personal vendetta[0] against a blogger from DDOSing the blog with their visitors, since CORS doesn't block requests from being sent.

      For a purely static website, there shouldn't be any risk from enabling CORS.

      [0]: https://news.ycombinator.com/item?id=46624740

    • ef2k a day ago ago

      To be fair, they do explain their motivation. It's an in-browser RSS reader, so it's fetching the RSS feed directly without a proxy server. There's not much risk since the content is public and non-credentialed. The bigger risk is misconfiguring CORS and inadvertently exposing other paths with the wildcard.

    • onion2k a day ago ago

      This seems to really reason through only the happy path, ignoring bad actors, and there'll always be bad actors.

      True, but the bad actors can defeat any security mechanism you put in place with a proxy, or a copy'n'paste, so the downside risk is pointless worrying about. The upside of allowing traffic is that your content that you presumably want people to read can be read by more people. For all but the most popular blogs that's probably a net benefit.

    • bigstrat2003 a day ago ago

      Also, why would an RSS reader be a website? An application installed on your PC is superior in every way.

      • staticassertion a day ago ago

        I couldn't feel more strongly in the other direction. The fewer programs running on my computer, the better. By far my preference is that "random dev code" gets placed into the strongest possible sandbox, and that's the browser.

      • mr_mitm a day ago ago

        With a website you get shared state (these days many people are using multiple devices), platform independence and sandboxing for free. Plus custom CSS and tamper scripts for customization, browser addons, bookmarks, an API for other applications to consume the content, and probably more.

      • socalgal2 a day ago ago

        Um, no? the most popular RSS reader back when RSS readers were a thing was Google's. It was a website. And why not. Like other websites, you can log in from any device that has a browser and immediately pick up where you left off, including work machines where you aren't allowed to install native apps.

      • adhamsalama a day ago ago

        So, about that...That's how I read RSS feeds on my Kindle.

        https://github.com/adhamsalama/simple-rss-reader

    • trick-or-treat a day ago ago

      [dead]

  • a day ago ago
    [deleted]