23 comments

  • ponywombat 2 days ago ago

    AWS have also made their own cost analysis MCP server https://github.com/awslabs/mcp/tree/main/src/cost-analysis-m...

    • neuroelectron 2 days ago ago

      What a great solution. They can leverage this to avoid liability for their own price quotes and solutions while simultaneously adding another layer of vender lock-in. synergy

      • cowsandmilk a day ago ago

        How does this avoid liability for anything?

        • mcintyre1994 a day ago ago

          I don't think anyone's really stated this outright, but large companies must believe they're not liable for anything their models/AI products are producing. That must be the case for their business model to work.

        • neuroelectron a day ago ago

          A better question is how can they be held liable at all.

  • _pdp_ 2 days ago ago

    On a related note, I'm not sure when it became "ok" to leave production credentials scattered across your system in configuration files. So many MCP server examples encourage this pattern, and inevitably, it's going to cause trouble at some point.

    • Game_Ender a day ago ago

      What is your preferred way to manage them?

      • nel-vantage a day ago ago

        Vantage engineer who worked on this feature here. The security posture of MCP servers is still in its early stages (see “The ‘S’ in MCP Stands for Security” from three weeks ago [https://elenacross7.medium.com/%EF%B8%8F-the-s-in-mcp-stands...]). The recommendations above to use something like the 1Password CLI wrapper when invoking an MCP server seem sound.

        That being said, an easier-to-distribute user experience would be to leverage short-lived OAuth tokens that LLM clients such as Claude or Goose ultimately manage for the user. We’re exploring these avenues as we develop the server.

      • devenjarvis a day ago ago

        The 1pass CLI is great! However if you aren’t using 1password as your secrets vault, I’m building an open source, vault-agnostic alternative called RunSecret (https://github.com/runsecret/rsec)

        • mdaniel a day ago ago

          You may want to do your own Show HN about it, so folks don't have to be "MCP curious" to find out that it exists

          That said, given https://github.com/runsecret/rsec#aws-secrets-manager presumably in order to keep AWS credentials off disk one would then have to have this?

              "vantage-mcp-server": {
                "command": "/opt/homebrew/bin/aws-vault",
                "args": [
                "exec", "--region=us-east-1", "my-awesome-profile",
                "--", "/opt/homebrew/bin/rsec", "run",
                "--", "/opt/homebrew/bin/vantage-mcp-server"
                ],
                "env": {"VANTAGE_BEARER_TOKEN":  "rsec://012345678912/sm.aws/VantageBearerToken?region=us-east-1"}
              }
          
          in contrast to the op binary that is just one level of indirection, since they already handshake with the desktop app for $(op login) purposes
          • devenjarvis a day ago ago

            This is great feedback, thank you!

            I agree RunSecret adds a level of indirection at this stage that op doesn’t (if you are using 1pass). This is something I plan to polish up once more vaults are supported. You’ve given me some ideas on how to do that here.

            And thanks for the advice on doing a Show HN, planning to do so once a few more rough edges are smoothed out.

      • klooney a day ago ago

        If your service is running in AWS, use the native IAM identity loading from ECS, EKS, EC2, etc. If it's your laptop, set AWS_PROFILE and let the SDK load the temporary creds from ~/.aws.

        If you really really really need to use static creds on your laptop, use aws-vault to export them, or ephemeral creds generated from them, into your environment.

      • ivanvanderbyl a day ago ago

        1Password’s CLI op does a reasonably good job of this

  • mdaniel 2 days ago ago

    How does this work? https://github.com/vantage-sh/vantage-mcp-server?tab=License...

    That is extra weird when thinking about the audience who might be Vantage.sh users (and thus have the ability to create the read-only token mentioned elsewhere) but would almost certainly be using it from their workstation, in a commercial context. Sounds like you're trying to keep someone from selling your MCP toy and decided to be cute with the licensing text

    • bluck 2 days ago ago

      I'm just trying to understand licenses, but doesn't the choice of MIT contradict the inital "non-commercial purposes" as MIT says 'including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software' - Therefore, the non-commercial purposes is actually void and I can use the software to the limits of MIT defines? And because it is already MIT, they can relicense only future software but not this piece anymore?

      • skwashd 2 days ago ago

        IANAL but that is my understanding. I created an issue to get some clarification - https://github.com/vantage-sh/vantage-mcp-server/issues/31

      • ddingus a day ago ago

        You have a couple problems that I can see:

        One is the MIT license does not prohibit selling. And wrapping it in a "for non-commercial uses" clause creates a contradiction difficult, if not impossible to enforce.

    • globalise83 a day ago ago

      So if I want to use the software I just have to create a fork on my home machine for non-commercial purposes, update the license to MIT only, and then the fork is mine to do with as I want commercially? What's even the point of this license?

  • andrenotgiant 6 days ago ago

    What's the difference between connecting an LLM to the data through Vantage vs directly to the AWS cost and usage API's?

    • StratusBen 6 days ago ago

      A few things.

      The biggest is giving the LLM context. On Vantage we have a primitive called a "Cost Report" that you can think of as being a set of filters. So you can create a cost report for a particular environment (production vs staging) or by service (front-end service vs back-end service). When you ask questions to the LLM, it will take the context into account versus just looking at all of the raw usage in your account.

      Most of our customers will create these filters, define reports, and organize them into folders and the LLM takes that context into account which can be helpful for asking questions.

      Lastly, we support more providers beyond AWS so if you wanted to merge in other associated costs like Datadog, Temporal, Clickhouse, etc.

  • cat-whisperer 6 days ago ago

    This is going to different, as resources end up getting intertwined? or is there a way to standardize it?

  • salynchnew a day ago ago

    This is dangrous, as I've long held that inscrutible cloud billing was one of the greatest protections we had against a runaway superintelligent AI.

    Now we only have poor IAM UX to fall back on.

    /s