I love the idea of fire and forget to a central AI aggregator. This one for LLMs, Fal for image and video. You don't need to juggle dozens of API keys or API integrations.
But these aren't open, despite it being in the name.
I want the ability to pull it all on-prem if I so choose.
I probably won't. But if I want to, I should be able to.
Don't use the name "Open" like this.
OpenAI, OpenRouter, OpenArt... Just stop already.
This isn't even fair source, let alone open. It's totally closed and opaque.
I'm half minded to make an "ActuallyOpenRouter".
(I know there are local routers, but I want hosted and managed with the ability to self-manage.)
While not solving everything (such as a single bill), I implemented borgllm[0,1] as a way to get some of that convenience and development speed with no lock-in or even infra.
At this point, I'm starting to see "Open" prefix for any product/project as almost an signal for the opposite. Projects that actually are about some sort of openness (the way I see it at least), don't usually prefix their name with it, they just operate "openly" and ensure it stays that way.
Congrats OpenRouter on the launch! I'm a big fan of this pattern. We do the same in Svix for our customers, but also we now make it easy for our customers to do the same.
For other companies looking to build something similar, you can use Svix Stream[1] that offers a lot of these integrations out of the box, with more coming.
This is indeed such an awesome feature. I really hope we see lots of other products & services offer to send you your own traces!!
Also, such integration really ought respect & add to any trace propogation contexts passed in. Fingers crossed that works well here! Not explicitly mentioned so I'm nervous, but the product really needs trace propogation.
I'm sure that if they don't already support it, they will add it. TBH, we at first didn't have it either, and then we added both that as well as custom attributes as we realized the way people setup their observability stack is extremely varied!
It's not "Open".
I wish this had another name.
I love the idea of fire and forget to a central AI aggregator. This one for LLMs, Fal for image and video. You don't need to juggle dozens of API keys or API integrations.
But these aren't open, despite it being in the name.
I want the ability to pull it all on-prem if I so choose.
I probably won't. But if I want to, I should be able to.
Don't use the name "Open" like this.
OpenAI, OpenRouter, OpenArt... Just stop already.
This isn't even fair source, let alone open. It's totally closed and opaque.
I'm half minded to make an "ActuallyOpenRouter".
(I know there are local routers, but I want hosted and managed with the ability to self-manage.)
Should make a "ClosedRouter", "ClosedAI", "ClosedArt" instead, but be actually open.
While not solving everything (such as a single bill), I implemented borgllm[0,1] as a way to get some of that convenience and development speed with no lock-in or even infra.
Perhaps you’ll find it interesting.
0: https://github.com/omarkamali/borgllm
1: https://borgllm.com
Use Hugging Face inference providers for this: https://huggingface.co/docs/inference-providers/en/index
At this point, I'm starting to see "Open" prefix for any product/project as almost an signal for the opposite. Projects that actually are about some sort of openness (the way I see it at least), don't usually prefix their name with it, they just operate "openly" and ensure it stays that way.
See also: "Smart"
The founder named it OpenRouter because his previous startup was OpenSea.
Some founders choose a prefix, then tend to use it everywhere :)
Kometimes Kreally annoyinK.
Had to look up what OpenSea is. NFT. So from one hype (and arguably scam) involving GPUs to the neXt one.
Coming Soon section says 'WhyLabs', weird, isn't WhyLabs shutting down?
Congrats OpenRouter on the launch! I'm a big fan of this pattern. We do the same in Svix for our customers, but also we now make it easy for our customers to do the same.
For other companies looking to build something similar, you can use Svix Stream[1] that offers a lot of these integrations out of the box, with more coming.
1: https://www.svix.com/stream/
This is indeed such an awesome feature. I really hope we see lots of other products & services offer to send you your own traces!!
Also, such integration really ought respect & add to any trace propogation contexts passed in. Fingers crossed that works well here! Not explicitly mentioned so I'm nervous, but the product really needs trace propogation.
I'm sure that if they don't already support it, they will add it. TBH, we at first didn't have it either, and then we added both that as well as custom attributes as we realized the way people setup their observability stack is extremely varied!