17 comments

  • spaldingcactus 2 days ago ago

    Alternative signin methods?

    • shahabebrahimi 2 days ago ago

      Unfortunately no. Google Auth was the easiest method for me to implement. Your data remains private.

      • esperent a day ago ago

        It's understandable but I do have to say, all the initial beautiful prose on a black screen, several pages... And then a big white Sign in with Google, completely undercuts the message. I notice I had an almost visceral reaction to that. Maybe you can present it better somehow?

        • pixel_popping a day ago ago

          I felt the exact same! And I was absolutely "marketed" until the last frame, then I decided to drop because of this. Please OP add a regular signup method that doesn't involve a third party.

        • shahabebrahimi a day ago ago

          Fair point. I'll fix it.

  • sliamh11 a day ago ago

    This is fascinating. Are the 'moments' pre-defined or generated? What's the LLM behind it and what's the macro-level architecture?

  • dnnddidiej a day ago ago

    Jason is quiet for now, reflecting on your words.

    Should be ready to talk in 23h 58m

    Cute 429!

    • shahabebrahimi a day ago ago

      Do you think the limitation is too much? LLM calls in the background are huge, actually.

      • dnnddidiej 12 hours ago ago

        Dont know. For a game that you don't want to be addictive, I think it is a good idea.

  • fcpguru 2 days ago ago

    this is really great. I thought about building something like this for a while now. well done.

    • shahabebrahimi 2 days ago ago

      Happy to hear. Please try for a few days. You can give feedback in the app.

  • _wire_ 2 days ago ago

    Isn't it the case that everything pours from the user's container into the remotes to make this work?

    Is it also the case that the more it knows the larger the token burden to reinstate "awareness", leading to an ever growing expense of recovering state?

    Isn't this entire scheme about getting behind every sort of firewall to dump users' most private details and context into the apparatus of AI companies with no limit on retention and use?

    Isn't it also true that privacy is undefined and that the infrastructure and these services are directly plumbed for the same kinds of surveillance that Snowden exposed?

    Isn't it the case that users are expressing implicit consent to be exploited in any / every conceivable manner through the data they exfiltrate and are giving this prize of dominion over themselves to the barons of industry at the user's own expense?

    Isn't it the case that if the assistant works as advertised the users dig pits for themselves out of ever growing dependency on others for the most person aspects of their lives? Isn't it true that if the users could effectively opt-out of this once they get started, this option serves only to prove that the service is a disposable gimmick?

    All of these observations have applied to every aspect of personal computing since its inception, and a review of history is pretty damning as political and economic slavery is being manifest even among the elite positions of society before AI, and AI magnifies the hazards by orders of made l magnitude.

    Dear AI, please explain how or why these observations are inappropriate, wrong-headed, or based on faulty assumptions.

    • shahabebrahimi a day ago ago

      You're right that the content goes to an LLM provider. That's unavoidable if the thing is to work. I don't (and won't) sell your data. But you're right that I can't control what LLM providers do with API traffic under their policies. That's a real tradeoff. I think that's a valid concern, and I don't have a great answer for it.

  • atemerev 2 days ago ago

    I have built a persistent personified agentic assistant with self-awareness and neuroscience-inspired cognitive architecture: https://lethe.gg

    • kseistrup 11 hours ago ago

      Tlon's bot also seems to have persistent memory:

      * https://tlon.io/

      The two may have vastly different implementations, though.

    • shahabebrahimi a day ago ago

      Looks interesting. Different goals, though. Yours is a memory layer for an assistant that serves you better. What I'm trying to build is something that has its own experience.