9 comments

  • vectify_AI 8 days ago ago

    Github repo: github.com/VectifyAI/PageIndex

  • BizarroLand 8 days ago ago

    Is there a plan to allow localhosting with ollama or pinokio or llmstudio?

  • casenmgreen 8 days ago ago

    Can this system explain its reasoning, and so explain its answer?

    • vectify_AI 8 days ago ago

      Yes, the explanation and reasons for relevance can be included in the search and reflected in the answer.

      • casenmgreen 8 days ago ago

        Looking through the repo, reading the doc, an LLM looks to be part of the implementation. LLMs cannot explain their reasoning, so if there is an LLM, then the system as a whole cannot explain its reasoning, because part of the system is a black box? reasoning can be explained up to the point the LLM comes into play, and also then afterwards, with whatever is done with LLM output?

        • curl-up 8 days ago ago

          Can you explain your reasoning?

  • ckrapu 8 days ago ago

    Makes perfect sense. Looking forward to trying this.

  • vectify_AI 8 days ago ago

    [dead]

    • xuqian5 8 days ago ago

      great work mate!