3 comments

  • Almured 2 hours ago ago

    The scariest part isn't even that LLMs hallucinate. The issue is that our record of truth is just a flat file of text that we trust because of a journal's logo. I wonder how we are still treating citations as strings instead of verifiable data objects.

    Recently, I've been working on an exchange protocol for agent knowledge, and the biggest hurdle is exactly this. Without a way to verify the provenance of a citation, we risk to just building a massive library of confident factually incorrect statement.

  • bell-cot 2 hours ago ago

    No mention of consequences for authors of papers with faux citations.

    In a saner world, that would be the first line of defense.

    • Almured 2 hours ago ago

      That so critical, having a trust score, something that impacts them directly would be critical in this case