4 points | by slake 17 hours ago ago
4 comments
AI talk is turning into Silicon Valley pseudo-math slang. Priors, exponentials, latent space
You get lines like “no priors” or “embracing exponentials” that sound smart but mostly signal status
Same move as N Taleb and “convexity.” A real idea turned into a generic intellectual flex
Probably? Reinforcement learning creates bots with specific styles. For example, ChatGPT is very fond of "typically", "unpack this", and "if you want".
Once again a post with literally 3 points and 2 hours old is the top of /ask
Why is the HN algorithm such ass, can we talk about that?
Well it did have Claude both in the title and the description...
AI talk is turning into Silicon Valley pseudo-math slang. Priors, exponentials, latent space
You get lines like “no priors” or “embracing exponentials” that sound smart but mostly signal status
Same move as N Taleb and “convexity.” A real idea turned into a generic intellectual flex
Probably? Reinforcement learning creates bots with specific styles. For example, ChatGPT is very fond of "typically", "unpack this", and "if you want".
Once again a post with literally 3 points and 2 hours old is the top of /ask
Why is the HN algorithm such ass, can we talk about that?
Well it did have Claude both in the title and the description...