Like all text, even this, written in an authoritarian/strongheaded/textbook way, people believe. They believe textbooks, the news, political reasonings, company mission statements, and federal reserve policies. Turtles all the way down.
Humans search for certainty and authority. LLLMs are so 'confident' in their answers. The truth/reason issue is not unque to AI - look at those who wide up in cults.
Coding aside, their not made for theraputics, they're predictive, and as a scientist who damn well wishes it could DO science FOR ME (would make my job easier), it cant, i tried, it cannot create anything new, it cannot reason from first principles, its a glorifed spell checker that summarizes articles reall well for me.
It seems all of these people were already mentally ill before using AI.
I find it strange someone would use it for coding and then suddenly believe answers they received about their relationship.
Like all text, even this, written in an authoritarian/strongheaded/textbook way, people believe. They believe textbooks, the news, political reasonings, company mission statements, and federal reserve policies. Turtles all the way down.
Humans search for certainty and authority. LLLMs are so 'confident' in their answers. The truth/reason issue is not unque to AI - look at those who wide up in cults.
Coding aside, their not made for theraputics, they're predictive, and as a scientist who damn well wishes it could DO science FOR ME (would make my job easier), it cant, i tried, it cannot create anything new, it cannot reason from first principles, its a glorifed spell checker that summarizes articles reall well for me.