![]() ![]() People, after all, identify themselves on a continuum of visual to verbal thinking the experience of not being able to put an idea into words is perhaps as human as language itself. The researchers explain that words may not work very well as a synecdoche for thought. At a moment when pundits are fixated on the potential for generative AI to disrupt every aspect of how we live and work, their argument should force a reevaluation of the limits and complexities of artificial and human intelligence alike. In a new paper, cognitive scientists and linguists address this dissonance by separating communication via language from the act of thinking: Capacity for one does not imply the other. It gets math wrong, fails to give the most basic cooking instructions, and displays shocking biases. Although ChatGPT can generate fluent and sometimes elegant prose, easily passing the Turing-test benchmark that has haunted the field of AI for more than 70 years, it can also seem incredibly dumb, even dangerous. And this supposed link between language and thinking is a large part of what makes ChatGPT and similar programs so uncanny: The ability of AI to answer any prompt with human-sounding language can suggest that the machine has some sort of intent, even sentience.īut then the program says something completely absurd-that there are 12 letters in nineteen or that sailfish are mammals-and the veil drops. The underlying assumption is that the brain processes the world and our experience of it through a progression of words. ![]() People “talk it out” and “speak their mind,” follow “trains of thought” or “streams of consciousness.” Some of the pinnacles of human creation-music, geometry, computer programming-are framed as metaphorical languages. Language is commonly understood as the instrument of thought. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |