Posts

Showing posts from November, 2025

Stochastic parrots: so what ...

In " On the dangers of stochastic parrots: Can language models be too big? " Bender et al. state that people "mistake LM [i.e., language model]-driven performance gains for actual natural language understanding" (p. 616). This is because language models  are  built by being able to accurately predict what word follows another but don't actually understand language. They are nothing more than stochastic parrots. The interesting question here is the 'nothing more than'. The objection again seems rooted in the humanistic bias that only biological beings can 'really' understand. But human understanding and the language manipulation that manifests it may itself be mostly a sophisticated stochastic process. Sam Altman has tweeted on X "I am a stochastic parrot, and so r u." Our brains are sophisticated neural networks that don't seem functionally very different from the artificial networks that they were the original...

LLMs, consciousness, understanding and moral worth

Barbara Montero's NY Times op-ed on Nov. 8, 2025 -- “ AI is on it way to something even more remarkable than intelligence ”  -- imagines the possibility of LLMs having consciousness – something it is like to be them, using a definition of subjective awarenes made famous by Thomas Nagel’s “ What is it like to be a bat? .” This would be different than what it’s like to be oneself or, to the extent we can imagine it, a bat … but still ‘something’. This seems reasonable. Montero suggests the criteria to be met for this accomplishment might be some reported ‘inner’ experiences. She suggests that this is no different from how we attribute consciousness to anyone but ourselves. That attribution in the case of other humans (or some animals) is strengthened by the belief that they are similarly constructed, so we might expect similar inner experiences. We don’t have that belief in shared construction in the case of LLMs (and it’s a matter of debate how important similar construction is). Bu...