mind the gap.

Fodor on why Clarke and Chalmers’ Extended Mind Thesis is all wrongheaded.

Which leads me to want to underscore that when I talk about external or outboard brains, I do not mean that these things are actually part of brains (a term I am sloppily using to mean “minds” and not jiggly bundles of nerves).

I mean that the creation of external representations is epistemic action that lightens the cognitive load required to achieve our goals (1). The external representation is the product of mind + action; it is not mind.

Notebooks and iPhones and such are mind prostheses (2). A prosthetic, no matter how customized, is still “other.”

I put things in my .org files so that I do not have to remember them. I offload the task of memory because either I can’t remember or I don’t want to expend the effort to remember.

I create concept maps. I offload the cognitive work of holding a complex representation in working memory. This allows me to focus on thinking about the representation and what it represents, rather than trying to keep the representation itself in clear mental sight.

But my .org files and concept maps are no more a part of my actual mind than my dishwasher is somehow part of me standing at the sink doing dishes while I’m upstairs writing a blog post.

These things are just tools.

Just sayin…

p.s. Did you know there is a genus of moth named Prosthesis?
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
1. Kirsh, David & Maglio, Paul. (1994) “On distinguishing epistemic from pragmatic action.” Cognitive science 18(4): 513-549.

2. Prostheses isn’t the exact term that I want because it means a replacement for something lost or missing, and most of us are augmenting our cognitive processes, not replacing them. The most accurate term would be mind extension, but a) that is too easily confused with the Extended Mind Theory, and b) it sounds too much like spam I receive already.

One thought on “mind the gap.”

  1. seems to me like Fodor is in a knot because Clark et al. have (according to him) overlooked an internal model of mind, and cites John Searle to support him.

    I think that this leaves the question unanswered, and that (say) Dennett, or possibly Turing, would argue that the “internalism” is itself an open question even for those of us who claim we have an internal model.

    “there’s nothing inside [the Roomba] but cat hair and dirt; I know because I’ve looked” — okay, Dr. Fodor, but if you’re going to be clever like that then prepare for a visit from some intrepid neurophilosopher with a bonesaw and curiosity about the contents of your cranium: “Clearly, Dr. Fodor has no internal model: there’s nothing inside there but gray jiggly goo — I know because I’ve looked.”

Leave a Reply

Your email address will not be published. Required fields are marked *