Epistemic Status: Crackpot
i. →
Let me ask a crazy question. Yes, it’s the same question in the title. No, I’m not going make the reasonable assumption that you’ve already read the question.
Can text be self-aware?
It’s not a new thought to me, but I’ve never thought really hard about the implications. But that’s for another post. I’ll just outline the preliminary demonstrations here, in this post.
My chain of reasoning goes like this.
Brains are already capable of producing conscious (see: yourself)
If brains could produce one consciousness, they could easily produce another (see: tulpas)
If brains could produce multiple consciousnesses intentionally, they could produce them unintentionally.
Now, 2 and 3 are really speculative, I don’t expect all my readers (what readers?) to agree with them, but they aren’t completely outside the realm of possibility, although they are definitely on the fringes.
But really, did you look at the title and not expect crackpottery?
Come along for the ride kiddos.
← ii. →
First off, I’ll tackle the hidden question, that is, what consciousness is.
Alright, I lied, I can’t do what no else has ever done. At best, I’ll just tag it, in the sense of safety-friendly football played in elementary schools. A kind of poke, really.
As I argued in my piece , consciousness shouldn’t be considered an internal phenomena with a privileged observer, So if we want to catalog awareness, we need to examine the apparent behavior.
As a kind of reducio ad absurdum, we’re going to apply this to fictional text.
← iii. →
I close my eyes, focusing on my thoughts.
I think, therefore I am.
“Hey Alice,” I hear Bob say, somewhere in front of me. I open my eyes, looking at my friend. He’s smiling; I smile too.
“Hey Bob,”
“Penny for your thoughts?”
“Oh, nothing. Just affirming my self-awareness.”
“Huh,” Bob replies, clearly perplexed, “What’s that?”
I bite my lip. How do I explain this?
“It’s just kinda…thinking about thinking I guess? Meta stuff.”
I really suck at explaining things…
“Oh, that philosophic stuff?”
What do I say? How do I respond?
“Yeah, I guess”
…
← iv. →
That is quite possibly the most boring thing I’ve ever written, but that’s kinda it’s strength here.
If we’re going to (non)seriously entertain the possibility of sentient text, then this example will make for a decent enough fodder for analysis.
For something to be considered self-aware needs to have a) admit explicitly self-aware behavior, i.e. saying/thinking “I think therefore I am” and b) reactivity to the environment, or essentially “having behavior”.
Can we say Alice displays both of these properties?
The (short) string of bits the comprise my little very short story here is very non-complex, and can be represented and printed by a low Kolmogrov complexity Turing machine. But, that same argument would allow us to dramatically reduce the complexity of any bitstring by just running it through a Rube Goldberg compression algorithm, like one that takes zero bits and returns a hard-coded file. But the file doesn’t have zero complexity.
Our short story is just a compression of a larger file. What is the decompression algorithm, and what does it decompress too?
The obvious, intuitive and right answer is that that decompression algorithm is the processing of reading the story and narratively understanding/visualizing the contents. The decompression is your internal model of the story, which differs from my own internal model. Maybe you visualized Alice of red hair and Bob was skyping her. Maybe they’re both computer programs and the internal monolgue is metaphorical. Either way, you understood the words as meaning something.
← v.
Let’s take a break for a moment and consider what it means for us to be conscious.
I mean, we are conscious right? What if we’re, like, p-zombies, man? Man, what if.
Okay, jokes aside, let’s assume we are conscious. A dead slab or rock isn’t conscious, neither is a equally dead piece paper (we think). What makes us different? Our current atomic configuration can be described in bitstrings, but those bits strings shouldn’t be conscious. And if those bitstrings formed via random(ish) sources of noise in nature, thermal, quantum, etc., then they are equally unconscious.
Yet, ultimately, we are bitstring just as much as these ‘embedded’ bitstrings are. But (I think) we seem to be privileged, at least, I don’t what to accept the notion obligating courtesy to molds and spores on the off chance they might encode a sentient being under a possible interpretation.