Sunday 29 May 2016

Brains and car engines

This paper makes a solid point. The argument about whether a brain is like a microprocessor or not (obviously not) is a distraction to the main point, which is that without understanding the flow of information we are simply sorting through patterns in the data coming out of neuroscience without coming to any real understanding. 

For instance, it's quite surprising how much effort is spent on measuring and tracking different oscillations in the brain. These are correlative with respect to the actual information flows, but not the same thing as the latter. We rather need to tease apart the mechanisms with which the constitutive neurons communicate with each other, in various behavioural and functional contexts. From there, we may then find how the oscillations arise, e.g., they are a byproduct of particular forms of processing. Instead, we put the cart before the horse and focus excessively on the oscillations first (or any large scale activity), before examining cell specific electrophysiology and neurochemical interactions. Indeed, many neuroscientists are loathe to go that "low level"!

This results in a state of affairs in neuroscience (at least systems level) where we are content with scratching the surface, e.g., measure changes in frequencies under different conditions. It's like measuring the spectral signature that the sound of a car's engine makes, and thinking that by comparing the peak power under idle vs driving conditions we are any closer to understanding how the engine works. However, we are actually in a very primitive state of "understanding". We don't know anything about pistons and drivetrains, let alone the principles of internal combustion. Yet it's only by understanding these that we would truly know how a car engine works. The same applies to the brain.

Sunday 22 May 2016

Murakami'd

I've been Murakami'd. Which is to say, I finished reading The Wind-Up Bird Chronicle written by Haruki Murakami. At least, anyone looking at me from the outside would conclude that that's what it means since, according to most external signifiers, I was sitting quietly for long periods of time reading this one particular book. This process repeated itself over the course of many days, with nary a sign that universes were colliding and separating, except perhaps for the occasional burst of laughter, furrowing of the brow, or quick exhalation of the breath.

There really is not much that I can say about the book itself, due to its very nature. Some of that is because it is fundamentally a surrealist book and, from very brief ventures into literary criticism, I see that there are many different ways a book can be said to be "surrealist", so I won't get into this at all, not least because I am not a literary critic. However, this is one of those rare books that shook me to my very core, and produced an experience that is difficult to otherwise come by. Some books have a tendency to do that to me, and each one in a different way. Murakami's work is different from anything I've ever read before, and elicited such a strange constellation of emotions and realizations, that I had to label it as its own thing, using its own verb - to be Murakami'd. And so, I felt compelled to sit down and mark down my experience in words immediately after the last page was turned - at least, as best as I could - as a record for myself, to be returned to in the future and accessed once more, talisman-like, sort of like how the primary protagonist of the novel would (literally) descend into his well and clutch his baseball bat tightly when venturing on his dream-reality journeys.

One of the things I've realized is that reality is not real, nor is unreality not un-real. Anyone who studies the history or philosophy of science knows this (at least the former), but to know it like this, as a ton of bricks hitting you, is something else. These are not meant to be cute transpositions of each other but are both equally and fully valid truths, juxtaposed together merely as a form of convenience of expression. The way Cinnamon retold and reshaped his mother's tales in the novel, holding to "the assumption that fact may not be truth, and truth may not be factual", may seem to be saying the same thing, but the emphasis is slightly different. What I'm referring to has more to do with the real/surreal distinction, or more accurately the real/constructed distinction. This is touched upon directly in an excellent interview with Murakami, where he says, "I don’t want to persuade the reader that it’s a real thing; I want to show it as it is. In a sense, I’m telling those readers that it’s just a story—it’s fake. But when you experience the fake as real, it can be real. It’s not easy to explain." This does not mean that you merely become "engrossed" in the story. We are talking about realities. What is real? How do you define 'real'? (I hear these sentences in Lawrence Fishburne's voice.) It's what arises when our minds or mental faculties meet with the (putatively) "objective" world through the mediums of our very idiosyncratically tuned (by evolution) faculties of sensation. (I cheated there, by parroting back from Buddhist metaphysics, but I think it's a very good summary of the human reality-constructing process.)

Therefore, what any one of us experiences as "real" is not objective nor necessarily shared by any other person, and I utterly blank on wondering what the Universe is like, as witnessed by an entirely different species, with differently tuned faculties of sensation and mind. Now, it's easy to ponder this when we talk about the "subjective stuff" - feelings, thoughts, emotions. But when we talk about the "external, objective stuff" - cars, wind, the Internet - we think there is only one possibility, and we all have the same access to it (except perhaps, the narrative goes, at the quantum level, but then that doesn't really affect us at the macroscopic scale; it all "cancels out", right, and anyway, scientists will eventually figure out what's really going on there too). Part of what it means to be Murakami'd, for me, is to have this certainty deeply shaken, to see that even the "external stuff" is highly contingent, elusive, ephemeral. When you get stabbed by a knife in a dream reality, and have a subsequent wound in the non-dream reality, is that strange? Why should it be? Is it just because it hasn't yet happened to us? What's to say that it couldn't? And which reality would then be the real one, which the dream reality? Or are they both dreams? Hume absolutely hit the nail on the head with his problem of induction - we really can never be sure that the sun will rise tomorrow, at all. The Popperian response (or Bayesian, for that matter) is merely instrumental or "practical" but does not address the deeply metaphysical conundrum. Sure, we can say that such-and-such "laws" of physics preclude it*, and the sun will only not rise when in 4 or 5 billion years, it swells to such a size that it engulfs the Earth - sure, then you can say that the sun "does not rise", but this is very problematic.

* remember that there are no such laws "hanging" out there in space. These "laws" are merely tentative descriptions that humans have imposed on the worlds-in-their-heads (i.e., their conceptual models). The motions of the planets are absolutely not "governed" by the laws of gravity. They are governed by something else entirely, that we do not yet understand and probably have no hope of truly understanding. We merely label it "gravity" and have a neat set of equations to describe it, but made a terrible mistake by mixing up the semantics and making it seem that a description is an actual statement of causal fact. Frankly, it is a miracle that the New Horizons probe made it to Pluto at all. What a fluke, that the planetary mechanics that were concocted in a puny ape-mind actually worked at such a grand scale! (The universe may yet have its last laugh, however, especially if the MOND theories are correct, and gravity ends up working quite differently at the truly mega-scale.)


What is to say that there is not some other process, as yet unobserved, that could also lead to the cessation of the sun rising? Perhaps some "cascading quantum effect" where the sun dissipates or dismantles itself without causing damage to our planet? Suppose you could formulate a scientifically palatable theory that could account for such a phenomenon (using proper language, of course: "cascading quantum effect" is a step in the right direction, I think). Can we calculate the chances of that happening? How could we? But then, when and if it does happen, all we can do is say "Oh, well, so I guess that happened." Soo desu ka. We really have no say in the matter, one way or another, despite what our physics textbooks and Nobel laureates say. In other words, we are quite full of hubris when we claim privileged knowledge of how reality works - by our very definitions of the scientific process, even. The very fact that theories can change, is testament to our temporary understanding. Before we go further, no, evolution is "not just a theory" - the vast body of evidence for it can't just disappear, but it can be expanded, changed, and, ultimately, take on an entirely unrecognizable form. (Lamarck says hi.) The "theory of evolution" can be relegated to the graveyard of the pessimistic induction. And so, we really have a very shaky grasp on what "reality" actually is, how it is constructed, and how it self-perpetuates.

(As you can imagine, to be Murakami'd is an excellent antidote to the pernicious religious sentiments of scientific materialism, something to which I am prone to. But that is not to say that I will cease to be a practising scientist, nor that I will suddenly side with the anti-vaccers. Those people are still bat-shit crazy, Murakami or no.)

So then, when the scientific materialist narrative is set aside and recognized as the flashy new smartphone model in a series of world-models, what tools can we use to understand reality? I really don't know, but Murakami does have a way of showing you that things are not as they seem, that what appears unreal can actually be very real, in a concrete sense. In essence, he brings up the utter indeterminacy of epistemology and ontology, in other words, the inseparability of what/how we know, with what actually "is". And then - crucially - he demonstrates how you can set a blender to it, and mix up the epistemology, remarkably also mixing up the ontology! It's amazing to see, though, how deeply ingrained and overly confident our world-models are, when it takes something like The Wind-Up Bird Chronicle to smack you in the head in order to realize that, actually, maybe that's not the truth after all.

I'm afraid that in the above paragraphs I've strayed too far into the familiar sphere of the philosophy of science due to my having studied it, but please do not take to be Murakami'd to mean that I started thinking about these subjects in exactly this way. It was far more visceral and non-verbal. I merely used the conceptual tools that I am familiar with to try to describe the experience, but I see now that I was only elaborating on the intellectual implications of the experience, not the experience itself.

[This post was written in the summer of 2015 but only published in May 2016.]

The brain may not be a digital computer, but it sure ain't "empty"

The following article ("The empty brain" by Robert Epstein) came up in my circle of Cognitive Science friends from university on Facebook. Normally I am sympathetic to viewpoints that try to show that the brain is not a digital computer, or doesn't have a von Neumann architecture. All of these ideas are quite outdated and we probably don't have to be beating the dead horse anymore at this point. Although the article started off that way, it quickly took a turn into some very strange territory, a quite reductionistic take on what the brain is "doing". The author became apparently hell-bent on saying that nothing ever happens in the brain at all, that the only intelligent behaviour is vis-a-vis interactions with the world. The "How and where, after all, is the memory stored in the cell?" is a shocking question and dead giveaway. For starters, there is such a thing called synaptic plasticity, dendritic remodelling, and ion channel and receptor expression/recycling, all of which serve to change the neuron's input/output function as a function of experience. How can these things be so blatantly ignored when discussing the question of memory formation in the brain?

Regarding the whole "there are no algorithms, encoders, decoders, ..." bit (paraphrasing), this is also eyebrow-raising. When light enters the eye and hits the retina, the pattern and intensity of light gets converted into a series of action potentials. Why would you not call that an analog-digital converter? Also, most neurons exhibit idiosyncrasies in how they respond to different types of synaptic input. Some neurons suppress low frequency inputs, only responding to high frequency ones, or vice versa. These are, quite literally, high-pass and low-pass filters, respectively. All components of information processing systems. And we're not even getting into what we know so far regarding neural circuits, all of which strongly indicate that information processing is taking place, yes, even representation, storage, retrieval, etc.

Insofar as some patterns of electrical activity are manifested in response to particular configurations of sensory input, we are allowed, or even obligated, to say that information is being processed and transformed, so that certain patterns of (sensory) inputs can then lead to patterns of (motor) output that facilitate survival of the organism in an uncertain environment. It's kind of the point of having a central nervous system in the first place. Or is the author assuming that the brain does absolutely, literally nothing? Maybe he takes a page out of Aristotle's book and believes it's a giant radiator. This is the only way he can get away with his audacious and, frankly, ignorant statements. By "ignorant" I simply mean that his arguments would not be formulatable had he read the most rudimentary "Neuro 101 for dummies" type textbook from the past 25 years.

And this isn't even getting into deeper philosophical questions of different kinds of information processing and the nature of representation, or anything like that.