A Basal Concept

A concept is built of smaller concepts which you already understand.  And if the main concept or the most recently understood concept is built of concepts then it would make sense that the concepts with which that main concept is build are also built of concepts and those concepts are built of concepts and so on.  I wonder if at some point the concepts can be reduced down to a point where they are no longer built of concepts and if they are not built of concepts then what exactly are they made of.  Memories, Emotions, gut feelings, images, places, or something else entirely?

What are those flashes that run through your mind when you think of an element like water?  Water is a fundamental building block of life for which we have developed a representation made up of 5 unrelated concepts W-A-T-E-R.  Yet the word also stands for more than a sum of its letters.  For me this concept triggers childhood memories, images of a certain place, and variety of sounds.  Are these instantaneous reactions to a basal concept, concepts in themselves or something else?  Something a bit more “Real” perhaps?  In my mind the reactions to this basal concept are not more concepts but in order for me to communicate what I am experiencing inside my head with you as the reader I must form them into concepts and therefore lose some of the “realness” of those concepts as I subconsciously categorize them.  I want to know if these basal concepts are fundamentally important in our ability to eventually build larger and more complex ones or is it possible to bypass the “realness” and still grasp the full meaning of a concept.

My discussion is dangerously flirting with recursion but I suppose that is acceptable on this particular day of the month.  Is a connection to the “real” world necessary in order to understand a basal concept?  And how many basal concepts do you need to grasp before you start to combine those concepts to build larger ones?  The absence of water and a feeling of discomfort may eventually combine to create what many of us call thirst.  And from there perhaps we can start to understand what it means to long or desire for something that we cannot immediately have.  What I am trying to get to is a few questions about potential fundamental differences between machines and a humans.  I want to know where a computer that is attempting to augment our intellect fits into the structure of concepts that we use every day.  Its goal is to help us pursue more complex concepts by assisting in the categorization of smaller ones.  If the computer can never internalize the flashes that make up basal memories what are those basal memories created with.  Concepts?  And if computer’s basal concepts are created of more complex concepts then we start a loop that indicates that the computer or device will be forever unable to internalize the true meaning of “Water” and instead be limited by the language with which the concept has been described.  Having just realized the full recursive-ness of this latest paragraph I am going to stop and encourage you, the reader, to share what comes to your mind when you think of “WATER”.

4 thoughts on “A Basal Concept

  1. I’m a strong believer in the idea that our brain is a rather complex network of a lot of relatively simple pieces that form a seemingly complex electro-chemical machine. This doesn’t, in my mind ( what is “my mind”, anyway ), make the concept of thinking about concepts any less amazing or exciting or strange, but it does force me to beleive that there are no fundamental barriars to building a computer than can exhibit the same ability to form ideas, connect them and experience insperation. It’s quite possible (though probably unlikely) that our current machines are technologically capable of doing such things but we have yet to come to a good enough grasp on how we do it that we are unable to replicate the process in something else.

    That being said, I think part of the reason it is so hard for us to really explain the *concept* of something like water, and yet so easy for us to hear or read the word “water” and instantly know what it is referring to, is that at least until this point there was no strong evolutionary advantage for really communicating this deeper concept of a concept of water. Really, what mattered, and continues to matter, is that when we are thirsty it is helpful to have a means of communicating that with someone else and means for understanding there response so that we can then find this thing that we have assigned the label “water” to. That’s all we need, really, and once we were able to do that, there wasn’t a strong evolutionary push to wire the brian any differently.

    The fact that now we ARE asking about the much deeper understanding of the meaning of the concepts which we have assind symbols for is pretty amazing and exciting, but we may just have to come to terms with the fact that there are fundamental physical limitations of the types of thoughts and ideas we can actually think up just due to the physical structure of our brain. But who knows, maybe the ability to augment that structure with an entirely different structure will result in more emergant properties of consciousness than are currently possible with only our brains?

    • I can relate to the view of the brain as a complex network of simple processes and I would even extend the network beyond the brain to include your entire nervous system. Our current perspective as you point out is that our network is a complex electro-chemical machine that allows us to among other things hold this conversation through this media platform.
      The evolutionary push to develop a given concept was initially driven by survival. I wonder what drives our development of concepts today when most of those concerned about uncovering or discovering new concepts are no longer concerned about where their next glass of water is coming from (although that could change). When pushed as a country we are quick to respond with the development of a concept that can serve as a solution. If we are ever pushed as a species I am confident that the collective will be able to effectively do the same.
      If we could develop a device with a complex chemical-electrical network that capable of forming ideas and connecting them how would we teach it things. Would it start at the level of an infant and slowly over time build up experiences that would define what it would become or would it come out of the factory preprogramed with the “Basic Facts” already built in.
      Would that device be able to experience “Water” in the same intimate way that we as humans do? And if not would its understanding of “Water” be as complete?
      And if that device is capable of forming ideas of its own fruition then we will have reached what Vernor Vinge defines as the Singularity in his essay “The Coming Technological Singularity”. Wikipedia points out that this is the point where computers will begin to “re-write their own source code to become more intelligent”. He also points out that shortly afterwards “the Human era will be ended”. I suppose where I am headed with this is questioning if we as humans have even reached the singularity if our attempts to re-write our own source code in kindergarten are based on an incomplete understanding on how our brains are wired and function.

Leave a Reply

Your email address will not be published. Required fields are marked *