Writer: Satan’s Sodomite
Subject: Rational Mysticism
Link: LS666 Email / 30.10.2024
Author’s Notes: This is “sort of” in response to “Doubt”; it outlines my thinking about the value of doubt and the dangers of belief. I wrote this several years ago for a mostly technical audience, but I think it has merit for others as well. At first glance, it may not seem to be particularly Satanic, but the approach is extremely useful to those on an LHP. If you find it worthwhile, please publish it. Hail Satan! — Satan’s Sodomite
Rational Mysticism
Yet ah! why should they know their fate? Since sorrow never comes too late, And happiness too swiftly flies. Thought would destroy their paradise. No more; where ignorance is bliss, ”Tis folly to be wise. — Thomas Gray, “Ode on a Prospect to Eton College”
Illusions
We each live in an illusion. More precisely, we each live in different illusions; that is, the illusions of any two people differ in some respect. This is natural and normal; Reality, after all, is far too large and complex to perceive and understand all at once; even if it were, using that knowledge in real-time would not be feasible.
Instead, we each have a mental model (or models) of reality. We use these models as if they were reality; the better the model is (that is, the closer the model maps to reality in the areas we apply it to) generally the more successful we will be when using it.
If it were not for these models, we would be paralyzed — unable to complete the tiniest of decisions. However, most of us are generally unaware that what we perceive is a model, rather than Reality itself. This creates several problems.
We will discard information that conflicts with our models, usually without it even being aware we have done so. We will even manufacture “replacement” information consistent with our models without being conscious of it.
A good example of this is that we each have, in each eye, a “blind spot” which provides no visual information whatsoever. However, our minds do not acknowledge that; instead, they create information to fill this spot so that our conscious minds perceive only a continuous image.
The problem is, however, that lacking the discarded information (or relying on the replacement information) may at times lead to bad decisions.
We are susceptible to manipulation by others who exploit the differences between the model and reality to make unwise decisions. Note that the manipulators might not be fully or even partially aware that they are doing so.
Obvious examples of this can be seen in advertising, where products are associated with generally desirable things (sex, youth, wealth, power, ethical superiority, etc.) but not claimed to bring with their purchase those things. The result is that in some cases, individuals will adopt the false model that they are associated with and purchase these products not on their merits but with some expectation or hope that the desirable things will also appear.
We are fooled into engaging in conflict with others by misinterpreting their having somewhat different models of reality as irrational or aggressive behavior. For many examples, look at social media.
Sets of very similar models — so similar that they can almost appear to be shared — are sometimes called World Views. Other terms used are Belief Systems (with the incredibly appropriate acronym BS) and Reality Tunnels (which emphasizes the restrictions and filters imposed).
Beliefs
Different models, different world views, different reality tunnels and of course belief systems, have different sets of rules or axioms, that require different things to be believed. Many religions are based largely on openly demanding certain beliefs; other groupings of individuals do as well, although they are not so explicit about it. Governments certainly do (hence the teaching/indoctrination of “civics” and the push for schools, at least the early years, to be under government control of the curriculum), as do social groups (defined by various things such as living in the same area, having similar pastimes or partial set of goals, etc.)
The set of beliefs differ in size and scope and often allow for some — although often quite limited — variation. In the US, for example, one may reasonably be liberal (in which one case must generally back the Democratic Party), conservative (in which case one must generally back the Republican Party), independent/moderate (in which case one must back members of either the Democratic or Republican Parties); one may not, however, reasonably be a libertarian, socialist, communist, fascist, or any variety of anarchist.
In much of the US, one’s religious beliefs may reasonably align with those of Baptists, Catholics, Methodists, Lutherans, Mormons or nearly any other Christian group — in many places, even Jewish groups are grandfathered in as acceptable. However, it is mostly unreasonable to be an Atheist, Muslim, Satanist, Hindu, Shintoist, Agnostic Taoist, or Pagan.
In Columbus, OH, one may be a Buckeye, but not a Wolverine. In Ann Arbor, MI, the opposite is true. Those outside of the “approved” groups are “No Good Shits” as Leary’s Fourth Circuit function triggers.
Beliefs are generally inculcated almost from birth, by those of the parents or guardians, and augmented by those of the local groups influencing them and the child, including government, religion, and other dominant local influences. They are bound to the individual identity before (and often instead) any tools or skills to evaluate and choose among existing and new alternatives.
Americans, for the most part, are raised to grow up believing things that Americans believe (with acceptable variation), and Iranians, for the most part, are raised to grow up believing things that Iranians believe (within acceptable variation). Those who stray noticeably in either of these (or many other) locations will encounter sanctions of various kinds, ranging from mild disapproval to death. Note that this is true even if the beliefs in one location are unacceptable to another location.
In short, we have been and are, for the most part, taught all our lives to believe, rather than to think, compare, and/or doubt. In countless ways we are encouraged to simply accept what we have been taught and remain in a daze, fitting in and not causing trouble, swaddled in the dreams and illusions that we have been wrapped in by our parents and other members of “society” just as they were before us.
Although it does not address this topic directly, I would strongly recommend the book — The Seven Sins of Memory — How the Mind Forgets and Remembers, by Daniel L Schacter, copyright 2002, ISBN-13 978-0618219193.
Dr. Schacter is a Professor in (and former Chair of) the Psychology Department at Harvard. The book addresses a large number of ways the human mind falsifies information.
Waking Up
We are each trapped and limited by our models of reality unless we do something about it. The catch-22 is that we need to be aware that we are trapped to act, and the models do not allow us to see the nature of the entrapment.
The end of this Gordian knot may be found at certain points in life where the model and reality differ so much that the model is at such points wrong and hence is itself obvious. My experience with others indicates that most (but not necessarily all) people have them during their lives, but they seem sufficiently varied that they cannot be predicted. Some fairly common points, however, are:
Discovering that a childhood “myth” (e.g., Santa Claus) is not real. Being deliberately or inadvertently betrayed by someone you trust. The perception shifts/rebellion during adolescence. The death or significant illness/injury of a close loved one. A personal accident, or “near-death experience” ( an “I could have been killed!” moment). An unexpected “success” or “victory”.
Unfortunately, in most cases, the person simply goes back to sleep after a time or moves from one Belief System (BS) to another one (BS) which is likely to be just as flawed.
As an example, here is a list of some of these points in the author’s life:
- Discovering what I thought was “magic” was a game my parents played (age 6).
- Being falsely accused of lying about throwing a pebble by a cop (age 9).
- Being told that a method I worked out for computing logarithms for negative numbers couldn’t work because “you can’t take the log of a negative number. (age 11)
- Finding a flaw in an algorithm while in high school that a professor told me he had missed for years teaching graduate students (age 16).
- Being exposed to the proof of Euler’s Identity, lets one compute logarithms for negative numbers (age 20).
- Discovering that a point of theology I had accepted (supposed proof that “abortion” was a “sin) was based on clearly flawed biblical hermeneutics. (age 24)
- Death of my mother (age 24)
- Discovery of the writings of Robert Anton Wilson (age 27)
- Observation of fellow students in grad school getting angry at the conclusion of the proof that the Halting Problem is unsolvable (age 28)
This list certainly has a few points that few could easily identify with, but it is only slightly larger than those whose lists the author knows of.
Unfortunately, those who are most likely to actually “wake up” are those with a larger number of such points than others, and such points are generally more tragic than mine. Such points can also, in effect, destroy one. Note that a corollary to Nietzsche’s aphorism “what does not kill us makes us stronger” is “what could make us stronger might also kill us”.
It is only in the last few points that I truly began to wake up (although, in retrospect, they all contributed). In particular, Wilson’s works provided the tools needed to begin consciously shaping my reality tunnels.
This notion, BTW, is not unknown in popular culture, although it is seldom discussed in factual terms. In particular, the movie Pleasantville demonstrates the idea and the varied ways it can occur (as well as the sorts of opposition one may encounter) extremely well. It maps quite well to the author’s own experience and that of others who have also “awakened”. Other popular (but less “useful”) examples include The Truman Show and The Matrix.
Staying Awake
As I said above, most people who wake up fall back into the stupor of illusion soon thereafter. Sometimes they fall directly back into the same illusion the “awoke”, sometimes they shift to a different illusion, but either way they fall back into some specific belief system or reality tunnel. The result is they end up only slightly better off, if at all. To truly take advantage of this period of wakefulness, one must learn to both extend the period of wakefulness and to re-trigger it when one discovers that one is again slipping back into a complacent stupor. Both of these are achievable, but both take work.
The first technique to learn to remain awake is to conquer the natural tendency to believe. Note that by “believe” I am not referring specifically to “religious” belief — although that certainly qualifies — but to any belief. This may be done by inculcating doubt in everything.
The first phase of this is to cease to trust anyone, including anything they say, write, or otherwise communicate. Remember that they all are, for the most part, trapped in their illusions. This does not mean to disregard what they say and/or do, but to keep in mind that they can, at best, speak and act only within the perceptions of their illusions.
They may communicate false things; however, they fully believe them since they have filtered out conflicting information and replaced it with information consistent with their model of reality. To facilitate this, attempt to gather information from people with sharply different models of reality (that is, different reality tunnels or belief systems).
Note how they react to any information that does not match their model; they may seem to ignore it (perhaps focusing on some ancillary piece of information in the communication), or they may attempt to dismiss or denigrate the source of the information (using “ad hominem” arguments or classifying the source as a “no good shit”); they may even get angry. They will not, however, attempt to reconcile the information with their model, except in rare cases.
As you begin to get some measure of skill at doubting others, you must move on to the second phase with due diligence, or risk becoming extremely paranoid or falling into a state of egoism, narcissism, or even solipsism. That phase is to cease to trust yourself.
You must first learn not to trust others so that you can avoid being easily manipulated as you learn not to trust yourself; in addition, you can learn from your experiences how others react and how you, although generally unaware of it, also react. At this point, although awake, you are still largely influenced by your past illusions.
This should not be surprising — you have spent most of your life relying upon them — and by force of habit (if nothing more) you continue to do so. This is the stage where you must rigorously examine each assumption you make about what you seem to learn, how you perceive the world and the perceptions and judgments you have made about both the world around you and yourself.
This is where the really hard work begins. Note that you will find yourself resisting this on an ongoing basis for far longer than you expect; you must also watch that you do not convince yourself that you have conquered this prematurely (in my case, I suspect that “prematurely” is “while still alive”), although you will be sorely tempted to do so. You are fighting against both decades of programming and reinforcement and probably hard-wired instincts as well. I’ll have more to say about this in the third phase.
There are some techniques I have found quite useful. The first is to delve into your belief system and begin to find your actual root beliefs. You can recognize these because you will either (a) find yourself using circular reasoning — that is you believe A ultimately because of B, and you believe B ultimately because of A, or B you will find yourself getting angry, wanting to shout something like “just because that’s the way it is!”, or feeling guilty about questioning something (generally this is some point at which you have been programmed by others to believe that not believing A makes one a “no good shit”).
When you find one of these core beliefs, invert it, choose to temporarily believe it (just as a mathematician may assume something to be true, perhaps to find an inconsistency), and begin attacking any other (usually derived) beliefs you find are inconsistent with your new, temporary belief. Seek out the thoughts of others (the Internet is a great place to do so, as is the library) who agree with your temporary belief, and immerse yourself in them.
Push yourself to see things from that perspective, and hold that perspective and those beliefs as long as you reasonably can without them conflicting with your normal life. You do not need to discuss this with others, although one additional tool I have used is to create a different identity or persona in some relevant Internet forum (chat room, discussion list) and discuss your new (temporary) belief and perspective there. If you can, put yourself in a position to defend that perspective against those who disagree.
You may be concerned that this exercise might cause the temporary belief to become permanent. While I have never encountered this, an additional safeguard would be to choose perspectives that are so far outside those you are used to, so abhorrent, that one cannot conceive of holding them more than briefly. Here are two examples I’ve used in the past, as well as a new one most appropriate to this discussion:
Kim Jong Un is a great hero and freedom fighter and should be held in high esteem by everyone. Torturing dogs to death is a wonderful hobby and a great way to relieve stress and bond with your family. I daresay that no one reading this would find themselves adopting these as long-term beliefs.
In general, my experience dictates that anything from an hour or two total to periodically for a couple of weeks with any such belief is sufficient.
The second technique I have found useful to mastering this phase is to eliminate (absolute) beliefs and replace them with (provisional) beliefs. I refer to these provisional beliefs as “Current Working Theories”. In general, I attempt to have at least two, mutually inconsistent Current Working Theories active on any point I am consciously considering.
This is fairly difficult, although those with a theoretical mathematical background will find that they have mastered most of the necessary tools, even if they have never applied them this broadly. Those who do not may find they need to work with the first technique for a longer period before being able to succeed at this one.
One should attempt to apply every bit of incoming information, whether direct sensory information or information provided by another person under all current and applicable theories. At first, you will probably find that you apply them to only one in real time and must go back and contemplate the information a second (or third, etc.) time under the other theories, but over time you should find that you can quickly switch among them in real-time for extended periods. Note that, if successful, you are filtering out less information than before you began this exercise.
The goal here is to master one’s own beliefs, rather than remain enslaved to them. Success can be seen when one can change their belief system as easily as they change clothes. You should also find that you can now make better decisions because you now have access to a larger base of unfiltered information, and the ability to organize that information in more ways, some of which you will find beneficial.
I must reiterate, however, that this needs to be done periodically on an ongoing basis. One will eventually find that the specific exercises are not necessary, but the rigorous self-questioning and challenging of one’s own beliefs and perspectives are necessary to keep them from becoming calcified.
This brings me to the third and final phase that I have discovered; learning to doubt that you are not believing (in the absolute sense) anything; this goes so far as to even doubt that you are doubting. I do not believe that I have mastered this (although I fear that I do at times believe that I have, as well as falsely believing that I do not believe this).
Unfortunately, I cannot articulate how to do this; I cannot even truly claim to understand it. As a mathematical analogy, I suggest that it is similar to moving asymptotically toward zero without ever reaching it in finite time. It “feels” to me as if it is super-rational, in the realm of those truths unreachable by reason (as proven to exist by Goedel’s Incompleteness Theorem and Turing’s proof that the Halting Problem is not solvable). As such, it falls into the realm of what I consider to be “mysticism” (note that this definition does not require the existence of the “supernatural”, only the mathematically provable “super-rational”).
Having said that, I include two “poems” by Aleister Crowley (from The Book of Lies) followed by my own “poem/mantra” on the matter.
Chinese Music. “Explain this happening!” —”It must have a `natural’ cause.” Let these two asses be set to grind corn —”It must have a supernatural cause.” May, might, must, should, probably, maybe, we may safely assume, ought, it is hardly questionable, almost certainly poor hacks! let them be turned out to grass!
Proof is only possible in mathematics, and mathematics is only a matter of arbitrary conventions. And yet doubt is a good servant but a bad master; a perfect mistress, but a nagging wife.
“White is white” is the lash of the overseer: “white is black” is the watchword of the slave. The Master takes no heed. The Chinese cannot help thinking that the octave has five notes. The more necessary anything appears to my mind, the more certain it is that I only assert a limitation.
I slept with Faith, and found a corpse in my arms on awaking; I drank and danced all night with Doubt, and found her a virgin in the morning. Terrier-Work. Doubt. Doubt thyself. Doubt even if thou doubtest thyself. Doubt all. Doubt even if thou doubtest all.
It seems sometimes as if beneath all conscious doubt, there lay some deepest certainty. O kill it! Slay the snake! The horn of the Doubt-Goat is exalted. Dive deeper, ever deeper, into the Abyss of Mind until thou unearth the fox that. On, hounds! Yoicks! Tally-ho! Bring that to the bay! Then, wind the Mort!
The mantra of Doubt. I believe that I Doubt, Thus I Doubt that I believe; Therefore I must Doubt that I Doubt … I believe.
Is it worth it?
This is perhaps the key question. I have repeated over and over that this is hard work to achieve; I have also (perhaps too gently) suggested that it may be dangerous (I have, however, avoided discussing any techniques — and there are such techniques — that could under some circumstances lead to mental illness, insanity, or perhaps even death).
My answer is yes. I can now work under a large number of different and contradictory belief systems and can respect many more than others seem to (note, however, that I can only respect them since I can respect any fixed belief system). I can perceive ideas and information across different belief systems that adherents of any one of them will filter out a portion of. I also, although I most strongly suspect I am deceiving myself, perceive Reality much more accurately than most.
As for more practical skills, as a software engineer, I find that I can find flaws in and troubleshoot software systems more quickly, efficiently, and deeply than anyone else I have experienced doing so. I often misdiagnose problems initially, but I perceive myself as recovering and finding flaws more quickly than others. I have received verbal feedback from others that suggests this is not entirely an expression of my bloated ego.
Is it worth it to you (whoever “you” are reading this)? That I cannot say. My best guess is that the answer for at least some of those reading this is “yes”, and quite possibly for others, “no”. Fortunately, it is not my decision.