By Dennis Klocek
Computers use formulas known as algorithms that dictate mathematically how changes are being created in a set of inputs. A researcher converts data from an experiment into mathematical equations or algorithms that are repeated or iterated many, many times by a computer. The logic language in constructing such a formula is, “if this, then : if not, else”. With only two choices of this or something else for a step in a calculation there is a filtering effect on data that works to reject what is called “noise”. Noise is considered to be data that is inessential to a narrowed search. This strict logic is done to reduce the range of possibilities for a solution. Results from a cycle of many this/else iterations can be analyzed and that information is then used to build another algorithm for a new, more focused search. Reducing variables when modeling a complex interaction is aided by the fundamental mathematical language of the computer known as base 2 or binary math. Yes and no are only two choices in that language. Phrases in that language are variations of yes/no such as if-then; or if so, then; or in the negative if-else. A tiny switch or gate in the computer registers that it is either off (off equals zero) or on (on equals one). When run in strings of many iterations the computer keeps track of how many ones (yes/on) or zeros (no/off) are being computed that fit the design of the algorithmic equation. The ability to rapidly process many ones and zeros through many iterations allows for solutions to be quickly reached.
Today, in computers worldwide a universe of on/off gates producing yes/ no answers are at work in processors everywhere, all the time. This reductionist way of modeling filters out fluctuations that may occur naturally within data that has been bracketed. Reducing data into brackets of binary code is a way of searching for noise that doesn’t fit the design of an experiment. A strictly linked chain of ones and zeros is not noisy. However, a broad field of numbers that could possibly be tangent to the search is filled with noise. Base 2 allows a researcher to quickly search for noisy errors in the algorithmic chain and reject them. This strategy of the search and destroy algorithm is the territory of the hackers and code warriors that wage info wars. Binary iterations filtering out “noisy” data allows researchers to quickly get an answer to a bracketed question. This is great for tracing errors in research on electronic circuits, security codes, engineering, chemistry or physics.
Binary iteration is especially problematic when applied to studying sociology, psychology or human response to medical issues. A string of yes/no answers produced by iteration creates a particularly un-human way of solving human problems inherently filled with the mathematical noise of feelings.
However, binary iteration can be problematic when yes/no algorithms are used to model natural phenomena that unfold in rhythmic time waves. This is because nature design is inherently noisy. Binary iteration is especially problematic when applied to studying sociology, psychology or human response to medical issues. A string of yes/no answers produced by iteration creates a particularly un-human way of solving human problems inherently filled with the mathematical noise of feelings.
Most algorithms in use today are variations of iterative error seeking formulas designed to ferret out zero/one errors in data processing. However, there is one particular form of algorithm that is self referencing known as recursive. A recursive algorithm assesses probabilities and can come closer to modeling the messy noisiness of human life. Recursive means reflecting, returning or self referencing. In computer language a recursive, self referencing algorithm contains one step in the formula that must include how the whole field of data bits reacted to the last iteration. When a solution to a problem depends on combining previous solutions to smaller instances of the same problem that is recursion. Each new iteration must include the whole field of previous possibilities. Each iteration takes into account a small calculation of the whole mathematical potential of the phenomenon being studied. Natural phenomena are highly recursive and self referencing. Errors in nature tend to be self correcting given enough time. This can be seen in an organism’s ability to deal with unusual environmental change. Forms of nature like Mandelbrot sets and Fibonacci spirals are examples of self referencing iterations.
Natural phenomena are highly recursive and self referencing. Errors in nature tend to be self correcting given enough time. This can be seen in an organism’s ability to deal with unusual environmental change.
The difference between iteration and recursion can be imagined in two types of graphic design applications, vector graphics and bitmap. Vector graphics mathematically restrict the points along a line that is drawn to a specific angle. Each new point completes a condition that is a specific vector from the originating point. Engineers use vector graphics to do design work when a specific line angle from a specific point must meet another specific line angle from another point. This is iterative in its yes/no selection and restriction of the movement of points. They are either a vector of the point or they are not.
Bitmap applications are used by designers to create cloud like fields that replicate the look of an airbrush. An array or field of points are computed with each iteration. The choice of density or value of the points is selected and each touch of the mouse sprays the next generation of field points to be computed. The result is that many small variations of the same choices create the final image. This is recursive in that the final outcome is not programmed in from the beginning. Bitmap has the recursive or self referencing qualities of probable positions within a field.
Since humans are also inherently self referencing in all they do, recursive, self referencing computer approaches can humanize social solutions by producing probability structures rather than giving yes/no answers to bracketed data. With iteration each step in an algorithm leads to the next step in a particular lock-step order designated by the researcher. The result of this lock-step search is a specific yes/no answer. With recursive algorithms each iteration includes a step recognizing reaction of a whole field of variables. The result is a probability structure of a group of possible solutions that requires all of the steps be combined together to work towards an eventual solution.
Statistics used to evaluate information about social trends can be either iterative (yes/no) or recursive (whole). Iterative statistics about pandemic death rates are powerful in that they represent the potential for providing specific answers. But with no known cure for most pandemics the data that is analyzed by iterative programming gives the illusion of an answer(yes/no) where no answer is possible. True answers require the wholeness of all of the forces in a pandemic. The overwhelming degree of unknown parameters in the data field of a pandemic makes iteration less effective for giving answers about health issues. In widespread illness there can be no specific answers. Populations succumbing to a pandemic often have pre-existing and co-morbid complications that cannot be built into an algorithm that, when iterated, presents a specific answer. The logic structure that can provide an answer from reductionist iteration is too fixed in the scope of its search. At best it can calculate a total number of registered deaths or infections and give a statistical (probable) view on that number comparing it to past pandemics. But those statistics were also flawed by pre-existing and co-morbid data that was not part of the iterative algorithm that generated the statistic.
To be fair, this, in itself, is not problematic. It is just contemporary research. But authorities dangerously present binary (yes/no) statistics on the degree and mortality rates attributed to the infection as a yes/no type of proof. These are on-the-fly surveys of a very wide field of input data reduced to yes/no reasoning. Statistics are actually probability structures rather than specific answers. It is alarming that these on-the-fly statics generate specific public policy for funding the remediation, tracking and policing of policies taken from on-the-fly numbers. This is mistaking apples for oranges in the creation of public policy.
Based on this can we imagine an iterative lock step sequence of how these pandemic response policies unfold?
- 1. How did this start?
- 2. Find suspected source.
- 3. Blame suspected source.
- 4. Call for more policing to prevent future outbreaks.
- 5. Ask for civil liberties to come under regulation for the public good.
- 6. Too dire to contemplate.
Being part of the wide dimensions of human life, clear yes/no answers cannot be linked reasonably to the dynamics of pandemics. Past viral pandemics like SARS, and HIV are in the same family as this coronavirus. Those pandemics did not result in the development of an effective vaccine yet the pandemics mysteriously faded. Where did the threat go?
Perhaps a thought experiment can be useful. Using an if so, then search protocol could it be said:
- Is binary cognition is a form of consciousness?
- If so, then is recursive iteration a form of consciousness?
- If so, then since computers are made by humans to mostly do iterative cognition, do the humans who engage in ubiquitous binary consciousness begin to think only in iterative sequences?
- If so, then since the iterative process in humans is inherently bracket driven can this form of consciousness be found in social phenomena?
- If so, then has bracketed and polarized binary based thinking replaced the possibility of recursive thinking in society?
- If so, then does the present polarization of many realms of life have a connection to binary consciousness in human/ machine interaction? Many minds, by training, are now thinking primarily in iterative processes.
- If so, then most decision making will be based on searching for one answer that will solve or complete a condition no matter how reductive the search.
- If so, then can it be realized that whole cascades of iterative yes/no choices do not constitute a truthful process for field like social understanding and policy making?
Whole truths are recursive and to evolve such insights takes time to process the whole field at each step. Unfortunately if that recursive capacity is not in the consciousness of most humans today, then public policy will be based on yes/no iterative polarized thinking. Regarding pandemic assessment this has already taken place. Binary iterative consciousness has generated a predictably reactive cloud of confusion, anxiety and blame based on the “necessity” of having to come up with an iterative yes/ no answer to a widespread health issue. Regarding the generation of pandemics, it could be said that feelings of uncertainty about truth that are sustained across large populations for a long time will eventually amplify into a resonant field of fear of other humans. Having fearful feelings about others is the true root of pandemics. Fear of others is the common infective emotion in all pandemics in the past and present. If so, then what about the future? Truth and history have proved the futility and danger of final solutions applied to social challenges. Only the recursive wisdom of many small understandings accumulated over time can move social accord forward.
Dennis Klocek, MFA, is co-founder of the Coros Institute and a faculty member at Rudolf Steiner College. He is the author of nine books, including the newly released Colors of the Soul; Esoteric Physiology and also Sacred Agriculture: The Alchemy of Biodynamics. Dennis is also an international lecturer.