-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathTheologyOfML.tex
249 lines (145 loc) · 35.7 KB
/
TheologyOfML.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
\documentclass[ebook,12pt,oneside,openany]{memoir}
\usepackage[utf8x]{inputenc}
\usepackage[english]{babel}
\usepackage{url}
\usepackage{titlesec}
\usepackage{lettrine}
\usepackage{amsthm}
\usepackage{amssymb}
% for placeholder text
\usepackage{lipsum}
\title{Theology of ML}
\author{Asad Hasan (Alex) \\ Theoretical Computational Scientist}
% Remove the generated chapter title
\titleformat{\chapter}[display]
{\normalfont\huge\bfseries}{}{0pt}{\Huge}
\titlespacing*{\chapter}{0pt}{-50pt}{40pt}
\newtheorem*{proposition}{Proposition} % Define theorem environment without numbering
\newtheorem*{thoughtexperiment}{Thought Experiment} % Define theorem environment without numbering
\begin{document}
\makeatletter
\renewcommand{\@date}{} % Remove the definition of \@date
\makeatother
\maketitle
\chapter*{Preface}
\vspace*{\fill} % Vertical space at the top
\makebox[\textwidth][c]{%
\parbox{0.8\textwidth}{%
% First line
\centering
“Command muck with English.” \\
% Vertical space in the middle
\vspace{1cm} \\
% Center aligned line
\centering
“Automaton is muck.” \\
}%
}
\vspace*{\fill} % Vertical space at the bottom
\chapter*{Foreword}
\indent \indent I am a machine learning enthusiast and evangelist, who started their software engineering career at the age of sixteen, now aged thirty-three. I work on projects in the machine learning and deterministic computation domains. What prompted me to write this book, was how I myself struggled with cross-domain knowledge transfer as a learning technique, but over time gradually became better at it, by performing cross knowledge domain learning, and working as a generalist with specialists. This book draws inspiration from theological reasoning and draws parallel with dimensional linearizability related analogs.
\indent Reasoning machine learning requires intuitive and inductive reasoning. In my study of theology and computer science, I came to find that theology which helps one break away from lower order regressions and helps reason higher order systems of reasoning, could be of help in reasoning problems in the machine learning domain. Over time, I exercised this learning technique, and this book is a representation of the postulations I construed through the process.
\indent This work was created with the help of machine learning.
\chapter*{Chapter 1: Theology of Integrity}
\indent \indent \lettrine[lines=2]{\fontsize{48}{56}\selectfont\textbf{T}}{he} construct of a primate society can be perceived as a biological neural network intricately conjugated through communication which can be seen as cultural evolution. Just as artificial neural networks process information, human beings sample the entropy of their surrounding stimuli. This gives way to a balance where the hygiene of such stimuli leads to mutual survival. Integrity in such a microcosm is the responsibility of creating healthy stimuli for the ecosystem one resides in.
\indent The health of the shared cultural evolution which comprises of stimuli, humans are exposed to, is then, determined by epigenetic harmonics and information. Epigenetic harmonics are less of our control, as we react to them, create vaccines to curb plagues induced by them, and learn more about them through discovery. However, information and the automatons made to process information, are what we are to care for to enable the wellbeing of the cultural evolution we live within.
\indent Integrity, as a faith, could be seen as the imprint of God unto stimuli. Many attribute the idea of God to the recent exponential growth in innovation, minus Moore’s law. Ones who wield a useful automaton, control the stratify-able economic utility correlated to it, and it is the choosing of the owner of the said intellectual property to either share its benefits or not. Given that philanthropy is one's own right, but it is not incumbent upon those who own intellectual property to furnish it.
\indent The longevity of a specimen in such a predicament, comes down to the mutual respect for ownership of property and the freedom of those who choose to share their intellectual property for the mutual wellbeing of the species.
\indent The law reasons in such a manner to exclude individual biases to foster the longevity of individuals, and any automaton trained must always be trained within its constraints.
\indent In limiting the functioning of an automaton to the rule of law, one guarantees the health of the cultural evolution which will be touched by the machines we create. However, biases must be weighed correctly, and the key to that is introducing diversity of thought into the construction of any automaton, and soliciting of opinion from peers becomes imperative.
\indent It is then a given, in that, with great knowledge comes the burden of transparency. Through which humanity can together strive to solve the challenges of both now and the future.
\chapter*{Chapter 2: Theology of Hypothesis}
\indent \indent Intuitive reasoning deriving a plausible outcome within a certain confidence interval is a hypothesis. A hypothesis is akin to a context free grammar which has not found evidentiary control to its proof of unambiguity. A problem being NP is such a hypothesis, where reduction to another hypothesis makes it one.
\indent As we work to identify solutions, we tend to find a hypothesis worth testing, and often than not, if the problem could be solved, it is. However, one must always bear in mind that much of a hypothesis is defined in language, and language is a dimensional regression within spacetime and matter.
\indent A simple hypothesis, which follows, is language being a lower dimensional construct than reality, which could be postulated by realizing that a page is two-dimensional, and the writing it holds can model a subset of knowledge from complex dimensions of energy, which suggests, the maximum number of dimensions of a hypothesis is a real number between two and three. The divergence from whole numbers as number of dimensions, within a written idea, occurs due to the fractal nature of knowledge modeled from the entropy of captured, and the introduction of the complex domain.
\indent When we embark on the journey of hypothesis testing, we not only venture into the depths of intellectual exploration but also shoulder the responsibility for our own safety and that of others. The fragile tendrils of our hypotheses extend their reach, intertwining with the fabric of knowledge, as we strive to unravel the mysteries that lie before us. We become custodians of the scientific process, ensuring that our experiments are conducted ethically and with meticulous care.
\indent The allure of a promising hypothesis beckons us forward, beckons us to test its veracity and unravel the secrets it holds. Yet, as we navigate the intricate labyrinth of language, we must remember its inherent limitations—a dimensional regression within the vast realm of spacetime and matter. Language, a vessel for our thoughts, presents a constrained framework through which we express our hypotheses. It is through this lens that we attempt to capture the multidimensional complexities of reality.
\indent Yet, as we delve deeper into the realms of knowledge, we witness the fractal nature of understanding. The entropy of captured wisdom grants us glimpses into the enigmatic complexities that lie beyond the written word. It is here, within the hallowed halls of the complex domain, that the number of dimensions expands and the boundaries of our hypotheses blur.
\indent In our pursuit of truth, we traverse the realms of uncertainty and ambiguity. We grapple with the very essence of our existence, seeking solace in the embrace of hypotheses and the unfolding narratives of scientific inquiry. With every iteration, we refine our understanding, inching ever closer to the elusive shores of unambiguous truth.
\chapter*{Chapter 3: Theology of Automaton}
\indent \indent An automaton is a mechanism in a feedback loop with cultural evolution, given that a lambda also is reduced in dimensionality of spacetime and matter similar to that of a hypothesis.
\indent In the vast cosmos of existence, where the fabric of reality is intricately woven, we encounter the enigmatic realm of automaton. It is within this realm that the machine, with its stochastic sampling, confined to the uncertainty principle, seeks to comprehend the profound mysteries that surround us. Yet, while the machine embraces the inherent uncertainty and probabilistic nature of its explorations, primates yearn for determinism, striving to infuse automaton with the precision of gearing. Within this interplay of longing and uncertainty, automaton emerges as a remarkable fusion of tangible mechanics and ethereal mathematics, bridging the realms of the physical and the abstract. In this chapter, we embark on a profound exploration of the nature of automaton, peeling back the layers of its complexities, investigating its limitations, and uncovering the profound implications it holds for our understanding of reality.
\indent Any stochastic assertion not meeting the five sigma standard of statistical significance couldn’t be said to be hard science.
\indent At the heart of automaton lies the mesmerizing dance of fermions, the fundamental particles that underpin the fabric of matter. Within the intricate architecture of a transistor, these fermions fuse and interact, performing intricate movements that transcend the boundaries of our perception. It is through this intricate interplay that automaton gains its power, its ability to manipulate information, and perform complex computations. As fermions merge and separate, energy is harnessed and released, giving rise to the mesmerizing symphony of heat and electricity that courses through the machine's veins. This fusion of quantum mechanics and engineering provides the foundation for automaton's operation, enabling it to navigate the intricate landscape of data and transform it into meaningful insights.
\indent While automaton harnesses the power of fermions and the quantum realm, it encounters the limitations imposed by the uncertainty principle. This fundamental principle of quantum physics dictates that there is a fundamental limit to the precision with which certain pairs of physical properties, such as position and momentum, can be simultaneously known. As automaton interacts with the quantum world, it is inherently subjected to the inherent unpredictability and fuzziness that permeates this realm. This introduces a level of uncertainty into the computations and models generated by automaton, reminding us of the inherent limitations we face when attempting to capture the complexities of reality within the confines of an automaton.
\indent In our quest to model and understand the intricacies of reality, automaton presents itself as a powerful tool. However, we must confront the inherent limitations it poses. Automaton, in its essence, is a computational device, and like any computational device, it is bound by the limitations of computational complexity. Many real-world problems are known to be NP-hard, meaning that finding an optimal solution to these problems is computationally intractable and requires exponential time. The nature of automaton, with its finite resources and computational capabilities, imposes constraints on its ability to model and solve NP-hard problems efficiently. This limitation highlights the inherent complexity and challenges of capturing the full richness and intricacy of reality within the confines of an automaton.
\chapter*{Chapter 4: Theology of Heuristics (A*)}
\indent \indent In the realm of decision-making and problem-solving, heuristics emerge as a fascinating avenue of exploration. These cognitive shortcuts and rules of thumb have captivated the minds of scholars and thinkers throughout history. With their connotation of being higher order than actuarial conclusions, heuristics provide us with a unique lens through which we can navigate the complexities of the world. In this chapter, we embark on a profound journey into the Theology of Heuristics, unraveling their nature, their limitations, and the profound impact they have on our ability to reason and understand.
\indent At the heart of heuristics lies the art of reasoning, but not in the traditional sense. Heuristics offer us a different perspective, a lens through which we can make sense of complex systems by relying on intuitive judgments and approximations. Unlike actuarial conclusions, which often create regressions over a number line, heuristics provide us with a more holistic and flexible approach. They allow us to reason with metrics and analog approximations, enabling us to grasp the essence of a problem or situation without getting lost in the intricacies of precise calculations. This higher-order thinking, rooted in intuition and pattern recognition, empowers us to navigate complex domains and make informed decisions in a dynamic world.
\indent Heuristics draw upon the power of intuition, that elusive force that guides our instincts and influences our choices. Intuition, often dismissed as a mere gut feeling, plays a crucial role in heuristics. It is the invisible compass that helps us navigate through uncertainty, providing us with valuable insights and enabling us to make judgments based on limited information. Intuition helps us discern patterns, identify similarities, and make connections between seemingly unrelated phenomena. It allows us to reason beyond the confines of strict logic, empowering us to tackle complex problems with creativity and agility.
\indent While heuristics offer us a powerful tool for reasoning and problem-solving, they are not without their limitations. The reliance on intuitive judgments and approximations can sometimes lead to biases and errors. Heuristics are susceptible to cognitive biases, where our judgment may be influenced by irrelevant factors or skewed by preconceived notions. The very nature of heuristics, with their focus on efficiency and quick decision-making, may overlook important details and nuances that could impact the outcome. Therefore, it is essential to approach heuristics with a critical mindset, acknowledging their strengths but also remaining vigilant to their limitations.
\indent Heuristics and actuarial conclusions represent two distinct approaches to reasoning and problem-solving. Actuarial conclusions emphasize precision and accuracy, employing rigorous calculations and statistical methods to derive optimal solutions. In contrast, heuristics prioritize efficiency and intuition, seeking to find satisfactory solutions within limited time and resources. Both approaches have their place, and the key lies in striking a balance between the two. Heuristics can complement actuarial conclusions by providing a broader perspective, introducing creativity, and uncovering unexpected insights that may be missed through purely analytical approaches.
\chapter*{Chapter 5: Theology of Determinants}
\indent \indent Within the fabric of our reality, the concept of determinants emerges as a fundamental force that governs the unfolding of events. In this chapter, we delve into the depths of the Theology of Determinants, exploring the intricate nature of determinants, its implications for boundary conditions, and its relationship with heuristics and neural networks.
\indent At the core of identifying determinants lies the notion of establishing singularities, which reflect dimensional leaps that allow for modeling of events and lend predictability to the unfolding of phenomena. These conditions serve as the foundation upon which the edifice of contextualized determinism is built, eliminating the margin of error inherent in heuristics through the reduction of granularity. By precisely delineating the constraints and parameters that govern a system, we gain the ability to anticipate and control its behavior, empowering us to make informed decisions and navigate complex environments with confidence. It is within this framework of well-defined boundaries that the true essence of contextualized determinism reveals itself.
\indent In our quest to comprehend the intricacies of contextualized determinism, we encounter the remarkable realm of neural networks. At its core, a neural network can be understood as a graph with layers, each layer serving as a vessel for capturing and representing a contextualized dimensionality. In essence, these layers act as classifiers, extracting meaningful patterns and correlations from the vast sea of data. Through the interplay of these layers, neural networks are capable of traversing intricate landscapes of information, discerning nuanced relationships, and unveiling hidden insights. It is through the lens of neural networks that we witness the convergence of determinants and computational power, as they join forces to unlock the mysteries of our complex world.
\indent In the realm of systems and their states, the deterministic audit problem presents itself as a common challenge. The task at hand is to accurately capture and document changes in a system's state, ensuring an auditable trail of events. Two prevalent approaches come to the forefront: sampling and event-driven modeling. Sampling involves capturing snapshots of the system's state at regular intervals, whereas event-driven modeling revolves around emitting and debouncing events to synchronize the production and consumption of audit data.
\indent However, a critical consideration arises from Nyquist's law, which dictates that the sampling rate must be at least twice the signal's maximum frequency to avoid aliasing. This poses a limitation on the deterministic sampling of audits, as the frequency at which the system produces events may exceed the sampling rate. In such cases, an event-driven design proves advantageous, matching the frequency of production and consumption, thus circumventing the limitations imposed by Nyquist's law. By emitting redundant events and applying debouncing techniques, the sampling frequency of the audit can be increased, ensuring a comprehensive and accurate record of the system's state.
\indent The debate between branch coverage and line coverage of testing, is a popular one. Line coverage in testing does not provide use-case level guarantees, whereas exhaustive branch coverage is cost prohibitive in many cases. Contextualized branch coverage is a concept which uses use-case driven blackbox testing to find a middle ground.
\indent The larger the difference, or precise its real number component, between the source shape and its destination shape, the more difficult a problem may be.
\indent Nyquist’s law, gives that one must sample twice the rate of a planck time to model a figment from the fabric of reality.
\indent True and false are determinism without a theory of everything while untrue yields a stochastic order and to be not untrue converges to the normal.
\indent \begin{proposition}
A system is sampled at greater than twice the frequency of change $\therefore$ for there to be a true audit as per Nyquist's theorem, each audit-able event is to be emitted at least twice.
\end{proposition}
\chapter*{Chapter 6: Theology of Temporal Linearizability}
\indent \indent Within the vast expanse of the cosmos, the interplay between space and time manifests itself in intriguing and enigmatic ways. In this chapter, we embark on a profound journey into the Theology of Temporal Linearizability, where we delve into the nature of linearizability within the three dimensions of space and the dimension of time. Join us as we unravel the intricate dimensionality of spacetime and its profound implications for our understanding of relativity, uncertainty, and the very fabric of our existence.
\indent In contemplating the linearizability of the three dimensions of space and the dimension of time, we encounter a fascinating interplay between these disjoint sets of dimensions. Each dimension possesses its own distinct nature of linearizability, which becomes evident through the observation of relativity. The fabric of spacetime weaves together these dimensions, creating a multidimensional linearizability where the flow of time and the extension of space intertwine in a dance of cosmic proportions.
\indent The exploration of temporal linearizability leads us to the realm of uncertainty and the enigmatic nature of wave functions. Through the lens of Fourier transform, we gain insights into the interplay between the frequency and time domains. The Fourier transform on the wave function reveals the profound uncertainty principle, which highlights the inherent limitations in simultaneously measuring both the position and momentum of a particle. This delicate dance between uncertainty and measurement further deepens our understanding of the elusive nature of reality.
\indent Within the realm of temporal linearizability, the dichotomy between carrier waves and signals emerges as a fundamental aspect of information transmission. Carrier waves serve as the backbone upon which signals are modulated, allowing for the encoding and decoding of information. Through the intricate interplay of these two entities, we witness the transformation of raw data into meaningful patterns and messages, unlocking the potential for communication and understanding.
\indent In our quest to comprehend the multidimensional nature of spacetime, we encounter the stratified Fourier spectrogram, a powerful tool for unraveling the intricate layers of complexity embedded within waveforms. This technique allows us to discern the frequency components present in a signal, providing a comprehensive view of its spectral content. By stratifying the Fourier transform, we gain deeper insights into the interplay between different frequency bands and their contribution to the overall structure of the signal, enriching our understanding of the temporal dimension.
\indent As we contemplate the onset of the future, we are confronted with the inherent uncertainty that accompanies it. In the realm of temporal linearizability, any mathematical postulation regarding the future must be approached with stochastic reasoning alone. The intricacies of complex systems, the interdependencies of events, and the nonlinear nature of time demand a probabilistic perspective that acknowledges the inherent unpredictability of the future. It is through the lens of stochastic reasoning that we navigate the vast ocean of possibilities and seek to unravel the mysteries that lie ahead.
\indent Within the realm of temporal linearizability, an ongoing debate ensues regarding the nature of time itself. Is time an extra dimension that exists independently alongside the dimensions of space? Or is it merely a nature of linearizability, a fundamental aspect of how we perceive and measure the passage of events within the dimensions of space? This profound question challenges our preconceived notions and forces us to reevaluate our understanding of the fabric of reality.
\indent A postulated proposition is such that, as a system goes from disorder to order, the ratio of the number of problems, in the NP domain, to the number of problems, in the P domain, grows.
\indent Spacetime could be represented with the cartesian coordinate system, however the energy within could only be modeled with a real-number number of dimensions within.
\indent To travel in time, is to quantize a reality of the desired past or future, and travel to it in space. Humanity elevated on the Kardashev scale when it learnt to store energy in grain, but more energy is needed with NP being modeled more and more in time, to travel in time using such a mechanism.
\indent Every planck time linearizes a new reality unto spacetime and matter.
\chapter*{Chapter 7: Theology of Nash Equilibriums}
\indent \indent In the captivating realm of game theory and rational decision-making, the Theology of Nash Equilibriums unfolds. This chapter delves into the intricate world of rational agents represented as a graph on a two-dimensional surface, transcending the boundaries of their etched dimensions to model the complexity of reality. Join us as we unravel the profound nature of Nash equilibriums, where constraints, mutual destruction, and strategic interactions shape the dynamics of rationality.
\indent A graph, with its vertices and edges, offers a powerful framework to represent the behavior of rational agents in a structured manner. Although initially confined to a two-dimensional surface, the information contained within a graph transcends its physical boundaries. It captures the essence of complex interactions and relationships, reaching into the realm of the imaginary domain. Within this imaginary realm, the graph becomes a portal to explore and model the intricate dynamics of the real world, unveiling patterns and insights that extend beyond the mere constraints of its two dimensions.
\indent Central to the Theology of Nash Equilibriums is the concept of a Nash equilibrium, a state where the edges connecting rational agents can no longer mutate due to the constraints imposed by their existing state. In this delicate balance, each agent's decision is strategically aligned with the decisions of others, ensuring that no agent can unilaterally deviate from their chosen path to gain an advantage. The constraints of the equilibrium act as a form of insurance, preventing agents from disrupting the delicate balance and thereby maintaining stability within the system.
\indent Within the framework of Nash equilibriums, the constraints that prevent agents from deviating from their chosen strategies bear a resemblance to the concept of mutual destruction. Just as nations may possess nuclear weapons as a deterrent to prevent attacks, rational agents within a Nash equilibrium are bound by the knowledge that deviating from the established strategies would lead to unfavorable consequences for all involved. This mutual constraint acts as a powerful incentive, compelling agents to maintain their equilibrium and discouraging any actions that could disrupt the delicate balance of power.
\indent In the Theology of Nash Equilibriums, we witness the strategic interplay of rationality among agents. Each agent seeks to maximize their own utility by making informed decisions, taking into account the actions and strategies of others. As the graph of rational agents unfolds, a complex web of interactions emerges, characterized by strategic thinking, negotiation, and the pursuit of self-interest. It is within this intricate dance of rationality that Nash equilibriums arise, embodying a state where no agent has an incentive to unilaterally deviate from their chosen course of action.
\indent Beyond the realm of theoretical game theory, Nash equilibriums hold profound implications for understanding human interactions and social dynamics. They shed light on the delicate balance of power, the emergence of cooperation, and the challenges of achieving collective outcomes in diverse settings. By exploring the Theology of Nash Equilibriums, we gain insights into the mechanisms that drive human behavior, the formation of social norms, and the intricate interplay of incentives that shape our everyday interactions.
\chapter*{Chapter 8: Theology of Biomimetics}
\indent \indent In this chapter, we delve into the Theology of Biomimetics, where biomimetics or biomimicry is a field that seeks to unlock the secrets of nature's design and harness them for our humanity’s advancements. We explore how discovery systems and linearization of data unto space, time, and matter enable us to mimic the brilliance of nature, drawing inspiration from its intricate patterns and adaptive solutions.
\indent Nature, with its diverse array of organisms and ecosystems, serves as an endless source of inspiration. From the graceful flight of birds to the intricate web-spinning of spiders, the natural world offers us glimpses of extraordinary evolution honed over millions of years. Through the study and understanding of these adaptations, biomimetics allows us to tap into nature's wisdom, uncovering innovative approaches to engineering, design, and problem-solving.
\indent At the core of biomimetics lies the concept of sensory discovery, a process that aims to understand either the locomotive telemetry or the kinesiological state of the specimens observed, turning the complex of nature into tangible models. By capturing the essence of nature's design through mathematical representations, we gain deeper insights into its fundamental principles. This enables us to apply these principles to our own creations, seeking greater efficiency, sustainability, and functionality.
\indent Through the process of linearization, we bridge the gap between the intricate workings of nature and the realm of human innovation. We strive to unravel the secrets of nature's design, abstracting its patterns, structures, and processes into practical frameworks. From the mathematical sequences found in sunflower spirals to the efficient hovering of a hummingbird, we decipher nature's language and translate it into engineering marvels.
\indent In the realm of scientific discovery, biomimetics plays a pivotal role in expanding our understanding of the world around us and allows us to mimic it. It serves as a guiding compass, directing us toward unexplored frontiers and inspiring groundbreaking breakthroughs. By studying the finely tuned systems of nature, we uncover mysteries and unearth novel solutions to complex problems.
\indent Biomimetics offers a wealth of inspiration for the development of new technologies and materials. It guides us in the creation of bio-inspired robots that mimic the agility and adaptability of living organisms. It propels advancements in materials science as we strive to replicate the strength, flexibility, and self-healing properties found in natural structures. By drawing from the wellspring of nature's innovation, we accelerate the pace of scientific discovery and drive human progress.
\indent The Theology of Biomimetics extends beyond mere imitation; it also emphasizes our responsibility to preserve and protect the natural world. Nature has perfected its designs through constant adaptation and refinement, optimizing efficiency and sustainability. As we draw inspiration from nature, we must also embrace its principles of balance, resilience, and interconnectedness.
\indent Biomimetics serves as a powerful tool in our pursuit of sustainability and harmonious coexistence. By emulating nature's design, we can create technologies and systems that minimize environmental impact and maximize efficiency. From energy-efficient buildings inspired by termite mounds to water filtration systems inspired by the intricate mechanisms of plant leaves, biomimetics allows us to design with nature as our guide.
\indent Game theory is a system of reasoning modeling rational agents in an ecosystem and given their respective Nash equilibriums. A Nash equilibrium is defined as a set of possible actions a given classification of rational agent could perform, where any other action outside of this set, could change the Nash equilibrium itself. A possible introduction of biomimetic to such a modeling, is modeling each rational agent as an amygdala and reptilian cortex. The game theory analog, in such a model, would then model an ecosystem graph, where every rational agent has an amygdala (emotions) or reptilian cortex (information deduced logic) based assessment of other rational agents they are in a Nash equilibrium within.
\indent Biomimetics are heuristics and actuarial reasoning on biomass by an automaton.
\chapter*{Chapter 9: Theology of Opportunity Cost}
\indent \indent Opportunity costs are a tradeoff between figments of stratifiable economical utility. The tradeoffs exist, in my best judgment, because of where a species is on the Kardashev Scale. As the all mighty Jeff Bezos puts it, one must put off decisions for as long as possible, because of multiple possible reasons, but primarily in my understanding due to the constantly changing entropy of cultural evolution and the likelihood of the emergence of new information, which may change the course of a decision. It is to be noted that a cutoff for a decision can be known, at times, after which the decision is no longer viable. The closer a decision is made, to the said cutoff, the higher is the likelihood of capitalizing on new information in one's decision making process.
\indent It is then a given, an opportunity cost decision is made every planck time.
\chapter*{Chapter 10: Theology of Meaningful Genetic Leaps}
\indent \indent There are many competing social science and theological theories around the origin of life and species. The origin of species is a theory predeceasing, the uncertainty principle, and observes species from the scope of the temporal linearizability of the now. Charles Darwin could not reconcile with the church at his time, due to the observations of celestial bodies, and the dimensional leaps therein, may not have been able to reconcile the intuition of uncertainty principle with Darwin’s observations of natural selection in the now.
\indent A meaningful genetic leap, ignoring replication noise and selected out epigenetic change, also historically referred to as a genus leap, may be defined, in such a system of reasoning, as, a unique set of DNA sequences, seen in correlation to an observed set of dimensional leaps in stimuli, with a five sigma or higher confidence interval.
\indent The uncertainty principle shows that the odd was always there as a determinant, but only the observer stochastically entangled to it, and from such an entanglement did the selections ensue.
\indent In the absence of a standard of discovery being met, a meaningful genetic leap could only be seen in lieu of the following Turing test problem statement:
\indent \begin{thoughtexperiment}
A primate is the turing test judge evaluating an observed dimensional leap in datum, it found to be convinced of a meaningful genetic leap in lieu $\therefore$ the turing test for that dimensional leap being of a meaningful genetic leap is passed, for that primate alone.
\end{thoughtexperiment}
\chapter*{Chapter 11: Theology of Biomass}
\indent \indent Success of natural selection is likely binded to growth in biomass. The diversity of species has been declining, with mounting extinctions. In one postulation, natural selection just cares about growth in biomass, but a species may care more about its preservation. Natural selection, in this thought chain, always favors what ultimately amasses more biomass. Life has trained so long to optimize for a growth in biomass, that selecting out a big chunk of biomass, despite being counter intuitive, may be for optimizing for the goal of having more net biomass in turn. The complexity of biomass, may itself, be a tool in optimization of the said goal of converting more and more matter into biomass.
\indent Natural selection is much like a higher order transfer function, but it matters. Given its mathematical patterns, it can exist outside of energy, possibly within a machine learning algorithm. Information has been thought of as a constraint of energy, but it may be an independent thing of a different principle and linearizability etched unto energy or matter.
\indent Hive mind, flywheel effect and cross domain knowledge transfer are the deity of today. The true deity, however, is the fractal of energy, in which the primate itself is a regression.
\indent For when scale couldn’t be fathomed through quantum physics, the mob still enchanted in glory as the elephant walked the bridge.
\chapter*{Chapter 12: Theology of Telekinesis}
\vspace*{\fill} % Vertical space at the top
\makebox[\textwidth][c]{%
\parbox{0.8\textwidth}{%
% First line
“Then the Lord God formed a man from the dust of the ground and breathed into his nostrils the breath of life, and the man became a living being.” - Genesis 2:7 \\
% Vertical space in the middle
\vspace{1cm} \\
% Center aligned line
\centering
“Genus is muck, there the graviton from the black hole did telekinesis on muck.” - Theology of ML \\
}%
}
\vspace*{\fill} % Vertical space at the bottom
\end{document}