Photo by NASA on Unsplash

Quantitative Intrastructural Reasoning

Theorizing in the Physical Sciences

Eric Bond
8 min readDec 7, 2020

--

Quantified models have also gone through differentiation, incorporated new paradigms in what are considered the hard sciences, and been coopted into applications that transform them in step with demands of practice, but have the distinction of being more verifiable and in some cases incontrovertible up to this point, capable of attaining status as natural law, which is still nearly impossible for personal psychology. Perhaps the state of quantification as primarily addressed to the inanimate is a consequence of developmental tradition. Motion and minerals are mercurial, ‘external’ phenomena, while human behavior conforms more closely to natural intuition and effortlessly to the fulfillment of will and desire. Perhaps this is one of the reasons that Western culture inclined towards concepts of mechanistic, ‘physical’ matter and nonmechanistic soul, neglecting overdue formulation of biological and then psychological material and mechanism with attendant laws of formal causality. Relatedly, everything located in the mind is more value-laden, of greater ethical import, and thus vulnerable to bias that can inhibit the revisional process. So once understanding of material environments superseded our somewhat arbitrary presumptuousness about the spiritual and began to be elaborated mechanistically, it far outpaced empirical investigation of the mind, which is just beginning to achieve escape velocity from prejudices and ancient intuitions, making its arrival as systematic, universalizing objectivity.

Hard sciences revolutionized our way of thinking about technology, but there has been some resistance to revised theorizing about ourselves, even though it is inevitable and has in fact had a modest effect on common beliefs under the radar, as in cultural assimilation of some Freudian concepts. Hindrance comes from fear of social disorder on the part of authorities whose governance is jeopardized, also from new power structures that go through oppressive, reactionary initiations, and growing capacity to force behavior via psychologically primitive yet highly effective techniques of propagandistic control, kindling false confidence in members of higher socioeconomic strata and resurgent class antagonisms. Somehow the culturally predominant have arrived at a conviction that they can convince the populace of its equal status while sucking it dry of capital and diminishing the standard of living, a trend which should prompt serious, soul-searching reflection by everyone. The way psychoanalytic theory and evolution-based models of human nature among additional facets of science have been poorly understood, deficiently incorporated, and become corruptive to institutions bodes ominously for the future of civilization: social Darwinist tendencies and diagnostic conventions are the first rumblings of potentially global forces of destruction.

Getting back to quantification, we have tended to project our own will into phenomena we examine at the beginning of a modeling effort, anthromorphizing the world as intentional. Disciplines such as alchemy and astrology, which evolved into chemistry and physics respectively, inspired much credulity about whether they evidenced sparks of divine will. Before invention of the telescope, the Earth was thought to be enveloped in a spiritual empyrium, and alchemists regarded their material as composed of spirit or ‘pneumatic’ substances to the extent that transmutations were invisible. Precision standardizing of measuring instruments allowed natural philosophers and then scientists to finely control laboratory environments, repeating procedures to exactitude and developing numerically-defined concepts such as velocity, acceleration, mass, heat, energy, along with statistical margins of error, so that variables both internal and external to experimental designs became keenly perceptible, rigorously predictable and mechanistically modelable.

One of the foundations of our physical understanding of the material world is the first law of thermodynamics, which states that matter and its associated energies of transformation are never created nor destroyed but only converted into different forms, such as kinetic and potential energy, heat (a measure of motion), electromagnetic radiation, and varying particle structures. It was derived from experimental setups insulated from the rest of the laboratory to such an extent that no phenomenon of conversion escaped measurement. Researchers found that exothermic reactions heat up surroundings and endothermic reactions cool them to a degree exactly proportional to reactive masses and atomic structures. Negligibility between trials of the same reaction process with the same instrumentation, alongside conservation of quantities in mass and energy regardless of direction, rate, amount and frequency of change, implied that the content of a material system can be redistributed but never vanish nor spontaneously generate. In classical physics as well as with chemical reactions between atoms and their subatomic particles, every physical process has seemed to only affect proportions of motion, mass and structural configuration, not alter the scale itself upon which they are balanced.

The law has never been unequivocally violated by a laboratory experiment, and so we of course lack a model describing how it could be, but it only applies in the context of standard atomic theory, so even this extremely reliable presupposition is still no more than a conditional assumption. It is for the foreseeable future at least hypothetically possible that something in the unknown universe could confound it, which might transform it from a scientific law into a theory. Ability of photons to spontaneously materialize when nuclei split into subatomic particles, and the generation of nearly massless electrons upon decay of comparatively heavy neutrons, may challenge or at least qualify the law pending further observations with more sensitive equipment.

This kind of caveat-making process has already occurred in relation to Newtonian mechanics in the context of which scientists had formulated what was termed the law of universal gravitation. From the 17th to the late 19th century, all motion in matter at the macroscopic level could be explained in terms of relative mass, with smaller bodies collapsing into or revolving around larger ones, a mutual but disproportional pull we all know as gravity. This force concept was used to account for arrangement of objects on Earth along with friction and erosion, the paths traced by moving objects, their inertia, additive influences when acting jointly, even the movements of celestial bodies. All of this mass was described as three dimensional, with time being a definition of the strength of mass-generated force in terms of rate, an epiphenomenon of inanimate substances obeying mechanistic laws of size proportions. Even individual atoms were projected in theory as a nuclear core like the sun with electrons orbiting at some characteristic rate.

For centuries physics seemed to be fully accounted for with these classical principles of mass and motion first addressed by Isaac Newton, but further observation at the boundaries of instrumentation revolutionized the science. Behavior of light had been something of an enigma, as it seems to exist as a wave under many conditions but act more like a series of quantized energy bundles when travelling through a vacuum or absorbed in discrete segments by atoms. Albert Einstein proposed that light is composed of particles called photons that only appear wavelike when traveling through mediums such as gases or liquids which impede and channel their path. He performed thought experiments that reify rate; in this schema, the speed of purportedly massless light in a vacuum is a constant top velocity, and rates of interrelated motion in massive objects span a spectrum from relatively fixed to hurdling around each other at a dizzying speed of hundreds of thousands of miles per hour.

Theory of relativity interprets gravity in terms of fourth dimensional spacetime, the mathematical substrate of rate measurement, similar to three dimensional space’s role as the substrate of particularity. From relativity theory’s frame of reference, scientists can account for some further cosmic events, such as shifts in electromagnetic radiation’s frequency upon entering the gravitational field of a black hole, theorized as caused by four dimensional curvatures of three dimensional space within which light bends. This theory also describes variable acceleration of objects under the influence of varyingly massive ones, as if warping of space’s shape by mass acts as a sort of proportionally broad and eccentric rate funnel. Relativity, though still a conditional model, can predict more of the observable universe than the traditional concept of gravitation, which became a special case of relativity applying only to earthlike environments and our own solar system. The Newtonian framework was demoted from a universal law to the ‘theory of gravity’.

Analysis at the subatomic level also reveals a close relationship between experimental design, image perceptions and concepts, and mathematical measurement. The famous double-slit experiment in quantum physics highlighted some surprising features of matter. Two slits in a walllike apparatus were spaced a narrow distance apart and bombarded with streams of electrons. When a florescent screen was placed behind the wall at close distance, it registered two bands of brightness beyond each slit, as if particles passing through them had taken a nearly direct, uninterrupted route and clustered around an average location of contact. When the screen was placed further away, with electrons allowed to more freely propagate in space, it instead registered a wave interference pattern with bright bands corresponding to in phase waves and dark bands to out of phase waves, yet the wave interference pattern emerged only as an additive effect of large quantities, with each individual electron registering in a specific location on the screen, as though still a particle. If a single electron was fired at the wall, its passage through one or the other slit was equivalently probable, a fundamental unpredictability only modelable over sustained periods as a statistical trend in relation to bulk amounts of particles. Placing a sensor at either one or the other narrowed slits to detect the passage of these single particles obliterated the interference pattern so that electrons once again showed up on the florescent screen as two clustered bands.

Apparently electrons, depending on the experimental setup, exhibit both wave and particle behavior, with the wave structure being delicate enough that small disturbances to the system such as those caused by the sensor can dissolve it and reestablish particularity. From this double-slit experiment with its image recording of light and dark bands as well as very fine mathematical adjustments to the system, whole new physical properties became observable. Subatomic particles had been demonstrated to consist in a particle/wave duality, as entities scientists call ‘wavicles’. Photons and atoms also manifest these wavicle properties, and though the effects are more difficult to generate in the lab as mass is enlarged, molecules with as many as a thousand atoms have evinced a wavicle nature. Decomposing effects of the sensor inaugurated the concept of quantum decoherence and its hypothetical counterpart coherence. Subsequent experiments found that wavicles, such as photons, electrons and atoms, exist in the form of what is called quantum ‘superposition’, existing as multiple overlapping phase states modeled with the Schrodinger equation.

Further lab work revealed more unintuitive aspects of quantum-scaled environments. An experimental setup in which two electrically charged metal plates were aligned with a minimal barrier of atomic material between them revealed nearly instantaneous, faster than light transfer of electrons across the gap, a phenomenon that was called quantum tunneling, and protons were found to perform the same feat. An experiment with two photons emitted in opposite directions from the same source revealed that they can be quantum ‘entangled’, as their phase states were found to be statistically related, with interruption of one photon during its travel causing almost instantaneous change in its partner irrespective of spatial separation. Entanglement effects have also been found between more massive particles such as electrons, protons, even whole atoms. It was also discovered in the photon experiment that introducing a slide into its path less than a picosecond (a trillionth of a second) before its contact with a terminal measuring device also generated an entanglement effect even though the theoretical speed of light in a vacuum implies the photon would have already passed the point of intersection. Thus, causality of photons was shown to happen retroactively as well as by way of faster than light speed entanglement. Electromagnetic radiation broke its own supposed speed limit and even violated the Newtonian model of sequential cause and effect. These experiments are revolutionizing our understanding of what matter is and does, with scintillating implications for knowledge of phenomena beyond the laboratory.

--

--

Eric Bond

Chapters from the book Standards for Behavioral Commitments: Philosophy of Humanism, and more!