The Transition to a Philosophy of Science and Technology
20th century philosophy
By the close of the 19th century, European institutionalizing of historical analysis was well underway. Anthropological study was exploring the world, sending out expeditions to research foreign lands, mastering foreign languages, authoring increasingly competent translations into Caucasian tongues, getting a better sense for the way behavior and belief vary by culture, excavating important sites in ever more remote areas and combining this archaeology with knowledge of indigenous life to construct theories of human prehistory. Much of the effort was initially subordinate to political and economic considerations as European governments sought access to resources in distant parts of the world as well as imperial hegemony, but science steadily gained in fame accompanied by more lucrative funding and greater independence so that interest in illuminating the facts of human history took on a life of its own. Biologists were putting together an accurate chronology of the evolutionary past based on geological science and the fossil record; by the mid-19th century scientists were convinced eukaryotic life had existed for hundreds of millions of years. Study of historical literature and all kinds of written material was also a growing field, with the fashioning of deeply perceptive theories of how myriad cultural transitions had led to contemporaneous civilization.
The science of intentionality, however, was still in its infancy. Most European natural philosophers and then early scientists avouched that animals are governed by instinct, an appetitivelike function of behavior, acting to satisfy physical needs. Humans were spiritual beings with a ‘rational’ function superior to the rest of nature. It should have been obvious how human beings sometimes act in animalistic ways for elaborate reasons, and biological theories are finally in the 21st century starting to give animals credit they deserve as reasoning beings that manage complex emotions. But in the 19th century and prior, the significance of psychological theory was no more than “humans have understanding; animals don’t”, and individuals with minds judged abnormal were treated like animals, locked away in asylums for life or worse unless they were from respected families.
The first systematic forays into a more modern conceptualizing of the psyche were probably carried out by founders of linguistics and phenomenology. The philosopher Ferdinand de Saussure of Switzerland put forth a theory of language based around the idea that linguistic structure conforms to an intricate framework of rules and conventions of correct construction that exists semi-independently from content it expresses, not entirely determined as passive representation by the conceptualized fact and underlying perceptions to which we intuitively attribute the reality of what we mean. Language operates according to its own principles of action and development, an organ of the mind on par with concepts and perceptions, a substantive cause, a distinct set of processes, with a somewhat separate evolutionary history. This was a seminal step in progressing towards scientific psychology, the insight that language and thought are analogous in their causality, conception being the rational function, and language its own symbolic function. Concepts of course were always regarded as an influence on the form of language as we construct our communications, but Saussure made it comprehensible that language can be an influence on the forms of concepts, not merely either nominal or the same as real. This divided the mind into two causally distinct modules, made possible theorizing of psychology as a machinelike system of parallel mechanisms, and helped to even more explicitly break loose rational modeling from the naive assumption that its symbolic forms of theoretical representation are identical to the structure of reality, opening up a niche for mass movement toward extremely unintuitive, fallacy and error conscious science. Phenomenologists such as the German Edmund Husserl arrived at similar insights regarding language, and even further refined theories of the mind as what was a growing set of modular forms and functions, making Kantian reason look more and more like the geocentric model of the cosmos.
German philosopher Gottlob Frege introduced Saussureanlike notions to mathematics, explaining quantification as an independent domain capable of undergoing its own process of change as an expanding set of structures with their general principles — advanced forms, axioms, proofs, and their elaborations — derived via symbolic logic. He introduced the idea of ‘sense’, which is concept in expression as an organization of ‘signs’, and opposed it to ‘reference’, the addressing of an expression to facts insofar as they are given by perception, a dual psychology mediated by the interposing of mathematical structure. Frege talked of a “recarving of meaning” in which mathematics would be continually reconstituted as the nature of concepts and perceptions changes with both the advancement of knowledge and modification to technical practices. This was a strong influence on modernity’s pure math, the resumption of Platonic regard for quantitative entities as a universe of immaterial constructs operating according to rules of potentially infinite complexity and variation, a limitlessness that seems to transcend perceptual reality.
At the turn of the century, the medical establishment began assembling empirical methods for analysis of the first person mind, trying to make the subjective and intersubjective more objective and scientific. Sigmund Freud of Germany was probably the first clinical psychologist, observing, recording and analyzing patterns in idiosyncrasies his subjects evinced during therapeutic conversations to fashion a theory of psychical structure and development. He refined his techniques of patient/practitioner interaction into a method called psychoanalysis, which provided the foundation for future counseling practice. American psychologist William James inspired the pragmatic paradigm in psychology, which theorizes components of the psyche in terms of their functionality for natural and social environments. The Swiss Carl Jung theorized psyche as well as treatment technique in its evolutionary dimension, investigating the origins of psychical features in the ancient past and attempting to integrate these discoveries into psychoanalysis.
Early 20th century Austrian philosopher Ludwig Wittgenstein, in his book Tractatus Logico-Philosophicus, embarked upon a synthesis of inferential reasoning with intrastructural reasoning, making some innovative insights regarding the logic of fact-based truth-value, and conjoining this with visualizing descriptions, a supplementary representation of logic as spatial structure. He generalized the perceptual underpinnings of explicit truth with the concept of an ‘atomic fact’, then from this starting point went on to construct a system of basic premises and derivative conclusions expressed in symbolic terminology, infusing it all with what he called “picture thinking”, giving his logical deductions a sort of schematic architecture. His approach helped to found the analytic philosophy that theorized logical meaning as a holistic universe of interrelated particulars, inspiration for positivistic consolidation in the domain of abstract reasoning, which soon seeded technologies of more intricate logical mechanism such as computers.
In Philosophical Investigations, Wittgenstein reflected in a more informal way on the relationship between expression and truth, promoting a notion that came to be called the ‘language game’, an advancement to the idea that the ‘sense’ of language is not only determined by its relationship to fact and truth-value, but also by conventions of usage providing groundrules for communicative relationships between individuals employing it, with a history and cultural significance equally formative to meaning. This was indicative of the move towards understanding truth more naturalistically, with reference to its many functions that vary by context. Linguistics, with Saussure as its founding father, was going in a similar direction, becoming the science of how individuals and cultures acquire, convey and evolve complex forms of knowledge that only take hold in the presence of language use.
Pure math was also making strides as thinkers worked on drawing all the strains of conceptual development into a unified metamathematical theory. Frege had attempted to put math on the firm foundation of its presumably finite array of logical principles, explanation of which would generalize the whole field to completion. Austrian/American Kurt Godel’s incompleteness theorem foiled him, demonstrating that the underlying logic of a system of mathematical generalizations can never be closed, a self-supporting, verifiably self-consistent totality of requisite proof. This led philosophers to analyze the connections between quantitative form and conceptualizing, with humanity’s mathematical thinking theorized as emergent from organic intuitions, generated as open-ended culture in some kind of evolutionary way by the mind in its relationship to perceptual, social and natural environments. But essential causes of math’s intuitiveness remained largely unspecified: the degree to which the forms and expressions of math are some kind of mental construction or a facet of reality’s deep structure was uncertain. Salience of this theoretical conundrum persisted until the development of set theory (a set is for instance all integers, or any three integers, or any three consecutive integers, or all functions, or all functions of slope-intercept form). It is a flexible modeling paradigm for math itself, enabling thinkers to easily analogize quantitative concepts at any level of abstraction, intricacy or generality by interpreting them into set types of infinite possible parameterization. This definitional framework for unboundedly streamlining the metaorganization of mathematical content eliminates any compounding of unintelligibility as pure math expands and its applications to physical theory and technology diversify. We have a conceptual superstructure — set theory — making the math itself as simple, direct and coherent as we will ever need it to be. The only limit to human knowledge is our ability to conceptualize information as mechanism, seeing patterns in the quantified facts of technicalized observations and then refining these into theories of causality.
It had finally become clear that we should not expect possibilities for truth in mathematics to be constrained nor determined by existing logic, but many thinkers nonetheless recognized practical potential in combining the concepts of logic and math. 17th century German philosopher Gottfreib Leibnitz had invented a very early technique for expressing logic with a sort of mathematical language, as binary code — 1’s and 0’s — for the equivalent of true/false dichotomies. Late-19th century American philosopher Charles Sanders Peirce and additional logicians expanded upon this paradigm to encompass in mathematical and otherwise symbolic terms what were becoming increasingly complex logical concepts. After early 20th century developments in number theory, it became conceivable that linguistic rules can be applied to large sets of mathematical quantities, systems which can then be used as instructions for procedures of machine mechanisms with their logical structure, and the idea of the modern computer was born. In the mid-20th century, English scientist Alan Turing developed the first electronic computer design, his Turing machine, and implemented a rudimentary version to crack the German military’s Enigma code encryptions during World War 2, winning the war for the Allies, but it would require some advances in chemistry and electrical engineering to bring it as high technology to the masses.
A philosophy by which to understand the nature of human knowledge was still being pursued, and American philosopher Willard Quine made contributions in this area. He formulated a positivist concept influenced by Wittgenstein’s vision of logic, theorizing the episteme as a gigantic, mutating web of interrelated beliefs and theories. Quine addressed the nature of truth with his concept of ‘confirmational holism’: verifying hypothesis or theory in one area of our body of knowledge changes, if only slightly, the general structure of the whole, especially beliefs in closest proximity to it. He regarded math as “indispensable” to epistemic progress, one of the most important types of conceptualizing in our attempts to process information and make sense of reality.
Quine’s ideas were not the cause of modern developments in science, but certainly apropos, as empiricism became ever more reliant on complex quantitative calculations to model collections of facts expressed in numerical form to an increasing extent. The quantifying of fact made massive amounts of experimental trials and additional sources of hypothesis testing much easier to compare, with patterns in perceptual content provided by the many observational instruments and methods of science thrown into sharp relief as approximations, averages for instance. Deeper understanding of all kinds of phenomena was facilitated by applying complex concepts of dimension, distribution, and rate of change.
As mathematics synthesized with methods of observation, the immensity of factual information and categorization became more manageable, and where scientists had been limited by relatively simplistic intuitions when constructing theories — proportion, direct and inverse correlation, linear and exponential change, force vectors, cycles — more complex concepts became possible. Advanced notions such as unique energy quantization for each atomic element, retroactive causality in elementary particles, fluctuating rates of evolution or ‘punctuated equilibrium’, amplification of localized reactions by biochemical pathways in sensation and elsewhere, quantum entanglement, and transmission of electricity were modeled, transitioning humanity to more modern technologies.
Contemporary science’s reply to Humean skepticism, the uncertainty involved in extrapolation from past conditions or integration of dispersed information, is statistics. This field supplies techniques for quantifying probability of error in the relationship of any particular datum to any potential geometric model of a data set, making decisions about how to fit figures to information much more exact as deviations are pooled mathematically and processed collectively according to well-defined standards. General implications of spatially and temporally complex or large-scale math for hypotheses and theories have become easier to assess, corroborated or controverted with greater precision.
Descartes’ skeptical challenge to rationality has been met by the scientific tradition of peer review. Once a research project has been completed, academics present their work at conventions; attendees ask questions, getting as much clarification and further analysis as desired. Results are also published in journals so the whole scientific community has an opportunity to reflect upon and discuss new facts and theories as well as incorporate ideas into their own professional endeavors. This collectivity is a triangulation to the most valid perspective on reality; “I think, therefore I am” is blown up to massive proportions, as “we can agree, therefore we know”. It is not a perfect system, for groupthink and irrational bandwagons sometimes take hold, in addition to personal rivalries and the occasional crucified reputation, but science often involves real honesty, transparency and accountability, a combination fostering as much intellectual integrity as any subculture has yet achieved.
The observational side of positivism has matured in the context of science as an implementing of technologically advanced instruments, enlarging the scope of human perceptivity so that we have more information at our disposal, increasing theoretical modeling’s efficacy and enriching our image of the world. The conceptual side of positivism has crystallized into the scientific method, an analytical procedure that is essentially recursion producing ever more potent hypotheses as investigators design an experiment, collect data, process results, draw conclusions, and utilize those conclusions to make further predictions, modifying theory as they go. The method has been clarified and generalized to such an extent that it is simple enough for schoolchildren to grasp, and students are typically instilled with this mode of thinking at a very young age.
Hegel’s vision of history is coming to fruition in the context of Quine’s confirmational holism: research trajectories of the many scientific fields are starting to interact with each other, sharing their theoretical paradigms and information so that the episteme is becoming more integrated. The first examples of this movement were contributions of biochemistry to medicine, then neuroscience to psychology; these interdisciplinary efforts have been spectacular successes, giving inspiration to astrophysics (astronomy and physics), biogeochemistry (biology, geology, chemistry), environmental science (ecology, politics, economics, sociology), ethnic studies (anthropology, sociology, psychology), the previously discussed quantum biology (physics, biology, chemistry), with many more instances as well as all their subdivisions.
Ever since the days of prototype Turing machines, scientists have dreamed of manufacturing more powerful computers. The first electronic computers were networks of vacuum tubes, programmable with math-based languages that are hierarchies of instructional protocols, at the most fundamental level binary code, which modify input of electricity through logic-based linkages between tubes, as a mechanistic system of ‘on’ and ‘off’ states for representing and processing information. This activity was then outputted to primitive visual interfaces, strips of paper tape with characters printed on them for instance. These machines were extremely expensive, operated at a snail’s pace by today’s standards, and the most powerful could be as large as a whole room while doing no more than elementary tasks like managing a company’s payroll database, basically glorified calculators and file sorters.
This all changed with the invention of microchips; they are made of silicon, a semiconducting material on which are etched tracks with linkages called logic gates, intricate patterns that channel and modulate flow of electricity, carrying out the same type of information processing as vacuum tubes but on a microscopic scale. The logic-based networks of a microchip are called ‘integrated circuits’, which are combinations of individual circuits that can each be fabricated with millions of logic gates for organizing electrical conductance to represent data. The whole apparatus of circuits has to be synchronized with signals from a clock mechanism, an additional set of wires and circuits, the design of which has become sophisticated enough that it is included in the structure of the microchip itself, making the whole system much more efficient. Microchip technology has become so advanced that all a computing device’s information processing is directed by a single chip, sometimes as small as a thumbnail; in personal computers (PC’s) it is called the ‘central processing unit’ (CPU). Round, flat disks within which microscopic magnetic particles can be arranged and rearranged an effectively unlimited number of times store information as memory. Together with interfacing devices such as monitors, keyboards, mouses and control pads, microchips have become central to almost all technology, impelling a gigantic leap in humanity’s information processing capabilities. Computer technology is revolutionizing society as the internet disseminates astronomically huge and growing amounts of data, the ability of science to model extremely large assemblages of information soars, and computers become complex enough to analyze the world at levels that may, if current trends continue, soon exceed human intelligence.
Modern empiricism has synthesized conception and observation, our reasoning augmented by math and our perception enhanced by technology, instating a culture of technical method with seemingly unlimited potential to harness natural environments. Empiricism has settled into three main scientific fields: physical matter, individual psychology in humans and additional organisms, and the study of societies; nature as inanimate, intentional, and collective. In the context of human decision-making and public policy, these domains — material, mental, and cultural — are starting to blend into consciousness of the future as universal in its human meaning, a species-wide sphere of action characterized by the intrinsic mutuality of progress. We are close to setting out on a joint path that will quell Schopenhauer’s human suffering and dispel the uncertainties of Descartes and Hume in optimized living. Contemporary culture has so much promise, our ideals become so practical that even an average education can come tantalizingly close to envisioning steps by which to attain them, conceiving universal incentives that would beget a utopia.
We can imagine what perfect life would be in concrete terms, the very real possibility of meeting every human’s needs, but then we look at actual conditions of the world: a superpower country squandering wealth and tarnishing its reputation with temporary expedients and isolationist policies, violent conflict that can flare up almost anywhere at any time, a global financial system in which periods of ubiquitous anemia are unavoidable and always threaten to put governments at odds with each other as well as overturn social systems, an upper-class trying to consolidate dominance in an effort that has always proved futile and ultimately destructive to social order, a failure to uplift the poor into an enlightened lifestyle of regard for the long-term prospects of our species, and worst of all, environmental crises that may become severe enough to destroy civilization in some parts of the world and sabotage progress permanently. One wonders how we, with the incredible power of science and technology, can so consistently be on the verge of widespread disintegration and collapse. What about human existence makes it so vulnerable to setbacks?