Skip to main content
SearchLoginLogin or Signup

A Genealogy of Perfect Thinking, Told in Four Parts

In a world of total automation and artificial intelligence, what is the role of the human? Mining the history of perfect thinking to better understand its intellectual foundations in the present.

Published onMar 21, 2019
A Genealogy of Perfect Thinking, Told in Four Parts
·

Brian Eno has been an innovator in music production and technology since the 1970s. He recently commented on the ability of computers to capture the perfect sine wave, or, to create music with absolute precision: “The least interesting sound in the universe probably is the perfect sine wave. It’s the sound of nothing happening. It’s the sound of perfection, and it’s boring . . . distortion is character, basically. In fact, everything we call character is the deviation from perfection. So, perfection to me is characterlessness.”1 Sound becomes interesting, in other words, when a human being’s inherent imperfection is part of the process. Meaningful art is produced in the give-and-take between technological capability and human creativity.

The ideal sine wave can function as a metaphor for the Singularity, which describes a world where the precision of technological automation and artificial intelligence replaces the messiness of human subjectivity. Deeply embedded in the concept of singularity is a drive for perfection. While the Singularity presents new challenges, the “perfect thinking” it represents has existed since the beginning of Western philosophy, and it has emerged and re-emerged in particularly potent forms in epochal moments in history. The Singularity is thus a new name for an old idea. From ancient Greece to the Enlightenment period to the Cold War rationality of the twentieth century, humans have theorized and sought technologies to understand and control the natural world. Each of these eras was marked by powerful, hegemonic ideas that positioned humanity as a problem to be overcome in the quest for perfection.

Some key challenges of the Singularity are captured by the following questions: In a world of total automation and artificial intelligence, what is the role of the human? Further, how do we imagine a role other than subordination of humanity under a regime of extreme techno-rationality? One way to resist such reduction is by understanding the intellectual ground on which such reduction stands. Genealogical critique is useful, as it seeks to mine the past for historical analogs that can shed light on the intellectual foundations of the present. As political scientist Wendy Brown argues, “Genealogy permits an examination of our condition that calls into question the very terms of its construction.”2 Political researchers Robert Ivie and Oscar Giner put it succinctly: “Seeing the strangeness of the past helps us see the strangeness of the present.”3 In uncovering a history of perfect thinking, we can better understand the contours of the present and be better equipped to deal with it in humane and productive ways that promote what Joichi Ito describes as a “culture of flourishing,” where people and technology coexist to the benefit of humanity.4

This essay tells the story of perfect thinking in four parts, describing each period and the perfection sought in that period. First, I discuss ancient Greece, specifically differences between Plato and Aristotle and the implications of their understanding of language and its relationship to democracy. Next, I examine the Enlightenment, where the hegemony of scientific rationality and objectivity was challenged by scholars representing a new form of humanism. Third, I examine the mid twentieth century, specifically the techno-rationality of the early Cold War era. Finally, I examine the Singularity, which I position as a node on an intellectual timeline that supports perfect thinking similar to these other eras, but with new challenges.

My goal with this genealogy is twofold. First, I seek to explore historical analogs for the Singularity in order to better understand how tensions between humans and technology were resolved in the past. Generally speaking, the Singularity represents the hegemony of objective reason, where the operations of society are free from humanity’s messy imprint. Second, I examine ways in which humanity has flourished through a synergetic relationship with technology. In each era of perfect thinking, humanism has emerged as a corrective of sorts, and it has been the combination of humanity and technology that makes progress possible. I assume that most of us would not want to live in a perfect sine wave; as such, my argument is geared toward developing the role of humanity alongside the Singularity. I conclude with a discussion of how we might fortify humankind in the new era by embracing the inherent imperfection, flaws, and endless creativity of our species.

1. Ancient Greece: In Which Humanity Becomes the Problem, or, the Role of Language in Human Understanding

The dream of a society unsullied by human frailty is at least as old as Western thought. Plato, perhaps history’s most famous philosopher, sought a rational means of obtaining objective Truth, and he positioned the subjective nature of the human faculties as a key obstacle. As he wrote, “The philosopher is in love with truth, that is, not with the changing world of sensation, which is the object of opinion, but with the unchanging reality which is the object of knowledge.”5 Language, a primary tool for humans, was by its very nature highly problematic. Plato especially attacked rhetoric—the art of discourse—as deceptive and concealing of the truth. Humans, then, were the obstacle, and steps needed to be taken for philosophers to outwit the changing, sense-driven nature of the human mind through dialectic, philosophy, and plain, simple language. Plato sought a form of Singularity for the period, advocating the tools of philosophy as a way to obtain perfect knowledge beyond the immediate and unfiltered comprehension of humans.

Aristotle challenged his mentor Plato, particularly his understanding of rhetoric. Rather than dismiss rhetoric as a problem, or at best a tool fit only for philosopher kings, Aristotle found in rhetoric a practical and productive art concerning the role of persuasion in teaching, advocating, and defending one’s self. As he argued, “Rhetoric is useful because things that are true and things that are just have a natural tendency to prevail over their opposites, so that if the decisions of judges are not what they ought to be, the defeat must be due to the speakers themselves, and they must be blamed accordingly.”6 In other words, persuasion was a key element in the give and take of democratic society, animating much of daily life. A standard of objectivity would make little sense here, given the temporal and shifting nature of democracy, which operates based on contingent information, probability, and persuasion rather than eternal truths.

Aristotle lectured and wrote key works on rhetoric and poetry, noting the many uses of artistic language for ends relating to democracy and ethics. These are the elements of lived reality, and as such, an exploration into how they work is more productive than dismissing them as merely problematic. The differences between the two philosophers play out forcefully in their competing conceptions of democracy. Whereas Plato famously mistrusted democracy and favored a society ruled by the learned elite, Aristotle had more nuanced views favoring constitutional government, with more room for mixed approaches to citizen engagement and self-rule.7 Aristotle’s rhetoric, then, is a handbook of sorts for the practice of democracy.

So, who won the argument? In a word, both. Plato and Aristotle together helped initiate a debate and form a tension that persists in both academia and popular culture—the tension between perfect objectivity and the subjective experience of humans. At the crux of Plato and Aristotle’s disagreement, though, is the role of language and its place in philosophy, knowledge, and democracy. For Plato, language was dangerous in its ability to obscure truth and its power to persuade and deceive the masses. For Aristotle, language was the coin of the realm; that is, language is fundamentally human, and to act and progress within society requires a dexterous command of both discourse and persuasion. Crucially, this is a productive debate, and these tensions have ignited countless new ways of studying, thinking, and acting in the world.

2. Enlightenment: In Which Objective Knowledge Replaces God

After the Dark Ages stifled free inquiry in favor of enforced dogmatism, the Renaissance and then the Enlightenment renewed interest in understanding the natural world. A new technique—the scientific method—emerged as a carefully constructed means of capturing the operations of nature free from humanity’s imprint, and, as such, revolutionized thought by creating objective data and information. Truth was thus obtainable, and objectivity became a prized mode of inquiry. The practice of science as a dispassionate investigation of things as they are gained renewed vigor.

John Locke was a key Enlightenment figure, and like Plato before him, he attacked creative and artistic language as a problem to be overcome in the pursuit of human understanding. “Since wit and fancy find easier entertainment in the world than dry truth and real knowledge, figurative speeches and allusion in language will hardly be admitted as an imperfection or abuse of it.”8 Like Plato, Locke saw a dichotomy between Objective Truth and Subjective Art, with the latter distracting or detracting from the former in areas of human knowledge. In seeking a way to master the world by removing the imperfection of humans through empiricism and inductive reasoning, Locke’s thinking, and much of the Enlightenment more generally, represented a Singularity for its time.

But writing against John Locke was Giambattista Vico, who, despite being less well-known in the period, has become a key figure in the study of culture and society.9 Vico rejected the idea that humans could understand the world free from subjective interpretation, and he questioned the new religion of scientific method—not devaluing it as a tool for greater human understanding, but expressing concern with the prevailing wisdom that this was the best and only way to understand the world. He contrasted Locke’s “methods of our time” (especially the scientific method and formal logic) with different ways of knowing, including rhetoric, poetry, religion, mythology, art, and other humanistic approaches, which he collectively described as the ways of the ancients.10 Vico advanced a mythological understanding of human knowledge, arguing that humanity understands itself primarily through stories and language more broadly.

Vico further developed the concept of the sensus communis, or common knowledge, as a guide for action and a source for information. Building on Aristotle’s exploration of practical wisdom, Vico argued that human knowledge was in part a product of humans, and new knowledge emerged through insight into the constructed nature of reality through cultural critique, which he describes in terms of rhetorical invention. The common sense of a society helps to form its culture, which provides a basis of understanding. As John D. Schaeffer observes, “Sensus communis enables the mind to form arguments. Vico attributes [the power of invention] to ingenium…a power of the imagination.”11 For Vico, the human imagination produced new insights and knowledge. This perspective was in contrast to the hegemonic belief that truth is outside of culture and obtainable only through science.

Vico pursued a theory that viewed humanity’s role in knowledge construction as a positive, or, at least, a fundamental element of what it means to be human. Now, of course, we can see that academic scholarship has embraced a multiplicity of ways of knowing, including the rational and objective sciences, the social sciences, which typically use variations of the scientific method to study human behavior, and the critical tradition, which reflects Vico’s sense of cultural critique to understand the construction and dissemination of knowledge. These approaches work together in our drive to comprehend a complex world, and thus speak to a necessary combination of the perfect and imperfect in the production and apprehension of knowledge.

3. The Mid-Twentieth Century: In Which Human Behavior Must Be Predicted and Controlled

Scientific thinking—broadly understood to include a variety of methods and theories—shaped the political and cultural landscape in especially potent ways during the Cold War. In the wake of the atomic bomb and the reorganization of the international order according to dominant nuclear superpowers, techno-rationality was considered the means by which the planet could avert another world war, and an uneasy tension would keep a relative peace. As science historians Lorraine Daston, Fernando Vidal, and Michael Gordin observe, “A loose conglomerate of game theory, nuclear strategy, operations research, Bayesian decision theory, systems analysis, rational choice theory, and experimental social psychology, Cold War rationality in its heyday seemed the last, best hope for the unification of the human sciences and the recipe for their successful application to everything from diplomatic negotiations to agricultural price-setting to social stability.”12 The emergence of new forms of social and political sciences could apply to all domains of life. Science would be used to predict and control people and societies, guided by a dream of perfection absent the inherent messiness and fallibility of humans. Techno-rationality was a Singularity for the age.

John F. Kennedy was perhaps the key public figure of this period, and he and his Harvard brain trust were some of the architects of the new govern-mentality. But Kennedy was also deeply committed to the arts and to the cultivation of artistic thinking. A speech he gave regarding the death of poet Robert Frost in 1963 makes the point about how our drive for perfection must be tempered by our human instincts and potential:

I look forward to a great future for America, a future in which our country will match its military strength with our moral restraint, its wealth with our wisdom, its power with our purpose . . . I look forward to an America which will reward achievement in the arts as we reward achievement in business or statecraft. I look forward to an America which will steadily raise the standards of artistic accomplishment and which will steadily enlarge cultural opportunities for all of our citizens. And I look forward to an America which commands respect throughout the world not only for its strength but for its civilization as well. And I look forward to a world which will be safe not only for democracy and diversity but also for personal distinction.13

Kennedy’s speech sought to situate the arts as fundamental rather than ornamental. Democratic society cannot exist without human creativity, because creativity drives innovation, questions norms, functions as dissent against status quo thinking, and allows humankind to achieve fuller potential.

This is a perspective echoed recently by U.S. Poet Laureate Tracy K. Smith, who is “convinced that one of the only defenses against the degradations of our market-driven culture is to cleave to language that fosters humility, awareness of complexity, commitment to the lives of others and a resistance to the overly easy and the patently false. Poetry is one vehicle for this humanizing, reanimating version of language.”14 Where Kennedy argued for poetry’s importance in the context of war, Smith does so in the context of an all-consuming neoliberal society, where market rationality and innovation threaten to subjugate the role of humans under a new techno-rationality. Both recognize and forcefully note the importance of human creativity as a corrective to potentially degrading forces of mechanization, market fundamentalism, and perfect thinking.

Kennedy straddled the line between perfection and humanism, and he both managed a government committed to the former and inspired movement toward the latter. As he stated, “I see little of more importance to the future of our country and our civilization than full recognition of the place of the artist.”15 This phrase, and this speech generally, led directly to the creation of the National Endowment for the Arts (NEA), whose mission is to give “Americans the opportunity to participate in the arts, exercise their imaginations, and develop their creative capacities.”16 The NEA represents an idea embedded in U.S. democracy, which is that creativity has a fundamental role in nurturing the very enterprise of self-governance.

Shortly after Kennedy’s death, a reaction to status quo thinking of all kinds emerged. Rock music broke free from constrictive norms of early pop music, leading to the Summer of Love and the epoch-defining (and LSD-drenched) Woodstock festival. The antiwar movement emerged as a loose coalition of students, artists, and activists that influenced perceptions of the Vietnam War more clearly than the government. New versions of social movements geared toward rights for women, African Americans, and gay Americans became cultural forces of backlash against repressive laws and social norms. Fashion changed and a new youth culture emerged, and stalwarts from earlier periods like James Brown and the Beatles adapted and drove powerful trends in art and culture geared toward celebrating individuality and attaining peace.

The twentieth century also saw an increased ability to copy and reproduce art on a massive scale, further aligning popular culture with mass capitalism. The spread of popular culture allowed for artistic materials to circulate widely, promising new connections between people and creative works. As science and technology advanced rapidly, so did the mass production and dissemination of what Theodor Adorno derisively called “the culture industry,” or the marriage of capitalism and entertainment on a scale previously unimaginable.17 The field of Cultural Studies emerged to better understand the ways in which culture and everyday life help us make sense of the world and how to exist within it.18

Crucially, mass culture comprised a vast network of information based largely on arts and entertainment. As new media technologies emerged, film, radio, and television became commonplace interfaces with the world, and the effects were far-reaching. In the twentieth century, it became much more possible to learn about things from multiple perspectives. Take, for instance, the psychological condition of anxiety, a defining feature of World War II and the postwar era, which Louis Menand described as “the key to the times,” animating much of the culture and social scientific scholarship of the period.19 On the one hand, there is the rational, scientific definition of anxiety, taken from the American Psychological Association: “an emotion characterized by feelings of tension, worried thoughts and physical changes like increased blood pressure.”20 This definition aligns with the objectivist, “perfect thinking” pole. It is clear, descriptive, and tested by science, and it reflects and shapes medical understanding of the condition.

On the other hand, we know that people are not only influenced by rational, medical definitions of disease, and in the twentieth century there were countless artistic texts that explored the subject of anxiety. For instance, W. H. Auden published the Pulitzer Prize-winning book-length poem The Age of Anxiety in 1947, chronicling life in the period.21 It was adapted shortly thereafter as a symphony by Leonard Bernstein and a ballet by Jerome Robbins, reaching large numbers of people. About two decades later, John Lennon released the song “Crippled Inside,” in which he sang, “You can wear a mask and paint your face,” but “one thing you can’t hide is when you’re crippled inside.”22 The song appeared on the 1971 Imagine album (with the iconic title track itself a reaction to the chaos of world events), which has sold over two million copies and continues to be celebrated as a key recording of the era. Edvard Munch’s painting The Scream is another case study in how the arts have examined anxiety. Originally painted in 1893, the copyright expired in the late twentieth century, and it quickly became one of the most used, parodied, and circulated pieces of art, even inspiring a series of teen horror films. The painting, which one reviewer suggested “defined how we see our own age—wracked with anxiety and uncertainty,” has become a deeply engrained element of the culture.23

We can see in the example of anxiety how the poles of objectivity and subjectivity together provide fuller understanding than either alone. These are all unique and valuable interpretations of the condition. The arts provide insight drawn from subjective human experience, while the American Psychological Association provides a rigorous, objective account. Together, they create a full picture. Our collective history of elevating perfect thinking, though, gives greater weight to the medical definition because it passes the tests and rigors of objectivity. This makes sense, as we want medical professionals to use the best research available, but perhaps other explorations of the condition should not be dismissed as mere art. They have endured in the public mind since their creation, and they help people understand, identify, and cope with the world. This case study illustrates the ways in which humanity collectively benefits from both the objective and subjective interpretations of the condition. Meaning is made in what game developer Nicky Case describes as the “symbiosis” of competing ideas “not despite, but because of, their differences.”24 It is a matter of recognizing that this is the case as we move toward a new era of perfection.

4. Singularity: In Which Technology Supplants Humanity (And Humanity Fights Back)

“Humans,” jazz great Wynton Marsalis observed in his commencement address to Juilliard’s 2018 graduating class, “are the greatest technology you will ever encounter.”25 Perfect thinking and Singularity are ideas— “big ideas,” as Nicholas Negroponte might term them26—with antecedents throughout history. Big ideas can only be countered, revised, or sharpened by other ideas. In the past, the drive toward perfection has been tempered by countervailing forces reminding us that humanity needs a role in the creation, production, and dissemination of knowledge alongside technology. What is needed now, then, is a focus on fostering and developing a culture of education and big ideas that is both responsive and proactive regarding the coming Singularity, so that humans can thrive within and alongside it.

Public education, in the United States anyway, has shifted toward an emphasis on STEM, with a concurrent deemphasis on the arts and humanities as viable or important fields. Scholars and commentators are aware have sought to address it in a variety of ways, noting especially that STEM alone will not equip future professionals and citizens, and that arts and humanities education is needed to ensure well-rounded students. One interesting approach comes from the Alan Alda Center for Communicating Science at Stonybrook University, created in part by the actor himself to employ pedagogical techniques associated with communication studies, dramatic arts, and improvisation as a means to improve the practices of medical professionals and others in STEM fields. With “training methodologies . . . inspired by empathy, clarity, and vivid storytelling,” Alda’s school fills a gap of such training in professional schools.27

Alda’s program can be considered part of a larger movement in academia towards STEAM, which adds Arts to the STEM acronym. This is a reasonable and potentially fruitful approach, but the danger is that it still subordinates the arts under the god of STEM. That is, the suggestion is to learn the arts because they will help people with real degrees, which have more tangible rewards in the new economy. STEAM is a good start, but it may not have the strength to fortify humanism in the face of the Singularity.

A better approach would be to reimagine public education entirely. The Montessori model might be a good basis for this endeavor, as it operates from the assumption that creative thinking precedes a desire to learn. The American Montessori Society describes it as a model that begins with observing a child’s curiosity: “The teacher, child, and environment create a learning triangle. The classroom is prepared by the teacher to encourage independence, freedom within limits, and a sense of order. The child, through individual choice, makes use of what the environment offers to develop himself, interacting with the teacher when support and/or guidance is needed.”28 This is a method of education that allows students to develop into unique individuals. Rather than shaping humans into the contours of techno-rational society, it empowers them to become leaders and thinkers who can flourish across multiple contexts.

Mark Rothko, the renowned artist who also taught elementary school students for over twenty years, had similar views on education. He believed that academic training could suppress rather than promote creativity. “As Rothko saw it, a child’s expressiveness is fragile. When art teachers assign projects with strict parameters or emphasize technical perfection, this natural creativity can quickly turn to conformity.” He stated, “The fact that one usually begins with drawing is already academic. We start with color.”29 Rothko would organize his classroom before children arrived, then simply let them drift and play with whatever they found interesting. Next, he would guide and train them according to those interests. This is similar to the Montessori model in its valorization of individual creativity, but geared toward art training specifically. It offers an interesting and necessary contrast to the current model of public education and seeks to develop individuals as individuals rather than components in an already defined system. It is a difference of goals—to control students or to unleash their potential? The former feels safer, but the latter feels necessary.

Of course, large-scale changes involve enormous challenges: A culture of education does not shift overnight. The Singularity, though, is also a big idea in the mode of Plato’s drive for Truth, the Enlightenment’s push for objective science, and the techno-rationality of the twentieth century. It is an epoch-defining shift in how humans understand themselves, calling for a concurrent shift in understanding our role in that world so that we can flourish and benefit from the Singularity rather than be effectively replaced by it.

One final point concerns funding: how to pay for education programs, large and small, that seek to unleash humanity’s creative potential in the age of the Singularity? The good news, in the United States at least, is that the money is there. Though the economy is uneven, it continues to grow. This unevenness, though, is becoming a target not only for the traditional progressive left, but also now for the techno-left, with concerns coming from leaders in Silicon Valley about automation and loss of jobs. Andrew Yang, one such tech leader, is running for president in 2020 on a Universal Basic Income platform. As AI and automation increase, greater profits in business will coincide with less need for humans to perform basic tasks. The theory goes: Why not take some of this profit and use it to offset the loss of income most people will feel?

Perhaps, then, Universal Basic Income could be amended (or increased) so that a percentage of the money goes to arts education. Yang proposes $1000 a month for every citizen under the age of 65, which would function as a sort of twenty-first century social safety net. Imagine if that number was $1300, with $300 per month allocated for creative pursuits for children. What would that society look like? For one, it would create some of the same benefits as a widespread change to the education system, with less disruption of the status quo. Further, it would reinvigorate the arts-as-industry with massive influxes of cash. Supply stores, exhibition spaces, instrument makers, music teachers, printing presses, libraries and other public spaces—these are just some areas that would benefit immensely from a new economy that guarantees support for the arts. The ripple effects would be massive and would help to nurture art and creativity alongside the Singularity, so that a combination of technological perfection and human creativity together point to a brighter future.

Whereas the technology of perfection is potentially sterile, unnatural, conformist, and machinelike in its precision and efficiency, humanity offers different strengths and weaknesses. We are colorful, messy, exuberant, vital, and emotional. We are at once predictable and unpredictable, conformist and nonconformist; we “contain multitudes,” as poet Walt Whitman famously wrote. As history makes clear, it is the various combinations of perfection and imperfection that move society forward and give us a fuller understanding of ourselves. Though we have at different moments in history been lured perhaps too easily into the tranquility of perfection, in each instance correctives have emerged to balance the human and the inhuman. We are currently in an emergent phase—Singularity—which is another version of perfect thinking and another mark on an intellectual timeline as old as Western thought. A world of Singularity absent the role of human creativity would be a nightmare, even if it originates from a dream of total perfection. We must be vigilant, then, using the past as guide, in imagining and ensuring productive roles for humanity in the future.


Footnotes

Comments
0
comment
No comments here
Why not start the discussion?