Warning: Flammable contents

W

One of the possible benefits of growing older is the ability to see what might not work. A lifetime of experience can also hone our sense of possible danger. Today I take the risk of offering a warning about the (pressurized, flammable or explosive) expansion of society’s fascination with artificial intelligence. Forgive me for temporarily muting the significance of theological truth in this matter, and for ignoring the fact that AI has benefitted society in some ways. And if I’m wrong about any of this, excuse my ignorance….

 

The crafting of artificial intelligence seems to be based on some key assumptions: When combing through unimaginable mounds of data, the pattern-finding skills of algorithm-consuming super-computers will yield unassailable facts that have trustworthy predictive value. GIGO (“Garbage in, garbage out”) can be muted because better algorithms are better sorters.

At its core, AI is an *artifice pretending to be an artifact. Some artifice-related problems may derive from these hypotheses: When the size of a data base increases, the number and complexity of its possible patterns also increases, making pattern-discovery even more questionable. (The human mind—the ultimate driver for artificial intelligence—has the strong capacity to invent patterns where they do not exist.) Thoroughly infused with the self-delusion of cognitive biases, our brains—and our machines—might trust unduly what may be unreliable.

Another set of philosophical (theological?) questions is rimmed by practicality: How will AI handle the likelihood that randomness will remain a pervasive component of reality? What about the powerful offsprings of unpredictability: mystery, hope, wonder, surprise, awe or transcendence? Will AI’s omniscience decrease the presence of these helpful approaches to God’s presence?

We’ve been here before. The Internet, social media and self-teaching algorithms were envisioned as improvements for our lives. Instead, in too many of its darker corners—e.g., teen-age depression, consumerism, device-addiction, misinformation—these massive technologies have continued to overwhelm our attention spans and moral compasses.

Neural networks—essentially computers theoretically able to think for themselves—have compounded the problem. Here, too, practical difficulties arise. Relying on the brain’s memory and affective structures, the primary steps in decision-making are highly emotive. What this means: Whatever their information-capturing capacities or algorithmic skills, AI entrepreneurs may be caught on the horns of a dilemma. If they presume rationality as the primary factor in intelligent decision-making, they bypass underlying emotive constructs. Or, if they try to emulate the sometimes-chaotic elements of human emotion, they could fail repeatedly at that hyper-complex task.

One of my current worries:  The mad rush by the titans of technology to develop self-generative AI capabilities without thoroughly vetting their social ecology. It seems clear that these enterprising entrepreneurs want to control vast swaths of the economy by capturing the entire landscape of our attention. Paid advertisements offer wealth to these companies, but also continue to distract us from what’s important, tempting us into increased materialism—one of the factors that contribute to the degradation of the Earth. When monetized, AI hardly seems ennobling for any of us, individually or collectively.

Perhaps the most impractical and illogical element of AI rests on human reactions. When our technologies double as our toys, we continue to feed these platforms with personal information and attention, even as that response demonstrably diminishes our well-being.

What might be the solution(s) to the quickening rush of AI into our lives?  Part of me wants to jump to the side and let these machines charge on by. In that mindset, I won’t accept AI’s cookies, download its apps, or submit to its invitations. This reaction might create another problem, though: The learning algorithms that make AI “intelligent” will then depend on others for their content, their approach to reality, their knowledge-and-decision bases. On others’ values and priorities. (It’s possible that some demagogues have already figured this out!)

Right now, I’m not all that sure what to think or do. The technology giants are warring with each other, dropping onto society their versions of intelligence, and moving quickly to do so. It feels impossible to keep up. This ocean—these bots—seem bigger than my small boat right now.

One thing I believe fiercely, though: The trajectory of AI will eventually run its course, and the futility of humankind’s self-idolatry will become apparent to more than older adults who want to be practical.

There’s still hope in this boat!

 

*An artifact is tangible evidence of a past or present reality. On the other hand, when we invent and construct artifices, we can only hope they’ll eventually become real. (Pinocchio comes to mind.) Historically, the towering idols of artifices are thus built on top of clay feet. (See Daniel 2.)

(To receive these entries when they are posted, go to the upper right-hand corner of the top banner and click on the three dots or parallel lines. Scroll down to the subscription form and enter your information.)

About the author

Bob Sitze

BOB SITZE has filled the many years of his lifework in diverse settings around the United States. His calling has included careers as a teacher/principal, church musician, writer/author, denominational executive staff member and meat worker. Bob lives in Wheaton, IL.

Add Comment

By Bob Sitze

Bob Sitze

BOB SITZE has filled the many years of his lifework in diverse settings around the United States. His calling has included careers as a teacher/principal, church musician, writer/author, denominational executive staff member and meat worker. Bob lives in Wheaton, IL.

Recent Posts

Blog Topics

Archives

Get in touch

Share your thoughts about the wonder of older years—the fullness of this time in life—on these social media sites.

Receive Updates by Email

* indicates required