It’s Not What You Think: Exposing the Limits of Technology

1 month ago 72

I remember when Logos’s “philosophy of technology” fit on a T-shirt. It was this quote attributed to Johannes Gutenberg:

Religious truth is imprisoned in a small number of manuscript books, which confine instead of spread the public treasure. Let us break the seal which seals up holy things and give wings to Truth in order that she may win every soul that comes into the world by her word no longer written at great expense by hands easily palsied, but multiplied like the wind by an untiring machine.

Such optimism! Yet Gutenberg’s invention not only gives “wings to Truth” but to every printed word, from poetry to pornography, from calls to prayer to instructions for assembling landmines.

There’s the problem. Technology enables, but by its nature and design, doesn’t discriminate. That’s up to us. Technology is both powerful and dangerous. It extends, augments, and even replaces human effort—for better or worse. In fact, technology tends to create new problems even while it solves old ones.

It’s pervasive, too. Any time we use knowledge about the world to solve a practical problem, we generate technology, by which I mean any human-crafted system, process, or artifact optimized to produce a set of identifiable outcomes.

  • A sledge hammer is an artifact optimized to convert muscle energy into forceful impacts.
  • Wool-dyeing is a process optimized to produce colored yarn and cloth.
  • The internet is a system of interconnected artifacts and processes optimized to “look at pictures of cats and get in arguments with strangers.”

Technology so permeates our world that humans can no longer thrive on this planet without it. There may be some place left where people can still live in a near-edenic state of nature (pre-fig leaves), but even the North Sentinel islanders, cut off from contact with the rest of the world, have clothing and weapons.

So much of our daily life is mediated by technology, it can become almost invisible. Technology is the water, and we are all fish. Because tech is so commonplace, it’s easy to take it for granted, to become blind to its consequences.

Yet as a tech worker for nearly three decades, I wrestle with this question daily: How do these tech choices affect the world, for good or ill? And how do we know?

To answer that, we have to define the problem. When dreaming up new software, as the designer I start with many possibilities and add constraints until a workable solution emerges. Likewise, by exploring the limits of technology—what it’s not, what it can’t do—I believe we can make it less invisible.

Read more about how Logos is thinking about AI.

Not nature

There’s an old joke: A group of scientists challenges God to a duel, claiming they too can create life in a lab. God accepts. A scientist reaches down to gather a handful of clay, whereupon God interjects, saying, “Hold on, make your own dirt!”

Whether you think evolution or intelligent design shaped the cellular motor, photosynthesis, or protein replication, you can’t deny their resemblance to working systems, processes, and artifacts. Rather than being the opposite of technology, the natural world is itself a kind of technology, which the Creator optimizes for his own pleasure.

“The Bible starts with a garden and ends with a city,” is true. But importantly it is God who builds both Eden and the New Jerusalem, not us. Nature (or better, Creation) is the unmediated technē (Greek, “work, art, skill”) of God’s hands, and what we call “technology” is the technē mediated by ours. All human technē consume or convert some of God’s—we start with his dirt.

We also know that God is jealous of his creation and how his stewards discharge their duties. We should therefore avoid reading unlimited license into the so-called cultural mandate in Genesis 1:28 (see Gen 1:28; 9:1–2; cf. Ps 8; Jer 29:6). Scripture hardly calls us to rapaciousness, but something more managerial.

Not magic

Technology and magic have always been twins. C. S. Lewis wrote that “[f]or magic and applied science alike the problem is how to subdue reality to the wishes of men”; both may require us “to do things hitherto regarded as disgusting and impious—such as digging up and mutilating the dead.” Similar ends, different means.

The key difference is transparency and how we evaluate their results. Magic consists of opaque forces harnessed by the (usually elite) practitioner, whereas technology is meant to be understood, replicated, used, and built upon by anyone who can understand it. Thus, when magic fails, we fault the people involved: an imperfect incantation or insufficient faith. When technology fails, this indicates the need for retooling or redesign.

Yet it’s easy to treat technology—especially tech we don’t understand—like magic (or in Christian terms, miracle or mystery). As science fiction author Arthur C. Clarke famously wrote, “Any sufficiently advanced technology is indistinguishable from magic.”

A danger, then, lies in confusing technology for magic, as though technology is a source of power beyond the observable world rather than within it. If I pretend my car runs on an inexhaustible supply of pixie dust (and it may as well, for as much as I understand it), that doesn’t change the fact that internal combustion engines run on tangible natural resources, the consumption of which creates economic, environmental, and political effects and side effects.

Where technological requirements conflict with ethical principles, magical thinking can tempt us into what Lewis calls the “magician’s bargain: give up our soul, get power in return.” Jesus, of course, would have us ask: “[W]hat does it profit a man to gain the whole world and forfeit his soul?” (Mark 8:36 ESV).

We must, therefore, keep a sober mind about what is part of the mystery and what is not. This becomes increasingly critical as we move into the age of artificial intelligence. I may anthropomorphize a chatbot, but that misperception cannot magically implant a moral conscience for it to draw upon.

Not morally neutral

It’s tempting to evaluate any given technology in the abstract. For example, a patent application may seem like a pocket universe confining tech to a page of diagrams. But once someone decides to design, develop, or deploy that technology, containment is breached.

Technologies usually maximize outcomes that ignore moral considerations altogether. Indeed, technology by its nature encourages us to think this way, because we measure a technology’s outcomes according to the metric we intend it to maximize. For example, bullets, by definition, maximize for “penetration,” “stopping power,” and other euphemisms that measure how well they cause damage. Sports betting, by definition, consists in tech that leverages applied statistics and psychology to maximize cash flows to the House. If you lose, a little or a lot, that isn’t a sign of a broken process, but the tech working exactly as designed. The moral considerations such technologies raise are wholly beside the point, because developers optimize the technology following other criteria.

Technologies usually maximize outcomes that ignore moral considerations altogether.

It’s not that technology is necessarily immoral, but morality is rarely a metric that tech builders optimize for. It’s success or failure does not rest on ethical considerations, but technical ones.

This abstraction affords some moral distance—and shifts culpability—to those who employ such technologies. But those decisions don’t exist in a vacuum. Technology is not morally neutral, but morally complex, and navigating that complexity is our moral duty.

Get Our Latest Articles in Your Inbox Every Friday. Click to sign up.

Not just

The Bible, especially the Old Testament, often views technology with skepticism. This is understandable, since the patriarchs were a pastoral, semi-nomadic people surrounded by vast empires with superior technology.

The Egyptians found Israelite pastoralism “detestable” (Gen 46:31–34 LEB), an early example of the techno-elite looking down on their supposed inferiors. Later, the Egyptians forced the Israelites to build “storage cities” (Exod 1:11 NASB)—an ironic twist, given that Joseph’s prudent management of that same technology had earlier averted famine (Gen 41:29–36, 47–49).

In Judges, we’re told the Israelites couldn’t drive out some of the people of Canaan because “they had chariots of iron” (Judg 1:19 LEB). In 1 Samuel 13:19–23, the Philistines hoard the technology for repairing iron tools so the Israelites could not “make swords or spears for themselves” to resist them. Even after adopting the monarchical “technology” of the surrounding nations (1 Sam 8), Israel remained outclassed on every front.

The Assyrians, Egyptians, Babylonians, Greeks, and Romans all leveraged superior technology to conquer their neighbors, and the great colonial powers of Europe and Asia followed suit. On a smaller scale, governments use tech to torture prisoners or surveil citizens, and (some) companies leverage tech to exploit workers.

It’s not that technology is always unjust, but that justice is beside the point—and thus, our responsibility. Christians should always be on guard that our adoption and use of technology is not treating our neighbors as means to unjust ends.

Not God

The Old Testament often frames reliance on technology as non-reliance on God. If an idol is a bit of cultic technology for invoking the favor of a deity, then tech can be a way to replace relationship with God with the “work of human hands” (see Ps 115:1–8 LEB; see also Ps 135:15; Isa 2:8; 31:7; 44:17).

The story of the golden calf explicitly pits the technology of a graven image against the technology of the graven tablets. Both are means of worshipping Yahweh, but the technē of God’s finger is the legitimate instrument (Exod 31:18), not one the people produced by taking matters into their own hands (Exod 32:1–35).

The Tower of Babel was an attempt by people to bypass God with mortar and bricks (Gen 11:1–9). Here I see parallels in the race to build god-like artificial super-intelligences meant to solve our social and technological problems. OpenAI founder Sam Altman, for example, claims that AI will “fix the climate” and “discover all physics,” using its “nearly limitless intelligence.” This seems far-fetched, but the fact that anyone is trying is unnerving!

Generally, if over-reliance on technology isn’t leaning “on your own understanding” (Prov 3:5 ESV), I don’t know what is.

Not human replacement

Building a computer as omniscient as God is impossible, but it’s not hard to imagine one smarter than myself. Any basic calculator app is better at addition and subtraction, never mind square roots. Wikipedia has already “forgotten” more in editorial deletions than I’ll ever know.

Those outperform me in a single area. Now what if a machine could outpace me at every mental task? That’s no longer a tool, but a replacement.

Replacing manual labor involves economic trade-offs, yet almost nobody is worried about humans losing the ability to shovel dirt or haul rocks. But what if we get worse at reasoning? Or better at it, but only with mechanical crutches (thinking of my smartphone)?

Human aptitude for technology is obviously a strength. Dependence, however, is a weakness. Even as technology giveth a new set of capabilities—starting a fire, building a wall, driving on the interstate—it taketh away the corresponding personal abilities. I’m certain I wouldn’t thrive in the 1890s without my modern conveniences. I’m not even sure I’d enjoy re-living the 1990s all that much. I’d miss that smartphone.

Since Socrates first complained about how the invention of writing eroded the ability of students to memorize and recite every generation has accused the youth of relying too much on technology. And yet, losing the ability to think effectively isn’t at all the same thing as, say, no longer teaching cursive penmanship in schools—one you can easily do without, and the other, not so much.

Our mission is to empower our users, not replace anyone’s own learning or creative process.

Practically speaking, we at Logos are being judicious in our adoption of artificial “intelligence” technology, carefully considering how these (ultimately mechanical) processes can mismatch, overfit, or introduce error and bias into their responses. We are also careful to clearly label AI-generated content so it can be differentiated from old-fashioned brains-and-fingers content.

We are an information technology company, and we’ve experienced some exciting developments in our sector, to be sure. But our mission is to empower our users, not replace anyone’s own learning or creative process.

Not inevitable

Outside of a few survival needs, no specific technology has to exist. It only seems so because engineering is cumulative. Every invention that exists enables more to exist—if not outright requiring further development, at least dangling the possibility.

Automobiles require the wheel, gears, electricity, wires, and so on, but also manufacturing technology, supply chains, and fuel. The global petroleum industry has spawned several sub-industries, including plastics, to optimize the byproducts. We need mechanics to repair and maintain our cars, and construction crews to build and repair the roadways—and laws to govern them, too. Electric vehicles require yet another energy infrastructure and improved battery tech. Now we’re working on making cars drive themselves. What’s next?

As the gyre turns, it widens.

This isn’t a bad thing. This inertia drives prosperity as we know it. But our mass consumption society is intrinsically biased toward more. Anyone who argues otherwise must swim upstream, engaging a proxy argument against the wonders the new tech will have wrought. You’re not against [insert benefit to humanity here], are you? This forces the debate into utilitarian terms: Now one must demonstrate that potential downsides outweigh upsides, while also proving that no competitor will build the technology first and gain an advantage.

For years, I’ve admired the Shaker design philosophy, usually quoted as, “Don’t make something unless it is both necessary and useful; but if it is, don’t hesitate to make it beautiful.” I apply this principle using a series of questions:

  1. Does it have a useful purpose?
  2. Does it fulfill that purpose?
  3. Can it be beautiful?

I ask each in turn, and if the answer is “No!” I stop what I’m doing and try again. The method is not foolproof, but pausing to consider if we are being fruitful or just making noise certainly helps.

So what?

For years, the Logos company motto was “advanced technology for eternal truth,” which neatly captures the tension inherent in our business; namely, that we are constantly adapting the latest technologies for materials optimized for prior ages—sometimes very prior. Our mission is both explicitly religious and technological.

What happens when those things come into conflict? Do we compromise our mission and pursue technology for its own sake, or do we forego technology that conflicts with our mission, even if it would enhance the business or make our lives easier?

I sometimes say that if you ask a software engineer to solve a problem, don’t be surprised if the solution is more software. Tech companies like Logos build tech. But I think we build it best when we temper our enthusiasm. Our innovation is not driven by pursuing tech at all costs or for its own sake. Rather, we routinely make technological choices that make our lives as technologists more difficult because we think it will better serve our mission, our vision, or our customers.

Reverse Interlinears, Factbook, Smart Search—none of these things are a part of nature; neither are they magical; nor a sufficient substitute for God or ourselves; nor morally neutral or inherently just. And they certainly aren’t inevitable, but the product of deliberate, thoughtful choices as we struggle to build the best “advanced technology for eternal truth” that we possibly can.

Advance your thinking on advanced tech with these recommendations from Eli Evans

Ellul, Jacques. The Technological Society, tr. John Wilkinson. New York, NY: Vintage Books, 1964.

Ellul, Jacques. “Technique and the Opening Chapters of Genesis.” in Theology and Technology: Essays in Christian Analysis and Exegesis, eds. Carl Mitcham and Jim Grote. Lanham, MD: University Press of America, 1984.

Galladertz, Richard R. “Chapter 3: Field Responsibility.” in A Christian Field Guide to Technology for Engineers and Designers. Downers Grove, IL: IVP Academic: An Imprint of InterVarsity Press, 2022.

Gay, Craig M. Modern Technology and the Human Future: A Christian Appraisal. Downer’s Grove, IL: IVP Academic, 2018.

Kaplan, David M. Readings in the Philosophy of Technology. Rowman & Littlefield Publishers, 2009.

Kudina, Olya. Moral Hermeneutics and Technology: Making Moral Sense through Human-Technology-World Relations. Lanham, MD: Lexington Books, 2023.

Prior, Matthew. Confronting Technology: The Theology of Jacques Ellul. Eugene, Oregon: Pickwick Publications, 2020.

Reijers, Wessel, Alberto Romele, and Mark Coeckelbergh. Interpreting Technology: Ricoeur on Questions Concerning Ethics and Philosophy of Technology. Rowman & Littlefield Publishers, 2021.

Reinke, Tony. God, Technology, and the Christian Life. Wheaton, IL: Crossway, 2022.

Shrader-Frechette, Kristin, Laura Westra, Danny M. Cohen, Richard DeGeorge, et al. Technology and Values. Rowman & Littlefield Publishers, 1997.

A Christian Field Guide to Technology for Engineers and Designers

A Christian Field Guide to Technology for Engineers and Designers

Add to cart

 A Christian Appraisal

Modern Technology and the Human Future: A Christian Appraisal

Add to cart

 The Theology of Jacques Ellul

Confronting Technology: The Theology of Jacques Ellul

Add to cart

This Month's Free Book Is Yours for the Reading. Click to get it now.

Read Entire Article