SHARE
Michael Crawford spoke at the CCA on Tuesday about the neg­ative effects of traffic laws and arti­ficial intel­li­gence tech­nology. Joseph Harvey| Courtesy

An increasing presence of tech­nology in the everyday lives of Amer­icans could poten­tially con­tribute to the loss of “pru­dence, judgement, and sociality,” according to writer Matthew Crawford during a speech Tuesday for Hillsdale College’s Center for Con­structive Alter­na­tives.

The first CCA of this semester addressed arti­ficial intel­li­gence.

Crawford, a writer and research fellow at the Institute for Advanced Studies in Culture at the Uni­versity of Vir­ginia and a con­tributing editor to The New Atlantis, spoke to the potential con­se­quences of new tech­nology on how we function in society. His upcoming book, “Why We Drive: Toward a Phi­losophy of the Open Road,” will address the impact our traffic laws have on indi­viduals.

Arti­ficial intel­li­gence and other tech­no­logical devel­op­ments, Crawford said, may hamper indi­vidual choice as society del­e­gates more respon­si­bility to robots. He said an emphasis on human agency, as opposed to more reliance on tech­nology in policy, would help cit­izens live according to pru­dence, judgement, and sociality.

“When I say ‘open road,’ I mean to invoke the space of human agency that is opened when people are left to their own devices,” he said. “What if we used our blessed eye­balls to determine whether to turn left at an inter­section?”

Using everyday city traffic as an example, Crawford argued that, as tech­nology elim­i­nates the small tasks and deci­sions of everyday life, indi­viduals will lose their ini­tiative and per­sonal liberty.

An increas­ingly tech­no­logical society, according to him, would be detri­mental to the practice of pru­dence, which comes only from expe­rience “and is cul­ti­vated only when we are free to err.”

Crawford showed a brief video of a busy inter­section in Addis Ababa, Ethiopia, which func­tioned well even without stop­lights or des­ig­nated lanes. The imple­men­tation of arti­ficial intel­li­gence in inter­sec­tions like these, he said, was unnec­essary, given how well indi­viduals coop­erate on their own.

The coop­er­ation of drivers in under­de­veloped coun­tries, Crawford said, is one example that proves that “self-gov­ernment as a prin­ciple will help us con­front the world of arti­ficial intel­li­gence.”

“Dis­order is bad,” Crawford said. “The project for rational control rests on a very thin con­ception of what reason con­sists of and too narrow a view of where it is located in society. The Addis Ababa inter­section is the picture of rational control.”

Crawford argued that by del­e­gating more small deci­sions — like when to stop and go at a large inter­section — to tech com­panies and municipal bureau­cracies, cit­izens would lose their political liberty as well, becoming more like sub­jects than cit­izens.

“From the per­spective of a central power, what is wanted is an ide­alized subject of a dif­ferent sort, an asocial one, an atomized account of a human being,” Crawford said.

The simple forms of coop­er­ation and com­mu­ni­cation that one observes in the seem­ingly chaotic streets of Addis Ababa rep­resent a system that bal­ances the rational gov­ernment with indi­vidual freedom, according to Crawford.

Assistant Pro­fessor of The­ology Jordan Wales said there may be better examples than the Addis Ababa inter­section to support Crawford’s general argument, and he said Crawford’s obser­va­tions may be more rel­evant in issues such as par­enting, rather than municipal traffic.

Senior Dylan Strehle said Crawford had some good points, but he said Crawford’s example of the Ethiopian inter­section was not well-received.

“Crawford was incredibly artic­ulate and pre­sented a com­pelling case,” Strehle said, adding that Crawford could have done better addressing stu­dents’ ques­tions.

Wales, who will be on the faculty round­table Thursday night, said he will address his thoughts on how Crawford’s view of the lack of indi­vidual ini­tiative could relate to par­enting.

“A system that enforces obe­dience without reflection is a system that raises children without ini­tiative, thought­fulness, or cre­ativity,” he said. “In driving, we see this in the use of a GPS.”