One of the defining features of humankind since the mid 19th Century was that we learned to do things in a much bigger scale than before. The obvious case is technology: the steam boats of the 19th Century were much bigger and powerful than anything before. We began to build cross-continental railroads, dams, the works. But there are other cases that are just as striking and a bit less obvious such as bureaucracy (of course, Weber beat me to it by more than 100 years). The modern nation state is able to mobilize human resources in ways that no previous political entity could.
The problem is that, although we as a species learned how to do stuff in a much grander scale, thanks to impersonal, anonymous mechanisms, we didn't evolve simultaneously an equivalent sense of moral responsibility. So, when we discovered total war we were not prepared to it. We had the technology to kill millions of people, even in the form of an assembly line, but did not have sorted out the issues of agency and accountability to stop it. That's why, to go back to the tired example, one of the great mass killers of the 20th Century could be just a punctilious bureaucrat like Eichmann that would not have been out of place working for a shipping company.
Now it turns out that we also have the technology to cook the planet, drown or displace possibly millions of Bengalis and starve some other millions of Sub-Saharian Africans and we don't have the tools of collective moral responsibility to handle this. No one, of its own accord would allow this to happen. But as a collective with control over huge resources we do.
Which raises the question: is it possible that we have some kind of "evolutionary mismatch" where our abilities have vastly outpaced our moral senses? If that is the case, can the human species evolve in a reasonable timeframe the moral sense adequate to this era of mass mobilization of resources?
No comments:
Post a Comment