Students and staff can start to see how digital technologies work – those that underpin the Web and elsewhere. They can think about how these technologies shape the formation of their understanding of the world – how knowledge is formed and shared; how identity is formed and expressed. They can engage with that original purpose of the Web – sharing information and collaborating on knowledge-building endeavors – by doing meaningful work online, in the public, with other scholars. That they have a space of their own online, along with the support and the tools to think about what that can look like.
It doesn’t have to be a blog. It doesn’t have to be a series of essays presented in reverse chronological order. You don’t have to have comments. You don’t have to have analytics. You can delete things after a while. You can always make edits to what you’ve written. You can use a subdomain. (I do create a new subdomain for each project I’m working on. And while it’s discoverable – ostensibly – this work is not always linked or showcased from the “home page” of my website.) You can license things how you like. You can make some things password-protected. You can still post things elsewhere on the Internet – long rants on Facebook, photos on Instagram, mixes on Soundcloud, and so on. But you can publish stuff on your own site first, and then syndicate it to these other for-profit, ad-based venues. […]
That’s your domain. You cultivate ideas there – quite carefully, no doubt, because others might pop by for a think. But also because it’s your space for a think.
I’m enjoying working through a Coursera MOOC from the University of Virginia, The Modern World, Part One: Global History from 1760 to 1910. Recommended if you’re interested in learning more about how the modern world came to be. As you might expect, lots of interesting parallels between fascist dictatorships old and new.
Tara Isabella Burton writing for The Atlantic:
Even in the United Kingdom, where secular bachelor’s programs in theology are more common, prominent New Atheists like Richard Dawkins have questioned their validity in the university sphere. In a 2007 letter to the editor of The Independent, Dawkins argues for the abolishment of theology in academia, insisting that “a positive case now needs to be made that [theology] has any real content at all, or that it has any place whatsoever in today’s university culture.”
Such a shift, of course, is relatively recent in the history of secondary education. Several of the great Medieval universities, among them Oxford, Bologna, and Paris, developed in large part as training grounds for men of the Church. Theology, far from being anathema to the academic life, was indeed its central purpose: It was the “Queen of the Sciences” the field of inquiry which gave meaning to all others. So, too, several of the great American universities. Harvard, Yale, and Princeton alike were founded with the express purpose of teaching theology—one early anonymous account of Harvard’s founding speaks of John Harvard’s ,“dreading to leave an illiterate Ministry to the Churches”and his dream of creating an institution to train future clergymen to “read the original of the Old and New Testament into the Latin tongue, and resolve them logically.”
Universities like Harvard, Yale, and Princeton no longer exist, in part or in whole, to train future clergymen. Their purpose now is far broader. But the dwindling role of theology among the liberal arts is a paradigmatic example of dispensing with the baby along with the bathwater.
Richard Dawkins would do well to look at the skills imparted by the Theology department of his own alma mater, Oxford (also my own). The BA I did at Oxford was a completely secular program, attracting students from all over the religious spectrum. My classmates included a would-be priest who ended up an atheist, as well as a militant atheist now considering the priesthood. During my time there, I investigated Ancient Near Eastern building patterns to theorize about the age of a settlement; compared passages of the gospels (in the original Greek) to analogous passages in the Jewish wisdom literature of the 1st century BC; examined the structure of a 14th-century Byzantine liturgy; and read The Brothers Karamazov as part of a unit on Christian existentialism. As Oxford’s Dr. William Wood, a University Lecturer in Philosophical Theology and my former tutor, puts it: “theology is the closest thing we have at the moment to the kind of general study of all aspects of human culture that was once very common, but is now quite rare.” A good theologian, he says, “has to be a historian, a philosopher, a linguist, a skillful interpreter of texts both ancient and modern, and probably many other things besides.” In many ways, a course in theology is an ideal synthesis of all other liberal arts: no longer, perhaps, “Queen of the Sciences,” but at least, as Wood terms it, “Queen of the Humanities.”
Yet, for me, the value of theology lies not merely in the breadth of skills it taught, but in the opportunity it presented to explore a given historical mindset in greater depth. I learned to read the Bible in both Greek and Hebrew, to analyze the minutiae of language that allows us to distinguish “person” from “nature,” “substance” from “essence.” I read “orthodox” and “heretical” accounts alike of the nature of the Godhead, and learned about the convoluted and often arbitrary historical processes that delineated the two.
Emphasis mine, as I’m currently thinking and reading about generalism and breadth vs. depth of knowledge. Working in a university means these thoughts occur rather often.