Technology and Higher Ed

higher-education-investing-in-technologyThere’s an amazing amount of babble in the news about technology and higher ed. Most of it is rather utopian, and most of the utopian rhetoric is coming from people who have a stake in selling higher ed technology, and almost all of this hype is coming from people who haven’t taught a day in their lives. Google has even recently dropped (wasted) a modest amount of money into researching the effectiveness of MOOCs.

I would like to suggest some sound thinking about technology and higher ed. Before I do, though, I’d like to tell you where I’m coming from.

First, I’m not a luddite. I love technology. I purchased an iPad 1 about three months after the initial release of the iPad and used it until purchasing an iPad Air late last year. I’ve been online since the early 90s (AOL 1.5? Maybe 2.0?) and have been creating webpages since the mid to late 1990s. I learned basic HTML back then, but I mostly worked in Dreamweaver. As a graduate student (I started Fall 1999), I worked as Coordinator of Enrollment Services, coordinating web-based projects among the offices of Financial Aid, College Admissions, and the university webmaster. I was responsible for putting the college catalog online and for helping to create online (and secure) policies and procedures manuals for every interested department on campus. One of my last projects was to create an online FERPA tutorial that all faculty and staff were required to take.

Since then, I’ve served as editor for two online journals, one of which was written up by the New York Times, and most recently I’ve started learning TEI and have become an editor for the Digital Mitford Project. I am committed to the advancement of digital humanities, which in my opinion is just an umbrella term for work that we’ve been doing since at least the 1980s. Digital humanities are not a fad: they’re a good thing, and they’re here to stay. I’ve also been teaching online since 2008 — not full time, but maybe one course a semester, maybe two or three over the summer — so I’m familiar with online education as well. I designed my current institution’s fully online Master of Humanities program and contributed several courses to it. Its enrollment has grown since then from about 70 students to probably over 200. And, as you see, I run this blog.

So please don’t try to tell me I’m afraid of technology.

Next, I’ve been a student at a variety of institutions. I have attended community college, a tier-1 research university, and small liberal arts colleges. I am mainly the product of small liberal arts environments with classes that were no bigger than about seventeen or eighteen students.

Finally, I am an educator. I’ve been teaching since 2001, and teaching full time since 2004 (Lecturer in English at Rollins College from 2004-2008 and then Assist/Assoc. Prof. of English at Tiffin University, 2008-present). I’ve taught a wide variety of college student populations ranging from prep-school students from fairly wealthy families, to adult evening and online learners, to community college students, to largely rural first-generation students.

Now here’s what I’ve learned about technology and higher ed: most thinking about it is misguided. One basic principle is involved here:

1. Students do not need to be educated by technology. Students need to be educated to use technology.

Do you see the difference? Most of our thinking about higher ed and technology involves finding some kind of magic bullet that will “fix” education cheaply. The dream here is that technology will educate students rather than teachers, or that technology will somehow supercharge regular teaching methods and make them more effective. Most of the hype about MOOCs is behind this magical thinking.

That just doesn’t work for most students. Sorry. Especially not the hordes of below average students being cranked out by our underfunded public education system. Studies seem to demonstrate that online education — especially the most highly self-directed of all online education, the MOOC — works best for students who are already highly educated or, at least, exceptionally high performing, mature, and self-motivated. I think that MOOCs are great when used as intended: as a free, non-credit means to self-education. They’re like high-powered books. MOOCs to educate the average incoming freshman student, however, has always been and will always be, at least for the foreseeable future, a failure.

I know this because I know how online education works, what students need to be educated, and because all experiments using MOOCs, or even just fully online classes, to educate this population so far have been dismal failures. What I’m saying here isn’t some kind of “defeatism.” It’s a refusal to succumb to ignorance of what’s obvious to any teacher who has spent any amount of time teaching these students. Maybe someday the technology will catch up, but I suspect this is further in the future than most people think, perhaps much closer to science fiction than science fact, and students need to be educated today, right now.

Why won’t these solutions work? For students to learn effectively in an online environment, especially an automated one, they need two qualities:

1. They need to be able to read, understand, and follow written instructions.

2. They need to be able to work on their own with little to no external compulsion. They need to be self-motivated.

Most students coming into college just aren’t there yet. If you’ve ever done any kind of text-based teaching, you may be familiar with this scenario: a student reads a line or two on a page and just doesn’t get it. You read the exact same words out loud and they completely understand. You might be tempted to think that video instruction can serve this purpose, and it may be more effective for these students than written instruction, but that’s not quite the same either. It’s too easy to zone out in front of a television or computer screen.

Seriously, many students leaving college still have a hard time with these two skills.

This will probably never happen:

Do you think that’s air you’re breathing? Heck yes it is. My mind has not, in fact, been jacked into a computer system.

And no, online education is not in its “infancy.” AOL started it all. The first online course started running in 1988. That’s 26 years ago. In tech years, that’s a very long time. How much 26 year old technology are you still using? Still using a 26 year old cell phone? A 26 year old car or television? A 26 year old microwave? Here’s some history:

Between 1990–94, AOL launched services with the National Education Association, the American Federation of TeachersNational Geographic, the Smithsonian Institution, the Library of Congress, Pearson, Scholastic, ASCD, NSBA, NCTE, Discovery Networks, Turner Education Services (CNN Newsroom), National Public Radio, The Princeton Review, Stanley Kaplan, Barron’s, Highlights for Kids, the US Department of Education, and many other education providers. AOL offered the first real-time homework help service (the Teacher Pager—1990; prior to this, AOL provided homework help bulletin boards), the first service by children, for children (Kids Only Online, 1991), the first online service for parents (the Parents Information Network, 1991), the first online courses (1988), the first omnibus service for teachers (the Teachers’ Information Network, 1990), the first online exhibit (Library of Congress, 1991), the first parental controls, and many other online education firsts.

But before I move on, let’s make some distinctions:

We need to distinguish between “online education” in general and MOOCs. I’m not going to discuss hybrid or blended models that combine online components with face to face instruction: just fully online courses. Hybrid courses may be a very good solution for institutions that need more sections than they have available classrooms, but they have to be designed well. The courses that I teach online are capped at 25 students. These students had to apply to get in to the program, so that they meet some kind of minimal academic standards, and my university requires me to engage with the class as a group at least five days per week, to respond to discussion forums within three days, to grade papers within seven days, and to respond to emails within 24 hours. These courses are not self-directed by any means. I am teaching them. I also happen to be teaching courses that I created, so I have a sense of ownership over the material, but that’s often not the case with online classes. Right now, this very summer, I am teaching online sections of Literary Theory, Creative Writing: Poetry, the Comprehensive Exam class, and an independent study on Shakespeare. I designed and am running all of these courses. I’m watching my students’ work and am confident that they are learning.

MOOCs, on the other hand, for the most part, may enroll tens of thousands of students, do not have any vetting process for these students, do not have any meaningful, direct interaction between the instructor and these students (how can it with that many students?), and are largely self-running, self-paced courses. I’d like to repeat — MOOCs are great for people who want to learn informally about a topic. But because MOOCs can’t build in meaningful assessments, they shouldn’t be offered for college credit.

So not all online education is quite the same. In my experience, face to face instruction is the best, followed by online education with small classes and direct instructor interaction. If the online courses run on their own, great, but not for college credit.

I say this because education is not just about the dissemination of knowledge. It’s not some kind of programming for the human mind, no matter how cool that scene in The Matrix seems to be. The human mind cannot be programmed like a computer. It is never simply a passive recipient of information that gets imprinted upon it and then remains that way, unchanged, once imprinted. Human beings are more than data storage units: we have volition and emotion in addition to intellect, both of which are involved in knowledge acquisition, and most importantly, all knowledge is social. Knowledge exists in a social context, so that even scientific knowledge is value-driven and fueled by emotion (in science, the primary emotion should be curiosity). Because knowledge is inherently social, scholars in all fields publish their findings, write books, and have their work evaluated by others.

So we need to be not only given knowledge, but we need to be socialized into that knowledge. We also need to realize that the majority of business interactions are still face to face — just think about the number of direct human interactions the typical office worker might have on a daily basis. If our students never learn to manage face to face interactions, they will not have been educated for the workforce. Employers believe this too, at least implicitly, as of all the top skills desired by business owners and leaders, social skills are close to the top, if not at the top. Education trains the mind, the emotions, and the will, and does so in a clearly defined social setting that has a work and interpersonal ethic built in. Online education is not quite the same in this respect, and certainly not, for the most part, in terms of real time face to face interaction.

Beyond the socialization of knowledge, education also entails skills development: writing skills development, analytical thinking development, synthetic thinking development, creative thinking development, critical thinking development, reading skills development, etc. Skills development in education doesn’t have to do with what the student takes in, but with the quality of the work that students put out. This kind of skills development requires some level of personal mentoring. Imagine trying to become an Olympic figure skater and be entirely self-taught. No matter what your athletic ability, that may well be an impossible task. It might be hard enough to just be an average figure skater, and even if you could be, you’d learn much more quickly with a good instructor and can develop further faster.

Now don’t get me wrong: I’m not saying technology is useless as a means of instruction. But it is at best supplemental to the work of instructors. And to be honest with you, if you were to set up two literature classes with the same reading lists and gave one set of students nothing but physical books, notepads, and pencils, and another set of students any technology that they wanted, I seriously doubt that the tech-laden students would do any better. In terms of the development of their reading and writing skills, the teched-out students might even do worse, as they’ll have more distractions. The same may be true of math instruction too, but I think that math instruction may work better in computer environments because computers can do math. They just can’t read. Computers can’t understand what they are reading the way persons do. Computers don’t understand math either, but at least they can do it perfectly.

Why can’t computers read? Because, first of all, most words in most languages mean more than one thing, and words are reliant upon context beyond themselves for their meaning. So in order for a computer to really be able to read, it would have to have all conceivable knowledge of all conceivable social contexts throughout the history of the literature that it is reading, and then be able to choose effectively what contextual knowledge is most important to that literary text. While human beings don’t have this extent of knowledge, they do still possess contextual knowledge, and they are capable of making choices while reading.

In math, on the other hand, a number means one and only one thing, and that meaning doesn’t change from one era, nation, social group, or language to the next. And because human beings are not only intellectual creatures, but are also volitional and emotional beings, and all of our capacities — our intellect, our wills, and our emotions — are engaged in the acts of reading and writing in ways that they are not engaged in basic math (highly advanced math may be an exception, but probably less than a fraction of a percent of our population are capable of working at this level). We can and often do write with feeling, however, even if we have poor writing skills. It’s very difficult to add and subtract like we really mean it, though.

So what do students need? They need to learn to do things with technology. I would love for every student coming out of college to have one programming language and some basic instruction in HTML, .php, and maybe Python or Flash. Maybe some network instruction, and in their last year of college, advanced instruction in Microsoft Word and Excel, and I mean advanced. Students need basic instruction in word processing, spreadsheet, and presentation programs their first year and advanced instruction in the latest versions of these programs their final year. I want students to know how tech works from the inside. Tech isn’t going away. Learning how it works and being trained in it from the start means being able to adapt to its constant changes, and perhaps to even be an agent of change.

Now you might think that students already know how to use technology, so they don’t need to be educated in this area. Most students (and I mean a statistical average), though, in reality, know social media and their phones, but not how to use Word and Excel or Powerpoint very well, or really know much about the technology itself, or about networking. While social media savvy is useful for business environments, knowing social media doesn’t mean knowing how to use it appropriately for business purposes, and you need to know more than social media to be effective in the workplace.

You also might think that students will be more interested in the material if it’s presented via the tech of their choice, but if they’re not reading print books, they’re probably not reading eBooks either. Many students just don’t read books much at all, and reading Twitter and Facebook feeds just doesn’t cut it: reading unstructured and superficial short blurbs doesn’t require the sustained, organized attention that developmentally effective reading does. Tech won’t make old subjects suddenly sexy, and the fact is — students are interested in more than tech. Sometimes they don’t know it. It’s our job as educators to try to show them that this is true, however. And seriously — students who play action-based video games are regularly engaging characters, settings, and plot lines that have existed in literature sometimes for literally thousands of years. Superhero characters are more often than not based on Greek gods, for example. Students are already interested in this literature, and when they read it, they enjoy it. I’ve had positive experiences teaching even somewhat challenged students Homer’s Iliad and Milton’s Paradise Lost. A compelling story is hard for anyone to resist.

My emphasis on technology in higher ed — on students as users and creators — facilitates the development of creativity, problem solving skills, and gives them hands-on experience with the tasks they will be required to perform in the workplace. An emphasis on technology as a means of providing an education, however, slows down education (as teachers and students are continually on the treadmill of learning new technologies) and keeps students behind the curve educationally, as the technology used to educate students seldom resembles the technology students will be using in the workplace.

So to repeat: Students do not need to be educated by technology. Students need to be educated to use technology. No one here is rejecting technology. If we’re going to use it, though, we need to understand its limitations from the inside.

Michelle Moravec’s “Tales of an Indiscriminate Tool Adopter” provides very useful and practical advice for adopting technology for higher ed purposes.

You might also want to read about how one province in Canada seems determined to innovate in education even if that means using methods that have been proven to be ineffective in all previous trials.

 

Published by James Rovira

Dr. James Rovira is higher education professional with twenty years experience in the field in teaching, administration, and advising roles. He is also an interdisciplinary scholar and writer whose works include fiction, poetry, and scholarship exploring the intersections of literature and philosophy, literature and psychology, literary theory, and music and literature.. His books include Women in Rock, Women in Romanticism (Routledge, 2023); David Bowie and Romanticism (Palgrave Macmillan, 2022); Writing for College and Beyond (a first-year composition textbook (Lulu 2019)); Reading as Democracy in Crisis: Interpretation, Theory, History (Lexington Books 2019); Rock and Romanticism: Blake, Wordsworth, and Rock from Dylan to U2 (Lexington Books, 2018); Rock and Romanticism: Post-Punk, Goth, and Metal as Dark Romanticisms (Palgrave Macmillan, 2018); and Blake and Kierkegaard: Creation and Anxiety (Continuum/Bloomsbury, 2010). See his website at jamesrovira.com for details.

6 thoughts on “Technology and Higher Ed

  1. Yes, some studies are out there. Many of them have been funded by the technology producers, so they’re essentially useless.

    I would understand the preference for tablets if eBooks were ultimately cheaper than print books, but they’re not.

    Like

  2. Very valuable discussion indeed, James.

    Does anybody here have experience with the use of tablets to completely replace textbooks as well as study guides in higher education? I would love to hear from you. (“Blended learning”)

    Secondly James, I would like to throw a thought around.
    “Students need to be educated to use technology”
    I wonder whether technology does not perhaps teach students to use technology only, but not teach them subject matter effectively? I ask this, because I see students that have not really worked with technology before, becoming really smart, very quickly in using it, but applying technology in ways they like and are interested in. (This very often does not include a lot of academics, though :-)) ). Generally, our students are really smart with technology.

    Would love to hear everyone’s input!
    Regards from an icy cold but still sunny South Africa.

    Like

    1. I really don’t see any meaningful pedagogical difference between using eBooks and using print books. Print books are easier to navigate (flip through), easier to mark up, and easier to bookmark. You can do all of this using eBooks, I know, but you can’t generally view the whole thing as easily. eBooks are easier to use for single word searches. I have taught several classes in which students purchased electronic versions of the assigned textbooks in some of my classes and print versions in others. They seemed to do about the same in both classes. eBooks are easier to carry around than a huge stack of books, but they’re harder to navigate — you can only view one page at a time.

      Anyone who thinks they’re being cutting edge by using tablet technology doesn’t really know anything about technology or about pedagogy. We have a general environment in which we think the use of tech in and of itself is seen as “cutting edge,” but I really don’t see any pedagogical value in most of it, and the majority of studies conducted in this area seem to have been financed by the people selling/developing the tech, so can’t be trusted. Any student who is using eBooks (or any specific kind of tech) for the first time one semester suddenly has two things to learn: the subject material AND how to use the technology. That’s counterproductive to learning the subject matter. It creates maybe double the work.

      Keep in mind that most students don’t read books much at all, much less read eBooks.

      I don’t think it’s true that students are generally adept at using technology. They probably know how to use their phones, Facebook, and Twitter. They can use Snapchat, Instagram, Imgur, etc., and browse the internet, but that doesn’t mean they’re familiar with advanced Word and Excel functions, or Powerpoint, (or even basic functions in these programs, unless they were taught in HS or by their parents). They probably don’t know image editing or coding either, again, unless they happen to love that stuff or took courses in high school. They probably — the average student — don’t actually know anything about their computers or how networks work, or even what a network really is. Even then it’s usually on a basic level. So we need to be careful what we mean when we say students are “adept” at using technology. That usually means, “adept at using social media and their phones, and some video games” some of which is useful, but they need to know a lot more than that, especially to be useful — or to have an edge — in the workforce.

      Like

      1. James, I cannot agree more. Added to this, if a tablet is stolen, a whole academic bookshelf and all the work is too. Secondly, many eBooks are not optimsed for searching. Thirdly, in a scenario where students are required to do an assignment, where the handbook, study guide, word document and assignment specification are all on the tablet. Any advice on how to make this workable for the student?

        With ‘adept’, I was very careful to specify, exactly meaning what you described above – “in ways they like”.

        Any other academics whose students need to use tablets? I wonder if any studies are being conducted on the long-term influence of technology in so-called blended learning.

        Like

Leave a comment