Testing Musings

I know it’s been quiet around here; the summer is usually a mad rush to get as much research done as possible without the time constraints of teaching, and it’s no different for me. This is especially so in the “final push”, as it were, but thankfully I have a bit of breathing room as projects move from my desk to those of the people who will make decisions. In the meantime, I’ve also been exploring the possibilities of part-time work to supplement my teaching income (we get about $300 less a month compared to doing research, which is pretty substantial on a grad student budget), and have had particular luck so far pursuing a position as an ACT/SAT tutor with Kaplan. It is that experience that has most recently provided me with fodder for my musings about higher education, and I’d like to share some of those.

As part of seeking this position, I am required to provide my SAT/ACT scores as evidence that I am qualified. Now, although I know those scores are around somewhere, I also know it would take a lot of digging and hoop-jumping to find them; given that this is really a side project, that’s not a time investment I was willing to make. Moreover, those test scores aren’t exactly, well, recent…if I remember correctly, I last took them as a junior in high school, which puts them on the upward side of ten years ago. I can’t honestly say 12-year-old scores are necessarily representative of my ability today to recall material that was tested back then. So, I decided to just bite the bullet, take a couple hours, and sit down to go through both tests again (mind you, these are provided by Kaplan, not officially sanctioned tests). I’ll also admit: I was more than a little curious to see how well I retained what had been taught in high school (in hindsight, a qualification exam was perhaps not the best time to indulge that curiosity). My experience and my results revealed some surprising things to me, particularly observations about the nature and quality of the tests that I would never have noticed before becoming an educator.

When I am designing an assessment, I not only want to make sure it covers the appropriate material from class, but also that it makes student use that material in a way that expands beyond the course. Ideally, my assessments test students on things they will actually need to know and do perhaps 5 years after my course; as a corollary, if my tests are well designed, students should be able to do as well (if not better!) 5 years after the fact. This is the same ideal I used to assess the quality of the SAT/ACT, and I have to say I was actually pleasantly surprised! I’ve always been rather dubious of the quality of standardized tests in general, so it was much to my shock that I either scored about the same or even improved on my scores of 12 years ago. Of course, I did go to an excellent high school, which helps in the long run, but if my “years after the fact” assessment ideal is valid, then I can say these two tests are actually surprisingly good. BUT…I have some issues with them.

First, based on my experience, the ACT – even though it is less popular – is from an educational standpoint a better test. Not only is it generally better well written (I took these back to back, so I can say that for certain), but it tests its concepts in a real-world and meaningful way. This is especially true of the science section, and even I  – a professional scientist – had to stop and think about some of the critical thinking sections. I use all of the skills in those sections (reading, writing, math, and science) on a daily basis, and lo and behold, I made a perfect score of straight 36. To me, that says that not only am I an irredeemable nerd, but also that in the long run, the ACT is actually testing appropriate material in an appropriate way. The SAT, on the other hand…

I will be honest, I have never performed as well on the SAT as I have on the ACT, though after a couple of classes, I always solidly landed in the 99th percentile on both tests. This time, however, I didn’t even quite make 90th percentile on math (reading and writing were a different story)! Now mind you, it’s been a while since high school math, but my scientific career is at least partially built around complex differential equations. Nevertheless, when presented with a slew of geometry problems, I was stumped; worse, the reason that I was stumped is that I had forgotten many of the formulae that would be used. In other words, I was not being tested on my ability to identify and solve problems mathematically, I was being tested on my ability to remember information. Moreover, it’s information I never used again, one of the biggest complaints I hear levied against high school math. I always used to think that such complaining was nothing more than a case of teachings failing to articulate the importance of their content, but after taking these tests, I feel that there may just be a kernel of truth to the “I’ll never use this again” crowd.

Of course, I freely admit that all of this is just musing based on my own limited experience; I can’t provide the same sort of reasoned argument I usually try to give, though I bet enough digging would reveal that there is indeed evidence to inform these musings. For now, I have to keep chugging through on this application process (and this “dissertation thing”), but it just goes to show you: you never know when you’ll get some profound insight about what you do.

Some Modest Proposals

It’s been refreshing for me to see the amount of attention that higher ed is receiving as the 2016 election cycles start winding up. The idea of student debt and making college affordable has been a particularly active one, with almost all of the presumptive candidates weighing in on the issue. Unfortunately, the ideas being offered so far have all been grand notions that are rather scant on details, and I’ll admit I’m a little worried that higher ed will simply be reduced to a political football without having a real conversation about it. Vague talk of college affordability and “reform” sounds nice, but the devil is in the details, and we’ve seen in North Carolina and Wisconsin that “reform” can easily slide into an assault on whole institutions. That said, there is no doubt that we have problems in higher ed, or at least ways we can do better, so I’d like to offer some of my own ideas on how we can improve education in this country. I will say right away that many of these are pipe dreams, and that many of these are reforms that must come from within the system rather than from without (lest the debacles of North Carolina play out again). Nevertheless, I think they’re a good starting point for a meaningful conversation.

Train Professors to Teach

A PhD is not a degree about teaching, plain and simple. A PhD is earned by mastering a field and making substantial contributions to its body of knowledge, and tenure-track positions are often awarded in large part on the merits of a body of scholarly work. Nowhere in the process is teaching emphasized (though I am sure there are likely exceptions to that rule somewhere), yet pretty much every academic is going to find themselves in the classroom at some point. Furthermore, universities are under increasing pressure to emphasize quality education for their undergrads, and while teaching-centered positions are starting to pop up across the board, there is little to no training on how to be an effective teacher. Many universities do have limited/optional programs designed to help promising teachers prepare for faculty positions – I was fortunate enough to be in UGA’s Future Faculty Program, which does just that – and many degrees do require at least a semester of teaching, but the fact of the matter is that we are simply not teaching newly minted PhDs the skills that make an effective classroom work. At least one reason for this is the misguided belief that expertise makes one an effective teacher, or that the way we were taught (lecture, bookwork, etc.) is the right way to teach. This sort of inertia is not easy to overcome, but we’ve started to see gathering winds of change in efforts to emphasize e.g. experiential learning and the flipped classroom. While research will always be the sine qua non of an academic’s life, if we really want to improve higher education, we need to acknowledge that teaching is itself a particular skill, and should be taught during graduate school. K-12 teachers earn special training, why not university professors?

Integration Along the Education Spectrum

I have talked to maybe one high school teacher seriously about what they do in the classroom, and those conversations happen simply because one of the faculty here teaches high school occasionally. Aside from that, I essentially have no idea what students are graduating high school able to do, and that combined with the sheer variety of secondary ed quality forces me to basically start at square one with new students. Common Core is of course one attempt to ensure uniform capacity across high school students, but the simple fact of the matter is that high school and university teachers just don’t talk to each other. How many opportunities are missed to create large-scale, years-long and significant curricula? I’m not saying high school should simply be preparation for college, but I firmly believe that curriculum reform (in both high schools and colleges) should involve extensive collaboration between the actual educators at both levels. Not only would this better prepare students to succeed after high school, but it would also help make the most of the college years by minimizing overlap and “re-instruction.” At the very least, such collaboration would help college instructors develop a realistic model of what their students are able to do the minute they step through the classroom door; knowing your students’ conceptual frameworks is the crucial first step to creating a significant learning experience.

Move Adjuncts to Full Time Positions

Much has been written about the “adjunct crisis” and its ramifications, and I know I’ve shared stories on this blog of adjuncts who have to work multiple jobs and spend hours commuting simply to make ends meet. No matter how skilled an instructor is, there is no way the quality of instruction can be fully realized under those conditions. Moreover, developing a feel for a school’s deeper mission can take substantial time and experience on a campus, and having a large suite of migrant workers prevents that sort of deep connection to a particular institution. There are of course bad adjuncts, just as there are bad tenured professors, but on the whole these contingent workers are exceptionally skilled teachers (it takes no small amount of skill to tackle that many classes and still provide quality) who do great things for a school. The only reason they aren’t tenured faculty is because it’s cheaper to hire adjuncts, plain and simple. Of course, there will always be a need for a flexible teaching force to deal with fluctuations in demand, but there is absolutely no good reason for 75% of the teaching workforce to be contingent faculty. If anything, that ratio should be inverted. Unfortunately, the only real way to make this happen is with more money, and as of right now it doesn’t exactly look like funding is going to start meaningfully moving in the right direction.

Work With Businesses (but not FOR Businesses)

I put forward this proposal with some hesitation, because there is a distinction between working with business interests and working for them, and that distinction tends to get blurred when put into practice. This is especially the case if political powers start meddling and bringing an ideology to higher education; North Carolina is a perfect example of how this can go very wrong. There is no reason to assume the mission of the business sector naturally aligns with the core mission of a university, and there is a very real risk of businesses attempting to turn the university into nothing more than expensive job training (despite the fact that this is not strictly the mission of the school). Nevertheless, working closely with businesses to provide internships and other experiential learning opportunities could dramatically improve the learning outcomes of students; one of the best ways to earn something, after all, is to do it. As an added bonus, students who participate in such programs have the chance to pick up some of the specific skills and experiences that make up the notorious “skills gap.” There is of course more to a college experiencing than preparing for a career, and we need to do a better job of articulating that, but if done well these sorts of business-school collaborations could prove extremely fruitful relationships.

One Size does NOT Fit All

In my mind, one of the biggest and most insidious problems plaguing higher ed is the lack of understanding about the different types of schools out there. Everyone deserves a chance to get a quality education, but everyone also has different needs and goals. Someone who loves cars and wants to pursue a career repairing them may not benefit so much – in the career sense – from getting a 4-year degree in engineering (though I would argue they benefit generally from the experience and new world views), and not everyone fits into the traditional four-year experience. Of course, with tuition becoming one of the most important funding sources for schools, there is pressure for a particular school to try to be everything for everyone, so this problem cannot be entirely corrected without also addressing pervasive funding issues. The recent attention from the President on community colleges is certainly a good start, and I think we have a long way to go in recognizing the importance of those schools, but as always it will be the follow-through that makes the difference.

As I said earlier, many of these ideas are really reforms that must come from within higher ed, and some are certainly more idealistic than others. Regardless, we cannot deny that higher ed has problems, but these are problems that we can fix. What’s more, if we don’t take it upon ourselves to improve our schools, we risk others doing it for us. And that usually doesn’t end very well.

Away in the Field

I know it’s been somewhat silent on the blog for a few days, because I’ve been traveling and busy making arrangements for more travel. The upcoming week I will be down at the Georgia coast doing field work, so unfortunately it will probably be quiet a little bit longer (but the good news is the only thing standing between me and a PhD will be a final chapter!). Your regularly scheduled musings on higher education will return around the middle of next week if all goes to plan.

In the meantime, let me direct your attention to a few interesting articles that I’ve read recently and will certainly provide food for thought.

This excellent article on the Chronicle deals in more detail with the subject of job training (see my own take on the issue here).

A very nice take on an important type of diversity that is, in my opinion, entirely under-discussed: socioeconomic diversity. It’s an important topic, to say the least.

A great and accessible take on setting fixed versus growth mindsets, something which every instructor should be aware of.

The Taxonomy of a School, and Why it Matters

I’ve said it time and again, but it bears repeating: I think one of the biggest problems in the national conversation about higher education is that “higher ed” is treated as one single monolithic entity. There is simply “college.” Of course, anyone in academia immediately recognizes that is a fallacy, and I honestly would not have been surprised to see the hashtag #notallschools popping up on social media. Yet the idea is ingrained in the public consciousness, and the statements and stories that enter the media show very clearly that the policy makers and pundits do not have a clear understanding of the diversity of higher ed institutions. Furthermore, there seems to be little appreciation of the different roles schools across the spectrum carry out, something I’ve written about before to a small degree. In my opinion, this lack of understanding is a big problem, in no small part because students cannot make smart decisions for their future if they do not understand the types of paths available to them. I have no doubt that the student debt issue is at least partially a consequence of a failure on the part of higher ed to dispel the notion of a “one size fits all” school (though there are possible reasons for that). Poor understanding of institutional missions also makes responsible policy making nearly impossible; if most policy makers’ chief experience comes from their time as students, then their decisions and pressures will reflect only that one role of the university system. Clearly, we need to do a better job of explaining what particularly types of schools do.

Schools’ Fundamental Roles

There are many possible ways that higher ed institutions can be divided: public or private, 2-year or 4-year, types and proportions of degrees conferred, etc. Most if not all of these aspects factor into the commonly used (though perhaps less well known outside higher ed) Carnegie Classification. Although the Carnegie scale is extremely useful, I think for simplicity, we can boil it down to one aspect: the relative emphases on teaching versus research. A school like UGA, and many of the other big names in state schools, is primarily a research institution, whereas a school such as my undergraduate college Hendrix is undoubtedly a teaching-centered institution. This is not, of course, to imply that teaching is unimportant at UGA or that research is inconsequential at Hendrix, but there is no gainsaying the very different roles the schools play. Even in terms of the research, both schools have very different focuses; here at UGA, research is the end unto itself and is intended for publication, while at Hendrix the research is clearly about the learning experience (and many projects focus on producing conference presentations that students themselves give). Those differences are almost always reflected in hiring practices as well: at an R1 like UGA, nothing can rescue a bad research talk, and at Hendrix, no one will be hired who gives a bad teaching demonstration (though in all honesty, in today’s market, a successful candidate needs to nail both aspects at either school).

So why do I think these distinctions are important? For one thing, in my experience, many students and families do not really understand the type of school they are applying to. While I firmly believe a good education can be had anywhere with enough effort, I also think students should know from day one how the faculty might differ from place to place. There are many great teachers here and at other large universities, but the vast majority of faculty members here are not hired for their teaching; unfortunately, I often see the negative effects of that when observing (directly and indirectly) classes. On the other hand, there is certainly something to be said for the rich research experiences that potentially exist for undergraduates. Although I am biased towards small liberal arts colleges for a variety of reasons, many students have different wants and needs, and they need to understand a university environment and its mission to make the best decision. Even more importantly – something I feel very strongly about, as I’ve said before – many students’ careers and aptitudes are simply not good fits for the 4-year school, and they could save a lot of money and frustration being aware and availing themselves of a quality 2-year school.

In terms of policy, if schools do not clearly articulate their own roles, they face the very real threat of being held to standards or pressures that are not in line with their own missions. The case of North Carolina comes to mind especially, and their ill-informed attempt to force a 4-4 teaching load on professors (that’s 4 courses per semester, if you’re not familiar with the jargon). Policy makers clearly saw the universities’ main role as institutions of instruction, and attempted to work in that mold despite the fact that such a bill would have crippled the actual main mission of producing research. In a similar vein, policy makers seem to focus particularly on the large R1 institutions when it comes to job training, and in so doing tend to ignore the schools that actually perform those functions and are often in dire need of support: community and technical schools. Now, I’m not saying that the big R1s should be left alone and not held to higher educational standards, but I am saying that we need to have a clear understanding of schools’ roles if we are to have this conversation.

Shifting Roles

Now that I’ve spent a good bit articulating the roles of these different school types, I need to mention that they are not in any sense solidly fixed. For a multitude of reasons, many schools are facing pressures to adopt more fluid or integrated missions. Many of these are internal and aim to improve student success: Hendrix (as just one example) is one of many liberal arts schools actively increasing student involvement in research to make their chances of post-graduate success (e.g. acceptance into professional school) better. UGA, for its part, recently announced an initiative to emphasize experiential learning; although this is not a new idea, the move on the part of a large R1 is an important signal of some of the changes sweeping through schools. In many ways, what underlies these changes is the realization that teaching and research are not necessarily separate endeavors, and certainly not mutually exclusive ones, and so I think this increasing fluidity on research versus teaching is an overall good thing. For the part of associates colleges, although their roles have not really changed as much, they have been getting more and more attention as a legitimate career path, and I will admit I was surprised to learn that there is much more to them than job training.

Of course, having said all that, I must point out that change for change’s sake is dangerous, and especially troublesome is what happens when I school tries to be something it is not. The case of Georgia Southern comes particularly to mind; it is an excellent regional teaching-centered university, but through a series of shifting leaders has been trying to pivot towards more and more research. This fight to be something that the school is not has led to substantial friction between leadership and faculty, including a very well publicized now-public email (though for the life of me I can’t find my link). Unfortunately, this may be symptomatic of the larger drive for schools to be more and more a one-size-fits all model in order to secure more students and the tuition-funding associated with them, but no one wins when a school loses its identity. Although we are undoubtedly in a time of substantial shifts in higher education, we can never lose sight of the roles our schools play. No school is completely like the others, and they cannot be treated as such.

Why a Liberal Arts Education Matters

My time at Hendrix College changed my life. I can say that without a breath of hesitation. Granted, the college/university years are themselves years of great change, a massive transition from youth to adulthood, but I have no doubt that the person I am today is a direct product of my experience at Hendrix. I can further say that undoubtedly who I am would have been very different had I attended another school, such as a big R1 institution. Although Hendrix did and does many great things (typical of a quality liberal arts college), if I were to boil it down to one single factor, I would have to give most of the credit to my liberal arts education. Had I not gone there, I highly doubt I would have uncovered a love for history, or had the chance to learn and master multiple instruments to symphony-level proficiency. Gone would be the days when I spent easily twice as much time learning about things outside of my field than I did of biology, when the foundation for my modern academic identity was formed. I consider myself a better person for my days there, a better academic, and a more responsible member of our society, and I know I’m not alone. Many people who have received liberal arts educations have had similar life-changing results, yet very rarely have we as academics really sat down to articulate why the liberal arts approach is important. Unfortunately, if we do not make that case, it is very easy for the liberal arts to be attached by policy-makers and students alike; North Carolina is just one example of programs being destroyed because of a poor understanding of their value.

A Matter of Terminology

Sadly, the conversation about liberal arts and the utility of different degree programs is marred to some degree by matters of language. No small amount of opposition to liberal arts programs from the right comes simply from opposition to the word “liberal,” despite that there is no direct relationship between liberal arts and liberalism*. “Liberal arts” is also sometimes used as a synonym for the humanities, rather than as a term for broad, wide-ranging education. This treatment is not entirely unjustified, because the historical liberal arts line up well with the humanities of today, but I prefer that those terms be separate: one refers to a type of education, and the other refers to specific fields of study. Let me also be clear that I am not using “liberal arts” as a synonym for “small liberal arts college;” although the SLAC is a leader in liberal arts education, that type of study can be applied at far more diverse institution types than just the small college. When I refer to the liberal arts education, I am referring specifically to the use of a broad-ranging field of study, usually embodied in the use of a core curriculum that emphasizes coursework in multiple disparate disciplines. That is the type of education whose value I wish to defend.

So why then, do I think a liberal arts education is so important?

The Value of Flexibility

Because a liberal arts education is inherently too broad to endow expertise in any one subject (even one’s own major is just an intellectual appetizer), one of its primary focuses is not subject mastery, but on learning how to think and to learn in the different disciplines. Our ideal “finished product” student is one who has both the drive and the capacity to continually learn a wide variety of things throughout their lives. Furthermore, we hope to help students develop the ability to be adaptive, to be able to quickly grasp the core of a problem, draw on broad knowledge bases to figure out where to start, and to know what tools to use. In my mind, that is far more valuable then giving students technical mastery, because we have no way of knowing what the main tools will be in ten, even five years from now. According to both business owners and educators, those who succeed will be those who can quickly adapt to and learn new tools, and those are exactly the types of people the liberal arts education seeks to create. Unfortunately, even though businesses stress the importance of mental flexibility and adaptability, they continue to criticize liberal arts educations for failing to give students the specific skills they need. This is of course an inconsistency, but it should also be on us as educators to articulate why, in fact, businesses should want our students. There is more to an education than career preparation, of course, but that doesn’t mean we should shy away from the “sell.” Yet intellectual flexibility is just one of the liberal arts’ selling points.

The World Is Interdisciplinary

I have written many times about the skills mismatch, and job training, and I think one thing that is often lost in this whole conversation is how really, really complicated the world is (I doubt that comes as a surprise). Any field who says they have all the answers – including science! – is deluded, particular when discussing the very real and complex problems we face as a society and species. For instance, one of the main things I study is the impacts of environmental degradation, but although I can tell you whatever you need to know about the impacts of environmental damage, that is only part of the story. To truly understand, grasp, and address the problem, we also need to know the cultural and societal forces that lead to it, and the socioeconomic obstacles to change. An environmental scientist can’t really tell you those things, which is why interdisciplinary work is so crucial. A liberal arts education is one of the best ways to be primed for such work, and many of the top schools are even working towards explicitly using interdisciplinary education. Although this approach will not lend expertise in all fields, it is necessary to appreciate the value of other fields and how they complement each other. At the very least, training students to realize that they alone don’t have all the answers to our problems is in itself something crucial in this nuanced world.

Two [or more] Heads are Better than One

One of the things that I found most fascinating about my time at Hendrix was the exposure to the sheer variety of different ways of thinking. Almost every discipline has a different way of viewing and thinking about the world, and my time at Hendrix exposed me to the full range of them (almost…I never took any economics). Although I am first and foremost a scientist and educator, my life is richer and my approach to problems more sophisticated because I’ve been exposed to the thoughts of great minds throughout history. Even on a personal level, the exposure to the great people and works of history has had a formative impact on my life: Beethoven’s music and life have been essential to my struggle to make peace with the steady decline of my hearing. Some might scoff at this more personal value of this aspect of the liberal arts, but I would immediately point out that higher education is broadly about developing new ways of thinking, including how we think about ourselves. I would also argue that there is a more immediate usefulness to this broad exposure: as we move into a more globalized and interconnected world, sensitivity to different cultures and ways of thinking is going to become a more and more crucial element of success. What better way to develop that than by actual, direct exposure?

In conclusion, although I do not think higher education should be justified purely in terms of its economic usefulness, I do think there is a very strong case to be made for the liberal arts in precisely those terms. We live in a world marked by complexity, where many problems do not have a clear answer, and in a globalized economy where our way is not always the way of others. The consensus among both business owners and education experts is that success in the world of tomorrow will be in the hands of those who can think broadly and flexibly. While we struggle to improve and sometimes wholly redefine education, the good news is that we already have a way of training people like that.

Side note: *Although there is no direct relationship between liberal arts and liberalism, there is at least an indirect connection: studies have shown that one key difference between conservatives and liberals is that liberals tend to have more flexible, nuanced, and multi-perspective worldviews. These types of worldviews are precisely the type of thinking that a liberal arts education aims to engender.

How Funding Shifts Can Hurt Students

The narrative of declining state funding in higher education is a pervasive and troubling one. Although it is dangerous to talk in broad generalizations when discussing academia, and there are bright spots of change, as a general rule, state spending on public colleges has declined dramatically. This is especially true in states hard hit by the recession, in which declining tax revenue forced cuts to university budgets (and at a time when it seemed re-education was a major priority). Fortunately, some of the gaps in state funding have been filled at least partially by federal support, especially through the Pell Grant program, but in a day when budget cuts in the hundreds of millions are making headlines, it is not too much of a hyperbole to say that higher education is facing major money difficulties. Of course, increases in costs – often unsustainable ones – have also contributed to the dire financial straits of some schools, for a number of reasons that are in some cases beyond my knowledge (an economist could tell you more about “cost growth disease”). While this whole story about money is complex and part of a much larger conversation, I want to describe why the increasing costs of college and shift from state funding could end up harming the ones we should most protect: students.

As a consequence of declining state support, many universities must make up the gap with increasing tuition and institutional fees. In the case of the University System of Georgia (which includes UGA, Tech, KSU, and all the other 4- and 2-year public colleges in the state), tuition and fees contribute pretty much the same amount to the budget as state appropriations, just shy of 2 billion dollars in each case. Although I am focusing only on USG (in part because it’s my school system), I would be willing to bet that many state systems are in similar situations. What this means, then, is that enrollment is one of the key factors driving the budget of the system; when fewer students enroll, the entire system feels the pinch. This gives the schools a bizarre incentive to enroll as many students as possible, which is a terrible thing for a couple reasons. First, this increase in enrollment typically leads to larger classes, less student-faculty engagement, and usually an increase in contingent teaching labor (none of which are good for students). Second, the importance of tuition income has the potential to force schools to try recruiting students who honestly do not belong  at a particular school, or in college in general. A college education is not for everyone, and certainly a 4-year degree is not what is best for some – if not many – careers. Yet the schools have a vested interest in convincing as many people as possible that a bachelor’s is good for them.

This problem, what I call “degree creep” (that actually may be a legitimate term, I’ve heard it bandied about before), has a number of important ripple effects. First and foremost, recruiting students who honestly do not stand to benefit from a 4-year degree or may not be intellectually prepared for college has the very real potential to needlessly increase the overall student debt load; we have all heard of the stories of students tens if not hundreds of thousands of dollars in debt for a degree they didn’t need. Another important ripple effect is that forcing schools to rely on tuition forces them to try and be everything for all students. I’ve written a little bit on this topic before, but to summarize, the apparent drive to focus on career preparation detracts from the overall mission of a university (research, service, and teaching). The old adage applies: if you try to please everyone, you will please no one; a school simply cannot be everything to everyone, nor should it. One final potential ripple effect that I want to put forward is that the drive to recruit as many students might in fact be contributing to rising college costs.

Students require infrastructure, and infrastructure is not cheap. Even now, USG is spending hundreds of millions of dollars on building renovations and construction, in part simply to better serve larger numbers of students. State appropriations potentially pay for some of that, but it is easy to see how that could be a self-perpetuating issue: schools spend millions to support more students, recruit more to get more money, requiring more infrastructure investment! Furthermore, in order to recruit competitively, schools may find themselves in the position of needing (or – more likely –  thinking they need) to build fancy, luxury facilities to cater to students. How can a school facing severe budget problems justify building such lavish facilities? “To attract students and ultimately bring in revenue.” Treating students like revenue streams also has the pernicious effect of reducing education down to a series of “transactions” rather than a genuine learning experience, but that is a topic for another time. Suffice it to say, no one wins when students are just one more way to financially support the system.

Let me close with the obligatory reminder that every school, and every school system, is different, and much of what I say refers primarily to the large four-year schools such as UGA. I feel very strongly that community colleges and their role have been ignored far too much in the national conversation about higher ed. Unfortunately, as I’ve been forced to say in many previous posts, there are no easy answers to some of the problems higher ed is facing, funding and otherwise. I am absolutely certain, however, that academia on its own cannot provide the solutions. Those can only be found when we as a society grapple with some of the hardest questions about higher education and the type of culture we want to foster. I am hopeful that that conversation will happen soon – education has already become an important campaign issue for 2016 – but when it does, we academics cannot shy away from taking part.

Learning Music to be a Better Teacher

Wanting to become a better teacher? While there are many excellent ways to improve your instruction, I’d like to offer one that may seem a little unusual: learn a new instrument.

I will grant that this seems at first a completely useless bit of advice; after all, what use does a biology instructor have for being able to play piano? On the surface of it, not much, though I’d also like to add that – in my mind at least – playing music is a reward on its own. There is of course a salient argument to be made that learning music makes it easier to make connections across disciplinary lines, and that alone is a huge benefit to being fluent in an instrument. As an example, I can say without a doubt that my love for music makes me a better instructor for my non-majors simply because it gives me another chance to connect with students outside of the course material. I’ve written enough about the importance of those student-teacher connections that I don’t think it needs much repetition here. Instead, I want to argue (and hopefully show) in this post that learning an instrument – more specifically, the process of learning one – provides distinct benefits derived from the insights it gives us into the process of learning.

Learning an Instrument is a Multi-faceted Process

When I first learned violin, it became very clear very quickly (and very noisily) that this is an extremely complicated instrument to master. Violin was the last of five or six instruments I play, so I already had the general knowledge of music down fairly solidly, but the actual mechanics of the instrument were completely mind-boggling. I had to learn how to properly hold the bow, how to hold the violin, proper posture, proper finger placement, and on through a very long list of skills. And of course, not only was each separate skill a challenge, but I also had to learn how they all fit together! And then there was the more musical side: emotions, timbres, dynamics and so on. The same can be said of many instruments, yet to a skilled musician, playing their instrument is simply one fluid act. We just do it, without thinking consciously about it. We can say the same thing about the skills that make up our field; truly doing ecology, for instance, requires knowing the basic foundational knowledge, understanding the extent and limitations of our knowledge, being able to produce and digest scientific research, and to place it in the context of the greater pool of human knowledge. Each of these is a distinct skill, yet to a distinguished scientist, it all makes up one thing. For many of us, our field is likely second nature (we are unconsciously competent), and so learning an instrument can potentially improve our teaching by making us consciously aware of the multi-faceted nature of learning. In a sense, it renews our sense of being incompetent, which can help us design better courses that reflect more aspects of our field than simply the foundational knowledge.

Being a Novice Learner Again

Let’s face it: learning comes naturally to the instructors. We are the embodiment of lifelong learners, and by the very nature of what we did to get there (particularly the dissertation or these process of advanced degrees) learning has become something we can do almost effortlessly. Many of us may have forgotten how extremely frustrating learning something new can be, and I have no doubt this colors not only our expectations for students, but also how we perceive and respond to that frustration when we see it in students. I have seen many a professor respond with “how do you not understand this?!” style shock when instead what we should be doing is offering encouragement and reassuring students they cannot be immediate experts. Learning an instrument is, frankly, a pain in the butt. Look at the long list of skills that went into the violin; the only way to master them is long hours of practice and self-criticism. It’s not a comfortable process, and after 7 years of playing I still have a lot of stuff to work on. Just as learning an instrument can make us more conscious of the multi-faceted nature of learning, it can also make us [painfully] aware of the frustration that comes with breaking down knowledge barriers. It might also highlight areas where our expectations for students are misguided; just as no one is an instant advanced musician, we cannot expect our students to be instant experts in our fields.

The Song of Learning

In many ways, I would argue that learning the totality of a field (i.e. the overarching goals of, say, a whole course) is almost perfectly analogized by the learning of a new song. Although we may spend days or weeks on particular parts of that song/field, it truly only comes into existence when we play the whole thing in context. This also highlights a very important learning strategy called interleaving, where students focus on individual aspects of a skill, interspersed with focusing on the whole context or message. Musicians learning a song do the same thing; our orchestra rehearsal seasons begin with a night where we just play through the entire concert to get a rough feel for the grand scheme of the works; subsequent rehearsals are spent focusing in much more detail on particularly troublesome spots. I’ve been slowly learning the opening to Bach’s violin partita No. 2, and there are times when – although I prefer to play the whole thing – I spend half an hour on one or two measures. It is not much of a stretch to see how this same approach is used in learning, and so going through the process as a novice a musician can potentially give us important insight that we might use to structure and plan a course.

To close, I must point out that the three main benefits above are absolutely not limited to learning an instrument; indeed, I’m fairly confident that most skills sufficiently far from our areas of expertise could yield similar insights. Of course, listening to and playing music has documented cognitive benefits as well, though that’s far outside my ken (perhaps it’s a new skill I should learn!), but I think it can also directly impact our teaching by reminding us what it feels like to be novices. And even if that weren’t true, is not music a worthy end undo itself?

Online Learning: a powerful [and misused] tool

When I think of the changes that have swept through academia in the decade I’ve been in it, it’s difficult for me to think of a bigger one than the advent of online education. I have “come of age” (both literally and figuratively) at the same time as IT forces have allowed large-scale online instruction, and it is now difficult for me to imagine an academic world without online instruction. Of course, distance education is not new, even if it has changed faces, but at no time before has online learning so deeply permeated undergraduate education. Although this can be a force for good, the advent (some would say “tyranny”) of online education has not always been a net positive; I have personally experienced dreadful online courses and found myself cursing their existence. The fact of the matter is that online learning is nothing more or less than a tool – not an end unto itself – and like all tools, the benefits or harms depend on how well it is used.

Despite all the negative press online learning gets, and despite the pushback from those who [usually justifiably] insist on face-to-face learning, online education can serve a very powerful purpose. Many students – particularly those nontraditional ones who are returning to school to find a new career – simply cannot be in a classroom all the time. Time demands from a part-time job or other concerns might make scheduling impossible, and transportation might be unfeasible or unaffordable. Other students might simply not work well in a traditional classroom for behavioral or other personal reasons. For all of these, online education can make the difference between getting an education or not getting one, and in those cases it becomes a hugely beneficial tool. Unfortunately, in practice, online education is not always used primarily as a way to increase accessibility. For many universities, online education can be a very attractive revenue stream at a time when traditional state support is dwindling; this is especially the case if the online courses are “farmed out” to junior faculty or even private companies that dramatically lower the front-end cost. Sadly, many junior faculty members – already a group that may not have specialized training in instruction – might have very little support in developing online courses, and the pitfalls of bad course design are dramatically inflated in online courses compared to their in-face counterparts. That said, online coursework does not have to be necessarily inferior in quality, but to prevent that, we have to know a few major things about designing/executing online courses.

Teaching a good online course is not easy

One reason that many universities might see online learning as an attractive revenue stream is the apparent notion that it can be done at much less cost and effort. Rather than having an expensive tenured faculty member, an online course can be executed by a part-time contingent instructor whose primary job is to develop, curate, and grade materials. Why pay for a full-time, fully supported professor to teach 20 students when you can get the same tuition and spend half the money? That’s a gross oversimplification, I know, but the point is that treating online courses as some sort of junior teaching endeavor ignores the very real and difficult challenges online learning presents. In a traditional class, an instructor has immediate and tangible feedback about how the class is going, and can make immediate changes as needed. Online courses, on the other hand, may be entirely written before hand, and in some cases there may be little to no meaningful interaction with some students. In this case, it takes a very skilled and adept instructor to both design a course that won’t run into problems, and to be able to pick up on scant signals that something might be awry. Although there are many excellent adjunct instructors out there, many may simply not yet have the experience or support necessary to tackle the difficult job of creating a great online course.

Online students are not the same

At the risking of making sweeping and unjustified generalizations, the students who take online courses simply are not the same – on average – as those in traditional courses. They may be non-traditional students just returning to school, they may be students who are simply not comfortable in traditional classrooms, or they may be students who view online courses as an easy way to get the same credits without the time demands. All of these cases present unique and often difficult challenges to an instructor. For instance, returning students might be overwhelmed by being in the classroom again, and may be in need of an intervention to provide the academic and moral support to help them gain confidence. Recognizing and providing that support is not easy, and once again, if online courses are “farmed out”, they may be taught by faculty who for any variety of reasons are not prepared to deal with student issues like that. Even my own immediate teaching supervisor, who is herself an excellent teacher, was surprised by how different the students were in her experimental online lab course, so this is not a trivial challenge. The fact of the matter is that students who sign up for online courses are self-selecting to a degree, and knowing the type of student you will have is a major factor in executing a good online course, one not often considered in the course design process.

Providing rich learning experiences is difficult online

One of the most powerful aspects of a good course – traditional and not – is providing rich learning experiences to students, which lies at the core of active learning strategies. One of the reasons I love teaching labs is because labs are inherently active learning, but lecture time slots can also be used for working real-world problems, designing projects, or any other variety of learning experiences beyond content delivery. When dealing with a course delivered entirely online, this becomes much more difficult. This is not, of course, to say that it is impossible, but in my experience many online courses focus entirely (if not exclusively) on content delivery. This may partially just be a reflection of the belief that content delivery is the sole purpose of any course, but it can become even more harmful in the case of online learning. In a traditional lecture, students at least have the chance to potentially engage with the material through dialogue, questions, etc., but in an online course of the same material, that possibility may easily be lost. Therefore, it becomes even more important to actively consider the learning experiences that will be provided online to students, because if they are not explicitly considered, the course risks falling flat.

There is no substitute for face time

The most salient argument against online education is that it does not allow the same sort of personal interaction and student-teacher relationship traditional classrooms allow. We know that the student-faculty relationship is possibly the most important aspect of a good education, so this is a damning argument. Unfortunately, there is not really a good response except making sure that every student has a chance to interact personally and meaningfully with the instructor. And I don’t mean asking questions and getting answers, but actually engaging in a deeper dialogue that extends beyond the class and allows building that relationship. This simply requires a lot of time, often as much [if not more] time as a traditional course would require, and there’s no way around that if we want an online course to work. Furthermore, as mentioned above, the students in an online course might not even want that student-teacher interaction, or they may (in the case of returning students) be intimidated by the technological component. None of these are truly insurmountable obstacles, but they require a skilled hand to deal with, and overcoming them is the sine qua non of a successful online course.

For all their disadvantages, online courses and online learning are here to stay. This is not necessarily a bad thing, but it is absolutely essential that we as instructors and faculty members are aware of the unique challenges associated with online instruction. I think it is clear that it is not appropriate to simply hand off online coursework to a novice part-time instructor, yet the possibility of saving money by doing so is one of the motivating factors driving university decisions. Online coursework can be very powerful in terms of increasing education accessibility, but that power cannot be realized if the proper investment in skilled instructors is not made. If online course work is treated simply as an additional revenue stream, it will lead to a situation where no one wins: students will lose out and faculty member will see a continued erosion to mere content-delivery vehicles. On the other hand, if done well, online learning is a very powerful tool for inclusion, and so should be pursued as far as possible to that end.

I’d like to close with my own vision of the future of coursework, online and not. The internet is a very powerful engine for finding and accessing content, and so I think the ideal course in the future will be a hybrid of online and traditional. Students will find and absorb content on their own through online delivery, and the limited “lecture” time will be spent on active-learning experiences. I don’t think we’ll ever move to an entirely digital classroom (maybe…holodecks, anyone?), but that doesn’t mean online tools can’t dramatically improve the courses we deliver. IT is a tool, and like all tools, it is only as good as how you use it.

Science Friday: Uncertainty and Climate Change

I’ve decided every Friday (in deference to Ira Flatow’s show of the same name) to dedicate my Friday blog post to issues of the environment, ocean, ecology, conservation, and just generally, science. In other words, what I actually study when I’m not reading/writing about education and policy. This is motivated in part by a nice post that appeared on Inside Higher Ed recently, but is mainly just a chance for me to exercise some of the informal science communication I hope to explore as a career. For this first Science Friday, I’d like to discuss climate change, namely the uncertainty in climate science, where it comes from, and what it means.

One of the most common refrains used by those who disagree with climate scientists is that the “science is just not settled.” These same people tend also to point to the failure of many climate models to accurately predict global sea temperature and surface rise, and the apparent lack of significant warming over the past decade (even though that apparent “slowdown” may not actually be real). Why should we make major policy decisions if we’re not 100% sure about the science behind it? That’s a perfectly reasonable point, but I think it highlights a grave misunderstanding of uncertainty, statistics, and climate patterns that permeates the public discourse about climate change. Although I am not specifically a climate scientist, I am deeply familiar with the issue as a marine scientist, given the intimate connections between the ocean and climate. I am also a numerical modeler, so although I do not work directly with climate models, I have extensive experience with the pitfalls of modeling, and so hope I can elucidate more clearly what we mean when we talk about “uncertainty” in the models.

Let me be very clear about one thing: climate change is happening. There is no scientific doubt about that based on what we currently know. There is also pretty much no scientific doubt that human civilization is a main driver due to the production of greenhouse gases such as carbon dioxide. We have a clear trend, and we have a clear mechanism; that’s actually more than we can say about many natural phenomena (such as gravity, which we can’t explain at all). Conceptually, there is no uncertainty whatsoever about the reality of climate change and the role of humanity. Scientists tend naturally to be equivocal due to the hairy nature of “proof” in empiricism, which has unfortunately helped feed the “uncertainty” narrative, but this is one of those very rare times when there is a consensus. What is less clear is the magnitude of change, its anticipated effects, the speed, and how it fits into the larger context of natural variation. So, the claim that the science is not settled is – from the most technical standpoint – true, but not in the way that it’s usually conveyed.

One of the most powerful tools for predicting large-scale trends (in terms of both space and time) is computer modeling. I use this technique to predict natural processes in an estuary that are nearly impossible to measure, and meteorologists use models continually to make forecasts so you can decide whether to wear a T-shirt or pack an umbrella. Models have changed substantially in recent memory thanks to a dramatic increase in computing power, but at their core they remain essentially the same: we start with a basic conceptual understanding of a system, break it down into mathematical representations, and then use data to parameterize or “fit” a model so we can trust its predictions. So what does this have to do with the climate change conversation, other than knowing how we actually make predictions? I would argue that understanding the uncertainties and pitfalls of modeling is essential to understanding uncertainty in climate science.

If we run a model and the results don’t make sense or don’t fit actual measurements – as is the case with many climate models – there are a few possible reasons. First, it could be that the conceptual model, how we think the system works, is simply wrong; although that is sometimes the explanation, it’s actually the least common source of model uncertainty. Another major issue with modeling is that many of the equations involved don’t actually have solutions; they must be estimated using numerical methods such as Euler’s method (the most basic of many related approaches). Although modern software is typically very good at providing accurate estimates or solutions, it can only do so much with highly complex systems such as climate that involve hundreds (if not thousands) of separate equations. Generally, though, that is a known problem to modelers, and can easily be solved with well-planned methodology. So, assuming the conceptual model is sound and the numerical estimation is accurate, there remains only one major point of uncertainty: the data.

Models are only as good as the data that go into them, and more complex models require astonishingly large amounts of data simply to be able to execute. To use an example from my own research, a model to predict nitrogen reduction (a single value) required a dozen equations and a few dozen variables; many of these are themselves largely unknown! Even assuming the data are in fact known, there is inherent uncertainty with every single measurement (which is what leads to “significant figures” when writing data points); no measuring device is completely precise, and there is always the specter of human error. With very large and complex models, this natural uncertainty can very quickly propagate through a model, so that the final result, even if sound, is still inherently uncertain. Quite simply, there is no way to deal with that; it is just the nature of empirical measurements. Of course, as data collection methods improve, that uncertainty diminishes, but it will never be eliminated.

How does then relate to uncertainty in climate science? Well, in addition to the natural uncertainty inherent in models, climate models are faced with the daunting prospect of relying on massive amounts of extremely noisy data. Even though temperature graphs show a clear upward trend, there can be as much variation within a year as there is between years, which makes it nearly impossible to define statistically significant trends; it’s not that the pattern isn’t there, it’s that it ends up buried in noise. There are ways to deal with this, such as something called a Fourier transform, which can be used to eliminate noise from a signal, but unfortunately such statistical “tricks” feed into an untrue notion that scientists are faking data when exposed to the broader public. Furthermore, climate scientists are dealing with data that are continually being revisited and improved; just yesterday a new paper came out showing that the “hiatus” in warming is quite possibly because of bad data in the past. Related to this is the fact that we are still learning about how the climate works in detail; keep in mind that global temperatures from satellites are only a few decades old! There are a number of natural oscillations which exist on those same time scales (such as the North Atlantic decadal oscillation currently swinging into place), so it’s entirely possible that there are still some climate drivers that we don’t yet know about. So yes, there is a great deal of uncertainty in climate science, but that’s not to say there is uncertainty about the reality of climate change.

To close, I must to be intellectually honest entertain the possibility that we are wrong about climate change. It’s an extremely remote possibility, but the nature of science keeps us from ever truly eliminating it. I must point out, however, that the degree of certainty about climate change far exceeds the certainty of knowledge with which, say, we make medical decisions or individual diet choices. Furthermore, carbon dioxide does more than just warm the planet; a substantial portion of the oceanic ecosystem is reliant on organisms who produce calcium carbonate shells or skeletons. We know that carbon dioxide dissolves in the ocean and lowers pH, and we’ve already been able to measure ocean acidification. The economic consequences of acidification could be extreme as entire fisheries collapse, and so that alone should be sufficient reason to worry about carbon dioxide, regardless of the warming effect. In the end, even if our power to correct climate change is limited, there’s not really a valid excuse for failing to do everything we can when we know the consequences.

Academia: not always about teaching?

What does a professor do? Are they teachers?

If you ask a typical member of the public to describe a stereotypical university professor, I would be willing to bet that the description invariably includes at least some mention of being in a classroom and working with students. If you look at the typical discourse, e.g. in Wisconsin or – especially so – in North Carolina, it seems that the public perceives the professor as only a teacher. After all, why would the North Carolina legislature have deemed it appropriate to try and legislate a mandatory 4-4 year (that’s 4 courses per semester, if you’re not familiar with the lingo)? Certainly, teaching is a major component of a professor’s life, without question, but the public might be surprised to know that, in fact, it is research that is the professor’s primary role. The sine qua non of an academic life is publications, and with very few exceptions, tenure cannot be obtained without evidence of an active scholarly record. Indeed, I would argue one of the biggest misconceptions permeating the national conversation is that universities (particularly R1 schools like UGA) exist primarily to teach. That’s simply not the case; UGA’s mission statement – like that of many schools – explicitly includes research, education, and service. Many people might not even be aware that, as a R1 institution, research takes precedence here! That, however, is not what I want to write about today; it is no secret that the national conversation is egregiously bad at lumping together and oversimplifying the roles of different schools. What I want to talk about is the role of teachers, people like myself who have always wanted first and foremost to be in the classroom, and why we have such difficulty finding a place in academia.

One of the many reasons I am exploring careers outside of the Ivory Tower is that my research, to be blunt, is not sexy and exciting; it’s not the type of ground-breaking research that lands tenure-track jobs. I would argue that it’s very, very important for our future, but the fact is that coastal ecology is a relatively small field and it doesn’t garner the attention (i.e. grant money) that biotech or genetics does. As a professional, however, I personally consider my research second string to my teaching expertise. I have substantially more teaching experience than someone at my stage would normally have (ten semesters compared to the single semester required for my degree), and I have actively pursued formal training in the theory of teaching and learning. This makes me extremely unusual – some PhDs graduate without having ever taught a course (depending on the program requirements) – but that is not necessarily a good thing for traditional academia. The fact of the matter is that a PhD is about training scholars, not educators, which is one reason that a number of damaging and misinformed beliefs still permeate the college classroom. Typically, the belief has been that expertise was all that was required to make a good teacher, a belief which has been slow – entirely too slowly, in my mind – changing towards the belief I enshrine in my teaching philosophy.

Teaching is in and of itself a rigorous intellectual act requiring extensive preparation and scholarship separate from the mastery of the subject material.

In my opinion, this is one of the biggest failings of the way we train and recruit faculty members. The many excellent teachers out there exist in spite of, not necessarily because of, the graduate school system. This is, thankfully, changing: UGA requires its PhD students to take at least a one-semester course on teaching, usually concurrent with at least one semester of actually teaching as well. Attitudes, however, are much more stubborn, and even though winds of change are blowing, academia is notorious for being slow to make shifts. Even as we grapple with the increasing importance of education, research still remains at the center of a traditional, tenure-track career. At the same time, the ever-increasing teaching load is “farmed out” to a continually growing pool of oft-exploited adjunct instructors. What this means is that people like me, who consider themselves educators first and take the time to dive into and master the scholarship of teaching and learning, find it very difficult to find a place in traditional academia, particularly one where we can make a reasonable living. But before I let it all seem gloomy, there are bright spots: more and more schools are realizing that teaching and expertise are distinct masteries, and creating positions specifically for those with teaching expertise. Overall, this is a great thing, and hopefully will continue, but sadly, these types of positions are still few and far between. What we really need is a shift to put education alongside research on a pedestal of respect.

Changing attitudes is, to put it mildly, not easy, and changing the policies that express that attitudes is even harder. One of the best ways to make a place in academia for teaching specialists would be to invite them to the tenure track. Some might argue that teachers have no need for tenure, because tenure’s main role is to preserve academic freedom for professors who might engage in controversial but crucial research. Teachers, excepting perhaps those teaching the most controversial courses (which would likely be the domain of tenured professors anyway), don’t need those same sorts of protections, the argument goes. I think this is woefully wrong. Denying the same sorts of protections to teachers potentially puts them at the mercy of feedback from students who may vent frustration on course evals, and I’ve written before why that might be problematic. Let’s face it: some of the best teaching requires a lot of work and intellectual discomfort from students, and having tenure-like protections would encourage educators to engage students in those meaningful conversations without fear of losing a job over being “too tough.” Of course, giving tenure on the basis of a teaching record also has the added benefit of increasing a teacher’s investment in his/her school and students, for the benefit of everyone involved in the process. I will grant that actually measuring teaching effectiveness could prove substantially more challenging than measuring scholarship, but I would argue that the potential benefits far outweigh the challenges.

To close, let me point out that we already have a whole class of schools that has long ago tackled the teaching/scholarship divide: the small liberal arts school. While tenure at these schools still does involve research, and I think we should push for a career path focusing entirely on the scholarship of teaching and learning. These schools long ago realized that the divide between scholarship and education was at best flimsy, and I think that’s just one more thing larger state schools can learn from the smaller brethren as they face mounting pressure to refocus on education.