Tuesday, November 24, 2015
Thanks to Brookdale for such a warm welcome. I didn’t realize how much I missed New Jersey until I came back to it.
Thanks to InsideHigherEd for so much support over all these years. It’s wonderful to have a platform where voices from beyond the Ivies can have their say.
Thanks to the Twitterverse for helping me keep up with sources and stories I otherwise couldn’t. And the people! Tressie McMillan Cottom, Paul LeBlanc, everyone at the CCRC...
Thanks to my occasional co-author, Susanna Williams, for making the trek to New Jersey on the rainiest day in recent memory. Thanks to High School Friend on Wrong Ocean for doing the same, though without the rain.
Thanks to The Girl, for bringing class, wisdom, and humor to every single day. Watching you grow up is one of my great joys in life.
Thanks to The Boy, for showing me how adolescence was supposed to be done. Even with the move, you handle it with a grace and poise far beyond anything I could do at your age. I’m in awe.
Thanks to The Wife, for holding it all together. For slowly turning a house that isn’t “us” at all into one that is. For being an amazing parent. And for still getting people to ask me incredulously “how did _you_ get _her_?” I don’t even mind the implied insult...
Thanks to my wise and worldly readers. You are the point of the blog, and I learn so much from you.
Happy Thanksgiving, everyone.
Monday, November 23, 2015
Once More, with Feeling
How many times should a student be allowed to re-take a class she hasn’t passed? In my context, that usually means either failing or withdrawing; we don’t do many incompletes.
Some colleges have policies on retakes and some don’t, but I don’t know that I’ve ever seen a fleshed-out argument for any given position. I’ll offer a few gestures towards one, and then ask my wise and worldly readers to help me fill it out and reach a considered answer.
First, I think it’s obvious that any given number (assuming there is one) should have some sort of “exceptions” clause. Weird stuff happens. When the stuff is both documented and sufficiently extraordinary, I don’t see much point in being overly strict. (Example: one semester, the professor falls ill six weeks into the class, and the college can’t find a replacement in time. I don’t see why a student should be punished for that.) That said, a policy of “infinite retakes” strikes me as hard to defend, given limited resources. So assuming that infinity is off the table, and there’s some sort of safety valve for extraordinary cases, what would be a reasonable limit?
“One and done” strikes me as unnecessarily cruel. People make mistakes. If a withdrawal from a class means you can never take the class again, and that class is a prerequisite for other things and/or a degree requirement, then the student is basically done. Say that a student has a medical issue or family emergency during Freshman Comp. Freshman Comp is required in every degree program at the college. Bar her from retaking the class, and you’ve basically expelled her. Given that anybody can get sick at any time, that just seems unreasonable.
Two or three attempts both seem reasonable at first glance. Either allows for a stray awful semester without banning the student from progressing. But they both put a cap on throwing good money or effort after bad. If you haven’t passed a class in three attempts, I’m thinking maybe that isn’t the class for you. Getting blocked from self-registration for a fourth attempt may prompt a visit to an advisor, who might suggest other pathways. Everybody is good at something, but nobody is good at everything; disappointment can be part of the process of narrowing down the fields that work for you. If that weren’t true, I would have spent much of the 90’s and 00’s playing major league baseball. The only thing that stopped me was a catastrophic lack of talent.
I’ve asked the Institutional Research offices at my last couple of colleges to run numbers on success rates for successive attempts at courses. In both cases, success rates dropped from the first attempt to the second, the second to the third, and so on. The drop was the largest from the first to the second. I haven’t seen national figures on that, but I’d guess they’re consistent. By the time you get to fourth or fifth attempts, it starts to look less like compassion and more like false hope, or tuition theft.
The change in Pell limits in 2012 added urgency to the issue. Prior to 2012, students had a lifetime limit of 18 semesters of Pell. (That had to cover both undergrad and grad degrees.) In 2012, the limit was shortened to 12 semesters. If a student spends four or five semesters on a single class, the odds of her finishing a degree before running out of money drop dramatically. And even if she finishes the associate’s, the odds of having enough money to complete a bachelor’s are vanishingly low. If a student has a semester or two of ESL and/or developmental classes, and stops out once or twice, that 12 semester limit can come up fast. I hate to base academic policy on financial aid, but I also find it irresponsible to enable behavior that would defeat the possibility of completing a degree.
So I turn to my wise and worldly readers to shed light. Is there another angle on this question that sheds useful light? Is there a better answer?
Sunday, November 22, 2015
Seasonal Retail and Final Exams
The next several weeks are some of the most stressful ones of the year for lower-income students.
That’s because two demands on their time are increasing simultaneously: seasonally increased hours at many retail and customer service jobs, and the late-semester ramp-up to final projects, papers, and exams.
Although I didn’t fully process it at the time, I had the remarkable privilege as a student to be able to restrict my paid hours during the semester to six to eight a week of work-study. (Summers were another story.) That level of work was enough to keep me honest, but not enough really to interfere with classes. And on the rare occasions when it did, I could ask for dispensation from a shift and nearly always get it. At worst, I might have to arrange a trade. It really wasn’t an issue.
For many community college students, though, this is a cruel time of year. They need the hours at work to support themselves, and they need study time. A few jobs allow students to learn while they earn -- I remember the guy at a Christmas tree lot poring over his biology textbook between customers -- but most don’t. Worse, many of them require variable hours, making it difficult for students to develop the economies of effort that come from routine. When your transportation or childcare arrangements are precarious at best, routine is your friend; the constant disruptions to routine that low-wage jobs entail can do real damage.
In some ways, the choices that students face are between short-term and long-term goods. Money right now satisfies a short-term need, but too much time on the job can jeopardize the degree, which is the long-term good. Every so often I’ll hear people who should know better pontificate about “delayed gratification” and “self-discipline.” It grinds my gears when I hear it, because for many students, the short-term needs are painfully real. If you don’t get your car repaired, you can’t get to campus anyway. If you can’t make rent, you have much bigger issues than a biology exam. Some students skip meals to save money; I doubt that anyone is at her academic best when hungry. Reducing the issues to personal failings of self-discipline -- which have always existed, and are not unique to people with low income -- doesn’t account for just how close to the edge some students are.
Which is to say, I wish students weren’t captive to short-term retail, but sometimes they are. This is when faculty and staff can make a real difference with nothing more than a stray thoughtful comment, or a show of real human concern.
In my teaching days, I would sometimes structure classes to allow students to work harder at some points so they could ease up a bit at others. The old “give four assignments and count three” method gave them a freebie to spend as they saw fit. I would explain it at the beginning of the semester, and note that it gave them options: if they blazed through the first three, they could skip the last one. Alternately, if life happened at some earlier point in the semester, they had a chance to recover without having to explain or justify themselves to me. They could maintain their dignity as they got back on track. It worked pretty well.
At this point, classes are what they are. But that crucial human connection still matters. I was repeatedly struck by how many students were struggling economically, but how few of them know that it’s normal. There’s a lot of shame and secrecy for many. They’ve imbibed the sort of secular Calvinism that says that material poverty is a sign of moral failing, so they’ll engage in counterproductive struggles to keep up appearances until they just can’t.
Just hearing from a trusted person that they aren’t hopeless or tainted because they’re struggling can help.
Personal respect isn’t a substitute for a fairer political economy, of course. But in the meantime, we could do worse than to acknowledge the strain. Some students are working hours that put them in danger. Letting them know that someone cares doesn’t fix the problem, but sometimes a little respect at the right moment can make it bearable. And learning that the issues aren’t just personal may help people connect the dots, and start to work on the politics of it. That’s my annual holiday wish...
Thursday, November 19, 2015
The Shot Not Taken
This one will be short and vague, by necessity.
This week I witnessed a devastatingly effective “shot not taken,” a sort of negative-space insult that worked by implied contrast. I gasped at its elegance. It’s a dying art, but it’s not entirely gone yet, and I’ll miss it when it breathes its last.
I’m sure there’s a word for it -- wise and worldly readers, help!! -- but we’ve all seen it. It’s Mozart saying to Salieri, “only you.” Or Alan King listing his favorite actors: “Sir Laurence Olivier. Sir Kenneth Branagh. And Drew Carey.”
It’s more impressive in relatively spontaneous contexts. It can be the silent beat, followed by a conspicuous change of subject, after a stupid comment. Or the compliment that implies its own shadow, often in the space of a pause: “he’s very...decisive....” “She certainly has a...presence.”
It’s not exactly sarcasm; it’s more surgical than that. Done well, it leaves the attacker looking clean and clever, and leaves the attacked almost nowhere to go. It’s the verbal equivalent of the perfect knuckleball: off-speed and easy to get wrong, but when done right, almost beautiful in its evasive effectiveness.
The undisputed master, of course, was P. G. Wodehouse. Politically, he was somewhere between naive and offensive, but in verbal precision, nobody came close. “Even at normal times Aunt Dahlia’s map tended a little towards the crushed strawberry. But never had I seen it take on such a pronounced richness as now. She looked like a tomato struggling for self-expression.” “He resembled a minor prophet who had been hit behind the ear with a stuffed eel-skin.” “Unseen, in the background, fate was quietly slipping the lead into the boxing-glove.”
Wodehousian humor demands close attention, which may be why it’s fading. It requires both precision and subtlety, and it asks of its reader or listener the ability to appreciate each. It’s easy to miss.
All of which is to say, I tip my cap. It was so elegant that all I can do is applaud.
Wise and worldly readers, is there a simple term for that? And do you have a favorite application?
Wednesday, November 18, 2015
The Prereq Temptation
(“The Prereq Temptation” was a rejected John LeCarre title. A lot of people don’t know that…)
In my imagined, more perfect world, there would be exactly one reason for a prerequisite to attach to a class: the students would need to know material from the prior class to be successful in the second one. For example, a student wandering into a calculus class who had never taken algebra or trig could be expected to be lost. (“Why are you doing math with letters?”) In sequences of courses that build on each other, the folks who teach the later courses should have some reasonable assurance that they don’t have to go all the way back to square one.
But in this world, that’s not the only reason prerequisites get put on classes. For example:
Felt prestige. At a previous college, I once had a department admit in a program review that the addition of a prereq to its into class made no difference in student success; to its credit, it even included the numbers to prove it. But it argued for keeping the prereq anyway, as a “statement” about expectations.
Self-defense in an arms race. If every other class that fulfills a distribution requirement has an English 101 prereq, and yours doesn’t, then you will get more than your share of the less prepared students, simply by default. After a while, even some folks who generally object to prereq proliferation will yield just to level the playing field.
Transfer requirements. Certain large public institutions -- not naming any names here, but they know who they are -- won’t accept certain courses unless those courses carry specific prereqs. They take the presence of prereqs as a sign of rigor. Even if we could show locally that the prereqs achieved nothing except to delay students, we’d still have to keep them.
Gaming graduation numbers. If every credit-bearing course requires that a student has cleared the developmental level, and developmental courses don’t add up to 12 credits, then the college can de facto exclude all developmental students from its “first-time, full-time” graduation rate. It’s unethical, but it happens.
Leaving aside the more sinister and self-serving reasons, people often argue for prereqs out of a sincere, if unproven, belief that they’ll set students up for success. The argument could be tested empirically, but almost never is. It should be.
Individual prereqs can make sense, but when they proliferate -- as they tend to do -- they make timely completion of a degree much harder. A student who has to wait for a prereq class to fit her schedule may add a semester or a year to her time-to-completion, just because she’s following an unproven rule passed through a combination of ego and wishful thinking.
To my mind, the burden of proof should be on prereqs. In the relatively rare cases in which the relevance is obvious and well-demonstrated, keep them. But subject all of the existing ones -- not just new ones -- to actual empirical tests. If our four-year counterparts would do the same -- hint, hint -- we could drop the prereqs that are only there to appease them.
The prereq temptation is subtle and pervasive, but it does real harm. If we could get that long list of reasons down to a single one -- where it actually helps -- students would benefit tremendously.
Tuesday, November 17, 2015
Guided Pathways for Transfer
Last year I caught a presentation by some folks from the Maricopa County Community College District (Phoenix, AZ) on a transfer partnership they had developed with Arizona State University. With nearly every transfer student aiming at the same destination college, it was relatively easy to design curricula for the first two years. (That’s not to demean the effort involved; it’s just to say the target was clear.)
I was jealous. That kind of clarity wasn’t possible at Holyoke, and it isn’t possible at Brookdale. That’s because in both places, students had many more options for destination colleges, and the destination colleges didn’t agree with each other. When the destination schools disagree on what should go into the first two years, who are we supposed to imitate?
Loss of credits upon transfer is a massive barrier to degree completion, for obvious reasons. It forces students to spend time and money retaking classes they’ve already taken. It’s also incredibly demoralizing. Students feel ripped off, either by the destination college or the sending one, and feeling like a chump doesn’t often inspire greatness.
When the loss of credits in transfer occurs between public institutions -- say, a community college and a state university -- taxpayers wind up paying twice for the same courses. It’s a huge issue for students, but not only for students. Everyone winds up paying for it, except maybe for the destination school.
Yet in the public discussion of transfer -- to the extent that it exists -- most of the blame for “wasted” credits is aimed at community colleges.
To some degree, any region with a healthy population of private colleges and universities will have to deal with this no matter what. Private institutions can define curricula pretty much as they see fit, within the confines of state licensing regs and accreditation criteria. But it seems like the public institutions should be more subject to public authority.
This week I was in the umpteenth meeting in which discussions of possible changes to curricula led to the inevitable “but will it transfer?,” answered with the inevitable “well, yes and no.” “Yes and no” is not a satisfying answer.
Put differently, four-year publics need to be required to create “guided pathways” for community college transfer. That means keeping their own curricula in relative check, and vetoing departmental efforts to go rogue and construct idiosyncratic requirements to avoid “giving away” credits (or, in what amounts to the same thing, relegating them to “free elective” status). The easiest way, I think, would be to cap at sixty the number of credits that a destination school could require of someone with an associate’s degree. Let the four-year college departments fight out which sixty credits they can require; I have no dog in that fight. But allowing the people with the greatest conflict of interest nearly unfettered discretion to double-dip is not okay. It’s a disservice to the students and the taxpayers.
Have any states actually tried that? I’d be interested in hearing about unintended consequences and/or workarounds. Alternately, for folks at public four-years, can you foresee unintended consequences and/or workarounds?
Monday, November 16, 2015
The Girl is preparing for her debate tournament next Saturday. She gets the topics ahead of time, but she doesn't find out which side she's supposed to argue until 15 minutes before the match. That means she has to prepare both sides of the argument, and be ready to argue either way.
One of the topics this week is whether GMO foods do more harm than good. Last night, as she did her research, she got frustrated and asked me for help. As she put it, "all of the anti-GMO stuff is really just anti-science. I can't use that!"
That's my girl.
She's content to argue either side of something that's, well. arguable. But to her, the legitimacy of science is not arguable; it's a ground rule. If she can find a science-friendly argument against GMO's, she'll happily use it; depending on which side she gets, she may need it. But she's not betraying science.
I've been thinking a lot about arguments and ground rules recently. There's no shortage of topics.
The recent set of conflicts on campuses around racial climates strike me as rooted in different understandings of ground rules. Some people believe that protests are inherently dangerous, scary, or objectionable, regardless of their content or motivation. (In my observation, that understanding is usually buttressed by a really selective reading of history.) The more serious conflicts occur between those who take unfettered speech as a ground rule, and those who take mutual respect as a ground rule. The former camp is theoretically straightforward, even if frequently inconsistent in application. (We have closed meetings on a public campus all the time, and nobody says boo about that...) The latter camp argues that procedural equality often serves to entrench other kinds of inequality, and it asks for a focus on substantive equality before getting hung up on procedure. That camp tends to be less theoretically straightforward, even as it makes sense on the ground. As a political theorist, I see lineage to both arguments. The first group is correct that arguments like Marcuse's "repressive tolerance" can quickly lead to terrible and scary places. And the second group is correct that a focus on process outside of context can lead to absurd outcomes, as in Anatole France's line that the law, in its majestic equality, forbids both the rich and the poor from sleeping under bridges.
More basically, the attacks on Paris by religious fundamentalists are rooted in an even deeper conflict over ground rules. Do you base your sense of rules and ethics on the Enlightenment, or on a desire for a specific kind of caliphate
? As near as I can tell, the two are mutually exclusive.
In higher education, we've seen an increasing intensity to the conflict between two sets of ground rules. One side assumes that the point of education is to perpetuate the society that exists. The other side assumes that the point of education is to empower people to remake the world. In practice, they aren't always in conflict; the former group doesn't mind technical innovation, and the latter tends to value certain ethical precepts with hundreds of years of lineage behind them. But sometimes the two camps clash. Custodians of traditions that they believe are under attack by modernity are quick to take umbrage to what they consider irreverence. Partisans of critical thinking tend to assume that a certain level of irreverence is healthy, and that too much deference is a sign of intellectual laziness. Each holds the other in mild disdain, and each is quick to take offense when the other shows that disdain.
The tricky thing about ground rules is that most of the time, people don't have carefully argued reasons for holding them. They take ground rules as given, and are often genuinely surprised when others don't see their validity. That abrupt shock can lead to fear, and even to anger. To me -- and to The Girl -- tested, empirical evidence is self-evidently a valid standard. That's the point at which our skepticism stops. To people who hold other beliefs, that may look "dogmatic." In a sense, it is. But everybody is dogmatic; they just hold different dogma.
Dogmas are clashing a lot these days. My optimism -- which may be naive -- is based on a hunch that at some level, enough of us share enough bedrock dogma that we can find ways to live and work together. We just have to allow ourselves to be shocked, and then to get over that shock and engage intelligently and respectfully. We can choose to get some distance on our own assumptions, and to self-consciously bring ethical reflection to the process of improving them. We can make the choice to change some of our assumptions. It's a challenge for an eleven year old; it's a challenge for a forty-seven year old. But it's the right thing to do. That's where my skepticism stops.
Sunday, November 15, 2015
Advising, Decades Later
This weekend we hosted some friends from Massachusetts. It was a glorious time: the kids picked up right where they left off, the weather cooperated, and the parents got along great. Other than a nasty sore throat on Sunday, I couldn’t have asked for it to go better.
The friends are college graduates, but not professional academics. I mention that because at one point the grownups’ conversation turned to memories of college. (The kids’ conversations were largely about Minecraft.) All four of us had tales to tell of favorite classes and least-favorite classes, of professors we liked and those we didn’t.
I was struck, though, by how quickly the others’ tales turned to advising. In each case, the tale was the same: advisors offered too little, too late, to be helpful. The student in each case wanted clear direction, but was instead presented with a long menu of options and given little to no help in navigating it. My own story was a little different: my advisor was a physics professor, whose entire advice consisted of “you should take more physics.” I thanked him for his input and returned to navigating the catalog myself.
Although each of the four of us came to college from a different angle, and all four graduated, we shared a sense of disappointment in advising. In our different ways, we each had hoped for some useful information, and none of us got it. We got through without it, but decades later, we could each recall wanting more.
The common denominator to each story was remarkably consistent with what we’ve been hearing about best practices in advising. Everyone wanted information that was useful, rather than either comprehensive (the long list) or self-serving (take my course!). We wanted to know what we needed to take before making choices, and we probably would have benefited from return visits over time.
I’ve heard those things for years from folks who study student success, but it’s somehow different hearing the same things from friends who wouldn’t know the Lumina foundation if it bit them. They haven’t been following the student success debates, and I didn’t bring up the subject. Their commentary was spontaneous, specific, and unanimous.
It’s easy to object to relatively directive advising as handholding, or as reductive. And it can be. But some level of clarity is actually empowering. It levels the playing field with students who have enough cultural capital that they can navigate a bureaucracy on their own. And it prevents students from taking courses they won’t need, or that don’t make sense for what they’re trying to do. (One friend mentioned that she unknowingly signed up for the pre-med version of Intro to Biology, even though she was a journalism major. It made life much more difficult than it needed to be.) That’s not at all the same thing as lowering standards. It’s applying standards to the right things.
Knowing the goal, and knowing how to achieve the goal, are two different things. With limited budgets, most community colleges can’t just hire their way out of the problem. Getting to a better system will take some redeployment of existing resources, some rethinking of workflows and protocols, and probably some technology. It’s not the sort of thing that can be done overnight.
But it’s heartening to hear, from disinterested parties, that we’re getting the question right. Advising matters, and it matters the most to the students who have the least. I hope that twenty years from now, today’s students will all be graduates, too.
Thursday, November 12, 2015
Ask the Administrator: How to Stay Current with a 15 Credit Load?
A new correspondent writes:
I have been pondering the role of research in higher education and what role it plays in opportunity for students.
In graduate school, we train people in research. When hiring to teach higher ed, we (generally) expect people to have PhDs or a similar terminal degree in a discipline because then they should ostensibly be somewhat knowledgeable about not only the history of the discipline, but the current trends, and how new knowledge in that field is constructed (I realize that there are fields that are exceptions to this). And when we take them out of the field of study, that is take them away from research, do we not take them away from this endeavor? That is to say, are we not doing the students that they train a disservice to not have faculty remain current in research? Part of my joy is sharing my research (or heck, other people's research as I come across interesting papers) with my students, even my non-majors. How does this translate to a community college professor teaching 5 classes, multiple preps, not doing research, and (I assume? Perhaps I am incorrect?) largely unable to keep up with current literature? What about students who want the chance to actually try the field on (i.e., see what research in a given discipline really is)? Does it matter that early?
I feel like knowledge keeps progressing, techniques change, and I wonder how CC profs are able to keep up, because I really don't know. Sure, Socrates and the first law of thermodynamics aren't going to change, but what we have learned in the past few years has. I suppose the question could equally apply to the increasing reliance on adjuncts at other places as well as lecturers who have no research program; I understand the cost tradeoffs here (and that some people even prefer this path), but my same question applies. I am curious what your wise and worldly readers have to say about this, as well.
Please understand, my goal here is not to make judgements but simply to ask how research informs teaching and whether it factors in at the community college level (and to educate myself). I also understand that at research-intensive schools, there are plenty of faculty who do research yet don't give a rat's behind about integrating it in the classroom...or really the classroom at all. For what it's worth, I am largely in the SLAC world, and in the sciences, which both color my views.
It’s a fair question. For a long time, many community colleges were wary of hiring Ph.D.’s to the faculty for fear of them either leaving quickly for another job -- it was a different time -- or growing embittered as they lost the ability to do the research by which they had intended to define their careers. The market has changed, at least in the parts of the country in which I’ve worked, so now it’s entirely normal to hire Ph.D.’s to faculty roles, but the question of research currency remains.
Yes, teaching fifteen credits per semester will put a serious dent in most people’s research productivity. That’s not to say that people don’t write books, but it’s not the norm. That’s one reason why the strategy of “I’ll use a community college gig as a stepping stone to the research university job I really want” rarely works. Very few people can write enough while teaching five classes to be competitive with people who barely teach at all.
In my observation, the folks who are able to be happiest here are the ones who stop mourning what the place isn’t, and instead appreciate it for what it is. Given the lack of a “publish or perish” tenure process, it’s possible to read widely and based entirely on interest. You can’t keep up with everything, but you can pick a few favorites and keep up with them. Your interests can shift, which some of us find liberatory. (Although my training is in political science, I’ve found that left to my own devices, I’m much more likely to read sociologists.)
On the upside, there’s no better college teaching laboratory than a community college. This is where the scholarship of teaching and learning finds a natural home. If you take “how do I best help students understand x?” as an applied research question, you can achieve things here that your counterparts elsewhere can’t. The best-received presentation I did at APSA was about teaching Intro to American Government to students who were never going to major in it. I don’t think it’s a coincidence that the Accelerated Learning Program -- a genuine breakthrough in the teaching of entry-level writing -- occurred at the Community College of Baltimore County, as opposed to, say, the University of Maryland.
One of my prouder achievements at HCC was the development of the “sandbox,” in which faculty were given one-on-one support in experimenting with various kinds of instructional technology. Brookdale has an entire Innovation Center for the same purpose. Colleges that recognize faculty as inquisitive and inventive people, and that grasp the potential of a teaching-intensive setting, can be hotspots of pedagogical innovation. That may not be the kind of research many Ph.D.’s had in mind when they started, but it’s real, it’s valuable, and it’s a hell of a lot of fun.
Admittedly, I’m not entirely objective on this one, so I’ll ask my wise and worldly readers to chime in. Wise and worldly readers, especially those at teaching-intensive places, how do you keep current in your fields?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Wednesday, November 11, 2015
A Different Vision of the Bachelor’s Degree
Have you ever re-watched a movie, or re-read a book, years after the first time, only to realize that you see it entirely differently with some more life under your belt? Sometimes it wears well, revealing new layers to experienced eyes. Sometimes it just brings home how much difference experience makes, as what once seemed profound has come to seem ridiculous. Either way, though, it’s humbling; judgments that seemed self-evidently correct at an early age just seem embarrassing later.
Yet that’s not how we structure higher education. We build degrees that move from the broad and general at the beginning -- the theory, the survey -- to the specific at the end. That structure pretty much guarantees that initial encounters with large and sweeping theories will be shallow at best, since they lack both context and a sense of why they matter. By the time students get to specifics, they’ve left the big questions behind. If they return to the big questions later, it’s despite, rather than because of, the way we’ve organized degrees. They rarely get the benefit of coming back to the big questions with the benefit of context, and that’s our failure.
That’s why I’m so excited about the new report by Mary Alice McCarthy, Flipping the Paradigm. It builds on the insight that many of us quietly know, but rarely use. Starting with theories and working down to practice doesn’t fit how most students learn. And it creates issues with transfer of credit that don’t need to exist.
Briefly -- and McCarthy’s report is well worth reading in its entirety -- she calls for “flipping” the bachelor’s degree. Instead of starting with broad “gen ed” classes and working towards narrower applications, we should start with applications and work upwards towards theories. Build theories on actual context, at a point when students have some sense of why they matter. Theories may be deductive, but learning is inductive. Teach for learning.
On a practical level, getting away from the model of front-loading gen eds would also allow community colleges to drop the distinction between “applied” and “transfer” degrees. Right now, a student who gets an A.A.S. (associate’s of applied science) degree at a community college and later decides she wants a four-year degree often loses tremendous numbers of credits upon transfer. A student who gets an A.A. intending to transfer, and then doesn’t, walks away with a credential of less market value than she probably expected. If the gen eds were backloaded, or at least more evenly distributed among the four years, it would be easier for someone with a two-year degree in an applied, employable field to go on for a bachelor’s.
Some versions of that exist now. The transition from RN to BSN is largely theory, with the clinical and applied piece handled at the associate’s level. Some Hospitality degrees have the applied culinary part in the first two years, devoting most of the latter two to general business courses.
I had the uncommon experience of seeing “applied bachelor’s” programs develop organically when I was at DeVry. While I was there, the New Jersey campus gradually transitioned from offering only two-year degrees to offering both two- and four-year degrees. That meant building four-year degrees on top of the two-year degrees that already existed. But the two-year grads had to be employable in their own right (rights?), so the application courses were front-loaded and the gen eds back-loaded. Students would get the hands-on part of, say, telecom in the first two years, and would spend the last two working largely on soft skills through gen ed classes. The idea was to teach the entry-level skills upfront, and to apply the “finishing school” sheen later on.
DeVry had other issues, but the backloaded gen ed structure had its virtues. By the time the students got to the more theoretical classes, they had a bit more time in school under their belts, so they weren’t as easily overwhelmed. The end was in sight, so they tended to stick around. And the focus on ‘soft skills’ was, uh -- how to say this delicately -- worth the effort.
Right now, students have to decide upfront whether they want employability or transferability, and they have to jump through unwanted classes before they get to what they actually want. We know that people don’t learn that way, and the “exploration” that the first two years are supposed to allow is largely defeated by distribution requirements. The system works reasonably well for students who get it right the first time and never deviate from the plan, but that will never be everybody. Lives change, people discover callings, markets shift. Why not move to a structure that allows students to jump more quickly into what they actually want, to stop out and come back more easily if they have to, and to transfer without losing credits?
Flipping the bachelor’s won’t solve everything, but it’s one of the smarter ideas I’ve seen in a while, and McCarthy gets the details right. Well done.
Tuesday, November 10, 2015
Tips for Faculty Job Seekers at Community Colleges
It’s hardly news that the job market for prospective full-time faculty is brutal. Yet even in this climate, searches sometimes fail. In hopes of reducing frustration on both sides and making more and more successful hires, I’ve prepared a non-comprehensive list of tips for candidates.
The faculty search season has started. Typically, the timelines for institutions follow roughly along lines of prestige; the elites go first, then the less elite, followed by everyone else. Part of that has to do with custom, I expect, but much of it has to do with money. When you don't have to worry about getting the funding approved, or about what the state or county will do to your appropriation next year, it's easier to plan with certainty. For colleges with thinner margins and more exposure to the political winds, it's common for searches to be postponed until late in the cycle because they often don't have a usefully clear budget picture until then.
Rob Jenkins has done some good columns at the Chronicle for people seeking faculty positions at community colleges. I'll add a few things, based on what I've seen over the last decade-plus.
A community college gig may not have been what you had in mind when you went to graduate school. That's okay; life happens. But if you convey, either in person or in your materials, that you're "settling for" a cc, you'll be dead in the water. Yes, community colleges may be largely absent from most discussions of "the university" that occur in graduate programs, but they account for almost half of the undergraduates in the United States. That's a larger portion than every flagship university in the country, combined. Treating the sector as an afterthought is both offensive and really distorted.
And those of us who work here work hard, and take our work seriously. The work may look different from the work at a research university, but anybody who has taught a fifteen-credit load to students of uneven levels of academic preparation can tell you that it's not for the faint of heart. Doing this work well is worthwhile, but really hard. If it's not for you, there's no shame in that, but don't apply. These jobs are not stepping stones -- the teaching load is too high for that -- nor are they jokes.
That said, what can you do to improve your chances?
First, if you haven't had exposure to the community college world as either an adjunct or a student, pick up a class. Get a firsthand sense of the reality of the place. T.A.'ing at a R1 is simply not the same experience, and if you try to suggest that it is, you'll convey that you have no idea what you're talking about. I know that any administrator recommending adjuncting is likely to get blasted with righteous internet rage, but candidates with community college experience consistently beat candidates without. And there are valid reasons for that.
Second, tailor your materials to a teaching-intensive place. That means moving the discussion of your dissertation and/or research to the end of the letter, to the extent that it comes up at all. Lead with teaching. This should be basic, but every year I see candidates violate this one. It's a huge red flag.
Third, get some online teaching experience. As with community college experience, online experience is a valuable box to be able to check. If you're still in graduate school, and your program offers some sort of certificate in online teaching, get it. If you're already adjuncting or "visiting," pick up an online class or two. Over the last five years, on-campus enrollments have dropped at community colleges across the country, but online enrollments have grown. It's the one consistent growth area. Given that many faculty were hired well before online teaching became an expectation, new hires are often expected to be willing and able to step in. To the extent that you can address knowledgeably what's involved in teaching well online, you will improve your chances.
Fourth, get some familiarity with "outcomes assessment." I know it's not a popular topic among many faculty, but it's here to stay. To the extent that you can speak fluently in that language, you will outshine the others in the final round. I've seen it happen.
Fifth, any experience or familiarity with "universal design for learning" and innovative ways of improving accessibility for students with disabilities can only help. At both my current campus and my previous one, more than ten percent of the student body is registered with the campus disability services office, and that counts only the students who self-report. Yes, it's fine that you comply with approved requests for extra time on tests, but what have you done proactively to create a more inclusive environment? Have you changed your handouts or presentations? Have you adopted or adapted class activities to be more inclusive? I once saw a faculty candidate brush off accommodation requests as "whining" -- and there went his frontrunner status..
Sixth, remember that your teaching demo isn't a research presentation. It's not about impressing the faculty with how erudite you are. It's about showing how effectively you can engage students. Lecture at your own peril.
Seventh, do NOT come in with an attitude of entitlement. You may honestly believe that it's obvious that the job should go to you, and that any fair-minded person would have to agree. And maybe it is. But if it is, the attitude can overshadow your merits.
Finally, assume that the folks on the committee can read. That means proofreading, obviously, but it also means addressing us as professionals. I've seen multiple -- multiple! -- cover letters start with "My name is..." Don't. Just don't. We'll figure it out from the signature line and the resume. And you may not believe this one, but I swear it's true: I once got a cover letter and resume on pink paper. (No, I'm not channeling "Legally Blonde." It actually happened.) Don't do it. It's not cute or endearing. It's insulting.
It's a brutal market out there, even for folks who do everything right. At least being prepared and avoiding some self-inflicted wounds can help.
Monday, November 09, 2015
What Seems Obvious at First Glance Is, In Fact, Still Obvious When You Look Closely
A new study found that more prestigious colleges and universities are no better at teaching than less prestigious ones.
To which this Williams College grad who works at a community college says, “Duh.”
Yes, the study had necessary limits. It was based on single-day observations of hundreds of classes. We’ve all had good days and bad, though I would expect that those variations would come out in the wash. It was based on a five-point rubric, on which the elites won on “cognitive complexity of the course work,” and the non-elites won on what amounts to engaging students with the material. Overall, the differences were a wash; as the study puts it in the title, “prestige is a mirage.”
Yes and no. In the classroom, mostly yes.
Prestigious places tend to be selective, which is to say, they tend only to let in students who have shown the ability to be very successful in traditional high school settings. These are the students who take lots of Honors classes, run clubs, volunteer, and get high grades. Students like that are good at school; if they weren’t, they wouldn’t get in. That frees up faculty to spend relatively little time worrying about pedagogy; they can just present the material and trust that most of the students will get it. At Rutgers, for example, many undergraduate classes were so large that there wasn’t much choice but to lecture. Lectures can be “cognitively complex” at a very high level.
When I taught at DeVry, though, I had to unlearn the teaching methods I had picked up at Rutgers, and pick up a whole new set. These students generally weren’t good at school, or if they were, they didn’t know it. They didn’t need to have certainties deconstructed; they needed to feel like there was a point in even trying. I learned quickly, and the hard way, the difference between “what I say” and “what they hear.” I had to shift focus. Instead of showing unexpected nuance and depth to a seemingly simple issue (in the ‘90’s, we called that “problematizing”), I had to bring clarity to what was otherwise a frustrating fog. That’s a different task, requiring different methods. Lecture had to be cut into small pieces, interspersed among as many applications as possible.
After a couple of years of teaching at DeVry, I was a far better teacher than I had ever been at Rutgers. At a basic level, it mattered more. The top students there -- and there were some -- got some pretty terrific classes, if I do say so myself, and I wasn’t the best teacher there.
At research universities, faculty are hired and promoted based on research. I had professors in grad school tell me openly and without shame to minimize the amount of time I spent on teaching, in order to spend more time writing. In a culture like that, I’m not shocked to discover that much of the teaching is done by lecture.
Where the “prestige” piece is more relevant is outside of class. That’s where the ‘signaling’ piece of a selective degree comes into play. But inside class, I’m not shocked to hear that the gap is small, when it exists at all. And given the academic job market of the last twenty years, teaching-intensive places have been able to hire from the same pool that the elites have; by now, you can get “cognitively complex” faculty at every level.
I’d love to see legislatures take teaching seriously when they allocate funding among the sectors; parity on a per-FTE basis would go a long way. If this study helps make that case, I’m all for it. In the meantime, though, some scientifically-backed respect is welcome.
Sunday, November 08, 2015
Meta-Majors, Sampler Platters, and Sneaky Ambition
High-toned liberal arts colleges often like to have interdisciplinary freshman seminars. Community colleges generally can’t, partly because our definition of “freshmen” is more heterogeneous -- does the freshman year start at enrollment, or when developmental classes are done? what about for dual enrollment students? -- but mostly because anything interdisciplinary often won’t transfer.
That’s more a function of neglect and bureaucracy than conspiracy. Many four-year schools have checklists into which courses must fall to be accepted. If a course doesn’t fit a category neatly, it either doesn’t make the cut at all, or makes it only as a “free elective.” Free elective status is where credits go to die. So we can run Intro to American Government all we want, but, say, The Politics of Protest Movements is a non-starter.
The gap in first-year course ambitions has stuck in my craw for years. If quirky and interesting first-year courses are available to the elites, they should be available to everybody. Fair is fair.
So that’s the starting point with which I came to the idea of meta-majors. As I understand them, meta-majors and guided pathways are related and complementary attempts to improve student success rates by being much more prescriptive with entering students. In practice, the idea is to make the transfer checklist the ‘default’ setting for student course selection. If students are kept on the straight and narrow, the theory goes, they’re less likely to get lost. If they only take courses that count, they’ll make progress more quickly, and be likelier to finish.
Meta-majors and guided pathways strike me as very promising ways to improve student success, both at the community college and upon subsequent transfer. But they’re vulnerable to the critique that they “solve” the tyranny of the checklist by surrendering to it. They seem to sacrifice adventure for safety.
(People who know me know to get a little nervous when I start a sentence with either “unless…” or “what if…”)
Unless the Big Intro course -- the meta-major -- is interdisciplinary and ambitious in its own right.
The most effective meta-major class I’ve seen was the Intro to Health Careers class at Holyoke. It was a sampler platter of the various occupations within the allied health field, taught on the assumption that many students who had identified nursing as a career goal didn’t know that many other options even exist. They’d spend some time learning about other roles in the industry, with the goal of finding the one that fits them best. Some students peeled off into social work, some into nutrition, some into public health, some into medical coding. The ones who continued with nursing were fewer, but better chosen; after a couple of years, both the diversity of the nursing class and the NCLEX pass rate went up. When students who actually wanted to be nurses were the ones in the nursing track, they did better. That’s not surprising; they wanted it more.
Okay, you say, but what does that have to do with the rest of the curriculum? How would that work in, say, humanities?
And that’s where I sneak my ambitious little friend, interdisciplinarity, back into the plan.
Imagine a Humanities 101 course along the ‘sampler platter’ model, but with a theme. For example, with “Love” as a theme, the class could offer glimpses into “love in art,” “love in music,” “love in literature,” and the like. (And before anyone cracks the inevitable joke, no, “Love” will not be a lab class.) The social sciences could use money, sex, or power. If you can’t find something interesting among those three topics, well, I just don’t know what to tell you. Building the sampler platter class around a hook would give it some coherence, would allow faculty to branch out a bit from always doing the same old thing, and would likely give the students a reason to care.
If the meta-major class is part of a package, it’s likelier to transfer. And if it helps students identify their interests early on, and thereby to make more strategic course selections as they go, it’s likely to reinforce the ‘guided pathways’ structure. In other words, we may not have to choose between ambition and safety. We could have both.
Wise and worldly readers, what do you think? Could a meta-major structure offer the venue for community college students to get a meaningful version of what students at elite places get?