Discovering your teaching self

The greatest enemy in teaching is boredom. Sitting and listening, sitting and reading, sitting and pretty much anything – these are all generally boring activities. And when you’re bored, your mind wanders away from what it’s supposed to be learning. Yet this is the most common and certainly the easiest way to teach.

I’m no comedian, so I won’t be entertaining my students with lectures of laughs. I’m not much of a storyteller, nor did I appreciate it when my teachers went off on long tangents. I managed to be somewhat of a jokester when teaching elementary age kids, but older students don’t appreciate that energy from an adult. So, rather than changing who I am for my students, what I try to be is relatable.

When a teacher is relatable, the students can feel more comfortable in a classroom. Anxiety is another detriment to learning, so easing any nervous energy can help facilitate learning. Students are more likely to ask questions and participate in discussions. When the lecture becomes more like a conversation, students engage with the class content, turning the material over in their minds and looking for holes. Doing so helps them gain understanding and commit it to memory, even if they are not speaking up about it.

In this manner, I try to make my classes interactive. Not everyone will feel comfortable with speaking out, but it’s still possible to engage these students in a one-way conversation. I can ask questions, prompt ideas about the material without immediately answering, pause before the ends of sentences to make students fill in the blank. I like to use animations, pictures, and comics (I’ll let someone else write the jokes) in my slides to visually re-capture any students who lost track of the conversation.

Moving forward, I would like to build games into my classes. I feel like my current teaching style targets oral attention (through conversation) and visual attention (through animated slides), but hands-on learning seems the best way to combat boredom. Ideally, the games would keep the interest of the students who don’t like to be vocal and, of course, would be fun. More brainstorming is needed!

Continuing public speaking and writing education

While I have expressed my support for subject matter specialization by undergraduate years, I simultaneously support inclusion of certain subject matter into undergraduate curricula regardless of chosen field. Two of these areas are public speaking and writing, as they entail skills that are used throughout all disciplines and careers, as well as in life generally. Proficiency in public speaking and writing can prove beneficial in any job, and beyond.

Looking outside of the humanities where applications are more obvious, let’s consider a software developer. In addition to writing in their programming language of choice, they must write up progress reports and updates on additions to their code. They also must present these updates to team members and team leaders so that everyone understands what has been done and needs to be done. When in a leading role, these skills become more important as these developers pitch their ideas/updates/applications to higher-ups, to clients, and to outside parties looking to invest.

Similarly, those with the greatest success in the sciences are able to clearly convey their work and their ideas to a variety of people. Scientists must write grants to people in their field, to scientists in related areas but not exactly in their field, and to non-scientists representing grant-giving organizations in order to obtain funding. They must also give presentations of their work in conferences, and if hoping to turn their work into a marketable product, then in pitches to potential investors.

Even if these skills aren’t directly applied in job tasks, they can be of use. Good speaking skills can help negotiate for a raise. Proper writing skills can help ensure emails are conveyed correctly. Comfort in speaking publicly can help in befriending coworkers and clients. Aptitude in essay- and story-writing can help people to convey ideas in a logical and interesting manner so that other people understand and appreciate them better. These capabilities might be targeted in primary school, but we continue to grow and develop throughout college. Putting a stop on the growth and development of these skill sets is a great disservice to our potential.

Interactive Teaching

In addition to having a nationwide shift toward a wider grading distribution (see prior blog post), I’d like to see a nationwide shift toward more interactive lessons. By interactive, I don’t mean getting-your-hands-dirty, build-a-volcano-out-of-baking-soda interactive. If your classes lend themselves to hands-on activities then that would be fantastic, but for most classes it’s not feasible and wouldn’t even make sense. What I mean by interactive is engaging the students to think for themselves.

Especially with the ubiquity of PowerPoint in teaching, too many professors have fallen into the easy trap of throwing all the content onto the screen for the students to passively absorb. But therein lies the problem – when the learning is passive, there’s not much absorption happening. Another tricky thing is that students often feel like they are learning it. If they see all the content and nothing confuses them, then they’re more likely to feel content in their level of understanding. However without picking their way through the details and actively turning the concepts through their minds, it is very difficult to know if you really grasp the material.

To force students to grapple with the content in such a manner as to tease out their level of comprehension, professors need to force the students to fill in holes in the content rather than always give them the full story. The good news is, incorporating such interaction is possible in all subject matters, and PowerPoint can even help with this. As the lecture goes on and references are made to previously mentioned points or ideas, professors should refrain from expounding on what those prior points were and instead make the class reference those ideas. Professors can also pause mid-sentence and ask the class to fill in the rest of what they were going to say. PowerPoint can help enact this with the animated feature – instead of putting up the whole content, only show a couple points at a time so that the class must predict the rest. Even the students who are not speaking up can benefit from this change, as they are having to predict and thus think through the unshown material rather than just reading it.

Musings on College Sports

As a naïve high schooler, I was steadfastly against the high level of influence that sports held over colleges. I believed that college should be all about scholarship, and that sports served as a resource drain and distraction for colleges and their students. So naturally, I chose to attend Duke.

My feelings about college sports then got complicated. Expecting to never attend a basketball game, I went to the very first one my freshman year (and most of the ones after). Prioritizing my classes over all else, I tented two out of my four years (tenting is living in a tent for six weeks to gain admission into the Duke-UNC game. That’s right, ONE game). Valuing academics over the other facets of college life, my best college memories are from basketball games.

Since graduating, I’ve accepted that my views towards college sports lie somewhere in the middle of against and ra-ra go team. College sports help instill a sense of kinship and college pride that brings a richness to the college experience treasured by students and alumni all across the US. This richness can be so strong that it clings for decades after graduating, with those having the same alma mater more easily able to trust, network, and develop friendships. My college experience would not have been the same without college sports, and I would not want to change anything about my experience.

Earlier academic specialization in U.S.

I often think about whether students in the U.S. should specialize earlier. With knowledge multiplying at an unprecedented rate since our age of computers, we need to learn more in order to reach a level of competence. Fortunately technology also accelerates our rate of learning, but it might not make up for how much we need to learn. This is especially the case in scientific research and technology-related fields, where being innovative and making discoveries requires one to be at the apex of current knowledge. Since these fields are also very competitive, the U.S. might be losing a competitive advantage by having their students wait longer than those of other countries to hone in on their career field. By reaching a competitive level of understanding at a later age, Americans might be at a disadvantage for a few reasons. First, peak brain processing power occurs at the early age of 18 [1]. Also, Americans consequently spend fewer working years at a competitive level. Finally, the age at which we gain expert level knowledge is now butting up against when most Americans start a family, which is especially destructive to women’s careers.

I believe that the U.S. has a tradition of waiting longer in part because of its emphasis on freedom – freedom of movement, freedom of thought, freedom to change. American universities also tend to emphasize the holistic person, which entails being well versed in both the sciences and humanities. In fact, many if not most universities have graduation requirements that curtail students’ abilities to limit their education to one field, for example by setting course requirements like literature, languages, science, math, culture, and more. While some diversity of thought and creativity may be gained in pursuing this diverse curriculum, I don’t believe that it makes up for the disadvantage in waiting. Most students know whether they’re better at the sciences or the humanities, for example, early in their gradeschool years. Allowing children to focus in on their best subjects should not be viewed as limiting their academic pursuits, but rather as expanding their potential.

[1] https://www.sciencealert.com/here-are-the-ages-you-peak-at-everything-throughout-life

Changing admissions goals

Continuing my discussion of admissions in higher ed, I think the best change schools could make if they’re looking to improve their admissions process is to evaluate according to their definition of success. Based on how they admit people, it appears that colleges care most about their students’ GPAs. And schools are in fact quite good at selecting who will get a high GPA at the end of their first year. However, is a successful student one that gets good first-year grades? Based on their mottos and mission statements, the schools themselves would disagree.

School mission statements tend to revolve around a few key themes, especially: innovation, scholarship, outreach, and diversity. See my first blog post for examples of two mission statements, both of which feature these tenets. Of these, only scholarship is related to grades. Innovation, outreach, and diversity can easily be achieved with a sub-par GPA, and are probably easier to accomplish at the expense of the GPA. By touting these themes in their mission statements, schools are proclaiming them as having value. If schools are to tout these themes in their mission statements, then they should recruit a student body that exemplifies these characteristics.

If schools were to redefine their view of student success based on their values, then they would have to modify how they evaluate applicants accordingly. Empirical studies would need to be done to properly determine which factors best predict this new set of success outcomes, but high school GPA which receives a lot if not the most attention now might no longer be the best predictor. Instead, application components like the personal essay, service record, and extracurricular activities might need more weighting in admissions decisions. The application might need to change, too, to collect more appropriate information about the applicants. Schools and students would face many challenges in redoing this process, but the results could justify the required effort. Society might even benefit from these changes if students are consequently encouraged to engage in their communities and aid more.

Changing the Grading System in U.S. Education (Future of the University Blog Post)

Grading in the U.S. is very inflated. In many schools, an A – the top 10% of the 0-100 grading distribution – is what is needed to be considered “good”. However, being in the top 10% should be considered stellar, not adequate. Even the more traditional “good” being set at a B – the top 15% of the grading distribution – seems misguided. And yet a B average is what most American professors set when utilizing a bell curve. A B average is almost an oxymoron, as the average should be at the midpoint of a distribution, so at 50. By artificially creating range restriction in our grading system, our grading system is not accurately depicting the capabilities of our students. The true top 15% achievers are being lumped in with often the top half or more. As a result, error plays a larger role in defining the top of the class. For a concrete example, a silly mistake will knock down a student’s ranking more with only a few percentage points separating the entire top half of the class. Additionally, the true top achievers fail to shine. Not being able to identify true top achievers compromises the purpose of a grading system – enabling organizations and schools to recognize and recruit talent. On the other side, the true bottom 15% (or even the bottom 50%) suffer more because unlike the top half, they do not benefit from being lumped in with the higher performers. In other words, they do stand out clearly from the rest. One could argue that the middling group also suffers from this system because they have less incentive to try harder. Therefore, they miss out on the learning opportunities that accompany putting in maximal motivation and effort.

Making this change will be wildly difficult, though, because it will need to be a sweeping, nationwide change. The change will need to be made across all age levels and across all types of institutions. If it is not made by everyone, then there will be many problems. First, only the students in schools who shift their average to the actual middle (i.e., 50) will suffer as their grades will appear lower than those of students following the current system. Second, recruiters and those needing to evaluate students will have a less accurate gauge on student performance due to using multiple measurement schemes. Third, student will likely be confused if they switch from one grading system to another and will similarly suffer from not having a clear gauge on what “good performance” is. There would need to be a nationwide cultural shift to make this process run smoothly and effectively.

Technology and Innovation in Higher Education: Social Media in Admissions?

Selection research – research into how people are chosen for certain roles – has recently been looking at how social media impacts selection decisions for new employees. While social media is not an accepted nor respected means of choosing new employees, it is being used by organizations, and may eventually be commonplace. Similarly, social media might eventually be used to make higher ed admissions decisions, and admissions counselors might be using it already. Regarding the efficacy in doing so, I turn to the paper “Social Media in Employee-Selection-Related Decisions: A Research Agenda for Uncharted Territory”1 for guidance, which reviews research on social media in employee selection.

In short, the paper’s conclusion is that social media is a poor, ill-advised means of selection. It introduces massive potential for bias and variability into a process that should be objective and standardized. Social media gives an incomplete picture of a person, and people view missing information suspiciously. People also instinctively place much greater value on negatively perceived information, thus swaying their opinions of what information they do find. Social media-based judgements are open to relying on one’s “gut feelings” about a person, too. When people see something relatable in a social media page, they will likely have a stronger positive gut feeling about that person, and they probably won’t recognize its source. Consequently, basing selection decisions off of social media judgements opens up organizations using it to legal retribution, as unfair selection practices are against the law.

Furthermore, using social media was found to be ineffective. In empirical studies, there were correlations of about .3 between validated personality measures and people’s social media-based judgements of those personalities. Similarly, the correlation between job performance and social media-based judgements was about .3. That correlation value is quite a bit lower than other, accepted selection strategies.

Therefore, using social media to make admissions decisions seems like a very bad idea. It would likely hurt diversity, which is already a weak area in higher ed. It might hurt colleges, too, in that they unwittingly admit lower caliber students. I hope that admissions committees are aware of the detrimental potential of social media, seeing as 45% of a sample of 2600 US hiring managers admitted to having used it despite it not being an accepted selection strategy. So I fear that admissions people are already dabbling in it too.

[1] Roth, P.L., Bobko, P., van Iddekinge, C.H., & Thatcher, J.B. 2016. Social Media in Employee-Selection-Related Decisions: A Research Agenda for Uncharted Territory. Journal of Management, 42(1), 269-298. doi: 10.1177/0149206313503018

 

Open Access Journals in Psychology

Open Access journals are particularly useful to psychology because of psychology’s subject matter – people. Thus, the research covered in much of psychology is relevant to people and could be used by the general public more than in most fields. This idea is supported by the prevalence of psychology-related books that have entered and succeeded in the popular press, such as Malcolm Gladwell’s best-selling books that synthesize research findings in psychology with stories about the world and Eric Barker’s book Barking Up the Wrong Tree that similarly accumulates psychological findings with a special focus on the science of happiness and success to present tips for the everyday reader. Even psychologists themselves have become famous amongst the general public, like Dan Ariely, who published best sellers Predictably Irrational and The Honest Truth About Dishonesty, and Daniel Kahneman, who wrote the very successful Thinking, Fast and Slow.

In addition to the general public, psychology research is useful to many practitioners. Counselors, therapists, and social workers apply the research done by psychologists in their work to help people. Additionally, consultants and coaches could use psychological research to learn how people are motivated, work best together, and even be manipulated. Companies like Amazon and Apple, as well as agencies like the FBI and CIA, have also realized this potential.

Therefore, by making research literature available to anyone with an internet connection, Open Access journals allow for greater dissemination of research – and hopefully for a broader reach of benefits.

For example, the Open Access journal Frontiers in Psychology is the largest journal within the field of psychology. It covers a wide range of subjects – environmental psychology, clinical psychology, social psychology, animal behavior, perception science, human factors, and much more – which hold relevance to a wide range of people and industries. It is an international journal, publishing research, receiving funding, and recruiting editors from around the world. Its goal is to make the best of psychology research readily available to anyone who wants to access it. By allowing knowledge to have a longer reach, this journal hopes to accelerate innovation and improvement “for the benefit of humankind“.

Ethics Post

I felt dirty reading the list of ORI updates, like looking at a list of mugshots. I was looking at what would likely be the advertised scientists’ greatest shame of their lives. It makes me wonder what chain of events led them to commit such nasty, life-altering mistakes. Did they have a concrete intention of committing fraud? Did they realize what they were doing mid-way down that path and want to turn back? Did they try to do the right thing at any point?

In considering the causality, I noticed that all five of the “featured” researchers were involved in biomedical research (Tataroglu, Neumeister, Yakkanti, Malhotra, and Potts Kant), which seems too unlikely to be pure coincidence. In fact, all five had received National Institues of Health (NIH) funding, four of five US Public Health Service (PHS) funding, and two had even specifically had National Heart, Lung, and Blood Institute (NHLBI) funding. What about this kind of research could lead to a higher incidence of ethical transgression in research?

My thoughts are almost fully speculation, but I did spend two years (one year full time) working and researching in a biomedical lab to give me some perspective of the research experience in this field. First, it is cut-throat. I’m sure everyone is aware that the road to being a medical doctor is a very competitive one. Having attended a heavily pre-med school, I saw the accuracy of future doctors’ reputations as intense and hell-bent on being the best. Personality tendencies compounded with the fact that, at least in the US, spots at medical schools are extremely limited, so competition was innate in the field. One of the guilty researchers was a medical doctor. As far as those in biomedical research but not MD’s, my anecdotal experience has been that many who went into biomedical research were originally pre-med and had similar dispositions to those who went to med school; they just instead turned their sights on the research aspect of the health field. So, I wonder if a culture of intense competition and need to prove oneself as the best has driven researchers toward cheating.

Additionally, biomedical research Takes. A long. Time. I don’t even mean years, I mean decades. This extended time span of studies is actually part of the reason I got out of that field. It sounded exhausting. It also sounded risky. What if eight years down the road you find that your pharmaceutical breakthrough will never be compatible with humans? Well then that’s it, all those years wasted. So, fear might have driven these researchers to exaggerate their findings. Also, frustration and fatigue might have driven them to give their findings a bit of a “boost” that might allow them to accelerate the slow, plodding, laborious nature often typifying such research.