Research Misconduct Case

I was shocked. I read about a professor from the University of Pittsburg ( who plagiarized text and falsified data in two unpublished manuscripts and two National Health Institute grant applications. I guess I’m naive, but I never dreamed a researcher would submit someone else’s work as his own  in order to put up the front he had actually conducted research, all with money from the U.S. Public Health Service. I’m curious to know how this person got as far as putting together two manuscripts and two grant applications without input from a co-author  or the IRB? When given tax dollars to conduct research for the good of humanity, I believe researchers should be held to the highest standards. I’m also shocked he was asked to not participant in government funded research for only three years. That seems like a short time to me.

Old School and New School

I did my undergraduate work from 1991-1995. Before laptops were popular, or even affordable, before email really took hold, before PowerPoint and before most of the information we receive on a daily basis was digitalized. I, and the rest of the class, took notes on loose-leaf paper and organized it in a 3-ring binder or Trapper Keeper. I did my master’s degree from 1995-1997 so I had a very similar experience except that digital email was becoming popular and websites like Yahoo and Google were just born. So only a hand-full of people even knew about them. I still went to the library to do research.

When I returned to school for my doctorate degree, the digital revolution had truly transformed the academic landscape into a place unrecognizable to me. There was no pressing need to physically go to the library to do research, all the lectures were typed out on PowerPoint and most everything was archived on a course management system. Wow, I thought school in the 21st century is so much more convenient now; it’s all at the end of my fingertips. But, what I didn’t realize right away was the effort it took to organize all this digital information being thrown at me (my 3-ring binder was now a thing of the past) and how easy it was to experience information overload!

A professor in 2013 can deliver an unbelievable amount of information in 60 minutes using PowerPoint. Whereas, I remember my undergraduate biology professor physically drawing a cell and all its parts on the chalkboard with different colored chalk. He was a talented artist and watching him physically draw a cell while describing its parts certainly made it easier to understand than a quick glance of an entire cell on a PowerPoint slide shown for a few seconds.

So, for me, not only is multi-tasking and surfing a detriment to academic success, but the rate at which information is delivered can also pose a potential problem. I find myself needing to take a digital Sabbath to keep the focus necessary to carry a task to completion. I also find myself weeding through a lot of information presented in classes to find the diamond in the rough, the information necessary to help me understand and complete a task.

However, I feel we all could benefit from learning how to use our gadgets for good and not for distraction. Recently, I was at a national conference. During one of the talks, the woman beside me used her iPad in an amazing way. As the speaker was talking, he would mention a study, simultaneously the woman beside me looked up the study, downloaded it, and took notes on the talk all in real time. It was amazing. So, at the end of 45 minutes, she had downloaded all the sources mentioned as well as took notes on how the speaker used them. I thought, this is certainly an efficient way to use an iPad. Perhaps moving forward, instead of complaining about students checking Facebook or the news during class, instructors will focus on teaching them new and engaging ways to use their devices during class that will actually increase their scores instead of lowering them.


UBUNTU. I just discovered this word a week ago. My husband purchased a Linux-based computer and installed Ubuntu as the operating system. I really didn’t know what all this meant. I’m learning quickly that there is a large sub-culture of computer users who participate in a free and open source software distribution network/philosophy. Basically there’s no Bill Gates or Steve Jobs in the Linux/Ubuntu worlds. The “community” contributes to the development of new software and it is made available to everyone for free. It’s amazing really.

Because I’m not really a computer person, I was more interested in the word Ubuntu. It didn’t sound like an English word and I thought it probably meant something cool, unlike the word Xerox. Xerox is a made up word, a name of a corporation and now a word on the verge of becoming a generic verb meaning “to photocopy.” Ubuntu sounded much more meaningful. So, I googled it, and here’s what I found:

“A person with Ubuntu is open and available to others, affirming of others, does not feel threatened that others are able and good, based from a proper self-assurance that comes from knowing that he or she belongs in a greater whole and is diminished when others are humiliated or diminished, when others are tortured or oppressed.”

Wow. What a lovely, benevolent concept.Then I thought, if we (as a society/culture/universe) all embodied ubuntu, practiced ubuntu, then would we have diversity issues, oppression, disparities, and a whole host of other “-isms?” It seems to me in western culture there’s a lot of emphasis on competition, on honing your skills to compete in the global workforce or on college entrance essays, etc. The focus is on us making ourselves better than someone else in order to get the job, whatever the job may be. But, what if we all had a place in the world to be ‘our best selves,’ and that it didn’t mean in order for someone to win another had to lose.

Then, I thought, is such a thing possible if the word isn’t part of the dominant culture? I know of ubuntu now, but its not a American concept, or a word that many people know. But then I caught a glimpse of hope when I read, “Ubuntu is the world’s favorite free operating system, with over 20 million people preferring it over commercial alternatives.” So, maybe those 20 million people will tell 20 million more people and so on until one day the scales tip and ubuntu is so well known it becomes a generic verb in a society overflowing with equality.

Can 21st Century Technology Close the Disparities Gap?

I’ve spent the last year of my doctorate work in federally designated health disparate regions of Virginia. Sounds simple enough. We travel to small, rural towns and sit in free clinics, social services departments or at the local Wal-Mart and promote our free research program that aims to give people health benefits and $150 for completing the entire project. However, it’s no simple job.

More importantly, through this work, I’ve had the opportunity to witness life in a different way. To witness what Michael Harrington called “The Other America.” Despite being from West Virginia and growing up in a poor area, I didn’t know poverty until I spent an entire summer sitting in the waiting room of a free health clinic. Here I was able to see and talk to many people who were unemployed, unskilled, uninsured, uneducated, unhealthy, and without transportation, food, a permanent address and hope. It was a humbling, disturbing and transformative summer.

What disturbed me the most about last summer were the children. Knowing the research, like Mimi Ito, a cultural anthropologist at the University of California Irvine, pointed out in the video “New Learner’s Of The 21st Century” (at 24:20 ) “we know that the learning outside of school matters tremendously for the learning in school.” I knew this too. I knew this as I watched children come in with their grandparents, their tired and sick grandparents who were now responsible for their grandchildren. I knew this as I babysat for a few kids during one of our community classes and the kids told me how they didn’t always have food in the house or how their Mom’s ex-boyfriend was a mean person. Hopefully I’m wrong, but I know these kids will most likely have a hard time in school and a hard life ahead of them.

To be honest, I left overwhelmed by this experience. How was our intervention on soda and exercise going to really make a difference in the lives of the people I was meeting? How could the educational system make a difference in the lives of the children I was meeting when their basic needs were not being met? How could they learn best at school when their learning was not supported at home?

As I watched the segment on Digital Youth Network (15:30) from the same video I mentioned above, I thought, maybe there’s hope. Maybe giving kids the tools they need to “access their passion,” is a way to avoid a “dream deferred.” Could it be possible for disadvantaged children to use computers to engage in learning outside the school, and this in turn, would make a tremendous difference in the learning their getting inside the school?

What if all kids were given computers in elementary school, regardless of their district’s SES, and all elementary school kids were given assignments that would sustain and engage their attention well into the evening, enough so for them to find a deep passion and be rewarded by education instead of subverted by it. Could this type of digital engagement be the antidote to the educational deficiencies kids endure because of poverty?

I’m old enough to know there’s never a silver bullet for anything, especially not a complex social issue like poverty and education. However, I do feel hopeful that as new technologies become embraced and education revolutionized, the gap will lessen. Because as Diana Rhoten, Program Director for Digital Media and Learning, says, ” the 21st century is (about) learning the tools and skills of remaking the content and becoming the producer and creator.” To many kids in poverty, having control over their lives may be difficult, but perhaps becoming a producer and creator of their learning is a way to produce and create a different life as well.