Future of the University

If there is one thing I believe should change in higher education, it’s the perception of the university as a gateway to a career instead of a place of academic enlightenment. Now, I realize this perception is a result of how higher education coevolved to meet the needs of contemporary culture. I also recognize the unlikelihood this perception will ever change. Regardless, I don’t see why I have to agree with the present situation or maintain hope that it will change for the better.

The history of higher education is distinct from other forms of education, with some universities among the oldest learning institutions in the world. The development of universities, and higher education more generally, over the course of the last millennium is closely tied to religion. Universities were established as organizations free from direct control of the church or other religious institutions, a privilege usually granted by the king or state. This privilege allowed for academic freedom to question, research and advance knowledge. Those who attended universities were usually individuals from the upper social class who had substantial wealth and weren’t required to devote all their time to laboring for money. They didn’t need to gain knowledge to obtain a job and , if they did, they did so by apprenticing/gaining on the job experience. University attendees were motivated by a true passion to learn. Thus, people initially attended universities for the sole purpose of immersing themselves in knowledge.

Education is widely accepted to be a fundamental resource, both for individuals and societies. Indeed, in most countries basic education is nowadays perceived not only as a right, but also as a duty – governments are typically expected to ensure access to basic education, while citizens are often required by law to attain education up to a certain basic level. As society has evolved, the necessity for labor-based trade skills has shifted. Today’s society is technologically integrated and the ability to interact with and interpret information is now a crucial skill. Universities have become the “go-to” destination for learning how to operate in our technologically advanced world.

In the modern world universities have two purposes: equip students with advanced skills useful in the workplace and to further human knowledge and understanding of the world. A decrease in employment opportunities (due to outsourcing or technological advancements eliminating the need for human-manned positions) combined with an increase in population growth has yielded a highly-competitive job market. Degrees in higher education have become a factor used to differentiate between job applicants and are seen as an indicator of applicant “quality”.  At first, this meant that a bachelors degree ensured one better chances at becoming employed. As more and more people saw the need for a university-granted degree, the employment market became increasingly saturated with bachelor degree-obtaining applicants. Now, a masters degree has become the standard for obtaining a “high profile” career.

Universities are now perceived as a place to obtain a piece of paper that (supposedly) guarantees gainful employment. Don’t get me wrong, they are still a place of academic enlightenment. Faculty and students still engage in the free thought that drives the advancement of knowledge. However, most universities are more focused on pumping out degrees than knowledge advancing research. As we look to the future, universities will inevitably coevolve to meet the needs of the people. Education has been and will continue to be a valuable resource — one that is essential for the advancement of humankind. It is my earnest hope that universities will once again become a destination for the passionate pursuit of knowledge rather than a destination for a better chance at gainful employment.

Advertisement

Academia Gives Back

Have you ever looked at the mission statement of a university? More often than not, many of them will say something to the extent of the university providing service to the community. Take Virginia Tech’s mission statement for example:

“Virginia Polytechnic Institute and State University (Virginia Tech) is a public land-grant university serving the Commonwealth of Virginia, the nation, and the world community. The discovery and dissemination of new knowledge are central to its mission. Through its focus on teaching and learning, research and discovery, and outreach and engagement, the university creates, conveys, and applies knowledge to expand personal growth and opportunity, advance social and community development, foster economic competitiveness, and improve the quality of life.”

Not only is service mentioned in Virginia Tech’s mission, but also the university’s commitment to “…advance social and community  development, foster economic competitiveness, and improve the quality of life”. For those of us living in the “bubble” that is campus culture, it can be difficult to tangibly see how the university provides on its promise to serve. Thankfully, I stumbled across a phenomenal example of how Virginia Tech gives back to the surrounding community: Campus Kitchen.

What is the Campus Kitchen?

Virginia Tech’s Campus Kitchen is a program ran through VT Engage that recovers surplus food from VT’s dining halls, repurposes it into meals, and delivers those meals to a local food bank. Campus Kitchens Project is national organization that promotes students getting involved combating food waste and hunger. Students in collegiate chapters across the nation collect surplus food from on-campus dining halls and help transform it to healthy meals that are distributed to food insecure individuals in the area .One in eight Virginians struggles with food insecurity, and there is a great need in our region to provide services to get food to those in need.  In spring 2015, Virginia Tech was one of three schools that won $5,000 grant to help start up a campus chapter. Volunteers have devoted over 2,500 hours with the CKVT since its launch in fall 2015. We are now recovering surplus food from three Virginia Tech dining halls six days a week. 10,000+ pounds of recovered food and 400+ meals have been delivered to our community partner, Radford-Fairlawn Daily Bread.

Who is involved in the operations of VT’s Campus Kitchen, and what exactly do they do?

The great thing about Campus Kitchen is that anyone at Virginia Tech or the surrounding community is welcome to volunteer their time to help with daily operations. There at a variety of ways one can get involved in VT’s Campus Kitchen. For instance, the Kitchen needs weekly volunteers to pack, cook, deliver, and serve the food they repurpose. Additionally, if one finds they enjoy volunteering and would like a larger more long-term role, volunteers can commit to collecting the food from the dining halls or becoming a delivery/shift leader.

If volunteering in VT’s Campus Kitchen isn’t your “cup of tea”, there are plenty of other programs offered through VT Engage to get involved with.  It seems that Virginia Tech really strives to provide on its commitment to service and truly lives up to it’s motto ut prosim — That I May Serve.

Open Access: Foods Open Access Food Science Journal

The “go to” journal for all things food science is The Journal of Food Science (JFS). Unfortunately, this journal is not open access and a paywall stands between valuable scientific knowledge and those who wish to access it. Thankfully, my membership in the Institute of Food Technologists (IFT; a professional organization for food scientists and industry professionals) provides me access to this journal. Now, my membership in IFT isn’t free (annual student dues are around $50) but that fee costs me significantly less money than it would to access JFS. If it weren’t for my IFT membership (as well as  Virginia Tech’s libraries), then I probably wouldn’t be able to access any of the articles I need for my research. Thankfully, a Google search has yielded another source for peer-reviewed scientific research: an open-access food science journal.

The journal is stumbled across was Foods—Open Access Food Science Journal. Foods is an international, scientific, open access journal of food science and is published monthly online by MDPI (Multidisciplinary Digital Publishing Institute). This journal provides an advanced forum for studies related to all aspects of food research and publishes reviews, regular research papers and short communications. Their goal is to “encourage scientists, researchers, and other food professionals to publish their experimental and theoretical results in as much detail as possible alongside sharing their knowledge with as many readers as possible”. There are not length restrictions on their papers, which allows scientists to “put it [their research] all out there” for others to learn from. Some unique features Foods offers its readers are:

  • manuscripts regarding research proposals and research ideas will be particularly welcomed
  • Ÿ   electronic files or software regarding the full details of the calculation and experimental procedure, if unable to be published in a normal way, can be deposited as supplementary material
  • Ÿ   they also accept manuscripts communicating to a broader audience with regard to research projects financed with public fund

Foods also provides the “Scope” (i.e., applicable areas) of the research they publish which includes:

  • food sciences and technology
  • food chemistry and physical properties
  • food engineering and production
  • food security and safety
  • food toxicology
  • sensory and food quality
  • food analysis
  • functional foods, food and health
  • food psychology
  • food and environment

Foods ensures its publications follow a code of ethics, specifically they are a member of the Committee on Publication Ethics (COPE). Since MDPI publishes Foods, their ethics statement is what Foods abides by. Additionally, MDPI states they verify the originality of content submitted to their journals using iThenticate to check submissions against previous publications.

The only reference I noticed that mentioned their stance on open access was that articles published in Foods will be Open-Access articles distributed under the terms and conditions of the Creative Commons Attribution License. MDPI then states they will insert the following note at the end of the published text:

© 2013 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0/).
In my opinion, Foods seems like a great open-access journal for food science publications. They want to get research out there and accessible to everyone, yet still credit the authors and protect their rights. I will definitely be using this journal for my research—related needs, and would even consider submitting future manuscripts to them.

Tech & Innovation in Higher Ed

If I had choose two (nice) words to describe present-day society (even though there are a multitude of words I could use) those words would be: technologically integrated. Technology has taken over our lives, and I’m certain most of us could not function if it somehow disappeared from existence. The world of higher education is no exception. Those of us in academia have adopted technology and evolved our research and, yes, even our teaching methods to incorporate it. The integration of technology in higher education has spawned  an evolution in teaching methodology: Massive Open Online Courses (MOOCs).

MOOCs (e.g., Coursera, edX, Udacity) are courses available online that allow unlimited access and participation, provided one has access to the Internet. Besides there (almost) universal accessibility, a benefit to taking MOOCs is that they provide provide traditional course materials in addition to interactive user forums and support from the global online community. Sounds pretty perfect, right? Well, as it turns out, many teachers, universities, and even students take issue with MOOCs. In a quest to better understand the controversy, I discovered a paper in the International Journal of Communication by Jose Van Dijck and Thomas Poell titled Higher Education in a Networked World: European Responses to U.S. MOOCs.

Van Dijck and Poell begin by stating the objective of their paper: “analyze the dynamics underlying the development of MOOCs in the context of an emerging global social Web where these courses are implemented in systems of higher education that are often institutionally grounded and largely nationally based” and “to understand how emerging commercial online infrastructures affect public systems such as higher education”. In the introduction, they describe how MOOCs are designed to function based off of the same mechanistic underpinnings of the ecosystem and other connective platforms: datafication, (algorithmic) selection, and commodification. Datafication is defined the tendency to quantify all aspects of social interaction and turn them into code. These codes signal for algorithmic selection (based off popularity of the course, student success in assessment modules, etc) of how the course will be enhanced and evolve overtime to meet the needs of students. The last mechanism of how MOOCs function is commodification, which involves the transformation of objects, activities, and ideas into tradable commodities. The majority of this paper involves the authors discussion of the pros and cons of how these three mechanisms impact MOOC and their stakeholders (i.e., students, teachers, program developers & entrepreneurs, and Europe). I’d go into more detail and discussion on this, but it’d be difficult to do the 19 page paper justice in one blog post.

Next, the authors introduce the following questions: Is education a public good. How do MOOCs relate to key public values-sustaining systems of higher education?  The authors mention that main disputes concerning MOOCs focus on the impact they have on core values of education. Because the paper’s discussion is focused on the U.S. and Europe, there were come interesting contrasts presented. For instance, many U.S. universities have been, to some extent, privately or corporately funded, which is one reason why many of them embrace the for-profit (or at best nonprofit) high-tech apparatus (i.e. MOOCs) that renders them part of a global online market. In contrast, European taxpayers have traditionally supported a public system of higher education that is focused on commodification, deregulation, globalization, and is economically distinct from its U.S. counterparts. I found the discussion on these contrasting frameworks super interesting and rather thought provoking. I wonder how systems of higher education in other countries (besides the U.S. and Europe) are approaching MOOCs?

The paper concludes by mentioning that MOOCs are  “an ideological battleground where a revamped definition of what public education means in a networked world is wagered and contested”. In our technologically integrated society, there exists an ever present battle over privacy, accessibility, inherent and protected freedoms, and the meaning of publicness . Higher education is one battleground where this struggle takes place. Governments and university administrators will eventually need to address the regulatory problems at stake here, as well as the potential profound ideological shifts in the emerging global market for online learning. Even if online mass instruction will never come to replace traditional college education, MOOCs will inevitably have a substantial impact on how education is defined as a public good.

Public Education: Opportunity or Oppression?

This week’s discussion on Critical Pedagogy comes at a very tumultuous time — the election of a new President and his appointed governmental  leaders means significant changes are upon us. The American Education System is one of the first sectors facing serious reform. Secretary of Education, Betsy DeVos, looks to overhaul the current public education system by shifting to a more privatized system. DeVos reform plan is being met with plenty of opposition from politicians, teachers, and parents alike despite evidence that many public schools are failing and Federal attempts to improve them have yielded no meaningful success. Now, I’m not going to sit here and tell you DeVos’s plan will completely fix all the problems with our education system, but I want to look at the situation with regards to critical pedagogy and how the American Education System came to be in its current state.

Critical pedagogy has been defined as a philosophy of education (and social movement) that has developed and applied concepts of critical theory and related traditions to the field of education. Critical pedagogy advocates teaching as inherently political, rejecting the neutrality of knowledge, and the belief that issues of social justice and democracy are related to the teaching/learning process. The concept of critical pedagogy can be traced back to Paulo Freire’s best-known 1968 work, The Pedagogy of the Oppressed, in which Freire provides a detailed Marxist class analysis in his exploration of the relationship between the colonizer and the colonized as it pertains to education.

Let’s take a quick look at the American Education System. Government-supported and free public schools were established after the American Revolution. However, education was optional and mostly offered at private local institutions or performed at home. This meant that education was not standardized nor was quality education available to everyone. In 1852, Massachusetts was the first U.S. state to pass a contemporary universal public education law requiring every town to create and operate a grammar school. Fines were imposed on parents who did not send their children to school, and the government had the power to take children away from their parents and apprentice them to others if government officials decided that the parents were “unfit to have the children educated properly”. Laws requiring compulsory education spread and now, virtually all states have mandates for when children must begin school and how old they must be before dropping out. Compulsory education laws require children to attend a public or state-accredited private school for a certain period of time with certain exceptions, most notably homeschooling.

Now that we’ve gotten some of the important background information out of the way, let’s discuss how America’s compulsory education laws have created this colonizer-colonized relationship Freire outlines in The Pedagogy of the Oppressed. By mandating that all children be educated, the Federal government was now responsible for providing a national educational system that was accessible to everyone. This meant that they now have control over the quality and type of content being disseminated to students. Though the public education system was meant to provide everyone with the same educational opportunities, it has furthur exacerbated the educational gap between people of different racial and socio-economic backgrounds.

There is a growing body of evidence showing that the U.S. public education system does not provide the same quality of education to all students. Additionally, similar research shows that U.S. student academic achievement is falling behind that of other countries.  Much of this is attributed to the poor quality of education provided by the public school system. Now, there are many quality public schools that provide high-quality education but, unfortunately, they are normally found in areas of economic prosperity. Areas of economic disparity, arguably areas where quality education is most desperately needed,  tend to have poor public school systems whose students often fail to meet Federally set academic standards. A factor that furthers this issue is the fact citizens are required to pay taxes supporting the local public schools. For low-income citizens, this means the portion of their income that could have been spent on sending their children to a private or charter school is forcibly invested in public schools. By forcing parents to send their students to school, as well as pay taxes to local public schools, the Federal government is essentially dictating how certain populations will be educated. In the case of those living in low-income areas, they are forced to send their children to the affordable, yet poorer quality, public schools instead of sending them to private or charter schools that may offer a better educational experience.

As it stands, it appears the American Educational System promotes this “colonizer-colonized” relationship, outlined by Freire, which oppresses students via banking education. If we are to free students from this oppression with respect to critical pedagogy, maybe DeVos’s reform plan holds some promise? By expanding the options available to families seeking a better education for their kids, parents will have the opportunity send their children to schools offering the best education. Ideally, this will create competition among schools, encouraging them to improve the quality of education they offer. It doesn’t necessarily force students out of public schools, but stimulates the schools to improve while at the same time giving families options if they don’t. It really raises the question on whether continued support of public schools creates opportunity or fosters oppression.

Affect’s Effect on Inclusive Pedagogy

I bet you’re thinking “Why did she use the words affect and effect side by side in the title? It makes no sense!” If this was your first thought, hang in there — I promise it’ll make sense by the time you’re done reading this article. Let me start by defining the word affect as it was used in the title. Affect (pronounced “af-ekt”; noun) is a term  referring to feeling or emotion, and it plays a key role in how an organism (i.e. humans) reacts to a stimuli.  Much of my doctoral research is focused on understanding the relationship between affect and food and how that relationship influences food choice. Additionally, I also study affect’s role in biases as it pertains to interdisciplinary group settings. So when I read Shankar Vendantam’s How “The Hidden Brain” Does The Thinking For Us, I couldn’t help but think about how my research relates to inclusive pedagogy.

In his article, Vendantam mentions that our brain operates in two modes: “pilot” (consciously) and “autopilot” (subconsciously). What’s fascinating is that the brain absorbs and processes information in both modes simultaneously. We don’t realize it, but our brain takes a multitude of explicit (i.e. consciously perceived) and implicit (i.e. subconsciously perceived) factors into account when cataloging information for future use. Even the positive or negative emotions we experience during an interaction with a stimuli can affect how we will respond to it (or with other stimuli we perceive as being related) in the future.

As Vendantam stated “…the mind is hard-wired to ‘form associations between people and concepts’.” From the first moments of fetal existence, everything we encounter or experience shapes how we think for the rest of our lives. In the whole nature vs. nurture debate, it’s safe to say that nurture significantly impacts one’s cognitive processing. What does this mean in the context of pedagogy? Everything.  It means that each and every student is unique in how they behave and interpret the world around them. This impacts their ability to learn and interact with information as well as their fellow classmates. As an educator, it means that your teaching style as well as the manner in which you conduct yourself and your class are greatly influenced by your past experiences. It means that the individual experiences of students in your class will impact their future actions. This is why, as educators, it is vital that we are mindful of ourselves, our students, and the learning environment we establish. Care needs to be taken to ensure our courses are as inclusive as we can make them. With 10 or even 300 unique individuals in a class,  maximizing inclusivity may seem like a daunting challenge. However, by focusing on learner-centered materials and teaching methods, I believe any educator can be successfully implement inclusive pedagogy.

Ethics Case Study: Karen M. D’Souza

Thanks to evolution, we humans have developed an advanced cognitive ability that allows us to process both tangible and intangible information that helps us make decisions. Tangible information (e.g. proven scientific facts or theories) provides us with a variety of solutions to to the decisions we face every day.  However, intangible information such as our emotions or ethical code ultimately guide us in selecting a final choice or action. Ethics (i.e. moral principles that govern one’s behavior) are crucial in making decisions that yield philosophically sound outcomes — the decisions we make need to be beneficial and not inflict unnecessary harm on ourselves, our fellow man, the Earth, etc. When there is a failure or refusal to employ ethics in decision making, we bypass a vital step in preventing deleterious outcomes. Specifically in scientific fields, neglecting to conduct research ethically can lead to a variety of ill-fated consequences.

In 2016, the Office of Research Integrity’s (ORI) Department of Health and Human Services found Dr. Karen M. D’Souza, a former Research Professional Associate in the Department of Surgery at the University of Chicago, engaged in research misconduct in research supported by the National Heart, Lung, and Blood Institute (NLBHI) and the National Institutes of Health (NIH). The ORI discovered that falsified and/or fabricated data were included in one NIH grant (R01 HL107949-01) two publications (J Biol Chem. 285(18):13748-60, 2010 Apr 30; J Biol Chem. 286(17):15507-16, 2011 Apr 29), two posters (Gordon Conference 2006 poster: “Regulation of Myocardial β-Adrenergic Receptor Signaling By Protein Kinase C”; Huggins 2010 poster: Gαq-mediated activation of GRK2 by mechanical stretch in cardiac myocytes; the role of protein kinase C”), and one presentation (Cardiac Research Day 2009 presentation: “Regulation of G protein-coupled receptor signaling by mechanical stretch in cardiac myocytes”). It was found that Dr. D’Souza reused and falsely relabeled and/or falsely spliced Western blot images, falsified the related densitometry measurements based on the falsified Western blots, and falsified and/or fabricated data for experiments that were not performed or from unrelated experiments.

When Dr. D’Souza made the decision to falsify and/or fabricate her data, she exhibited a blatant disregard for conducting research. What motivates a researcher to make an unethical decision such as Dr. D’Souza did? Often times it is the ever-present pressure for researchers to “publish or perish” — the notion that the success of a researcher’s career directly increases with the number of research manuscripts they have published in peer-reviewed journals.  Time is another factor that can sway a researcher to abandon their ethical code. The grants that fund the majority of present-day research always include a time line for when all related experiments and results must be obtained. If the researchers fail to complete experimentation by the deadline set forth in the grant proposal, it is highly unlikely the funding organization will select to fund any of their future research endeavors.  Similarly, some funding organizations desire for the research they are funding to provide specific results or conclusions that support the organization’s personal agenda. Researchers will falsify and/or fabricate data in the hope that, by pleasing the funding organization, the organization will grant them more funding for future research.

In Dr. D’Souza’s case, we may never know what precisely motivated her to report unethical research. However, we can speculate the potential outcomes that may result from her misconduct. Since she was performing medical research on coronary function, we can assume that her published “findings” may be utilized to clinically treat patients with heart-related issues. Because she falsified and/or fabricated her findings, there is no way to know if they are statistically significant. If used to treat patients in clinical trials, her unethical results could prove ineffective, or worse, fatal. Additionally, Dr. D’Souza’s gross unethical conduct will certainly yield a multitude of undesirable consequences. For instance, she has already received several administrative actions (i.e. punishments) directly related to her research misconduct, which she has voluntarily agreed to accept. The ultimate and, in my opinion, least favorable consequence of Dr. D’Souza’s misconduct is that she has forever tarnished her integrity as a researcher, surgeon, academic, and human being.

My final thought on the case of Dr. D’Souza is this: take the time to seriously evaluate the consequences of your actions. There is no personal or professional triumph worth sacrificing your integrity for.

Tenure or No Tenure: A Ph.D. Student’s Opinion

The first time I ever heard the term tenure was back in elementary school when I overheard a couple my teachers discussing how close they were to earning it. At the time, I didn’t know what the word meant, nor did I care. As I progressed through academia in pursuit of higher education, I continued to encounter tenure in a variety of contexts. Most notably, I recall it coming up during a discussion between my classmates and I about a rather awful professor who taught our freshman level chemistry course. One of my classmates mentioned that, no matter how much we complained about our professor’s inability to teach, our professor could not be fired because she was “tenured”. It was at this point I asked myself “what exactly is tenure, how did it come about, and why does it seem strictly prevalent in academia?” Now, as a doctoral student with close to 20 years of academic experiences, who is preparing for a career as a collegiate professor, I’m going to present my understanding and opinions on tenure.

What is tenure and how did it come to exist?
Tenure is an esteemed and privileged appointment one can obtain working in academia that lasts until retirement. Tenure exists because of the principle of academic freedom, which is based on the conviction that scholars have the freedom to teach, research, or communicate facts or ideas without being targeted for repression, job loss, or imprisonment.  Those who support academic freedom believe student and faculty inquiry is essential to the mission of educational entities such as schools or universities. Historically, there have been incidences where scholars communicated ideas or facts that were inconvenient to external entities (i.e. political or societal authorities) and thus were persecuted and even sentenced to death (e.g. Lysenkoism in Soviet Russia). To protect scholars, many countries have recognized and adopted tenure allowing scholars to freely to express their opinions without fear of institutional censorship or discipline.

Why is there controversy and debate surround tenure?
Presently, tenure is a hot-topic of discussion for academics and non-academics alike. Popular news-media outlets including National Public Radio (NPR), US News & World Report and New York Times, are continually publishing articles discussing tenure and the effect it has on educators and the educational system. Whether faculty, student, parent, or even politician it seems everyone has an opinion on the necessity of tenure. Below is a brief (and thus not entirely inclusive) summary of the pros and cons expressed for tenure.

Pros:

  • Protects scholars and educators from being fired for personal, political, or other non-academic related reasons.
  • Prevents academic entities from firing experienced, higher-paid educators to replace them with less experienced, lower-paid educators for financial gain.
  • Protects scholars who research and/or teach unpopular, controversial, or otherwise disputed topics.
  • Allows educators to advocate on behalf of students and express disagreement with school or university administration.
  • Promotes scholarly performance and entices people to engage in education-based careers by rewarding educators for years of successful teaching and contributions to academia.
  • Encourages the careful selection of qualified and effective educators.

Cons:

  • Creates academic complacency that poor teaching performance and very few, if any,  contributions to academia.
  • Makes is difficult to remove under-performing educators due to a lengthy, complicated, and costly “tenure revocation” process.
  • Promotes seniority as the main factor in evaluating an educator’s performance instead of  the quality of their work.
  • Tenure is obsolete and no longer needed to protect educators since modern-day laws and court rulings are able to better serve that same purpose.
  • Tenure is not necessarily applicable at all levels of academia (i.e. kindergarden, elementary, middle, and high school) due to government implemented standards of education.
  • Fails to promote education of the student and solely focuses on promoting the educator’s career.

Though there are more points that could be added to either list, both sides of the argument are backed by some very solid reasoning. To me, it seems that the debate is very contextual and the issues appear to be fairly localized reflecting the challenges experienced at particular academic levels.

What’s my opinion as a student and future educator?
As I previously stated, I’m a doctoral student with who roughly 20 years of various academic experiences under my belt from which to formulate an opinion on tenure. When it comes to tenure at lower levels of academia,  I don’t believe it is as vital for preserving academic freedoms. Many schools, especially public ones, are operated based on strict guidelines from the local and state governments. There is very little room for educators to discuss anything outside of the material they are required to teach in order for their students (and schools) to meet academics standards. In this type of context, it seems that tenure would also exacerbate the issue of ineffective educators since there is already a lack of content-based competition. However, I do believe tenure is necessary to protect more established educators who, after years of successful teaching, have earned a higher pay rate. With all the debt (state and national) and budget cuts to education, schools are under financial pressure to operate with smaller and smaller amounts of funding — it seems only natural that administrators would seek to fire higher-paid, experienced educators in lieu of those who, though less experienced, come at a lower cost.

Concerning tenure at the university and collegiate level, I believe tenure is absolutely necessary in these contexts. Though students are expected to complete a specific list courses catered to their academic major, the quality and type content featured is heavily dependent upon the course instructor. Students may be required to learn about taboo or controversial topics that make them uncomfortable, and the course instructor should be able to teach this sort of material without fear of losing their job. This is especially true at the graduate school level where professors are performing research and educating students on more focused and complex topics. Now, I fully believe that earning tenure should not bestow an educator with complete academic immunity — there needs to be a system of timely, cost-effective “checks and balances” to ensure the educator is still effectively teaching students and making academic contributions. What those look like exactly, I’m not really sure. Hashing out those details should be left to each academic entity and furthur localized to be college- or department-specific. When it comes to tenure and it’s application, it is my opinion that there is no “one-size-fits-all” solution. The effectiveness and necessity of tenure should be assessed contextually to ensure the principles of academic freedom are proliferated and abuse of the tenure appointment is minimized.

The Crusade for Sound Science

In today’s world, we are more connected with our fellow man than at any other point in human history. The evolution of technology has yielded television, computers, the internet, smartphones, tablets, and many other nifty gadgets that allow us to communicate with one another. Whether its a phone call, text, or Facebook message we are now capable of connecting with people on the opposite side of the planet within seconds. Through these technologies we can also access information about the world around us including breaking news, weather, entertainment, government policies, and even scientific discoveries. Though technological advancements have vastly improved life as we know it, they are also creating a variety new problems for modern society. It is my opinion that the greatest of these problems is hindering the promotion of sound science.

Before all this technology came about, research and academic knowledge was communicated via printed text in books, encyclopedias, academic journals, on chalk boards, and even in newspapers. In order to contribute to content to these sources, one had to have certain credentials and demonstrate adept understanding of what they wished to publish.  This meant that those sharing said content were usually experts in a related field (or had received formal education/training on the subject) and, generally speaking, would provide reliable, accurate information based on sound scientific findings. Now that everyone has information (and the ability to share it) in the palm of their hands, pretty much everyone can share their thoughts and opinions on a topic. Though this can stimulate thought-provoking discussion, it also provides an ample opportunities for the spread of “alternative facts” and pseudo science.

 

As a academic, I see myself and my fellow colleagues work diligently to make scientific discoveries and contributions that will advance our understanding of the world we humans live in. So, it makes me incredibly upset to see the media and general public toss sound science aside in lieu of pseudo science or trendy pop-culture ideals. Seriously, a simple scroll through my Facebook newsfeed almost always ends with me becoming irrationally angry. When did society:

  1. Stop trusting scientists and start believing any self-proclaimed “experts” who happen to be trending, despite their lack of credentials?
  2. Become “sheeple” and stop thinking for themselves ?

Sometime during our technological evolution, the public withdrew their trust from the academically-credentialed and placed it in the present-day “mommy blogger”, news reporter, and Hollywood celebrity. How we found ourselves in this backwards reality, I will never know. What I do know is that those of us in academia are challenged to win back the public’s trust.

Now I’m sure you’re asking yourself “Just how do we go about slaying that beast of an issue?” The battle begins by training scientists to communicate in a relatable and understandable manner. If we want to public to hear us out, we need to get on their level and partake in conversations they understand and are able to contribute to.  This means that we academicians need to become bilingual — we need to speak the language of science as well as the language of the public. It is pertinent that we not only develop this skill, but ingrain it in the scientists of tomorrow.  Similar to learning a second language, the sooner young minds begin learning a skill the more adept they become at using it. The next step in slaying the beast that is the public’s misplaced trust is to actually engage the issue head-on. We are never going to win this war if we don’t step out from behind the safety of university walls to fight. When someone attempts to spread “alternative facts” or pseudo science, that’s our cue to charge in and combat it with sound factual knowledge. The best way to do this is by making the information we are relaying relatable and transparent. We need to show the public that academia has nothing to hide, that what we do is is with the best of intentions and ultimately for their benefit.  By properly arming ourselves with effective communication skills and sound science, I believe academia can win the war for the public’s trust.

GMO (game-manifested outcomes) Learning

Recently, for the GEDI course, I read a piece titled What Video Games Have to Teach US by James Paul Gee.  In the piece, Gee discusses video games and how they can be a useful tool for education.  As an recreational gamer myself, I completely agree that games can teach us many things. However, many people would disagree. This disagreement stems from numerous studies that claim video games cause a variety of negative effects including aggressive behavior, poor academic performance, and promotion of poor values. Now I’m not saying that video games are free of deleterious effects, but I do believe that they can prove useful in an educational context.

When comparing video game-based education to traditional classroom education, the first noticeable difference is the level of engagement each evokes. While teaching in a classroom, the educator is focused on engaging the class as a whole rather than engaging the individual students. This makes it easy for students to disconnect and fall behind. Video games engage the player individually, allowing the content to be disseminated in a manner that’s catered to the individual. Additionally, traditional classes can be overwhelming-the vast amounts of information students receive, generally over extended periods of time, make it difficult to absorb and retain . Video games  provide small amounts  of information in relevant stages. The information provided in video games is pertinent to the to the tasks at hand versus information gained in the classroom, which is often time not applicable in the students’ lives.

Another benefit to video games is that they scale the content  so that its appropriately challenging for the individual.  For example, players just starting out are given challenges that, though difficult, are still able to be accomplished at their skill level. As the player’s skills develop, the challenges become more complex. Coincidentally, many games involve what’s called a “boss level” where players face off with a leveled challenge that they must pass in order to progress forward. The “boss level” is meant to gauge how well a player’s skills have developed. If they have developed appropriately, the player is deemed worthy enough to move on to increasingly more difficult levels where their skills will be furthur developed. In a classroom education setting, the content is doled out based off a timeline. This leads to students falling behind and never fully developing the skills needed to succeed in the class.

One of the most beneficial aspects video game education offers that classroom education does is the opportunity to fail. If you fail in a video game, it’s not the end of the world. Yes it is frustrating, but the game model allows the player to go back and attempt to succeed again. Since games are set in increments (i.e. levels), players don’t have to return back to the very beginning of the game if the fail. Instead, they return to a checkpoint, or worst case the beginning of a level, to develop the essential skills needed to attempt the leveling up once again. If you fail in a class, the repercussions are pretty detrimental to making timely progress. Failing a class means the student has to take it again, starting the class over from the beginning regardless of if they simply failed to develop a skill taught toward the end of the class. This is incredibly frustrating for the student and can often deter them from retaking the class. Alternatively, players who need to replay a video game level are not deterred- they even exhibit increased motivation to continue playing and developing their skills.

So in my opinion, video games have a lot to offer in the way of education that traditional classroom teaching cannot. The development of education-based video games can provide opportunities for students to engage in learning more successfully that they otherwise would in a classroom setting. I encourage educators to assess these useful tools and attempt to apply them in their teaching practices.