Future of the University

If there is one thing I believe should change in higher education, it’s the perception of the university as a gateway to a career instead of a place of academic enlightenment. Now, I realize this perception is a result of how higher education coevolved to meet the needs of contemporary culture. I also recognize the unlikelihood this perception will ever change. Regardless, I don’t see why I have to agree with the present situation or maintain hope that it will change for the better.

The history of higher education is distinct from other forms of education, with some universities among the oldest learning institutions in the world. The development of universities, and higher education more generally, over the course of the last millennium is closely tied to religion. Universities were established as organizations free from direct control of the church or other religious institutions, a privilege usually granted by the king or state. This privilege allowed for academic freedom to question, research and advance knowledge. Those who attended universities were usually individuals from the upper social class who had substantial wealth and weren’t required to devote all their time to laboring for money. They didn’t need to gain knowledge to obtain a job and , if they did, they did so by apprenticing/gaining on the job experience. University attendees were motivated by a true passion to learn. Thus, people initially attended universities for the sole purpose of immersing themselves in knowledge.

Education is widely accepted to be a fundamental resource, both for individuals and societies. Indeed, in most countries basic education is nowadays perceived not only as a right, but also as a duty – governments are typically expected to ensure access to basic education, while citizens are often required by law to attain education up to a certain basic level. As society has evolved, the necessity for labor-based trade skills has shifted. Today’s society is technologically integrated and the ability to interact with and interpret information is now a crucial skill. Universities have become the “go-to” destination for learning how to operate in our technologically advanced world.

In the modern world universities have two purposes: equip students with advanced skills useful in the workplace and to further human knowledge and understanding of the world. A decrease in employment opportunities (due to outsourcing or technological advancements eliminating the need for human-manned positions) combined with an increase in population growth has yielded a highly-competitive job market. Degrees in higher education have become a factor used to differentiate between job applicants and are seen as an indicator of applicant “quality”.  At first, this meant that a bachelors degree ensured one better chances at becoming employed. As more and more people saw the need for a university-granted degree, the employment market became increasingly saturated with bachelor degree-obtaining applicants. Now, a masters degree has become the standard for obtaining a “high profile” career.

Universities are now perceived as a place to obtain a piece of paper that (supposedly) guarantees gainful employment. Don’t get me wrong, they are still a place of academic enlightenment. Faculty and students still engage in the free thought that drives the advancement of knowledge. However, most universities are more focused on pumping out degrees than knowledge advancing research. As we look to the future, universities will inevitably coevolve to meet the needs of the people. Education has been and will continue to be a valuable resource — one that is essential for the advancement of humankind. It is my earnest hope that universities will once again become a destination for the passionate pursuit of knowledge rather than a destination for a better chance at gainful employment.

Academia Gives Back

Have you ever looked at the mission statement of a university? More often than not, many of them will say something to the extent of the university providing service to the community. Take Virginia Tech’s mission statement for example:

“Virginia Polytechnic Institute and State University (Virginia Tech) is a public land-grant university serving the Commonwealth of Virginia, the nation, and the world community. The discovery and dissemination of new knowledge are central to its mission. Through its focus on teaching and learning, research and discovery, and outreach and engagement, the university creates, conveys, and applies knowledge to expand personal growth and opportunity, advance social and community development, foster economic competitiveness, and improve the quality of life.”

Not only is service mentioned in Virginia Tech’s mission, but also the university’s commitment to “…advance social and community  development, foster economic competitiveness, and improve the quality of life”. For those of us living in the “bubble” that is campus culture, it can be difficult to tangibly see how the university provides on its promise to serve. Thankfully, I stumbled across a phenomenal example of how Virginia Tech gives back to the surrounding community: Campus Kitchen.

What is the Campus Kitchen?

Virginia Tech’s Campus Kitchen is a program ran through VT Engage that recovers surplus food from VT’s dining halls, repurposes it into meals, and delivers those meals to a local food bank. Campus Kitchens Project is national organization that promotes students getting involved combating food waste and hunger. Students in collegiate chapters across the nation collect surplus food from on-campus dining halls and help transform it to healthy meals that are distributed to food insecure individuals in the area .One in eight Virginians struggles with food insecurity, and there is a great need in our region to provide services to get food to those in need.  In spring 2015, Virginia Tech was one of three schools that won $5,000 grant to help start up a campus chapter. Volunteers have devoted over 2,500 hours with the CKVT since its launch in fall 2015. We are now recovering surplus food from three Virginia Tech dining halls six days a week. 10,000+ pounds of recovered food and 400+ meals have been delivered to our community partner, Radford-Fairlawn Daily Bread.

Who is involved in the operations of VT’s Campus Kitchen, and what exactly do they do?

The great thing about Campus Kitchen is that anyone at Virginia Tech or the surrounding community is welcome to volunteer their time to help with daily operations. There at a variety of ways one can get involved in VT’s Campus Kitchen. For instance, the Kitchen needs weekly volunteers to pack, cook, deliver, and serve the food they repurpose. Additionally, if one finds they enjoy volunteering and would like a larger more long-term role, volunteers can commit to collecting the food from the dining halls or becoming a delivery/shift leader.

If volunteering in VT’s Campus Kitchen isn’t your “cup of tea”, there are plenty of other programs offered through VT Engage to get involved with.  It seems that Virginia Tech really strives to provide on its commitment to service and truly lives up to it’s motto ut prosim — That I May Serve.

Open Access: Foods Open Access Food Science Journal

The “go to” journal for all things food science is The Journal of Food Science (JFS). Unfortunately, this journal is not open access and a paywall stands between valuable scientific knowledge and those who wish to access it. Thankfully, my membership in the Institute of Food Technologists (IFT; a professional organization for food scientists and industry professionals) provides me access to this journal. Now, my membership in IFT isn’t free (annual student dues are around $50) but that fee costs me significantly less money than it would to access JFS. If it weren’t for my IFT membership (as well as  Virginia Tech’s libraries), then I probably wouldn’t be able to access any of the articles I need for my research. Thankfully, a Google search has yielded another source for peer-reviewed scientific research: an open-access food science journal.

The journal is stumbled across was Foods—Open Access Food Science Journal. Foods is an international, scientific, open access journal of food science and is published monthly online by MDPI (Multidisciplinary Digital Publishing Institute). This journal provides an advanced forum for studies related to all aspects of food research and publishes reviews, regular research papers and short communications. Their goal is to “encourage scientists, researchers, and other food professionals to publish their experimental and theoretical results in as much detail as possible alongside sharing their knowledge with as many readers as possible”. There are not length restrictions on their papers, which allows scientists to “put it [their research] all out there” for others to learn from. Some unique features Foods offers its readers are:

  • manuscripts regarding research proposals and research ideas will be particularly welcomed
  • Ÿ   electronic files or software regarding the full details of the calculation and experimental procedure, if unable to be published in a normal way, can be deposited as supplementary material
  • Ÿ   they also accept manuscripts communicating to a broader audience with regard to research projects financed with public fund

Foods also provides the “Scope” (i.e., applicable areas) of the research they publish which includes:

  • food sciences and technology
  • food chemistry and physical properties
  • food engineering and production
  • food security and safety
  • food toxicology
  • sensory and food quality
  • food analysis
  • functional foods, food and health
  • food psychology
  • food and environment

Foods ensures its publications follow a code of ethics, specifically they are a member of the Committee on Publication Ethics (COPE). Since MDPI publishes Foods, their ethics statement is what Foods abides by. Additionally, MDPI states they verify the originality of content submitted to their journals using iThenticate to check submissions against previous publications.

The only reference I noticed that mentioned their stance on open access was that articles published in Foods will be Open-Access articles distributed under the terms and conditions of the Creative Commons Attribution License. MDPI then states they will insert the following note at the end of the published text:

© 2013 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0/).
In my opinion, Foods seems like a great open-access journal for food science publications. They want to get research out there and accessible to everyone, yet still credit the authors and protect their rights. I will definitely be using this journal for my research—related needs, and would even consider submitting future manuscripts to them.

Tech & Innovation in Higher Ed

If I had choose two (nice) words to describe present-day society (even though there are a multitude of words I could use) those words would be: technologically integrated. Technology has taken over our lives, and I’m certain most of us could not function if it somehow disappeared from existence. The world of higher education is no exception. Those of us in academia have adopted technology and evolved our research and, yes, even our teaching methods to incorporate it. The integration of technology in higher education has spawned  an evolution in teaching methodology: Massive Open Online Courses (MOOCs).

MOOCs (e.g., Coursera, edX, Udacity) are courses available online that allow unlimited access and participation, provided one has access to the Internet. Besides there (almost) universal accessibility, a benefit to taking MOOCs is that they provide provide traditional course materials in addition to interactive user forums and support from the global online community. Sounds pretty perfect, right? Well, as it turns out, many teachers, universities, and even students take issue with MOOCs. In a quest to better understand the controversy, I discovered a paper in the International Journal of Communication by Jose Van Dijck and Thomas Poell titled Higher Education in a Networked World: European Responses to U.S. MOOCs.

Van Dijck and Poell begin by stating the objective of their paper: “analyze the dynamics underlying the development of MOOCs in the context of an emerging global social Web where these courses are implemented in systems of higher education that are often institutionally grounded and largely nationally based” and “to understand how emerging commercial online infrastructures affect public systems such as higher education”. In the introduction, they describe how MOOCs are designed to function based off of the same mechanistic underpinnings of the ecosystem and other connective platforms: datafication, (algorithmic) selection, and commodification. Datafication is defined the tendency to quantify all aspects of social interaction and turn them into code. These codes signal for algorithmic selection (based off popularity of the course, student success in assessment modules, etc) of how the course will be enhanced and evolve overtime to meet the needs of students. The last mechanism of how MOOCs function is commodification, which involves the transformation of objects, activities, and ideas into tradable commodities. The majority of this paper involves the authors discussion of the pros and cons of how these three mechanisms impact MOOC and their stakeholders (i.e., students, teachers, program developers & entrepreneurs, and Europe). I’d go into more detail and discussion on this, but it’d be difficult to do the 19 page paper justice in one blog post.

Next, the authors introduce the following questions: Is education a public good. How do MOOCs relate to key public values-sustaining systems of higher education?  The authors mention that main disputes concerning MOOCs focus on the impact they have on core values of education. Because the paper’s discussion is focused on the U.S. and Europe, there were come interesting contrasts presented. For instance, many U.S. universities have been, to some extent, privately or corporately funded, which is one reason why many of them embrace the for-profit (or at best nonprofit) high-tech apparatus (i.e. MOOCs) that renders them part of a global online market. In contrast, European taxpayers have traditionally supported a public system of higher education that is focused on commodification, deregulation, globalization, and is economically distinct from its U.S. counterparts. I found the discussion on these contrasting frameworks super interesting and rather thought provoking. I wonder how systems of higher education in other countries (besides the U.S. and Europe) are approaching MOOCs?

The paper concludes by mentioning that MOOCs are  “an ideological battleground where a revamped definition of what public education means in a networked world is wagered and contested”. In our technologically integrated society, there exists an ever present battle over privacy, accessibility, inherent and protected freedoms, and the meaning of publicness . Higher education is one battleground where this struggle takes place. Governments and university administrators will eventually need to address the regulatory problems at stake here, as well as the potential profound ideological shifts in the emerging global market for online learning. Even if online mass instruction will never come to replace traditional college education, MOOCs will inevitably have a substantial impact on how education is defined as a public good.

Ethics Case Study: Karen M. D’Souza

Thanks to evolution, we humans have developed an advanced cognitive ability that allows us to process both tangible and intangible information that helps us make decisions. Tangible information (e.g. proven scientific facts or theories) provides us with a variety of solutions to to the decisions we face every day.  However, intangible information such as our emotions or ethical code ultimately guide us in selecting a final choice or action. Ethics (i.e. moral principles that govern one’s behavior) are crucial in making decisions that yield philosophically sound outcomes — the decisions we make need to be beneficial and not inflict unnecessary harm on ourselves, our fellow man, the Earth, etc. When there is a failure or refusal to employ ethics in decision making, we bypass a vital step in preventing deleterious outcomes. Specifically in scientific fields, neglecting to conduct research ethically can lead to a variety of ill-fated consequences.

In 2016, the Office of Research Integrity’s (ORI) Department of Health and Human Services found Dr. Karen M. D’Souza, a former Research Professional Associate in the Department of Surgery at the University of Chicago, engaged in research misconduct in research supported by the National Heart, Lung, and Blood Institute (NLBHI) and the National Institutes of Health (NIH). The ORI discovered that falsified and/or fabricated data were included in one NIH grant (R01 HL107949-01) two publications (J Biol Chem. 285(18):13748-60, 2010 Apr 30; J Biol Chem. 286(17):15507-16, 2011 Apr 29), two posters (Gordon Conference 2006 poster: “Regulation of Myocardial β-Adrenergic Receptor Signaling By Protein Kinase C”; Huggins 2010 poster: Gαq-mediated activation of GRK2 by mechanical stretch in cardiac myocytes; the role of protein kinase C”), and one presentation (Cardiac Research Day 2009 presentation: “Regulation of G protein-coupled receptor signaling by mechanical stretch in cardiac myocytes”). It was found that Dr. D’Souza reused and falsely relabeled and/or falsely spliced Western blot images, falsified the related densitometry measurements based on the falsified Western blots, and falsified and/or fabricated data for experiments that were not performed or from unrelated experiments.

When Dr. D’Souza made the decision to falsify and/or fabricate her data, she exhibited a blatant disregard for conducting research. What motivates a researcher to make an unethical decision such as Dr. D’Souza did? Often times it is the ever-present pressure for researchers to “publish or perish” — the notion that the success of a researcher’s career directly increases with the number of research manuscripts they have published in peer-reviewed journals.  Time is another factor that can sway a researcher to abandon their ethical code. The grants that fund the majority of present-day research always include a time line for when all related experiments and results must be obtained. If the researchers fail to complete experimentation by the deadline set forth in the grant proposal, it is highly unlikely the funding organization will select to fund any of their future research endeavors.  Similarly, some funding organizations desire for the research they are funding to provide specific results or conclusions that support the organization’s personal agenda. Researchers will falsify and/or fabricate data in the hope that, by pleasing the funding organization, the organization will grant them more funding for future research.

In Dr. D’Souza’s case, we may never know what precisely motivated her to report unethical research. However, we can speculate the potential outcomes that may result from her misconduct. Since she was performing medical research on coronary function, we can assume that her published “findings” may be utilized to clinically treat patients with heart-related issues. Because she falsified and/or fabricated her findings, there is no way to know if they are statistically significant. If used to treat patients in clinical trials, her unethical results could prove ineffective, or worse, fatal. Additionally, Dr. D’Souza’s gross unethical conduct will certainly yield a multitude of undesirable consequences. For instance, she has already received several administrative actions (i.e. punishments) directly related to her research misconduct, which she has voluntarily agreed to accept. The ultimate and, in my opinion, least favorable consequence of Dr. D’Souza’s misconduct is that she has forever tarnished her integrity as a researcher, surgeon, academic, and human being.

My final thought on the case of Dr. D’Souza is this: take the time to seriously evaluate the consequences of your actions. There is no personal or professional triumph worth sacrificing your integrity for.

Tenure or No Tenure: A Ph.D. Student’s Opinion

The first time I ever heard the term tenure was back in elementary school when I overheard a couple my teachers discussing how close they were to earning it. At the time, I didn’t know what the word meant, nor did I care. As I progressed through academia in pursuit of higher education, I continued to encounter tenure in a variety of contexts. Most notably, I recall it coming up during a discussion between my classmates and I about a rather awful professor who taught our freshman level chemistry course. One of my classmates mentioned that, no matter how much we complained about our professor’s inability to teach, our professor could not be fired because she was “tenured”. It was at this point I asked myself “what exactly is tenure, how did it come about, and why does it seem strictly prevalent in academia?” Now, as a doctoral student with close to 20 years of academic experiences, who is preparing for a career as a collegiate professor, I’m going to present my understanding and opinions on tenure.

What is tenure and how did it come to exist?
Tenure is an esteemed and privileged appointment one can obtain working in academia that lasts until retirement. Tenure exists because of the principle of academic freedom, which is based on the conviction that scholars have the freedom to teach, research, or communicate facts or ideas without being targeted for repression, job loss, or imprisonment.  Those who support academic freedom believe student and faculty inquiry is essential to the mission of educational entities such as schools or universities. Historically, there have been incidences where scholars communicated ideas or facts that were inconvenient to external entities (i.e. political or societal authorities) and thus were persecuted and even sentenced to death (e.g. Lysenkoism in Soviet Russia). To protect scholars, many countries have recognized and adopted tenure allowing scholars to freely to express their opinions without fear of institutional censorship or discipline.

Why is there controversy and debate surround tenure?
Presently, tenure is a hot-topic of discussion for academics and non-academics alike. Popular news-media outlets including National Public Radio (NPR), US News & World Report and New York Times, are continually publishing articles discussing tenure and the effect it has on educators and the educational system. Whether faculty, student, parent, or even politician it seems everyone has an opinion on the necessity of tenure. Below is a brief (and thus not entirely inclusive) summary of the pros and cons expressed for tenure.

Pros:

  • Protects scholars and educators from being fired for personal, political, or other non-academic related reasons.
  • Prevents academic entities from firing experienced, higher-paid educators to replace them with less experienced, lower-paid educators for financial gain.
  • Protects scholars who research and/or teach unpopular, controversial, or otherwise disputed topics.
  • Allows educators to advocate on behalf of students and express disagreement with school or university administration.
  • Promotes scholarly performance and entices people to engage in education-based careers by rewarding educators for years of successful teaching and contributions to academia.
  • Encourages the careful selection of qualified and effective educators.

Cons:

  • Creates academic complacency that poor teaching performance and very few, if any,  contributions to academia.
  • Makes is difficult to remove under-performing educators due to a lengthy, complicated, and costly “tenure revocation” process.
  • Promotes seniority as the main factor in evaluating an educator’s performance instead of  the quality of their work.
  • Tenure is obsolete and no longer needed to protect educators since modern-day laws and court rulings are able to better serve that same purpose.
  • Tenure is not necessarily applicable at all levels of academia (i.e. kindergarden, elementary, middle, and high school) due to government implemented standards of education.
  • Fails to promote education of the student and solely focuses on promoting the educator’s career.

Though there are more points that could be added to either list, both sides of the argument are backed by some very solid reasoning. To me, it seems that the debate is very contextual and the issues appear to be fairly localized reflecting the challenges experienced at particular academic levels.

What’s my opinion as a student and future educator?
As I previously stated, I’m a doctoral student with who roughly 20 years of various academic experiences under my belt from which to formulate an opinion on tenure. When it comes to tenure at lower levels of academia,  I don’t believe it is as vital for preserving academic freedoms. Many schools, especially public ones, are operated based on strict guidelines from the local and state governments. There is very little room for educators to discuss anything outside of the material they are required to teach in order for their students (and schools) to meet academics standards. In this type of context, it seems that tenure would also exacerbate the issue of ineffective educators since there is already a lack of content-based competition. However, I do believe tenure is necessary to protect more established educators who, after years of successful teaching, have earned a higher pay rate. With all the debt (state and national) and budget cuts to education, schools are under financial pressure to operate with smaller and smaller amounts of funding — it seems only natural that administrators would seek to fire higher-paid, experienced educators in lieu of those who, though less experienced, come at a lower cost.

Concerning tenure at the university and collegiate level, I believe tenure is absolutely necessary in these contexts. Though students are expected to complete a specific list courses catered to their academic major, the quality and type content featured is heavily dependent upon the course instructor. Students may be required to learn about taboo or controversial topics that make them uncomfortable, and the course instructor should be able to teach this sort of material without fear of losing their job. This is especially true at the graduate school level where professors are performing research and educating students on more focused and complex topics. Now, I fully believe that earning tenure should not bestow an educator with complete academic immunity — there needs to be a system of timely, cost-effective “checks and balances” to ensure the educator is still effectively teaching students and making academic contributions. What those look like exactly, I’m not really sure. Hashing out those details should be left to each academic entity and furthur localized to be college- or department-specific. When it comes to tenure and it’s application, it is my opinion that there is no “one-size-fits-all” solution. The effectiveness and necessity of tenure should be assessed contextually to ensure the principles of academic freedom are proliferated and abuse of the tenure appointment is minimized.

The Crusade for Sound Science

In today’s world, we are more connected with our fellow man than at any other point in human history. The evolution of technology has yielded television, computers, the internet, smartphones, tablets, and many other nifty gadgets that allow us to communicate with one another. Whether its a phone call, text, or Facebook message we are now capable of connecting with people on the opposite side of the planet within seconds. Through these technologies we can also access information about the world around us including breaking news, weather, entertainment, government policies, and even scientific discoveries. Though technological advancements have vastly improved life as we know it, they are also creating a variety new problems for modern society. It is my opinion that the greatest of these problems is hindering the promotion of sound science.

Before all this technology came about, research and academic knowledge was communicated via printed text in books, encyclopedias, academic journals, on chalk boards, and even in newspapers. In order to contribute to content to these sources, one had to have certain credentials and demonstrate adept understanding of what they wished to publish.  This meant that those sharing said content were usually experts in a related field (or had received formal education/training on the subject) and, generally speaking, would provide reliable, accurate information based on sound scientific findings. Now that everyone has information (and the ability to share it) in the palm of their hands, pretty much everyone can share their thoughts and opinions on a topic. Though this can stimulate thought-provoking discussion, it also provides an ample opportunities for the spread of “alternative facts” and pseudo science.

 

As a academic, I see myself and my fellow colleagues work diligently to make scientific discoveries and contributions that will advance our understanding of the world we humans live in. So, it makes me incredibly upset to see the media and general public toss sound science aside in lieu of pseudo science or trendy pop-culture ideals. Seriously, a simple scroll through my Facebook newsfeed almost always ends with me becoming irrationally angry. When did society:

  1. Stop trusting scientists and start believing any self-proclaimed “experts” who happen to be trending, despite their lack of credentials?
  2. Become “sheeple” and stop thinking for themselves ?

Sometime during our technological evolution, the public withdrew their trust from the academically-credentialed and placed it in the present-day “mommy blogger”, news reporter, and Hollywood celebrity. How we found ourselves in this backwards reality, I will never know. What I do know is that those of us in academia are challenged to win back the public’s trust.

Now I’m sure you’re asking yourself “Just how do we go about slaying that beast of an issue?” The battle begins by training scientists to communicate in a relatable and understandable manner. If we want to public to hear us out, we need to get on their level and partake in conversations they understand and are able to contribute to.  This means that we academicians need to become bilingual — we need to speak the language of science as well as the language of the public. It is pertinent that we not only develop this skill, but ingrain it in the scientists of tomorrow.  Similar to learning a second language, the sooner young minds begin learning a skill the more adept they become at using it. The next step in slaying the beast that is the public’s misplaced trust is to actually engage the issue head-on. We are never going to win this war if we don’t step out from behind the safety of university walls to fight. When someone attempts to spread “alternative facts” or pseudo science, that’s our cue to charge in and combat it with sound factual knowledge. The best way to do this is by making the information we are relaying relatable and transparent. We need to show the public that academia has nothing to hide, that what we do is is with the best of intentions and ultimately for their benefit.  By properly arming ourselves with effective communication skills and sound science, I believe academia can win the war for the public’s trust.

GMO (game-manifested outcomes) Learning

Recently, for the GEDI course, I read a piece titled What Video Games Have to Teach US by James Paul Gee.  In the piece, Gee discusses video games and how they can be a useful tool for education.  As an recreational gamer myself, I completely agree that games can teach us many things. However, many people would disagree. This disagreement stems from numerous studies that claim video games cause a variety of negative effects including aggressive behavior, poor academic performance, and promotion of poor values. Now I’m not saying that video games are free of deleterious effects, but I do believe that they can prove useful in an educational context.

When comparing video game-based education to traditional classroom education, the first noticeable difference is the level of engagement each evokes. While teaching in a classroom, the educator is focused on engaging the class as a whole rather than engaging the individual students. This makes it easy for students to disconnect and fall behind. Video games engage the player individually, allowing the content to be disseminated in a manner that’s catered to the individual. Additionally, traditional classes can be overwhelming-the vast amounts of information students receive, generally over extended periods of time, make it difficult to absorb and retain . Video games  provide small amounts  of information in relevant stages. The information provided in video games is pertinent to the to the tasks at hand versus information gained in the classroom, which is often time not applicable in the students’ lives.

Another benefit to video games is that they scale the content  so that its appropriately challenging for the individual.  For example, players just starting out are given challenges that, though difficult, are still able to be accomplished at their skill level. As the player’s skills develop, the challenges become more complex. Coincidentally, many games involve what’s called a “boss level” where players face off with a leveled challenge that they must pass in order to progress forward. The “boss level” is meant to gauge how well a player’s skills have developed. If they have developed appropriately, the player is deemed worthy enough to move on to increasingly more difficult levels where their skills will be furthur developed. In a classroom education setting, the content is doled out based off a timeline. This leads to students falling behind and never fully developing the skills needed to succeed in the class.

One of the most beneficial aspects video game education offers that classroom education does is the opportunity to fail. If you fail in a video game, it’s not the end of the world. Yes it is frustrating, but the game model allows the player to go back and attempt to succeed again. Since games are set in increments (i.e. levels), players don’t have to return back to the very beginning of the game if the fail. Instead, they return to a checkpoint, or worst case the beginning of a level, to develop the essential skills needed to attempt the leveling up once again. If you fail in a class, the repercussions are pretty detrimental to making timely progress. Failing a class means the student has to take it again, starting the class over from the beginning regardless of if they simply failed to develop a skill taught toward the end of the class. This is incredibly frustrating for the student and can often deter them from retaking the class. Alternatively, players who need to replay a video game level are not deterred- they even exhibit increased motivation to continue playing and developing their skills.

So in my opinion, video games have a lot to offer in the way of education that traditional classroom teaching cannot. The development of education-based video games can provide opportunities for students to engage in learning more successfully that they otherwise would in a classroom setting. I encourage educators to assess these useful tools and attempt to apply them in their teaching practices.

Academic Responsibility-Who’s To Blame?

What exactly is academic responsibility and who does it pertain to? 

Every night when I get home from campus, I call my boyfriend (who lives in Miami, FL) to talk about how our respective days went. Now on Monday and Wednesday nights, this call occurs around 9pm when I get out of either my “Contemporary Pedagogy” (GEDI) or “Future Professoriate” class. Naturally, my side of the conversation always drifts towards the discussions that take place these nights. On one particular night, while working on a blog post for the GEDI course, I started asking him what he thought about academic responsibility.

Unlike myself, he is an undergraduate student (studying Sports Medicine at Florida International University), so I was curious to hear how his opinion  compared to my own as a graduate student. In his opinion, academic responsibility is “blown out of proportion” because “teachers are supposed teach material and students are supposed to learn it. It’s as simple as that”. Is academic responsibility really so simple?

Is the responsibility of an educator solely to disseminate information? Is it their job to ensure students are learning? Most of my elementary and high school classes involved teachers guiding us through material in our textbooks, writing key information on the chalkboard, asking us lots of questions we would raise our hands to answer, and administering homework/projects to ingrain the knowledge being taught into our brains. The university-level courses were similar, but involved much fewer assignments, more lecture, and a ton of power point presentations. At all these levels of education, knowledge was being disseminated. However, the level of effort elementary-level educators put into ensuring learning is vastly different than that of university educators. To me, this makes perfect sense-young children need more guidance and assistance while developing academic proficiency. Yet day after day I hear undergraduate complain that their professor “doesn’t do enough to help them learn”. When does the responsibility for learning transfer from the educator to the student? Is it ever shared? In my opinion, the academic responsibility of learning should fall directly on the student-especially when that student is an adult in college. Yes, the educator is responsible, but only for disseminating the appropriate knowledge. It’s up to the student to absorb and commit it to long-term memory.

What about academic institutions? What is there responsibility in all this? If you ask today’s generation of undergraduates about their grades, most will either

A) Attribute good grades to the personal blood, sweat, and tears they put into succeeding in the class.

or

B) Blame bad grades on their professor…and then on the academic institution for hiring such an unqualified person in the first place.

Are bad grades the fault of the educator or the university who hired them? To be honest, grades are awful. Sometimes they motivate students, and sometimes they discourage them. Personally, I feel the responsibility of grades is not directly the fault of the academic institution. If all students are doing terribly in a particular professor’s course, then yes, it is the institution’s responsibility to evaluate the efficacy of that professors teaching practices.  Ultimately, I think students (especially those in university studies) and educators share the responsibility for grades. The grading system, like most scales, is merely meant to be a tool to assess where an individual is “at” on a particular spectrum. Similar to the pain scales in hospital, grading can be useful in determining how well a student is learning. Unlike a hammer, these tools [scales] are meant to assess rather than fix. It is the students responsibility to recognize that bad grades indicate they need to do a better job learning and, if necessary, seek additional help. So…what’s the educators responsibility? Well, every tool is only as good as the quality of materials it is comprised of. If a tool is poorly constructed, how can we expect it to do it’s job properly? If the grading scale is a tool, then homework, quizzes, and tests are the components it’s comprised of. It is the responsibility of the educator to properly design their class so it consists of components that effectively represent the quality of learning that should be occurring.

Overall, I believe academic responsibility is shared between academic institutions, educators, and students. However, it is very complex and continually evolves along with the academic proficiency of the student. By the time that student is enrolling in university-level courses, they need to take responsibility for their own academic success.

Mission Statements: Just a Mantra?

As a doctoral student who has only ever attended public universities, I have always been curious what my experiences would have been like had I attended a private university. So, when deciding on mission statements to look up and reflect on, I decided to investigate statements from both a private and public university. I figured the two institutions would provide valuable insights into what public vs. private universities seek to provide their students and communities.

EPSON scanner image

The private university I chose was Cornell University in Ithaca, New York. Their mission statement is as follows:

“Cornell is a private, Ivy League university and the land-grant university for New York State. Cornell’s mission is to discover, preserve, and disseminate knowledge; produce creative work; and promote a culture of broad inquiry throughout and beyond the Cornell community. Cornell also aims, through public service, to enhance the lives and livelihoods of our students, the people of New York, and others around the world.”

The first thing I found interesting about this statement was the fact that Cornell is both a private as well as a land-grant university. From what I thought I understood about land-grant institutions, I was under the impression they were exclusively public universities since they received money from the state. However, a little internet research revealed that a land-grant institution is one that receives benefits from the Morrill Acts of 1862 and 1890. The Morrill Acts funded educational institutions by granting federally controlled land to the states for them to sell, raise funds, establish and endow “land-grant” colleges. The mission of these land-grant colleges was to focus on the teaching of practical agriculture, science, military science, and engineering (without excluding “classical studies”). The remainder of their mission statement reflected that of any land-grant university by promising to provided education and disseminate knowledge. I liked the additional statement about Cornell providing pubic service aiming to enhance the lives of the people of New York and around the world.

mission3The public university I chose was my alma mater Kansas State University (KSU) located in Manhattan, Kansas. Their mission statement is as follows:

The mission of Kansas State University is to foster excellent teaching, research, and service that develop a highly skilled and educated citizenry necessary to advancing the well-being of Kansas, the nation, and the international community. The university embraces diversity, encourages engagement and is committed to the discovery of knowledge, the education of undergraduate and graduate students, and improvement in the quality of life and standard of living of those we serve.

Though it isn’t explicitly stated in their mission statement, KSU is also a land-grant university. Their promise to foster teaching, research, and service are all making good on their role as a land-grant college. Like Cornell, they promise to advance the well-being of their state as well as the world. A significant different between KSU’s mission statement and Cornell’s is that KSU mentions embracing diversity, where as Cornell mentioned it was a “private, Ivy League” institution. Just from their mission statements, I would gather that KSU is a more inclusive institution for students and staff.

Overall, the mission statements of these institutions share several similarities. I believe this may be due to their “land-grant” nature, and I would be interested to see if other non land-grant universities (both private and public) reflect the commonalities in KSU’s and Cornell’s missions. However, I find it incredibly interesting that one university includes diversity, and by default inclusivity, in its mission while the other does not. I guess my final thought on the subject would be: I wonder how the promises within these missions manifest in the learning environment of their respective universities?