Analytic Futures

Futures Scenario: An Office of Academic Assessment Report from 2020

 

Our understanding of meaningful measures of learning transformed rapidly in the years following the Learning Technology Task Force work conducted in early 2011. This led to advances in pedagogy and an expansion of university-wide support for teaching. The university’s recommitment to teaching naturally led us to invent new assessment practices. A new class of assessment tools and practices now allow us to view evidence of learning across students, courses, programs, departments, and colleges.

Student learning takes place in multiple contexts including but not limited to the traditional classroom and include myriad high-impact practices beyond the classroom (e.g., undergraduate research, internships, and capstones). External stakeholders (i.e., employers, governments, and society at large) demand more sophisticated evidence of competency from our students.

Assessment of student learning, therefore, must be creative and nimble, transparent and rigorous—a one-shot multiple choice test is no longer sufficient. Artifacts of student performance, grounded in the disciplines as well as across and among them, provide robust evidence of student learning; evidence that we can organize, analyze, quantify, and make meaning of with such tools as the Association of American Colleges and University’s (AAC&U) Valid Assessment of Learning in Undergraduate Education (VALUE) rubrics (http://www.aacu.org/value/) and Lumina’s Degree Qualifications Profile (http://www.luminafoundation.org/publications/The_Degree_Qualifications_Profile.pdf). That data is integrated into systems like learning analytics and data visualization allowing us to critically examine curricular or programmatic junctures where students develop patterns of behavior to reflect essential learning outcomes in and beyond academics.

How did we get to this point?

We got to this point by harnessing the world-class talents of Virginia Tech researchers and faculty visionaries. An interdisciplinary team consisting of academics and administrators from across the university came together to explore the frontiers of student learning and to develop new methods of direct assessment. Interactive digital archives, for example, were tied to social networking capabilities that allow peers and faculty to collaboratively assess student learning.

The group also encouraged the university to capitalize on the emergence of learning analytics and a movement toward course-embedded assessments. In doing so, the university is now able to tap into course-specific assignments for assessment purposes by providing the means to gather and aggregate samples of student work that reflect a particular learning outcome. These “samples” come from different class assignments but still reflect the same learning outcomes thus using data for multiple purposes, in this example, course grades and program outcomes.

The group also created a forward-looking data mining system based on research in the sciences, engineering, social sciences and humanities. Virginia Tech leveraged its talents in such areas as engineering to design systems and adopt emerging technologies, education to understand the student experience, and psychology to comprehend the human mind. It also capitalized on forward-thinking practitioners in administrative roles who work daily to conceptualize, collect, and take action based upon assessment data.

Together, this team designed and constructed a new form of business and learning analytics that is the envy of higher education throughout the world. Membership in this group was fluid, allowing contributions as needed. Because of the nature of the group and the work, several prestigious research grants were awarded to individuals conducting ground-breaking research as well as in support of the totality of the project.

Perhaps more important than how we assess, what we assess shifted from one-dimensional, discipline-specific “facts” to a more robust and dynamic set of interdisciplinary learning outcomes. Assessment in 2020 has converged around the idea that deep learning can be measured. Starting with the AAC&U’s Liberal Education and America’s Promise (LEAP) initiative, Virginia Tech has moved to the forefront of essential learning outcomes and high-impact educational practices. Guided by this initiative’s VALUE design, academic programs created authentic assessment practices that moved us away from standardized test to rubrics designed by leading experts. These rubrics focused university teaching and learning efforts—particularly in our Curriculum for Liberal Education (CLE)—in the areas of Knowledge of Human Cultures and the Physical and Natural World, Intellectual and Practical Skills, Personal and Social Responsibility, and Integrative and Applied Learning.

Adopting this approach to teaching and learning has not been without challenges. As we moved forward, we realized that the use of letter grades, a remnant of the industrial model, had served a practical purpose.  Simply put, the high-volume measurement of deep learning proved difficult. If we were going to assess deep learning based on the LEAP Initiative, we found it necessary to answer important questions such as How can we create authentic assessment at a large scale, so that large enrollment disciplines can leverage the approach? Electronic portfolios, for example, were, and still are, a popular approach. However, we needed to develop technology that could assess 10,000 portfolios against competencies in a given major. While we were hopeful that computing power in 2020 would be entirely up to the challenge, we directed long-term efforts to develop an assessment engine to harness that power and largely automate assessment through the use of text-mining technology. To be clear, human interaction remains a fundamental component of assessment at Virginia Tech and we have embedded a philosophy of “high tech, high touch” in our model of teaching and learning practices, but we have leveraged the computational power to allow deep learning, and the assessment of that learning to occur at a large scale.

Learning Analytics

Learning Analytics (LA) was well documented in the 2011 Horizon Report (http://www.nmc.org/pdf/2011-Horizon-Report.pdf). In this report, LA was described as showing promise “to harness the power of advances in data mining, interpretation, and modeling to improve understandings of teaching and learning, and to tailor education to individual students more effectively.” Given the wealth of information and data collected and stored at the university—in the library, dining halls, Learning Management System (LMS), fitness center, computer labs, residence halls, student organizations, and more—a richer set of data is married to direct assessment of student learning and is available to multiple stakeholders (e.g., students, advisors, and accrediting bodies), not just key administrators and faculty. Now, dynamic, whole person evaluations, aggregated across students, are the benchmarks of the university, not the use of static standardized scores. These whole person evaluations include performance on important competencies that include solving of ethical dilemmas and interacting with others. Importantly, learning analytics has minimized the difference between accountability and improvement data because data and measures are now more transparent to more constituencies and the data is provided in real-time. Just as the WEAVE system is not assessment, rather a data management tool, so too is learning analytics; however, the analytic power is more robust. Because of the university’s commitment to “Invent the Future,” it quickly organized its resources to become a world-wide leader in learning and business analytics.

Data Visualization

Our data visualization techniques have become comprehensive because we have broken down the barrier between a quantitative vs. qualitative paradigm and now exclusively use practical and streamlined applications of what were previously called mixed-methods. In 2011, we were already capable of visualizing faculty productivity data through our participation with Academic Analytics. This system provided Virginia Tech with the capacity to better understand productivity in graduate programs and allowed us to make comparisons across departments and universities. The system was endorsed by leading universities and several prominent organizations including the Council of Graduate Schools and the American Association of Universities. The following shows a visualization of data from a department at Virginia Tech. It provides a graphic representation of aggregated faculty work compared to a peer-referenced norm and aids in the assessment of areas of strength and weakness for the purpose of future action.

Image produced in Virginia Tech’s portal to Academic Analytics. Click to enlarge.

In early days, the university capitalized on emerging technologies and its own strengths in engineering, geography, and computing sciences to create geographic information systems. The group, for example, could create maps of academic department space utilization and worked closely with campus police to develop incident maps of reported crime. Both provided decision-makers the ability to overlay data on a campus map and see information in new ways.

Image of academic space utilization provided by Virginia Tech Geospatial Information Sciences. Click to enlarge.

These early experiments in visualization provided a gateway to more robust use of data. Now, in 2020, an academic advisor sees something like this where each circle represents a set of selected variables for each student:

Image from Chartball.com. Click to enlarge.

The advisor can easily spot outliers and focus attention on at-risk students. Or use a visual reference as part of a conversation with a student. The graphic can be animated much like is done with data in Google’s Public Data Explorer (http://www.google.com/publicdata/home).

Similarly, a student can pull up an entire year of his data on a mobile device and observe trends. He has the ability to benchmark against other students in his program, college, university, nation, and perhaps one day across international borders. He is even able to project forward by simply clicking on a series of courses or co-curricular activities. What would he look like if he chose to take one class over another? Would it delay graduation? Could he anticipate additional financial implications?

Image from Chartball.com. Click to enlarge.

In addition to university data, external data can be imported for more informed decision-making. For example, Bureau of Labor statistics shows the student projected labor statistics in his field. Will jobs be available after graduation? Data can be viewed on a mobile device as well as a computer giving the student flexibility and the ability to share his data with others. The student can thus create an interactive, mobile, networked pathway.

A department chair pulls up a screen with real-time and snapshot data of her department—student learning outcomes, publications, student demographics, financial, space utilization, LMS statistics, and more. She easily zooms in on an area of interest for more detail. A detailed report is available, but most prefer to view data in its visual format. Much like a dashboard in a car, the user can quickly interpret data and make meaning. A department chair, and her faculty, might see something like the following graphic where variables are malleable and categorized with a few simple clicks of computer keys or swipe of a finger on a mobile device:

Image from http://dataviz.tumblr.com. Click to enlarge.

This is a powerful set of information at the fingertips of a department chair for use in accreditation and annual reports, but as importantly to keep abreast of department operations and continuous improvement activities.

What is clear here is that the university moved in a direction that leapfrogged the one-dimensional use of single data sources for assessment. We now capitalize on vast, rich data from multiple points across the university that is accessible by users with different needs, interests, and units of analysis. The user can easily manipulate that data, mashing data from multiple sources, to create visually pleasing, graphic representations. Data is personalized and customizable. The user can then spend time interpreting the data and creating a narrative rather than lost in data collection and routine calculations.

The combination of meaningful assessment of deep learning across all learning environments and scales of enrollment tightly coupled with access to customizable analytics made available to all stakeholders has positioned Virginia Tech as a leader in the new environment of higher education.

– David Kniola and Ray Van Dyke, Office of Academic Assessment,

with assistance from:

Todd Ogle, Kate McConnell, Steve Culver, Anne Laughlin, and Megan Franklin

 

One Response to Analytic Futures

  1. Pingback: Meaningful measures of learning « providencecollege2020

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>