The Dean’s Information Challenge: From Data to Dashboard

(First in a series of a published EDUCAUSE article November, 2016)

by Jeff Meteyer, University of Rochester

Summary:  Data drives dashboard construction. At the University of Rochester, our dashboard development teams sought assistance and created processes to ensure consistent results and widely agreed-on metric definitions. These experiences inspired data governance and communication partnerships across various systems.

This case study focuses on issues of interest to deans and the challenges of developing a dashboard system.

The Dean’s Perspective

New and existing deans face daunting pressures in managing their areas of responsibility; among their required tasks are to:

  • Develop student outcomes strategy
  • Understand the unit’s financial position, as well as its operational and academic standings
  • Interpret information from an avalanche of data from various (potentially non-integrated) sources
  • Ensure the effective and efficient use of resources related to finances, people and space
  • Engage and support cost/value discussions around tuition and student job market success
  • Solicit research dollars in targeted areas of opportunity
  • Make strategic faculty hiring decisions and develop faculty retention strategies

Deans are interested in existing performance metrics and how they portray both current and historical performance. But, in attempting to address such questions, deans might find themselves in information silos — sometimes without realizing it. Sometimes these “silos” result from institutional evolution, as described below.

When deans rise through the ranks of a particular institutional branch, they might be unaware of “tribal” knowledge from other parts of the institution. Further, they often rely on subject matter experts to help lead the way in terms of definition and performance. However, some measurements, such as student metrics, do not behave according to a predictable data-driven model across an entire university. In addition, some deans might hail from other institutions with particular dashboard cultures and expectation levels. These deans might believe it is easier to bring this “vision” to their new establishment, only to find that the new institution’s data might not easily suit or fully populate the former institution’s dashboard model. Deans might find themselves without an existing dashboard model, so the activity of designing one becomes an additional task to manage.

Deans have reporting priorities beyond the student area, including research ranks, investment trends, salary expenditures, and submission/acceptance ratios for research projects. Deans might focus on the workload ratio of various faculty members or principal investigators in relation to research time, student course load, and mentoring. Further, tenure tracking and overall demographic measures play a role in both recruiting new talent and retaining existing performers.

Financial reporting priorities may include operational metrics, which indicate where a dean’s unit stands in relation to its plan or budget and (ideally) in relation to previous timeframes. Forecasting future fiscal cycles is challenging in the absence of historical data and an understanding of how that data relates to prediction models.

When data definitions vary, exceptions or anomalies can arise in the metrics using those varying definitions. Many people in an institution’s higher ranks have gone through evolving organizations where successive leaders invented new ways of thinking and addressing organizational shortfalls to establish their influence on the institution’s practices. However, deans and administrators do not want to risk misinterpretation or misstep due to nebulous data definitions; they want to be relatively certain, given the data integrity and interpretation, that the decisions they make align with their strategy.

The recent rise in data governance teams — which address issues such as data definition, data security, and determining usage at various institutional levels — is helping to establish process-driven decision-making, reducing the challenges of data sharing and interpretation previously experienced by deans and others.

Data Challenges

When designing dashboards as information portals, it is important to ask what the appropriate metric is and how it will be used for decision making. A visualization or reporting team sometimes illustrates what can be done, but their design intentions might actually muddy the dean’s decision-making process. The goal should be to define the metric in as granular a way as possible so that the resulting illustration of that metric helps users determine a course of action.

For example, a five-year trend in research spending (in aggregate) might give users a sense of what to expect for forecasting, but if the goals are at the sublevel (such as capital expense reduction), the data sets should be represented so that users can easily identify distinct signals or trends without having to redesign a report. Setting thresholds and alert logic is important; data sets arrive quickly, and the buildup lets user see the signal and act accordingly. If data does not refresh often enough, the signal may be lost or delayed, resulting in missed opportunity.

Development Process

At the University of Rochester, our reporting and analytics team worked in tandem with then-Dean Robert Clark and the dean’s support staff to outline a vision for developing institutional dashboards. The dean offered a level of creative freedom, letting the team illustrate the dashboard design and as a group define the resulting metrics.

Figure 4. An example institutional dashboard

The team developed various proofs of concept showing the information that could be gleaned from the data warehouse and other systems across the decentralized institution. Multiple review sessions ensued, in which team members explained why they thought a particular visual accurately represented a metric’s performance over time. Our mantra throughout the design process was: keep the message simple, while also ensuring that a single visualization could answer multiple questions. The challenge was in defining which level of information provided sufficient actionable support, versus having a visualization that drilled down a “discovery wormhole.”

Defining data elements — and finding agreement on those definitions among contributing parties — proved to be an ongoing challenge. Although we reached agreement for the initial round of institutional dashboards, the process pointed to the need for data governance. We are investigating tools such as iData’s Cookbook, which can help an organization capture and publish data definitions that require a structured approval process, as being the authoritative source. A recognized source containing definitions for terms, metrics, and reports can help the team maintain consistency in the design and development of reporting and analytics.

Our data analysis team learned the importance of understanding how data definitions matured over time and the different ways data was collected and classified in the information systems. This understanding sometimes led to the development of bridges between data transformations to allow presentation of a continuous data story. Special events, such as ERP or other system replacements, drove the need to consider which data definitions and transformations are needed and how a data conversion strategy reaches beyond the source system to reporting systems. Weeks of data modeling and cleanup ensued; our goal was to assure users that we could tie the ends together and portray multiyear windows of trends.

Results

After nine months, we arrived at an agreed-to set of institutional dashboards that worked to illustrate student, faculty, and research relationships and performance over time. We designed the visualizations to allow multiple questions to be asked and answered, and included various parameters that users could manipulate to discover results. We also addressed the granular versus aggregated views to allow “drill-throughs” when security rules permitted access.

Dashboarding might seem like another reporting project. However, we encountered many variations in grey requirements and acceptance of visualization “art” for the final deliverables — what worked as a pie chart one week worked better as a bar chart the following week — and thus helped us to exceed many of our resource forecasts. The bridge between data graphing and the “story to be told” can be vast, and the environments and issues at the time can influence how a visualization represents data. At the same time, adhering to standard, defined metrics over time helps solidify and standardize how the visualization can illustrate performance variations. Further, adhering to strong data definitions reduces the translation risk for data extending across multiple instances.

In our case, data cleanliness issues and deciding how to portray data certainly exceeded our original two-week time estimate. However, our hope is that we have now put processes in place to reduce the development timeline to a more manageable timeframe, driven by agile scheduling, reusable components, and standards.

Future Plans

A dashboard effort can lead to analysis paralysis if the signaling features are not strong enough to point out anomalies. A visualization can become stale — like a roadside billboard you learn to ignore. It’s important to employ processes to provide data-driven refreshes that show new situations, as well as the effect of decisions made based on previously illustrated data. This correlating between data and action helps prove the worth of tracking information and displaying results to confirm how a strategy fared.

Given human tendencies toward instant gratification, dashboards must produce quick responses to actions at a level granular enough to show correlation. As deans continue to work under various pressures, their ability to measure performance rapidly and discover details that serve the goals are critical dashboard deliverables. As their strategic partner, it is imperative that IT teams help deans address data cleanliness, definition, and refresh concerns.

IT Leadership Is Key

For colleges and universities, lack of data is not usually a problem. Getting information to academic decision makers, however, is not so easy. The problems involved are not merely technical issues such as data integration; to create dashboards and reports that help deans and administrators make decisions, IT leaders must help their institutions develop data governance, build agreement around data definitions, understand data stewardship, and communicate across silos to reach agreement on dashboard goals.

The lessons learned from these four case studies can be a roadmap for other institutions as they work to provide academic leaders the information they need to make effective and fruitful decisions.

 

© 2016 Mike Wolf, Martha Taimuty, Monal Patel, and Jeffrey Meteyer. The text of this EDUCAUSE Review online article is licensed under The text of this article is licensed under Creative Commons BY-NC-ND 4.0.