B & I’s Excellent Adventure: Create excellence in your own BI space-time continuum (Darin Mattke, University of Texas at Austin)
BICE (BI Center of Excellence), BICC (BI Competency Center) or whatever you want to call it (or not), creating excellence in BI is actually not as confusing as the myriad of names attributed to it. Because not all environments are created with theoretical or ideal infrastructure models and tools, even small, simple steps can be implemented to pave the path of quality, excellence, competence or whatever buzz-word, catch-phrase you want to use. Ideally, executive sponsorship, IT and business buy-in, data governance, and training/certification all contribute to the idea of BI excellence, but… What if our environment exists in the real world? What can we do to get started? Is it too late to do anything about it? What can I do!?!? This presentation will be a discussion of what and how to implement strategies that lead to great BI, as well as a glimpse of what has been done at UT to further the idea of quality, community, and collaboration. It is the greatest of all mistakes to do nothing because you can only do little – do what you can. -Sydney Smith
BI on the Go (Jeffrey Glatstein, University of Massachusetts, Office of the President)
This topic will deal with the concept of Mobile computing. Is the time right for Higher Education to adopt a mobile strategy? The University of Massachusetts would like to pose this question and have an open dialog with the audience surrounding the challenges and business drivers of the mobile platform.
BI Training and Communication Strategies (Rachel Kemelman, New York University)
This panel discussion is intended for Business Intelligence managers/directors who have planned and executed an end-user Training and Communication Strategy for the implementation of a DW/BI. We are looking to collaborate with a small group of individuals from large, complex and de-centralized universities, as these factors create unique challenges to overcome. The objective of this panel discussion will be to review various training and communication approaches, whether they were successful or not (and if so, why not), what they would do differently, what lessons were learned, and what could be done to ensure a successful BI rollout. This panel will also be given an opportunity to present their proposed training and communication methodologies and get input from the participants, allowing all to share their ideas and help fellow HEDW members in their efforts. As the lead in the panel discussion, I will present the panel with various BI training and communication topics, and also help to facilitate the group discussions. I will also weigh-in with various training and communication strategies that I’ve executed in the past, and discuss the successes and pitfalls of the approaches.
Bringing Order to Chaos in Data Warehousing (Jeff Christen, Cornell University)
Through the process of implementing several data marts, Cornell has learned there is more to productionizing a data mart than meeting the business requirements. If the data mart implementation is successful, the users will expect it to be highly available, reliable, and perform well. This presentation will cover the tools and methodologies used by Cornell to optimize and sustain its data warehousing environment.
The Business Intelligence Marketplace: How to Find, Keep and Serve Your Customers (Ann Wunderlin, University of Washington)
According to a June 2011 survey of LinkedIn BI professionals, the main reason Business Intelligence programs fail is due to lack of user adoption and awareness. You can build the best Data Warehouse solution served up through the most useful analytic tools, but in order for your program to be successful your users need to know where to find these resources and how to use them. Join our panel discussion to find out how successful programs are spreading the word and supporting their users, to get and keep them coming.
Can Do Kanban: Integrating the DSS Team and the EDW (Mary Syre, University of Washington)
Congratulations. Your project proposal to build an Integrated Enterprise Data Warehouse has been approved! Now what? Join us as we map our journey to deliver business value quickly while developing new design approaches and artifacts, building an ETL framework, standardizing quality assurance processes and nearly doubling the size of our team! We decided early on to apply Agile/Lean/Kanban practices to manage the project and continuously adjusted the approach to reach a custom fit for our team. We Can Do Kanban, and so can you! Let us share our Kanban processes and the Software Development Lifecycle that guided us in creating processes, templates, deliverables, and working as an Integrated Team to deliver an Integrated Enterprise Data Warehouse.
Course Evaluation Analysis in an Open Source Business Intelligence Suite (Timothy Moore, Virginia Tech)
Virginia Tech completed a new Course Evaluation data mart in May, 2011. At the same time, we implemented Jaspersoft, an open source business intelligence suite. This presentation will describe the processes for creating the Course Evaluation data mart and for implementing Jaspersoft. Topics covered will include: Customer Interaction; Course Evaluation Data Model Creation; Review of BI Suites; Implementation of Jaspersoft; End User Training; and Handling Greater Use of Data Mart and BI Suite. We will be sharing lessons learned from these experiences.
Creating the Ultimate Masterpiece? Building a Master Data Repository to Inform Strategic Decision-making (Karen Weisbrodt, University of Texas, Austin)
Increased focus on using institutional resources effectively has placed a premium on converting data from across campus into strategic management information for use by university administration. Staff at the University of Texas at Austin plan to meet this need by building a comprehensive data repository to house and link student, faculty, financial, facilities, and research data for UT-Austin together with benchmarking data from peer institutions. Presenters will share background information and challenges to linking multiple disparate systems. They will also share initial action already taken for the project as well as outline the plan for the next 6-12 months of work, with expected deliverables.
Designing the Analytic Data Warehouse One Slice at a Time (Nancy Lashbrook, University of Washington)
Ready for a challenge? Design an Analytic Data Warehouse in an Agile fashion! Our team has developed a repeatable, low-ceremony Agile-Friendly approach to designing and specifying our analytic data warehouse. This approach easily fits in the agile methodology described in the “Can Do Kanban: Integrating the DSS Team and the EDW” presentation. We start with the Business Dimensional Model as our blueprint. From there, we zero in on a single slice of business value. Through direct SME interaction and data profiling we capture just enough Business Functional Requirements and Business Metadata. Collaborative development of Data Models and Mapping Specs wraps up the design with direct involvement of Test and ETL staff. Then, onto the next slice! Join us as we describe our design deliverables and show you how they bring value to the overall development process.
Digital Dashboard: An Account Manager’s View (Navin Chandar Rajaram, Indiana University)
Universities have fiscal and budget challenges that must be met. To help university decision makers with timely information that is easy to digest as well as assist the operational decision making process a “Digital Dashboard” was developed. This dashboard provides fiscal officers and departmental account manages with revenue, expense, budge, expense forecasts, and residual forecast information. An estimate of spending based on current budget and balance is also available. See how the Research Administrative Systems & Decision Support (RASD) division of Office of Research Administration developed a “Digital Dashboard” using SQL Server, Dimensional Modeling, and cubes.
Faculty Performance Measurement Using Data and Dashboards (Kimberly Ford, Walden University)
Faculty performance assessment can aid institutions in strategic planning, quality control methods, student satisfaction, and delivering quality education to students. Regardless of the type of institution, using research and data can help organizations create policies and improve faculty performance. There may be data collected not currently in use or there may have been efforts to begin collecting data that the organization has identified to begin to measure faculty performance. Focusing on how to tie data together to create a collective picture of faculty performance will depend on organizational needs based on expectations for tenure, policies and continued employment. There are many areas of data that could be used to create a faculty performance dashboard. Student surveys, faculty surveys, peer surveys, faculty issues, professional development, workload, GPA and grading are all some of the key datasets that can prove to provide a valuable picture of faculty performance. Defining what the organization deems important for data collection, is the first step in creating faculty performance dashboards.
Higher Education Data Mining (Natalie Kellner, New Mexico State University)
Guide your institution’s strategic decision-making, not only with internal metrics and goals, but by comparison to external peers. Utilize your business intelligence environment to examine key educational benchmarks in the context of your competitors. In challenging economic times see how you can assist your institution in utilizing metrics to guide strategic decisions ensuring high educational effectiveness, in a cost-effective and efficient manner. Comparison with peers provides your institution a leg-up on how competitors are responding to the resource, preparation and delivery issues facing higher education today. Gain a competitive advantage by measuring performance against peers.
Implementing the Common Education Data Standards (John Blegen, State Higher Education Executive Officers)
The Common Education Data Standards (CEDS) Initiative is an exciting opportunity for stakeholders from across the education spectrum to work together toward more clear and consistent data. The CEDS Initiative is a joint effort of the National Center for Educational Statistics (NCES) and a consortium of non-governmental education stakeholders. The CEDS Consortium is enthusiastically supporting these efforts by participating in their development and supporting their adoption. The CEDS Standards are unique because they are preschool through workforce in scope, are entirely voluntary, and are supported by a broad array of stakeholders including the Council of Chief State School Officers (CCSSO) the State Higher Education Executive Officers (SHEEO), and the Postsecondary Electronics Standards Council (PESC). Version 2.0 of the standards was recently released in January, 2012. This is the first version with significant postsecondary content. Version 3.0 which will expand that content is currently in development. This presentation/discussion will bring you up-to-date on CEDS activities, present some exciting new data dictionary analysis tools that are available at no cost, and identify issues and challenges for discussion. Discussion will include how the CEDS impacts those building new data warehouses and special considerations for those with existing warehouses.
Institutions <-> Standards <-> SLDS; Integrating the Disparate (Arron Ferichs, Oregon University System)
Panel and open discussion about cross-sector SLDS efforts in Oregon. Topics tailored to the panel and the audience. Panelists will address the Oregon Interagency Data Warehouse, SLDS, Data Standards, FERPA, Data Governance (interagency and interstate), Master Data (beyond metadata management), and more. Format: A panel of three people will offer 5-minutes presentations with up to two slides to present the subject matter. After that, the remaining forty-five minutes are open for questions and discussion. Special notes: The Oregon SEA (State Education Agency) representative Director of Enterprise Systems (Michael Rebar) has agreed to be a panelist and has contacted other collaborators. For example, we have contacted SHEEO to solicit their participation in an open discussion around data standards to streamline ETL (Extract-Transform-Load) processes and reporting. We have contacted our Washington colleagues to discuss warehouse implementation considerations, open source tools, interstate identity resolution, and regional data consumption. Finally, we have contacted a colleague at the U.S. Department of Education in regards to FERPA and privacy concerns.
ITS provided Data Integration as a Service or How to make your ETL tool investment payoff and add value across Campus (Shelly Turner, University of Michigan)
This session will provide an overview of how the University of Michigan is leveraging Informatica to provide Data Integration as a service to various Units and other IT groups on Campus. Due to current budget conditions, everyone needs to Add Value and find new ways to provide services. ITS at University of Michigan is embracing the concept of providing Data Integration and ETL outside of the scope of the Data Warehouse and BI. We are piloting a number of projects such as transferring various data across campus and also including unit data in our Data Warehouse. Topics will include the various service levels we have defined so far, considerations and challenges.
Managing the BI User Experience (Tamara King, South Orange County Community College District)
Navigating the development life cycle for our institutional Data Warehouse was relatively straightforward. We thought we did everything right; planning, user focus groups, technical design, and executive sponsorship. We utilized an Agile methodology to phase the development with controlled timelines, training and user support for queries, reports, data access and dashboards. However, governing the user community has proven to be a bit more challenging than anticipated. We’ve worked hard to encourage users to really engage and achieve a full and deep understanding of the data and reporting capabilities of the BI environment. This workshop will provide a high level view of our development life cycle and focus on what has worked and what hasn’t with specific case studies and examples of the SharePoint interface, class profiles, reports, dashboards and metadata. We will feature both our successes and failures and share how we are tackling these challenges as part of the BI User Experience.
Metadata Management: Making Sense of It All (Rainbow Di Benedetto, University of Texas at Austin)
Metadata management is a broad term used to refer to ways of collecting and organizing information about information, such as definitions of key terms, business rules applied, how data is collected, and where it is used or published. At The University of Texas at Austin, we have found that as we make more data available through the data warehouse and other means, it becomes both more difficult and more important to do this in a systematic way. We need to know exactly what we mean when we use a term like “faculty salary” or “enrolled student”; be aware of alternate definitions, where they are used, and why numbers in similar reports may differ; and be able to assess the impact of proposed changes or new reporting requirements. Currently, we use a hodgepodge of different methods and would like a more unified solution that is easy to use and maintain. We are investigating possible options and would like to share what we have found so far and hear what has worked well (or not so well) for others.
MyRA (My Research Administration): One Stop Dashboard (Christine Ray, Indiana University)
Indiana University has developed a web-based reporting tool called MyRA (My Research Administration) that allows principal investigators as well as administrative staff to view the status of sponsored project and compliance data. Components include: MyProposals – Lists proposals with a link to additional proposal details, attachment information, and status history; MyAwards – Lists awards by sponsor with links to expense summary for expenses and encumbrances compared to account budget, status of award receipt and negotiations, and related documents, and account administrator contact information; MyCompliance – Shows IRB approvals with protocol number along with approval and expiration dates. Similar modules for Conflict of Interest and IACUC are being developed; MyEducation – Shows scheduled and in progress courses taken through CITI and IU’s Office of Research Administration. MyRA was developed with cutting edge technology available including .NET 4.0, MVC, Web Services, AJAX, CSS, and a data warehouse. MyRA displays enterprise system data and homegrown system data in one cohesive user interface, including a Dashboard for graphs and charts. Since its inception, usage has grown campus wide at a steady pace.
Needs Analysis – Business Process vs. Business Intelligence (Peter Weinstein, Illinois State University)
Illinois State University (ISU) has recently begun a combined Business Process and Business Intelligence needs analysis effort. From an IT perspective, Business Process needs analysis focuses on collecting data while Business Intelligence needs analysis focuses on reporting results. However from a Business perspective both of these activities are part of an end-to-end Business Practice and are dependent upon each other. ISU will present the methods that we are using to merge Business Process and Business Intelligence needs analysis so that we can better understand the Business Practice. During our presentation you will have an opportunity to ask questions, provide feedback and share some of your own experiences.
Of Forests and Trees: Institutional Data Definitions and Decision Support (Christina Drum, University of Nevada, Las Vegas)
Our president tasked our Institutional Research office with implementing an enterprise data warehouse and business intelligence platform on the heels of a new student ERP system. One result of this undertaking has been the emergence of a formal Decision Support function, a key component of which involves leading a collaborative campus effort to develop institutional data definitions. In developing data marts in our newly implemented BI tool, a team of data stewards, data users, and IR development staff works together to define the elements needed to meet informational requirements. Thus, our data mart development process centers on the creation and implementation of data definitions, each of which includes: a title, description, interpretation and usage notes, and a description of how it is derived from transactional data sources. We house the resulting definitions and their technical implementations in a central metadata repository developed in house. Definitions are made available to data mart users through an online data dictionary. In this presentation, participants will learn about the tools and processes we developed to create, organize, and provide access to usable definitions. Our approach, and sharing the lessons we have learned, will be of value to those seeking comparable solutions in their organizations.
Predictive Analytics for Student Counseling and Academic Capacity Planning: A Case-based Decision Support System (Bodo Rieger, University of Osnabrueck)
We present an application for the rolling prediction of future course-enrolments applying a refined case-based reasoning (CBR) approach. As a first step towards a sophisticated predictive analytics tool this project aims at supporting executives, faculty and students in their course- and resource-planning processes. Using the open-source jColibri framework we designed a case-base that consists of more than 900 heterogeneous student cases modeled in an object-oriented manner, including personal attributes (e.g. age, sex, a-levels grade) and course histories (including current GPA, failed and completed courses, etc.). The retrieve-phase of the CBR-cycle introduces a dynamic case interpretation with regards to the stored cases’ descriptions and solutions. A rule-based reasoning approach is integrated in the revise-phase to verify that proposed solutions (i.e. a student’s predicted future study plan) meet the examination regulations. Next, solutions are presented to the respective students who benefit from a proposal of selectable future courses and are able to validate/change the proposition against their preferences. In order to support executives and faculty revised solutions are stored within a database and processed for multidimensional/analytical and standard reporting on different levels of aggregation (e.g. predicted number of enrolments for class Accounting I in Fall 2012).
A Primer on Culture Change! (Anja Canfield-Budde, University of Washington)
In Spring 2011, the University of Washington recognized that, due to legacy interpretations of retention requirements, the University community could not get the historical data it needed to support effective decision-making. Lack of sufficient historical information in the Enterprise Data Warehouse (EDW) lead to decentralized information systems enabling departments to run their own longitudinal studies and historical analyses. This created inefficiencies that negatively impacted productivity and the quality of reports at all levels of the University. Sound familiar? This talk will show how UW establishes the Enterprise Data Warehouse as the single source of information for strategic decision-making, longitudinal studies, institutional research, and reporting and analysis activities, by centralizing and retaining granular, consecutive and unaltered historical data. Learn how three simple, accepted methods are leading to a new UW Data Management policy that not only provides the needed data, but also minimizes risk. Surveys of peer institutions like yours, a Planning & Budgeting Brief and a formal Risk Assessment were crucial tools in helping the University embrace a culture change of truly historical dimensions.
Retention Analytics at Drexel (Insiyah Jamal, Drexel University)
Attend this session to learn about how Drexel departments leverage Student Analytics to improve performance and boost Retention efforts. Session will also cover key report formats, common dimensions and measures and dashboards.
Should You Blame the Data Warehouse? (Suzanne Coletti, Princeton University)
The Data Warehouse and the BI tools used to support the Warehouse have made data much more accessible to our campus community. However, in doing so, the new and critical question becomes who has the responsibility for insuring that the data are properly used and disseminated? The Data Warehouse group can insure that only users with proper security can access the data, but once the data have been downloaded or otherwise disseminated to individuals or to departments, that tight level of control may be threatened. This issue has always existed, but now, since it is much easier to access and disseminate the data, new questions of security arise. How much of the responsibility for educating our users to understand the proper care and security of the data accessed from the Warehouse lies with the Warehouse team? Are these university guidelines or policies that exist to address this issue? We’ll explore some of the sensitive data stored in the Warehouse, what we can do to educate users and mitigate risk, and discuss what responsibilities should and should not lie with the Warehouse team after the data have been extracted.
Users may not know much about BI, but they know what they like (Aaron Walz, University of Illinois)
With industry survey after survey showing staggeringly low rates of BI adoption, clearly one of our ongoing challenges is delivering BI solutions that our organizations will actually use. Complicating this is the difficulty of getting good requirements for BI projects: business users with little or no BI experience are at a loss to describe needs for information solutions they’ve never had. How can we deliver successful BI solutions given these challenges? At the University of Illinois, the use of prototypes has helped us to both validate ideas for new solutions, and to get detailed requirements for projects to build them. This presentation will describe the challenges often encountered and examples of the different ways prototypes can be used to help address them. Even where users don’t know much about BI, prototypes help uncover that they really do know what they like.
Using a Data Warehouse for NCATE Tracking and Reporting (Matthew McAuliffe, Salem State University)
The National Council for Accreditation of Teacher Education (NCATE) requires institutions to track and report on detailed assessment data for their Education majors. At Salem State University we have built some custom PeopleSoft tables to allow faculty to enter that assessment data, and we are also leveraging PeopleSoft’s delivered Academic Advisement functionality to track students’ progress through the various stages of the program. In order to analyze and report on this data easily, we wanted to bring it into our Blackboard Analytics (formerly iStrategy) data warehouse. So we engaged with Blackboard Analytics to design and build several new dimension and fact tables to capture this information and integrate it with the wealth of information already included in their delivered data models. This presentation will describe the content of the new data models, the types of reports we can produce from them, and the process involved in designing, developing and testing this new functionality.
Warehouse Functionality without a Real Warehouse (for now) (Wayne Mircoff, Georgetown University)
For many years, without an EDW, GU-Finance has been using primitive PC-based tools and developed novel applications to report on tens of millions of rows of data. We are in the unusual position of being extremely familiar with our data, its interrelationships and reporting needs, but we’re on unfamiliar ground with developing a Finance data-warehouse. This presentation will encompass two areas: An overview of GU-Finance’s current reporting tools; An overview of a phased plan to rebuild all of this functionality in a centrally supported (Cognos) warehouse environment in conjunction with the implementation of a new financial system I hope to be able to share our experience in template development and customer engagement while gaining real-time feedback to our phased plan.
What If We Could Use Siri for Business Intelligence (BI)? (John Rome, Arizona State University)
This presentation looks at future trends of BI and ponders ideas like using an intelligent assistant like Apple’s Siri to answer questions that are important to an institution. In addition to looking at future trends, the presentation will take a historical look (hopefully entertaining) at BI, including the history of BI/Data Warehousing in Higher Education.