Dashboard - AAMC

45 downloads 288 Views 6MB Size Report
Nov 6, 2012 ... Business Intelligence (BI) technology. Dashboards ... Data Warehousing for Dummies. 2009 ..... Scheps S. Business Intelligence for Dummies.
Institutional Dashboards for Continuous Quality Improvement

Deb Simpson, Medical College of Wisconsin (Moderator) Marc Triola, New York University Mary C. Hill, University of Michigan Henry Sondheimer, AAMC Robby Reynolds, AAMC

1

Background:

Why dashboards? Why now? Medical educators generate a LOT of data for different stakeholders New accreditation criteria for academic quality (LCME, ACGME, HLC) New expectations for data analysis and reporting • Link objectives to outcomes • Link individual, unit, and aggregate data • Cross-link data

2

2012 AAMC Annual Meeting

Background:

Why dashboards? Why now? New approaches to evaluating and comparing outcomes require new tools Business has led the way in developing and using Business Intelligence (BI) technology Dashboards are a navigational tool

3

2012 AAMC Annual Meeting

What is a Dashboard? Auto Dashboard?

Or Cockpit Instrument Panel? 4

2012 AAMC Annual Meeting

Process & Objectives for Session Process

Goal & Objectives

WHAT: Intro & Dashboard Examples

GOAL: Expand our awareness re: available data sets

• • •

NYU Michigan AAMC – MBM & ASSET

NOW WHAT: Discussion Questions • • • •

What are key decisions/questions? What data need/have/access? Challenges in dashboarding data? Benchmarking?

SO WHAT: How do dashboards • •

5

Add value (to whom) Support a CQI- edu.environment

2012 AAMC Annual Meeting

OBJECTIVE: Address the questions “you” want answered re: • Available data sets: 

What you have



What others have (AAMC, USMLE)



What you don’t have

• Benchmarks – Criterion

Introduction: Decisions & ?’s Key Concepts & Terms Business Analytics Tell key stakeholders in multidimensional manner • Process: What’s happening now • Outcomes: What’s happening relative to “criterion” • Prediction: What MAY happen • Discovery: Something interesting

Hammergren TC. Data Warehousing for Dummies. 2009 Terkla DG. Institutional Dashboards AIR 2012

6

2012 AAMC Annual Meeting

Business Analytics Key Indicators – THE critical information • Varies by institution/stakeholder/decision-maker • Indicators must be  Easy to understand  Relevant to user  Strategic  Not used in isolation  Quantitative; up-to-date with current info; reliable Hammergren TC. Data Warehousing for Dummies. 2009 Terkla DG. Institutional Dashboards AIR 2012

7

2012 AAMC Annual Meeting

Business Analytics – Tells key stakeholders by: Visualization of data • Multidimensional • Chronological (Past-Present & IF Predictive Future) Key Concept: Tell me a lot of things but don’t make me work too hard. • Dashboard  Presents current information on your operational performance (car, airplane)  Purpose – manage/guide the organization • Scorecard  Shows performance against a plan, set of objectives, Hammergren TC. Data Warehousing for Dummies. 2009 criterion/standards Terkla DG. Institutional Dashboards AIR 2012

8

2012 AAMC Annual Meeting

Dashboard Examples

9

The NYU Educational Data Warehouse

Marc Triola, MD Associate Dean for Educational Informatics NYU School of Medicine Division of Educational Informatics

Overview • Background • Our Approach • Implementation • Lessons Learned and Next Steps

Background • Who we are • Our new curriculum: C21 • With the rollout of C21, DEI had a goal to create a Framingham Heart Study style database of educational data

Needs • C21 o

Learning analytics at the individual level

o

Curriculum mapping and management

o

Regulatory reporting

• Strategic operational dashboards o o o

Admissions, Diversity Affairs, etc. UME, GME and Biomedical Sciences graduate program DEI

Benefits of an EduDW • Integrates metrics from numerous heterogeneous sources and enables analysis across multiple systems & processes • The EduDW architecture, based on dimensions and facts, promotes exploration: o

provides single analytic view that is easier for users

o

insures high performance

o

is supported by a variety of query & reporting tools

o

facilitates creation of multidimensional cubes

• Preserves historical data • Takes off the load of resource-intensive queries from operational systems

Education Data Warehouse Lecture Podcasting

ePortfolio

LMS

SIS

Student Patient Log

Learning Modules

Evaluations

Exams

Admission s

Simulation

ETL

Data Marts EduDW

Reporting and Analytics BI

Learning Analytics and Individual Dashboards

Learning Analytics and Individual Dashboards

Learning Analytics and Individual Dashboards

Learning Analytics and Individual Dashboards

Learning Analytics and Individual Dashboards

Learning Analytics and Individual Dashboards

Learning Analytics and Individual Dashboards

Learning Analytics and Individual Dashboards

Learning Analytics and Individual Dashboards

Curriculum Mapping

Curriculum Mapping

Curriculum Management

Operational Dashboards

Operational Dashboards

Operational Dashboards

Lessons Learned • Support within the organizational culture o

Commitment from senior leadership

• Ongoing partnerships between Informatics/IT, faculty and administration o

An iterative process!

Dashboards for Continuous Quality Improvement at the University of Michigan Mary Hill ([email protected]) Director, Data Management Services University of Michigan Medical School

November 6, 2012

Why? Manage Better and Smarter Being “successful” is a matter of survival We all are experiencing financial challenges… Doubling of the NIH budget is over Reductions in state support More restrictive funding in a world of higher compliance Mary Hill 11/6/2012

Looked at What’s Important • Determined relevant key performance indicators (KPIs) and benchmarks • Reviewed metrics currently in use • Ensured consistency with Leadership Vision and Strategy • Bring transparency and agreement • Don’t worry about getting goal “right” • Worry about being nimble

Mary Hill 11/6/2012

University of Michigan First Dashboard: • Key constituents in one room • Pushed for set of KPIs in 3 months • Paper first – drew pictures • War room Mary Hill 11/6/2012

Current Status • Must be able to get to the data • Locally available • Peer – who are they? • Externally available • Numbers must tie back to something users trust • Allow drill down Mary Hill 11/6/2012

Process • Process of analysis brought changes to goals • Different departments different peers • Tied performance incentives to goals – brought engagement

Mary Hill 11/6/2012

Learning Program Metrics Breakdown by area • Undergraduate • Graduate Basic Science • Graduate Medical Education • Continuing Professional Education Create metrics for: • Input • Throughput • Output Mary Hill 11/6/2012

Example Undergraduate Input • % admitted accepted to peer schools Throughput : • # Abstracts/Presentations Output : • Primary Care Focus % Mary Hill 11/6/2012

Example Graduate Basic Science Input • % Diversity Throughput : • #/$ students on Institutional Fellowships Output : • % in science related position five years post training Mary Hill 11/6/2012

Example Graduate Medical Education Input • #/% in-state resident Throughput : • #/% house officers engaged in Quality or Patient Safety Initiatives Output : • # publications with house officers as first author Mary Hill 11/6/2012

Example Professional Development Input • # Regularly Scheduled Series/# of attendees Throughput : • % CME participants providing post course evaluations Output : • # Papers published/#external presentations Mary Hill 11/6/2012

Snapshot

Mary Hill 11/6/2012

Challenges • Keeping current – environment changes • Explosion of dashboards – everyone is doing – • Trying to keep a metric in one place • Have consistent user interface across the Health System Mary Hill 11/6/2012

AAMC Mission Management Tool •

2008 meeting of Group on Student Affairs



Challenge to give medical schools something better than USN&WR



Reviewed the MSAR for medical school missions



Six missions selected



First release March 2009

2012 AAMC Annual Meeting

46

2012 AAMC Annual Meeting

47

2012 AAMC Annual Meeting

48

2012 AAMC Annual Meeting

49

2012 AAMC Annual Meeting

50

2012 AAMC Annual Meeting

51

2012 AAMC Annual Meeting

52

2012 AAMC Annual Meeting

The Missions Dashboard 2012

53

2012 AAMC Annual Meeting

ASSET Dashboard Monitor LCME Standards Performance Annually

www.aamc.org/medaps www.aamc.org/medaps 2012 AAMC Annual Meeting

MedAPS: Suite of Services Provide AAMC member medical schools with the tools necessary to assess, maintain and fulfill accreditation standards and promote continuous quality improvement.

ASSET

Curriculum Inventory & Reports

(Accreditation Standards SelfEvaluation Tool)

(Replacing CurrMIT)

www.aamc.org/medaps 2012 AAMC Annual Meeting

ASSET Dashboard

ASSET Dashboard • Review performance on LCME standards annually • Compare performance and curricula with national data • Compare performance and curricula with peer institutions • Link to AAMC tools and solutions to help address deficiencies www.aamc.org/medaps 2012 AAMC Annual Meeting

www.aamc.org/medaps 2012 AAMC Annual Meeting

Populating MedAPS Data Sources LCME AQ Part I-A

LCME AQ Part II

ASSET (1/3 Pre-Populated)

LCME AQ Part I-B

Curriculum Inventory

AAMC Data Warehouse

Curriculum Inventory Reports Student Record System

Graduation Questionnaire

ASSET Dashboard 2012Database AAMC Annual Meeting Faculty

MedAPS: Timeline Curriculum Inventory & Reports Phase 1: Upload School Data

Launch

ASSET

Curriculum Inventory & Reports Phase 2: Launch Service

2013

2014 Launch

ASSET Dashboard

www.aamc.org/medaps 2012 AAMC Annual Meeting

2015

Discussion Questions • How can dashboards help schools monitor their individual

missions? How are schools monitoring their missions without dashboards? • How can dashboards organize vast amounts of data to

keep it from becoming overwhelming? • What data would be the most useful in a dashboard

environment? • What are the data sources for dashboards? What are the

challenges to collecting the data? • What are some questions you would like to be able to

answer?

61

2012 AAMC Annual Meeting

Summary of Key Findings

Next Steps So What ?

Now What?

62

2012 AAMC Annual Meeting

References & Resources Alexander M. Excel 2007 Dashboards & Reports for Dummies. Wiley Publishing. Hoboken, NJ 2007. Arizona State University: http://www.asu.edu/dashboard/ Arnold KE. Signals: Applying Academic Analytics EduCause Quarterly 2010 Scheps S. Business Intelligence for Dummies. Wiley & Sons Hoboken NJ. 2008 Eckerson WW Deploying Dashboards & Scorecards TDWI Best Practices Report-July 2006 Elias T. Learning Analytics: Definitions, Processes & Potential 2011 LogiXML white paper: Dashboard Best Practices (2004). Fuchs, G. Hammergren TC & Simon AR. Data Warehousing for Dummies. Wiley Publishing. Hoboken, NJ. 2009 iDashboard: www.idashboard.com Simpson D, Colbert J, Ferguson K, O’Sullivan P. BI & Dashboards. Presented at SDRME Annual Mtg 2011. Madison WI. Terkla DG, Sharkness J, Cohen M, et al. Institutional Dashboards: Navigational Tools for Colleges & Universities. Association for Institutional Research Professional Files Winter 2012 #123 http://www.airweb.org/EducationAndEvents/Publications/Pages/ProfessionalFiles.aspx

Baldrige Education Criteria for Performance Excellence. http://www.nist.gov/baldrige/

63

2012 AAMC Annual Meeting

Disclosure(s) I affirm that all persons involved in the planning/content development do not have relevant financial relationships with pharmaceutical companies, biomedical device manufacturers or distributors, or others whose products or services may be considered related to the subject matter of the educational activity. Partial funding for this MCW project was provided by:

64

1.

Educational Leadership for the Health of the Public Research and Education Initiative fund, a component of the Advancing a Healthier Wisconsin endowment at the Medical College of Wisconsin

2.

Drs. Elsa B. and Roger D. Cohen Children’s Hospital of Wisconsin/Medical College of Wisconsin Student Fellowship in Medical Education.

2012 AAMC Annual Meeting