Conference Sessions


Schedule at a Glance (PDF)

Wednesday, 13 November 3:00- 3:45 pm

Presentation matters - Creating visually appealing graphs, tables, and text

This interactive session provides tips from renowned experts (Edward Tufte, Stephen Few, Presentation Zen, etc.) on effective data presentation strategies of text-based (i.e., words), tabular/numerical (numbers), and graphical information. When these tips and principles are applied to the presentation of information in documents and presentations, it advances the field of IR and data-informed decision-making because it can make presentations and documents informative, clear, accessible and thought-provoking, so that meaningful conversations are initiated, producing actions with results. All too frequently, weak visuals get in the way of important conversations.  Presented by Anne Marie Karlberg, Whatcom Community College

Don’t Leave it to Chance- Get Organized

The Institutional Research Office is often called upon to do much with little. That can result in feeling stretched beyond capacity and feeling terribly disorganized. This session will provide thoughts for how to stay organized in the world of institutional research with ideas from the audience encouraged. It is especially designed to help those who are new to the profession.  Presented by Wendy Olson, Whitworth University

Why Students with High Potential Benefits Might Choose to Leave School

A presentation based on surveys based on underlying research around why students choose to leave schools for reasons that subvert their own cost/benefit calculus.These reasons run from student self-efficacy to loss aversion to value of work vs education.Columbia Basin College students have been given surveys about their own concerns for two years (tied to this research) and we have itemized the practical impact of these moderators on student retention and performance.  Presented by Pär Jason Engle, Columbia Basin College

Wednesday, 13 November 4:00- 4:45 pm

Academic Advising and Guided Pathways: Fall and Spring Entering Student Experiences

Guided pathways (GP) is a rapidly expanding reform being undertaken by community colleges across the country.One of the central tenants of GP is to help students choose a path, enter a path, and stay on the path through completion.Key to this process is academic advising.

Most students enroll for the first time during the fall term; however, analysis of CCSSE data, collected during the spring term, from 2017 reveals that approximately 19.5% of respondents enrolled for the first time.This study looks at the advising experiences of fall term entering students from the 2016 SENSE survey (same academic year as the 2017 CCSSE survey).Additional variables in this analysis will include common items from the Academic Advising and Planning item sets for SENSE and CCSSE as well as common items from the two main surveys. The analysis will examine these experiences by demographic subgroup. Presented by Michael Bohlig, Center for Community College Student Engagement, UT-Austin

Who did our peers chose in their IPEDS Benchmark report?

This is a demonstration presentation that will show attendees how to utilize IPEDS Data Center resources to build a Power BI report that shows peer institution comparisons.  Presented by Max Kwenda, Gonzaga University

Owning Your Introversion

Calling all introverts! Does the idea of professional networking sound completely exhausting? Do you prefer to work alone, even though you totally rock it when you work in groups? Well, friend, this session is for you! In a culture that rewards extroverts, there is much to celebrate about introversion. This session will highlight the reality and the [many] pros of being an introvert. We will also provide opportunities for you to reflect on your own and with others to learn strategies to be an introverted institutional researcher who is regularly requested to take the center stage. At the end of the session, you will learn how to WORK that introversion!  Presented by Lisa Nguyen, Clackamas Community College and Elizabeth Lee, University of Portland

Thursday, 14 November 10:45- 11:30 am

When Data Custodians Go Rogue

In a large organization with an entrenched culture of autonomy, there was a strong and immediate need to develop a collaborative process to define and approve metadata for new and existing business terms.  To solve this issue, a student data business term vetting and approval process was created from scratch, validated through a proof-of-concept, and is successfully used today. In this session, you will learn: Why and how the process was built; How the Proof of Concept played out; How this process is being used today; Expansion into other subject areas.  Presented by Stephanie Harris and Keith Van Eaton, University of Washington

Doing IPEDS Analysis the Easy Way

Downloading and compiling IPEDS data for longitudinal or cross-institutional analyses is a common pain point for institutional researchers. Although the NCES makes these data readily available, handling the files often necessitates repetitive and unreproducible workflows, particularly when data from several collections needs to be combined. This presentation will focus on the use of the “ipeds” R package, a tool developed by institutional researchers that automates much of this tedious work. It will also demonstrate some graphical and statistical applications of the resulting data that could be the basis of peer analyses or related work. Code used in this presentation will be available from the author’s Github.  Presented by Michael Smith, Portland State University

Ethical Foundations for an Evolving Field: AIR's New Statement

As the field of IR evolves, so too must the ethical foundation that undergirds our work. In response to emerging topics, such as the proliferation of analytics, big data, and vendors who offer myriad solutions to our problems, the AIR Board of Directors sought to examine the Association’s long-standing Code of Ethics and Professional Practice. It became clear that both ethics and best practices were blended in a single code, which may be helpful for professionals in search of a prescriptive document to which they can refer, but the overarching ethical values and principles could get lost in the details. Furthermore, the code was not adaptable in a timely fashion to address new trends. As such, the Board drafted a new Statement of Ethical Principles that is poised to be dynamic and inclusive in the use of data to facilitate insights and improve decision making in pursuit of strengthening higher education. Join us for a conversation about the new statement and the role of ethics in IR. Presented by Michelle Appel & Leah Ewing Ross, AIR

Thursday, 14 November 11:45am - 12:30 pm

Provocative Questions on Campus: Using Data to Develop Courageous Answers

In higher education, we often fall victim to collecting data with only specific purposes in mind. But the power in an established data lake can go well beyond its original intention. Rather than limiting data use to meet original goals, we should encourage campus stakeholders to think big picture and dare to ask provocative questions that we can then explore and examine. In this presentation, we provide data-based courageous answers to questions that many ponder but few have systematically attempted to explore. We discuss why the question is important, where the data utilized comes from, and what implications initial answers suggest for the future of higher education. Most importantly, we help campuses begin to see how they can start asking their own provocative questions.  Presented by Shannon LaCount, Campus Labs

Recurrent Transfer Studies in BC: Changing with Changing Times

How well do existing transfer pathways function for learners? How well do these learners perform when they transfer between institutions? These metrics are key for learners, post-secondary institutions, and transfer-related agencies. The BC Council on Admissions and Transfer has undertaken three major recurrent reports on transfer and student mobility in BC: The BC Transfer Students Profile and Performance Report (since 1992), The Survey of Mobile Students (since 2012), and The Credits to Graduation Report (since 2010). In addition to sharing findings from the most recent studies, this session will focus on how the needs of the various audiences have evolved, how the research results influenced transfer pathways and transfer students, and how changes in the external environment have led to new data sources and methodologies for measuring transfer system effectiveness in BC.  Presented by Robert Adamoski and Anna Tikina, BC Council on Admissions & Transfer

Natural Language Processing (NLP): Analyzing & Visualizing Qualitative Survey Data

Analyzing and visualizing unstructured data, such as open-ended survey responses, is inherently complex and poses unique challenges. Powerful NLP algorithms are available that allow for the efficient processing of qualitative data. Gonzaga University’s Office of Institutional Research conducted sentiment analysis on open-ended responses in a housing survey to identify the Resident Advisors (RAs) who received the most positive reactions. Additionally, I conducted topic modeling to visualize the most meaningful themes within the responses to each RA. Using the R language and popular NLP packages, I wrote two functions that automated the text processing and visualized the results in Power BI. Participants of this presentation will learn the underlying methodology of each algorithm, how to recognize and mitigate their challenges, and when to take advantage of their strengths.  Presented by Joe Siddons, Gonzaga University

Where Are Our Students Going? Making Clearinghouse Data Accessible at University of Portland

This presentation outlines how the IR office at the University of Portland makes National Student Clearinghouse Data available to stakeholders across campus in an effort to explain where students go after or instead of attending UP. The presenters will describe the process of cleaning Clearinghouse data and building Tableau Dashboards in ways that make that data usable to the campus community.  Presented by Michael Johnson and Elizabeth Lee, University of Portland

Thursday, 14 November 2:45- 3:30 pm

Transgender in the Academy: Challenges and Opportunities for the Gender Variant Employee and Student

Transgender persons are facing growing acceptance, and more and more transgender people are socially transitioning at a younger age, and more and more transgender adults are transitioning in the workplace. However, the murder of transgender persons, especially transgender women of color, continues to be all too common. At a legal level, many states in the United States continue to attempt to pass or successfully pass “bathroom bills” that criminalize the existence of transgender persons. In April 2019 a ban on transgender persons serving in the U.S. military went into effect, even though transgender persons are free to serve in the military in Australia and the United Kingdom. This workshop provides basic knowledge of the challenges and opportunities faced by transgender students and employees in higher education.  Presented by Stephanie Dykes, North Seattle College

Let’s Understand Natural Language Processing for Institutional Research (It’s more than just word clouds!)

Managing open-ended comments from campus surveys whether its student course evaluations, employee engagement or alumni outcomes can be a daunting task. Learn what natural language processing or text analysis is and how institutions can benefit from incorporating this field of data science into their survey research. Attendees can expect to gain an understanding on different text analytic techniques, to learn from some real case-studies, and to see it in action with some live coding in Python.  Presented by Kevin Chang, Kai Analytics and Survey Research Inc

Helping students FIG-ure it out

Freshman seminars are a ubiquitous offering in U.S. higher education. Though these seminars have been evaluated in numerous studies, most studies have done so without employing matched comparison groups and using data at scale. In this work, we use data on nearly 58,000 students across 18 years at a public U.S. university to examine the impact of first-year interest groups (FIGs) on student graduation and first-year retention. Using rich data from university databases and external sources, we apply propensity score matching to account for selection bias and confounding variables when comparing students. We find that graduation and re-enrollment rates for FIG students were higher than non-FIG students, an effect that was more pronounced for self-identified Hispanic students and self-identified under-represented minority students. Additionally, we analyze survey responses from over 12,500 FIG students to find that social aspects of the seminars, particularly making friends and knowing others taking the same classes, were the most beneficial to students. Interestingly, references to these social aspects were not disproportionately present in the responses of self-identified Hispanic students and self-identified under-represented minority students.  Presented by Lovenoor Aulck, University of Washington

Building Partnerships for Analytic Initiatives

In a culture where access to data is democratized, and analytics become decentralized, how should organizations demarcate responsibility for decision support? Portland State University has adopted a collaborative approach to laying the foundation for institutional analytics with contributions from IR, IT and business analysts. This session will discuss how the skills and perspectives from these domains can complement each other in higher education analytics initiatives and how to foster collaborative relationships in your institution.
The relationships that develop from fostering collaboration between these three areas are not only productive, but also fun and provide greater career satisfaction to team members. IT members have the opportunity to see the business impact of their labor. IR members find automated solutions to their pain points when accessing and preparing data. Business analysts have an opportunity to shape enterprise analytics to meet the business needs of their unit.   Presented by David Burgess and Alison Nimura, Portland State University

Thursday, 14 November 3:45- 4:30 pm

Using IPEDs data to find market comparators

According to a recent Gallup survey, about 43% of American employees think they’re underpaid. Part of Portland State University’s compensation goals when evaluating these concerns is to determine if we are paying a competitive salary.Unfortunately, our previous list of comparator institutions was aspirational, and used to establish targets for where we would like to be in the future in terms of faculty mix, student enrollment, research, degree completion, and so forth.In addition, we wanted to determine and address any market inequities for our existing full-time faculty, and offer competitive salaries when hiring new faculty.We needed a list of peer comparator institutions, in order to determine what the market salary rate was for universities most similar to us.

The proposed presentation will discuss the steps we went through to establish our peer comparator institutions, using focus groups to help clarify the key factors that make Portland State unique, and analyzing IPEDs data on those factors in order to find a statistically valid comparator list.

Compensation can be an emotional issue.We will discuss not just the technical aspects of defining our comparator group, but what can also be the more difficult challenge of achieving university buy-in for the group, including stakeholder identification and management, and communication. Finally, we will discuss how we are using the comparator group, in conjunction with CUPA-HR data, to determine market inequities and establish starting salaries for our new hires.  Presented by Nina De Lange and Adis Sehovic, Portland State University

Please Bear with US: Attempting the Transition from Line Cooks to Executive Chefs

While creating excitement around the data you have is important, developing consistent and accurate methods for continued sharing practices are of even greater importance. After all, what good is the data if you either a) cannot continually access it, or b) cannot depend on its accuracy. Our institutional research office has focused on developing interest in and understanding of our data. We have been focused on helping others realize the usefulness of the data we collect, and assisting them in making informed decisions based on it. However, a recent shift has occurred where a certain amount of risk was inherent. One major risk was in losing the enthusiasm surrounding our data as we put requests on hold so that methods could be developed for data accuracy and efficiency. Creating useful dashboards has been a top priority, and we have taken a collaborative approach, both at our institution as well as across the state, in order to move from the quick-order line cook position to an executive chef role. Within our session we will discuss how we have been evaluating our data needs, exploring the various options for creating and maintaining reports, and how we continue to support the excitement around our data.  Presented by Lauren McGuire and Lisa Anh Nguyen, Clackamas Community College

Visualizing College Program Retention and Completion

Langara College’s academic programs frequently ask the Institutional Research department for their student retention rates. A common definition of student retention considers a cohort of students who begin in fall term and return the following fall. However, some Langara programs take less than a full academic year to complete; we have programs with intakes in all terms, not just in fall; and we offer laddering programs with overlapping requirements, so that students can, and do, earn one credential while registered in another program. When the traditional definition doesn’t apply, is there a meaningful way to report retention for all our programs? How do we visualize the data as students switch programs, stop out, and earn credentials? Our goal was to make a self-serve dashboard available so that academic leadership could easily find the retention data they need. This presentation discusses the challenges Langara IR faced in setting definitions, preparing data, and creating interactive visualizations in Tableau. We use area charts to visualize retention and graduation of an incoming cohort over time, and Sankey diagrams to visualize flows between programs from term to term.  Presented by Kyle Schoenfeld and Courtney Fabri, Langara College

Broaden Your Connections and Evolve Your Data Experience

Empowering your data begins with enhanced connections across departments and campus. Breaking down silos

and building a strong, connected foundation for unified information gives you insights to make better decisions.

Join us to discover how Campus Labs can help you unlock your data connections and evolve the communication

of your data ecosystem. Sponsor presentation by Michael Weisman, Campus Labs

Friday, 15 November 10:15- 11:00 am

Data Modeling to Create Operational Efficiency

This presentation details how Gonzaga University’s Office of Institutional Research and the Undergraduate Admissions Operations office worked to create a data model that exposed duplicate records of student prospects, including those from uploads of standardized test scores in our ERP system – Ellucian Banner. I utilized the Power Query platform and DAX language within the data visualization tool, Power BI, to identify several kinds of duplicated records. The developed product allows our Admissions Operations Office to prioritize and target their data cleaning effort. This presentation discusses the role of institutional research in operational practices, explains what constitutes a data model, and shows how to use Power BI for data preparation and cleaning.  Presented by Joel Silvius, Gonzaga University

Not Sharing Institutional Knowledge is No Joking Matter

A registrar, an analyst, and a DBA walk into a <insert beverage retailer of choice here>.  One says, “What do you get when you have an institution who employs more people than the population of the state capital, with about an equal number of student enrollments, and a home grown data system older than the students whose data it holds?”  Answer: An amazing, talented group of people who use this common data system to respond to a vast range of policy, process, operational and reporting questions; Three different answers to the same question; and Frustration that both of these answers are correct.

From this frustration, SDC was born.The Student Data Council (SDC) is a cross-campus, collaborative group of people who range in technical skills from back end developers creating the data systems, to analysts using the data to help develop policies and drive decisions, to academic advisors who consume the data to improve our students’ experience.Often the primary (only?) thing that these people have in common is that they somehow, someway, somewhere… use student data.

Join us to learn how this grass roots group developed, growing into a cross-functional, knowledge transfer resource that breaks down silos, shares best practices, builds relationships and avoids the pitfall of becoming nothing but a gripe session.  Presented by Stephanie Harris and Helen Garrett, University of Washington

BC Student Outcomes: 30 years of Innovation

BC Student Outcomes Research has been surveying students in the province for over thirty years.The information collected has been used in inform prospective students about career opportunities and the quality of education at the public post-secondary institutions through interactive online dashboards. The BC Ministry of Advanced Education, Skills and Training (AEST) also uses this data to set and monitor accountability targets for post-secondary institutions.

This presentation is an introduction to BC Student Outcomes Research, an overview of publications, and some of the innovations used by BC Post-secondary institutions to reinforce internal accountability.  Presented by Stephen Salem, Coast Mountain College and Christine Chan, Capilano University

Using Data to Drive Student Success: Benchmarking and Data Collection for Accreditation

NWCCU's 2020 Standards for Accreditation require institutions to collect data related to student learning and student achievement, and to use those data to drive institutional decision-making to promote student success and reduce equity gaps. This presentation is an opportunity for PNAIRP members and conference participants to provide input into the development of the processes that will guide and support these institutional transformation efforts in the years to come.  Presented by Mac Powell, NWCCU

Friday, 15 November 11:15- Noon

Degree of Difference: What Do Learning Outcomes Say About Higher Education?

Learning outcomes statements are, or should be, a fundamental aspect of a student’s higher education experience—the transformative aspiration one opts into when choosing an institution to attend and a program of study. Students enter college with the expectation they will know and do more when they leave than they could on their first day of enrollment—and learning outcomes should define that change. This presentation will include results from analysis of over 15,000 learning outcomes statements to determine if there is a meaningful difference between learning experiences at various institutions and programs. We will discuss the impact of the results and offer actions step for participants to take with them to their own campuses.  Presented by Shannon Rose LaCount, Campus Labs

Sanity in Statistics: Finding a healthy balance in a high stress field

As any IR/IE Director or Analyst will tell you, crunching numbers, fulfilling institutional requests for data, and accreditation work can take its toll on mental (and eventually, physical) health. This session will cover the issues many Directors and analysts face, the "do's and don't's" to share with those who work outside of IR/IE, and effective coping strategies for moments of high stress.  Presented by Carissa Coslow, Bastyr University

Living with Limited Access to Financial Aid Data

Recent clarifications and guidance from the Department of Education have required many campuses to re-evaluate their policies and practices around access to student-level financial aid information. Consequently, institutional researchers are finding that data they once relied upon to meet reporting needs, evaluate initiatives, and conduct research are no longer available to them. This presentation will discuss how Portland State University responded to the DoE’s new guidance, detail how its institutional research and financial aid offices modified their processes to cope, and offer suggestions on alternative metrics that can substitute for those that are no longer accessible.  Presented by Michael Smith, Portland State University

PNAIRP Federal Tax ID: 91-1132292

Powered by Wild Apricot Membership Software