Sample image

2009 Conference

PNAIRP Conference Schedule(PDF)

The Pacific Northwest Association for Institutional Research and Planning was pleased to hold the 2009 Conference in Portland, Oregon. This was our 30th annual conference and took place September 30th through October 2nd at the University Place Hotel and Conference Center.


The PNAIRP annual conference offered a unique opportunity to exchange ideas, share results and discuss concerns with fellow members of our association within a supportive and collegial environment. The theme was:

Informing Organizational Intelligence

The issue of informing organizational intelligence through our work is a broad one, and the presentations were similarly broad in their scope.


The Hotel and Conference Center operated by Portland State University offered rooms for the conference rate of $89 per night. It was located on the edge of Portland's Cultural District (Oregon History Museum, Portland Art Museum, Portland State University, restaurants and the South Park Blocks) and close to the Portland Streetcar line.

For more information about the Hotel and Conference Center see the University Place website.


Wednesday, September 30  

1:15-2:45 pm Pre Conference Workshops


Fundamentals of Survey Research

Astoria Room

Facilitator: Paul Stern, Washington State University – SESRC

Poorly designed surveys waste time, money, and staff resources. This pre-conference workshop will help participants decide whether a survey is the right way to get at the needed information, and will include the following topics:

1) how to ask good questions and minimize survey error

2) learn tricks to get at difficult issues

3) discuss various methods (Web, Mail, Phone)

4) other areas the participants may want to discuss including sampling and weighting results. Bring your questions to this interactive session.



Transforming Raw Data into Visual Communication with Tableau Software

Coos Bay Room


Facilitators: Robin Bunnell, Southwestern Oregon Community College and BJ Nicoletti, Linn-Benton Community College

This pre-conference workshop will demonstrate the use of Tableau software as a tool that can help transform data into highly visual output. Applied examples of fast analytics and visualization using dashboards and interactive filters with data from:

1) the Community College Survey of Student Engagement (CCSSE)

2) Oregon’s Community College Uniform Reporting System (OCCURS)

3) the Survey of Entering Student Engagement (SENSE) will be demonstrated.



To view file:
1)  Save the attached Zip file to your computer.
2)  Extract the Zip file.  (Unzip it.)
3)  Double-click on the player.html file.

What Kind of Regression Should I Use with My Data?

Wahkeena Falls Room


Facilitator: Gordon Bower, University of Alaska Fairbanks

This pre-conference workshop will introduce and demonstrate extensions and modifications of the linear regression model designed to cope with data issues that we encounter. It will include a discussion of data types and characteristics, and introduce techniques to expand the tools that we have for data analysis. In the second half of workshop those techniques will be applied to several real-world data sets.




Thursday, October 1

10:30 – 11:15 am Concurrent Sessions Block 1


University Economic Contribution Analysis

Astoria Room

William O’Shea, Pacific University

Many colleges and universities are interested in how their economic activity in their state or local area is associated with additional economic activity in that area as the institution’s employees and vendors spend dollars received from the institution.

An important difference in approaching an analysis of an institution’s economic activity is differentiating between a focus on net (i.e., impact analysis) or gross (i.e., contribution analysis) economic activity. The scope of this economic contribution analysis focused on an estimate of the economic activity in Oregon supported by Pacific University’s operational spending in FY08. These results are useful as an estimate of the gross economic activity in the Oregon economy attributable to Pacific University. What’s more, the use of specific terminology regarding the type of analysis provides some transparency regarding the interpretation of the results.



Assessment 101

Coos Bay Room

Anne Marie Karlberg, Whatcomm Community College

Presentation will provide an introduction to and overview of assessment for folks new to assessment. Assessment will be defined, component parts of successful assessment programs will be identified, three types of assessment indicators/data will be discussed (direct indicators, indirect indicators and institutional data) and examples of each type will be provided.

Three levels of assessment will be presented and the two phases in creating effective outcomes assessment will be identified and discussed.



A Stage-based Approach to Identifying Obstacles to Degree Completion

Multnomah Falls Room


Gordon Bower, University of Alaska Fairbanks

Studying retention rates and time-to-degree only on a “semesters elapsed since first time freshman” or “number of credits earned” basis, it is impossible to distinguish students who make steady progress toward one degree before changing major or dropping out from those who “wander aimlessly.”

Within a degree program, we identify the longest sequence of courses, each requiring the previous as a prerequisite, than are required to earn that degree. We then examine the enrollment history of each student in that degree program, using completion of each course in that sequence as a milestone. We can then describe how rapidly typical students progress through the sequence and identify points in the program where students are most likely to drop out, change their major, or require several attempts to complete a required course.




11:30-12:15 pm Concurrent Sessions Block 2


Toward a Cost-Benefit Analysis: Estimating the Increase in Retention and FTE Enrollment from College Survival and Success Courses

Astoria Room


Robert Vergun, Portland Community College

The Office of Enrollment Services at our institution was interested in assessing whether the cost of offering a free one-credit course on College Survival and Success is offset by the FTE revenue generated from the increase in the term-toterm retention of students. As part of the assessment, the Office of Institutional Effectiveness was asked to estimate the increase in FTE Enrollment in subsequent terms as a result of that course. This analysis goes beyond simply asking whether FTE Enrollment has increased as a result of the course offering, and instead asks the extent to which it has increased. The analysis makes use of a self-selection regression model to measure the increase in FTE enrollment in subsequent terms, controlling for the characteristics and enrollment patterns of students. The goal of this presentation is to offer a template to researchers asked to conduct similar costbenefit analyses. The presentation will offer a step-by-step guide.



Who are our successful students?

Astoria Room

Qin Liu, British Columbia Institute of Technology

To serve student success and to enhance the learner experience are clearly stated in the mission statement and the strategic plan of the institution. The institution is interested to know who its successful students are and what factors have contributed to their success. To answer the questions, this presentation reports the findings from data analysis of the 2009 Full-time Student Survey. Student success is measured by five variables: academic achievement, student satisfaction, attainment of educational goals, gains in personal development, and gains in practical competence. These variables are aligned with the dimensions of student success identified by Kuh and his associates (Kuh et al, 2007). Cluster analysis was employed to create groups of successful students and at-risk students. Preliminary results show that among the groups created, the successful students perceived the highest gains in practical competence and personal development, were most satisfied with their learning experience, and reported the highest level of goal attainment whereas their academic performance was not significantly different from other students. Results also show that the successful students were more academically and socially engaged than the other groups. How these students perceived their academic challenges and institutional support was also examined. The findings will help the institution obtain a better understanding of who are more likely to succeed and who tend to fail in their studies, and inform institutional decisions that will enhance student success. This study also contributes to a better understanding of student success issues in anon-university setting.



How Long Does it Take Students at an Urban University to Complete Their Degrees: Traditional vs. Non-Traditional Students?

Multnomah Falls Room


Lina Lu, Portland State University

Portland State University (PSU) is a 4-year urban institution which has a high percentage of non-traditional students, such as 60% transfer students; 40% of parttime students; and 60% of old students (age>25). What is an average time for PSU undergraduate students to complete their bachelor’s degrees? How did traditional students (first-time freshmen) differ from non-traditional students to complete their bachelor’s degrees? This study tried to answer these questions.

This study took 2008-2009 graduates as an example to track time to degree completion by using such as indicators: traditional/non-traditional students, full/part-time status, first-term GPA, age, and academic major. The study also focused on relationships of these indicators with time to degree completion. The findings will provide useful information for administrators, instructors, and decision-makers to adjust policies and strategies and to provide better services for students.



2:00 – 2:45 pm Concurrent Sessions Block 3

Multi-institutional Data-sharing: Challenges, Issues, and Lessons Learned from Urban Transfer Research Network (UTRN )

Astoria Room

Juliette Stoering, Portland State University

Fauzi Naas, Chemeketa Community College

Jim Posey, University of Washington Tacoma

Judy Redder, Clackamas Community College

Ron Smith, Portland Community College

Shanda Diehl, Clark College

This session provides information about the issues and challenges institutional researchers face when partnering in multi-institutional data-sharing research projects. Panelists’ remarks will be based on experiences as partners in the Urban Transfer Research Network (UTRN). The UTRN is a collaborative multiinstitutional research project is funded by Lumina and focused on underrepresented students who begin their postsecondary careers at community colleges with the goal of attaining at least the Bachelor’s degree. Over the course of four years, UTRN partners have developed cohort-tracking guidelines, data element specifications, and procedures for matching shared data sets while protecting the confidentiality of student records. Panelists represent diverse perspectives, including community college and university partners as well as founding and new UTRN members.



Automatically Updating Publications: Linking Excel and InDesign

Coos Bay Room


Jonathon Jacobs, Oregon University System

InDesign has an option that allows the linking of placed word or excel files. Any time content in the source Excel file changes, InDesign alerts the user that the source document has changed and asks if an update should be performed. Any changes will automatically be made. This tool can be useful for any office that develops non-Excel based reports on a regular basis while using excel for its spreadsheet and calculation tools. This eliminates the need for cut-and-paste, and ensures consistency between the Excel and publication (InDesign) versions of a document.

This session will feature demonstration of the use of automatic links in InDesign and how it can streamline and automate regularly produced publications. Tips and tricks, formatting, and limitations will be included. This session will be most useful if the viewer has had some exposure to Adobe InDesign publication software. For those that have not used the software, it will offer a demonstration in what is possible as far as linking when using the software.


Google Me This, Batman

Multnomah Falls Room


Maureen Pettitt, Skagit Valley College

The purpose of this presentation is to provide strategies for conducting more

successful searches on Google and Google Scholar. The participants will see how

modifying search words and characters can change the outcome of the searches.

Participants will also be asked to share their tips and tricks for more effective





3:15 – 5:00 pm Concurrent Sessions Block 4

Two Sides of the Same Coin: Community College and University Faculty, Staff, and Administrator Views of Factors That Affect Student Transfer and

Baccalaureate Attainment

Astoria Room

Juliette Stoering & Peter Collier, Portland State University

This research complements quantitative analyses addressing community college (CC) student transfer and degree completion. The researchers conducted interviews with faculty, staff, and administrators at CCs and the university in an urban region and asked what structural and interpersonal factors facilitate or hinder community college student transfer and baccalaureate completion. Of particular interest were factors associated with success for underrepresented groups. Researchers developed a codebook using atlas.ti and analyzed the coded transcripts for differences by institution type. Results are discussed in terms of integration with previous quantitative findings.



Reviewing Programs En Masse and In Depth

Coos Bay Room


Dawn MacDonald, Yukon College

This session will report on a research project designed to review a cluster of related programs in Health, Human Services and Education and to assess the effectiveness of a ‘thematic’ approach to program review.

Programs are classified into six clusters cutting across departments and schools.

Thematic clusters of programs are reviewed on a six-year cycle. Faculty from across the institution are engaged in the process. The review assesses the overall health of programs and makes recommendations for program improvement, but programs that are identified as needing more in-depth treatment are submitted for a targetedProgram Evaluation.



Making Sense of Board Goals and Assessing College Efforts

to Achieve Them . . . and doing it quickly before they change!

Multnomah Falls Room


Brynn Pierce & Chris Egerston, Central Oregon Community College

This presentation reports of the efforts to build a connection between Board goals, initiatives undertaken to move toward achievement of those goals, and the assessment of those initiatives. It will include discussion of the mapping of institutional effectiveness reporting to those goals and the building of institutional capacity to collect and report the information required to measure progress. There will also be information presented on communicating and facilitating the Board’s use of the information for strategic planning.




Friday October 2

9:00 – 9:45 am Concurrent Sessions Block 5


Holes in National Student Clearinghouse Data

Astoria Room


Paul Stern, Washington State University – SESRC

We conduct regular matches of Washington high school graduate data with the National Student Clearinghouse and with public college information in Washington State. By comparing results from these two matching sources, we have identified areas where NSC data is incomplete.

This session will share some of the under-counts we have discovered in NSC and share our thoughts about why those holes exist. We will include time for colleges to share their own successes and challenges working with NSC data.



A New Web-Based Tool for Assessing the Student Experience in Learning Communities

Multnomah Falls Room


Maureen Pettitt, Skagit Valley College & Gillies Malnarich, Washington Center for the Improvement of the Quality of Undergraduate Education

This session will focus on a new web-based tool for assessing students’ experiences in Learning Communities. We will describe how the questions were initially developed and the multiple approaches utilized to refine the questions to arrive at a final version. The theoretical bases of the questions will also be reviewed.

Participants will engage in a conversation with the presenters about how the survey might be used at their colleges to assess learning communities and incorporated into their institutional effectiveness models.




10:00 – 10:45 am Concurrent Sessions Block 6


An Office to ADMIRE (Assimilated Data Management, Institutional Research and Effectiveness):

A Synergistic Model to Integrate Institutional Research, Institutional Effectiveness, and Information Management.

Multnomah Falls Room


James T. Posey, University of Washington Tacoma and Gita Wijesinghe Pitter, Florida A&M University

The objective of this presentation is to identify common essential information and data needs of colleges and universities and to suggest a model to integrate these data needs into one office or department that can synergistically create a data-informed institution that eliminates duplication, competition, and antagonism between disparate offices.



Pnairp: Peering into the Future

Astoria Room


Convenor: Ron Smith, incoming PNAIRP President

Come join a conversation with fellow PNAIRP’ers about where we are and share your thoughts about what you would like to see PNAIRP do in the next year. What activities currently have serve you well? What things might we consider doing to better serve institutional researchers in the Pacific Northwest?




11:00 – 11:45 am Concurrent Sessions Block 7


Returning to Learning in an Age of Assessment

Coos Bay Room

Chris Jackson, Council for Aid to Education and

Kay Sagmiller, Southern Oregon University

The objective of the presentation will be to inform attendees regarding recent trends in higher education that suggest a move toward performance (or “task”) based curriculum and assessment. Beyond simply describing these trends, the presenters will discuss, in particular, institution specific case studies of educational improvement tools—using the CLA’s Assessment Services and Education programs as models—that support these efforts through both assessment and teacher training.



Using High School Transcript Data to Assess and Improve COMPASS Placement Outcomes

Multnomah Falls Room


Joe Montgomery & Marielle Parker, Columbia Basin College and

Jeff Harris, Docufide Inc.

In this presentation, we follow up on the lead author’s 2007 PNAIRP presentation which challenged the validity of COMPASS math scores and the accuracy of course placement based on the scores. We also propose to:

1) describe how CBC initiated the use of Docufide, Inc., software to download high school transcript information;

2) present analyses comparing level of high school performance with placement level; and

3) present several alternatives for incorporating math transcript data with COMPASS scores to improve placement accuracy




Keynote Speaker

Dr. John Muffo

Dr. John MuffoDr John Muffo is a distinguished member and past-president of the Association for Institutional Research who specializes in assessment, including the evaluation of academic programs and learning systems. He has been active in promoting professional development among international higher education organizations for several decades and is a frequent contributor to the literature on institutional research.

He is currently serving as a consultant on assessment and institutional research matters and has been a consultant on every continent except Antarctica. Some of his current activities include;

  • Assisting every department in a large research university to assess their programs at the undergraduate and graduate levels
  • Acting as the lead assessment person on grant projects sponsored by the National Science Foundation, the U.S. Department of Agriculture, and the U.S. Department of Education
  • Conducting a comprehensive review of Virginia Tech's European Center in Riva San Vitale, Switzerland,
  • Serving on accreditation teams in two regions of the US
  • Assisting programs with disciplinary accreditation such as ABET
  • Reviewing a wide range of new program proposals
  • Acting as the monitor to assure the quality of graduates from an accredited, doctoral-granting university
Dr Muffo has a B.A. from St. Francis College, a M.Ed from Ohio University and a M.B.A. and Ph.D. from University of Denver.

He is currently a consultant with John A. Muffo & Associates. Previous work included 3 years as Administrator, Special Projects of the Ohio Board of Regents, 15 years as Director, Academic Assessment Program, Virginia Polytechnic Institute and State University and 14 years as Director or Assistant Director in various capacities at Virginia Polytechnic Institute and State University, Indiana Commission for Higher Education, Cleveland State University and University of Illinois at Urbana Champaign.

Dr Muffo also is involved in local politics as a county supervisor, which has given him an opportunity to be on a first name basis with his governor (who is also the national chairman of the Democratic Party) and congressman. This has led to him testify twice in Congress in the last year or so, culminating with an invitation to the White House.