UWM & the R1 Reclassification

Draft: Kyle Swanson 4/4/2016
Discussed at APBC 4/7/2016
Post APBC revisions:
IPEDS 2013-14 PhD counts vs. Dept. Profile 2012-2015 running averages (4/8)
Removed professional PhDs from index (4/8)

UWM made the jump from R2 to R1 in Carnegie Classification in 2015, marking a significant milestone in the institution’s history. Given its importance from a marketing and political standpoint, it is vital to understand precisely why this reclassification occurred, and insofar as this designation is deemed strategically important, to use the metrics that underlie the R1 classification as a basis for future decision making by the University.

By way of background, the Carnegie Classification has been the leading framework for recognizing and describing institutional diversity in U.S. higher education for the past four and a half decades. Starting in 1970, the Carnegie Commission on Higher Education developed a classification of colleges and universities to support its program of research and policy analysis. Derived from empirical data on colleges and universities, the Carnegie Classification was originally published in 1973, and subsequently updated in 1976, 1987, 1994, 2000, 2005, 2010, and 2015 to reflect changes among colleges and universities. This framework has been widely used in the study of higher education, both as a way to represent and control for institutional differences, and also in the design of research studies to ensure adequate representation of sampled institutions, students, or faculty.

Under the Carnegie Classification, doctoral universities are assigned to one of three categories based on a measure of research activity. The research activity scale includes the following correlates of research activity:
Research & development (R&D) expenditures in science and engineering (S&E);
R&D expenditures in non-S&E fields;
S&E research staff (postdoctoral appointees and other non-faculty research staff with doctorates);
Doctoral conferrals in humanities fields, in social science fields, in STEM (science, technology, engineering, and mathematics) fields, and in other fields (e.g., business, education, public policy, social work).
Carnegie then statistically combines these data using principal components analysis to create two indices of research activity reflecting the total inter-institutional variation across these measures. It is vital to note that classifications are time-specific snapshots of institutional attributes and behavior based on 2013-14 data.

The first index represents the aggregate level of institutional research activity, and the other captures per-capita research activity using the expenditure and staffing measures divided by the number of full-time faculty within the assistant, associate and full professor ranks. The values on each index are then used to locate each institution on a two-dimensional graph, shown below in Figure 1. Carnegie calculates each institution’s distance from a common reference point, and then uses the results to assign institutions to one of three groups based on their distance from the reference point. Thus the aggregate and per-capita indices are considered equally, such that institutions that are very high on either index are assigned to the “highest research activity” group, while institutions that are high on at least one (but very high on neither) are assigned to the “higher research activity” group. Remaining institutions and those not represented in the data collections are assigned to the “limited research activity” category. Before conducting the analysis, raw data are converted to rank scores to reduce the influence of outliers and to improve discrimination at the lower end of the distributions where many institutions were clustered.R1

Figure 1: Carnegie classification based upon institutional aggregate research activity (abscissa) and per capita research activity (ordinate). Blue: R1 Doctoral Institutions (highest research activity). Green: R2 Doctoral Institutions (higher research activity). Orange: R3 Doctoral Institutions (moderate research activity).

For the purpose of understanding UWM’s situation, it is sufficient to consider only the aggregate research activity index given the strong linkage between the aggregate and per capita quantities. The aggregate research activity index is based upon institutional rank relative to the 276 doctoral granting institutions in the US. Statistics for all institutions as well as the specific values for UWM are shown in Table 1. Since Carnegie uses rank statistics to develop their overall aggregate research activity index, an institution with “Median” values in all measures would by definition have an aggregate research activity index of 0.

In the absence of detailed data for all institutions, a few reasonable assumptions are necessary. Specifically, we assume an exponential distribution among the various institutions for all research related measures. Under this assumption, the standardized anomaly associated with the various measures is given by the expression

SA= √12 (1/2-exp[(ln⁡(2)UWM)/Median]).

This expression describes standardized anomalies from the median within a uniform distribution, appropriate for rank statistics. Carnegie then weights these standardized anomalies using weights derived from the principal component analysis to maximize the signal distinguishing the various institutions. Carnegie’s weights for the various research activity measures are also given in Table 1. The estimated Aggregate Research Activity Index for UWM is the sum of the final row in Table 1 (the Aggregate Index Contribution) divided by the sum of the Weights in the penultimate row. This quotient gives UWM an aggregate research activity index (ARAI) of 0.61.

 

Returning to Figure 1, an ARAI of 0.61 puts UWM slightly above the R1 cutoff, as the 2015 classification is at an approximate ARAI value of 0.4 (outer circle intersection with abscissa). While the margin is not razor thin, it is small enough that UWM’s continued classification as an R1 institution should certainly not be taken for granted.

It is worth examining the strategic role that each of UWM’s units plays in contributing towards the R1 designation and also towards sustainability, particularly in light of ongoing resource constraints. Figure 2 focuses on fractional unit contribution to the R1 designation under the Carnegie metric, as well as fractional overall unit expenditures net restricted funds (grants, etc.). L&S is the largest contributor to the R1, responsible for 60% of the activity that led to the R1 designation, followed by CEAS at 12%, CHS at 6%, and CON, SOE and SFS all at 5%. The remaining 6 units together account for the remaining 7%. Expenditures are more evenly distributed across the units, with L&S at 40%, CEAS at 9%, CHS at 7%, CON at 6%, SOE at 7%, SFS at 2% and the other 6 units totaling 29%.

r1 fig 2

Figure 2 Fractional unit-by-unit contribution to the R1 designation (top panel) and overall unit expenditures net restricted funds (bottom panel). Total expenditures are roughly $200 million.

Units whose fractional R1 contribution exceeds their fractional expenditures contribute more heavily in the relative sense to the R1, where units whose fractional credit hour instruction exceeds their fractional expenditures contribute more heavily to the sustainability of the institution. Figure 3 shows a two dimensional space highlighting research activity on the abscissa and sustainability on the ordinate. Research activity is measured as outlined above using the Carnegie aggregate metric. FY09 and FY14 unit-by-unit research expenditures are taken from the OSP dashboard, and exclude Fund 101 research expenditures, as the distribution of Fund 101 research dollars are an internal institutional choice. Number of PhDs is taken from the Department Profiles, and number of postdocs from Human Resources. Sustainability here is simply the fraction of UWM total SCH generated by each unit. Both of these measures are weighted by unit expenditures net restricted funds (i.e., excluding grant funds), with the figures taken from Jerry Tarrer’s Fall budget model presentation. Similar figures from FY09 are extracted from OAIR’s Department Profiles. Removal of restricted funds eliminates grant expenditures from total unit expenditures, i.e., units are not punished for having significant extramural funded research activity.

r1fig3

Figure 3: Expenditure weighted sustainability and contribution towards the R1 classification using Carnegie metrics. All quantities are weighted by fractional unit expenditures, i.e., this is a view of unit-by-unit performance relative to the institution as a whole. Higher relative unit performance is upward and to the right. Red indicates approximate unit relative performance during the 2010 Carnegie report period, with the arrows indicating changes in unit performance over the FY09 to FY14 time period. The gray shaded region captures units that are underperforming in relative terms in both sustainability and research metrics.

Consistent with Figure 2, Figure 3 indicates that in expenditure weighted unit-by-unit contributions to the R1 status as measured by the 2015 Carnegie metrics are dominated by L&S, CEAS, CON, CHS and SFS. However, there is a marked difference in relative unit performance between the FY09 and FY14 periods that respectively underlie the Carnegie 2010 and 2015 reports. Compared to FY09, FY14 research expenditures decreased in the School of Education ($5.9M in FY09 to $0.6M in FY14), the College of Nursing ($3.8M to $1.0M), the School of Social Welfare ($2.2M to $1.1M), and the College of Health Sciences ($3.2M to $1.3M). These reductions occurred in spite of an across-the-board increase in non-research directed unit expenditures between FY09 and FY14 (SOE +$3.1M; CHS +$2.7M; CON +$3.8M; SSW +$1.8M). For all of these units, the reduction in the Carnegie metric due to smaller research expenditures was in part offset by an increase in the number of PhDs granted.

In contrast, L&S showed a significant increase in research expenditures between FY09 and FY14 (from $10.9M to $19.6M), as did CEAS (from $4.6M to $6.5M). It should be noted that the latter was driven by a significant investment in terms of non-research related unit expenditures ($11.5M in FY09 increasing to $18.6M in FY14), while L&S achieved research expenditure growth with only a very modest increase in non-research unit expenditures ($79.0M in FY09 to $80.7M in FY14). For both of these units, increases in research expenditures were complemented by an increase in the number of PhDs granted. Overall research expenditures were also driven upwards by the addition of the two new schools (SPH and SFS).

Summary: The approximate 50% increase in PhD degrees granted in 2013-14 compared to 2008-09 underlies UWM’s reclassification. Setting strategic priorities in a resource constrained environment will require making tough, data informed choices about campus priorities, and the R1 designation and associated Carnegie metrics provide one possible framework to use in making these decisions.
Sources:
Carnegie methodology: http://carnegieclassifications.iu.edu/methodology/basic.php
UWM research expenditures:
http://www4.uwm.edu/grad_school/onedrive-projects/admin_dashboard/
FY14 unit-level expenditures: (pg. 22 and 23) http://uwm.edu/academicaffairs/wp-content/uploads/sites/32/2015/09/budgetmodel-presentation.pdf
PhD numbers: IPEDS 2013-14 counts
Research staff: Payroll data “Employees in Training”
Note that IPEDS doesn’t break out Nursing and Health Sciences Professional PhDs. These PhDs do not count under the R1 metric. We assume these degrees are equally distributed between the two schools.
UWM’s ARAI for the 2010 report was approximately 0.4, slightly under the cutoff.