Response to Chancellor Update: Charting Our Future
While the Chancellor’s campus email was welcome for its transparency regarding administrative planning, AAUP finds it important to include arguments that analyze UWM’s challenges in a different way. The following is a point-by-point response that reaches slightly different conclusions regarding the state of affairs
With the graying of the baby boomers, there is no doubt about the large demographic shift taking place in the United States. While people 65 years and older constituted only 15 percent of the population in 2016, they are projected to constitute 21 percent by 2030 and 23 percent by 2060 (census.gov). Comparatively, children under 18 years will grow in absolute numbers but their portion in the population may decrease; e.g., 22 percent (73.6m) in 2016 to about 21 percent (75.4m) in 2030.
For higher education, however, absolute numbers of younger groups matter more than their ratio in the total population. As academic institutions, it is important for us to be specific and precise, and thus, consider Wisconsin’s actual numbers, not national and regional projections that can always turn out to be otherwise:
The following are the actual enrollment numbers of 12th graders in Wisconsin (WISEdash) compared with UWM’s freshmen enrollments. It’s important to base our current policies not simply on general projections from one source (Nathan Grawe’s book) but actual Wisconsin data. We do not want to find ourselves barking up the wrong tree while addressing an issue.
|Year||Wisc. 12th Grd Enrollment||UWM new frosh fall headcount |
WIsconsin Resident Total
These data may have small inaccuracies but it’s clear that the recent decline in UWM’s Wisconsin resident students doesn’t directly correlate with “fewer college-aged students resulting in declining enrollments.” The number of college-aged students in Wisconsin may have seen a drop 8-10 years ago but since 2014 their number has actually been inching up while our enrollments have been inching down. The decline may have other immediate possible causes, such as UW system’s support for Madison’s increased in-state enrollments. The larger context suggests that enrollment declines are disproportionately impacting smaller schools, and a university like UWM, if properly led, should not lose this competition for enrollments.
Cost of Higher Education
The cost of higher education and student debt are definitely of major national concern. UWM can take the lead by joining other universities in advocating a drastic increase in federal and state funding based on existing research.
Yet UWM is one of the least expensive universities in the region. Ideally, this should allow us to recruit more students, not fewer. Therefore, it is likely that, like the demographic shift, our competitive weakness relative to others may not be due to the expense of education.
Public questioning of the value of a college degree
While the public has every right to question the growing costs of higher education (though, UWM is not part of that constellation), no one questions the importance of higher education as a public good. Indeed, the importance of a college degree is only growing. “By 2027, total undergraduate enrollment is projected to increase to 17.4 million students. In fall 2016, total undergraduate enrollment in degree-granting postsecondary institutions was 16.9 million students, an increase of 28 percent from 2000, when enrollment was 13.2 million students” (https://nces.ed.gov/programs/coe/indicator_cha.asp). It’s counter-productive to relay dubious claims about the college degree. As the institutional make-up of world society grows in complexity, the need for higher education should grow correspondingly. A population that doesn’t understand how the world works cannot help the world work better.
Employers, Skills, and the Purposes of Higher Education
There are two important reasons why AAUP thinks it’s dangerous to try to closely align education with employer expectations.
First, the academy has always been responding to the stimulations from the world outside. UWM just approved a new major in environmental engineering because of its growing importance in the world. These academic responses have been organic in nature. Because the collective intelligence of research faculty in the country is far greater than the collective intelligence of a small set of college administrators, it is important for administrators not to chase every new industry or skill in search of satisfying local or regional employers. It’s not the computer industry that gave rise to computer science, it is academic computer science that gave rise to the computer industry, starting with the proposal by Gottfried Leibniz, a seventeenth-century philosopher and mathematician, to build machines for valid inferences through a “calculus of reason” (calculus ratiocinator). One can apply the same logic to nuclear and genetic industries. Academic disciplines forge new paths through an internal dynamic while receiving stimulations from the outside world. It’s important for college administrators not to start betting on new industries and future job markets, and start meddling with the organic movement of academic disciplines. Let’s give our students fundamental principles and foundational skills from computer science to communication so that they become capable of learning new programming languages and new media platforms on their own, as and when needed.
Second, the skills gap argument—that there is a gap between the skills job seekers currently have and the skills employers need—has been debunked by many studies in economics and education. Some skills gap is unavoidable under the best of conditions, as some positions will always remain unfilled even while some workers are unemployed. However, “the belief that America suffers from a severe ‘skills gap,’ economist Paul Krugman writes, is…an idea that should have been killed by evidence, but refuses to die.” Were there an actual skills gap, one would witness a rise in wages in those areas, but no such wage increase is taking place.
Growing competition in higher education and low-cost models
Competition in higher education is a reality. But it’s important for UWM to know the category in which it is competing. Chasing everything, from lower-cost models to new curriculum delivery modes, may not be the answer. We are delivering lower-cost education with online programs already in place. We can capitalize on those strengths but UWM must also figure out who its competitors are. Are community colleges, for-profit colleges, or MOOCs our competitors? AAUP views none of these players as UWM’s competitors.
Dear AAUP Wisconsin members,
I am writing to share an update from our UW-Whitewater chapter regarding the chancellor search that is under way there. This is the first chancellor search to take place anywhere in the UW System since the Board of Regents made changes to the search and screen process for chancellors, once in 2015 and then again in 2017.
As you may know, the 2017 changes incorporated a new statutory requirement prohibiting UW institutions from requiring chancellors to have the customary academic qualifications.
The UW-Whitewater AAUP chapter is circulating a petition asking the search and screen committee to include a “strong preference” for academic credentials in the position description. You can read the petition, and sign your name in support, at the following link:
The petition already has over 300 signatures. I hope that we can show our support from all across the state, and help our Whitewater colleagues show the Regents how important this is.
Much as UW-Stevens Point is the first test case for the new faculty termination policies in RPD 20-24, UW-Whitewater is the first test case for the new chancellor search process in RPD 6-4. As elsewhere, the new process concentrates power in the hands of the Regents. Previously, search and screen committees had to consist of at least half faculty from the institution in question; now, the committee has 10 members, of whom 5 are Regents, and only 2 are faculty.
I encourage you to support our UW-Whitewater colleagues!
Nick Fleisher (UW-Milwaukee) President, AAUP Wisconsin
When the chancellor donates his $50,000 raise to the University Food Pantry,
They put out a spread.
It’s delicious. Everyone gathers to eat it.
The shelves are full. Since there is no more food insecurity among students,
They are free to focus on learning.
When the chancellor donates his $50,000 raise to Norris Health Center,
People from all over Milwaukee are Inspired by this display of generosity. They donate
their services. There is acupuncture, massage, talk therapy and tarot reading; it’s all free.
There are Black and brown, queer and Indigenous, white and Asian American care
providers. Everyone feels better.
When the chancellor donates his $50,000 raise to scholarships,
the Wisconsin Board of Regents is ashamed. They drop tuition rates so low
That a working single mother of three can afford to take a class. (There is free childcare
for students.) She gets her degree, makes it big,
donates extravagantly. Going to college becomes an option for everyone in town.
When the chancellor donates his $50,000 raise to the university,
The money magically multiplies. Suddenly, everyone who makes less,
When the chancellor donates his $50,000 raise to the university,
the example of his generosity reminds
faculty and staff across campus that the university is made out of
only love and labor, and that it belongs to everyone, including us.
We dance in our offices and continue the work.
Another University is Possible: it has been there all along.
It hovers in the wings of this cruel austerity:
requiring only courage, only love to take flight.
by Jeffrey Sommers
Professor of African and African Diaspora Studies and Global Studies
Dear Chancellor Mone, Provost Britz, and Colleagues,
I could not attend this week’s Faculty Senate meeting as I was in Riga, where I ran an event designed to thwart corruption, in cooperation with the US Ambassador and the German Friedrich-Ebert-Stiftung. Incidentally, while I was there, everyone from the US Ambassador on down asked me (in astonishment), “What is happening to Wisconsin?” I am in Budapest this weekend. The purpose of this travel is to convene a meeting of an event at Hungary’s Central European University, funded by the Open Society Foundations, for a project I co-direct on threats posed to democracy by authoritarian governments. Such governments have increased their reach in recent years and typically have used (and abused) constitutional procedures to advance and ensconce their power.
As you might know, this week Wisconsin was described as “Hungary on the Great Lakes” by one of the New York Times’ s top columnists. Moreover, Wisconsin billionaire ‘job creator’ Sheldon Lubar (with whom I have corresponded this past week) wrote Governor Scott Walker to decry the “conniving” (his word) of the Wisconsin GOP and the Governor’s cooperation with them as they abuse their power in acting against the public will by trying to hamstring the state’s newly elected Democratic governor, Tony Evers. Wisconsin is presently the most gerrymandered state in our republic. And here too, in Budapest, people are asking in disbelief, “What is happening in Wisconsin?” Today, as Governor Walker (against Mr. Sheldon Lubar’s counsel) signed our gerrymandered state legislature’s bills to limit democracy, I received emails from around the world from figures of note asking, “What is happening in Wisconsin?”
The work of academics historically has been to pose difficult, sometimes uncomfortable questions, not in a gratuitous, but in a serious, fashion. The search for “truth” and “improving the human condition” as articulated by UW President Charles Van Hise in 1905 are central to UW’s mission, and extend back to the Greek philosophers of antiquity. My uncomfortable question is: “Might it be incumbent upon us to review all UW policies comingfrom the System level or higher given what has been revealed as the undemocratic character of our current state government?” It’s not only UWM that is watching how we answer our current crisis of democracy, but the nation and the world. How will we respond? Make no mistake, this is a historic juncture.
by Richard Grusin
Distinguished Professor of English
On Pearl Harbor Day, 2018, the University of Wisconsin Board of Regents dropped its own economic bomb on the people of Wisconsin, approving raises ranging from $14,421 to $72,668 for 10 of the UW System’s 13 chancellors. In the days following the December 7 meeting, social media has exploded with expressions of the emotional damage inflicted by these oversized raises.
Many University of Wisconsin faculty and staff, whose pay has remained static for roughly a decade, and who took de facto pay cuts in 2011 when Act 10 peremptorily increased individual retirement contributions by roughly 7%, filled Facebook and Twitter with complaints, shares, and retweets about these obscenely inflated raises. Over and over again, faculty and staff decried the injustice of chancellors like UW-Madison’s Becky Blank and UW-Milwaukee’s Mark Mone receiving raises ($72,668 and $49,419 respectively) greater than the salaries of many assistant professors and full-time instructors.
Interestingly this outrage was not shared by the news media, who seemed more concerned with the possible injustice of two chancellors not receiving raises because they were being punished for actions that the Regents did not approve. In an article in the Milwaukee Journal-Sentinel, Karen Herzog reported, “The chancellor who hosted the University of Wisconsin Board of Regents on his campus this week has been denied a $25,600 performance raise after his reprimand for inviting a pornstar to speak to students during free speech week a month ago. The regents also did not award another chancellor, whose husband was banned fromher UW campus and stripped of an honorary, unpaid position after an investigation concluded he had sexually harassed female employees.” No mention was made in the Journal-Sentinel of the unseemliness of the large chancellor raises, nor was there any suggestion that“punishing” misbehaving chancellors was in any way problematic.
This stark divergence between local news coverage and the responses circulated widely on social media is worth examining, in part because both responses overlook what I take to be the fundamental problem with the logic of employee compensation entailed in the Regents’ decisions. For me the most troubling element of these raises is not their disproportionate size no rthe financial punishment of the chancellors who had displeased their superiors. Although I am in complete and total agreement with my fellow UW System faculty and staff at being outraged by the dollar amount of the oversized raises given to 10 of the 13 UW System chancellors, I am not surprised. And you shouldn’t be either.
Why am I not surprised? Because as anyone who has been paying attention knows, the chancellors have been carrying water for UW System President Ray Cross and the Regents for several years now. These outsized raises are financial rewards for their not having opposed or obstructed a single top-down edict from Cross and the Regents–for their having carried out his orders like good soldiers or middle managers are expected to do.
Put differently, what both the raises and the punishment reveal is that these raises are payoffs, ex post facto bribes, or quid pro quo rewards for UW System chancellors having accepted without objection the destruction of tenure and shared governance; repeated massive budget cuts; unfunded tuition freezes; and the break-up and distribution of the UW Colleges and Extension to the four-year, comprehensive, and doctoral campuses, aka the UW System merger.
Why didn’t chancellors object last year to this merger? Could it be because their jumbo-sized raises were made possible by money freed up by the elimination of the UW Colleges/Extension chancellor position upon their top-down dissolution? As Karen Herzog dutifully reported, these raises didn’t require an infusion of new salary money but were funded by dividing up “the $270,774 salary of former UW Colleges and UW-Extension Chancellor Cathy Sandeen, whose position was eliminated in the sweeping UW System merger.”
This might very well explain why UW System chancellors have quietly gone along with the absurdly sped-up timetable for this merger. Could it have something to do with the fact that the funds freed up from eliminating Chancellor Sandeen would be used to reward those very chancellors? You don’t really think that Friday was the first time UW System chancellors heard that those funds would be used this way, do you? I certainly don’t.
What I find most scandalous about these raises is not how grotesquely large they were in the context of the multiple financial needs of a seriously strapped university system, nor how raises were withheld from chancellors who have earned the disapproval of Ray Cross and the Regents. No, what is most troubling to me about the economic logic of these raises is that they reveal once and for all that the role of the chancellor in the University of Wisconsin System is not to represent the interests and needs of his or her university to the UW System, but to carry out the marching orders handed down from above.
Sadly, we now have no other choice but to believe that chancellors like Becky Blank or Mark Mone have not been acting as independent academic leaders, charting the best course for their universities in difficult times. Rather UW chancellors have become little more than well-paid marionettes, whose strings are being pulled from above by Ray Cross and the Walker-appointed Board of Regents. If money indeed talks, these raises speak volumes about the true nature of academic leadership in the University of Wisconsin System.
The following is a post from an anonymous, nontenured author:
We need to talk about salaries.
Last week the UW Regents approved a 3% pay raise for faculty for the next two years – the largest raise in many years. However, it’s difficult to be appreciative when so many of our colleagues have fallen inexcusably far behind the acceptable pay scale. Ray Cross claims that UW Madison faculty are underpaid by 10%; assistant and associate professors in the humanities there typically earn between $70,000 and $100,000 per year. This appears to be Cross’s only concern: that these numbers are too small. But our new colleagues at UWM face much worse problems regarding their compensation.
Many new UWM faculty who have come to us via the merger with the UW Colleges faculty earn less than $50,000 a year. Their bump when promoted to Associate Professor with tenure is only around $1,500. For some context, an assistant manager at Kwik Trip with no required higher education can make $45,000 a year. This means we have hard-working faculty who have earned PhDs, teach a 4-4 load, engage in research–and could probably do better, financially, as gas station managers. What kind of signal does this send to our faculty? What kind of message does this send about higher education?
Our new colleagues from the UW Colleges have long dealt with rock-bottom morale. We are told that when talk of raises would come up in their meetings, the prospect was always quickly shot down. Administrators for the Waukesha and Washington County campuses, however, each received a $20,000 bump in salary this year.
The problem is not a lack of money in the system. It’s that the money is constantly moved away from faculty and into new silos. How did the chancellors get their large pay bumps this last week? It came from leftover money from the UW Colleges that could have helped underpaid faculty. This double standard needs to end.
The UWM annual budget for the 2018-2019 fiscal year is $689,165,710, with $243,334,769 going to salaries. The branch campuses bring in a combined budget of $4,136,764. The salary gap between our UWM faculty and the College of General Studies faculty could be fixed with around $1.2 million dollars. There is no excuse for such inequality among our colleagues. When chancellors get raises larger than many faculty salaries, it shows not only the arrogance of these administrators, but also their lack of interest in faculty compensation, retention, and morale.
Investing in the College of General Studies can only help UWM as a whole. These branches are feeding students into the Milwaukee campus. How can we expect them to advocate for us if they are treated so unfairly? Numerous faculty searches on our new branch campuses have failed because salaries and workload are nowhere near competitive. If we are to make the most of this merger, it is essential to invest in our colleagues. We must welcome the new faculty to the Panther family by treating them fairly–something the UW Colleges administration has failed to do.
Chronicle of Higher Education Data
UWM Faculty Salary Average:
UW Colleges Faculty Salary Average:
Comments from Nancy Mathiowetz, Professor Emerita, UWM
Former President, American Association for Public Opinion Research
Former Editor, Public Opinion Quarterly
It would be useful in reviewing the survey to understand the analytic objectives of the study. What empirical questions are they attempting to address? And how do they want to use these data? That framework would provide a better lens for reviewing the instrument.
Both questionnaire design and sample design are important to review in understanding the quality of a survey. With respect to questionnaire design, one wants a well-designed questionnaire, for which the wording is both easy to comprehend and does not bias the respondent. The structure of the questions (e.g., Likert scales, open ended, multiple choice) is also important and can contribute to the overall quality of the survey data. A poorly designed questionnaire renders data that may be misleading, biased, or inaccurate.
Similarly, it is important that the sample design –that is, the identification of the population of interest and the means by which to select members of that population—be clearly specified and executed. Once people are selected for inclusion in a study, efforts should be made to encourage their participation so as to have representation across the full diversity of the population of interest. Similar to a poorly designed questionnaire, a poorly designed or executed sample can result in misleading,biased, or inaccurate estimates.
The Mercer Survey
The American Association for Public Opinion Research offers a series of recommended best practices, including recommendations about question wording (see: https://www.aapor.org/Standards-Ethics/Best-Practices.aspx). Specifically with respect to question wording, the website states:
Take great care in matching question wording to the concepts being measured and the population studied.
Based on the goals of a survey, questions for respondents are designed and arranged in a logical format and order to create a survey questionnaire. The ideal survey or poll recognizes that planning the questionnaire is one of the most critical stages in the survey development process, and gives careful attention to all phases of questionnaire development and design, including: definition of topics, concepts and content; question wording and order; and questionnaire length and format. One must first ensure that the questionnaire domains and elements established for the survey fully and adequately cover the topics of interest. Ideally, multiple rather than single indicators or questions should be included for all key constructs.
Beyond their specific content, however, the manner in which questions are asked, as well as the specific response categories provided, can greatly affect the results of a survey. Concepts should be clearly defined and questions unambiguously phrased. Question wording should be carefully examined for special sensitivity or bias. When dealing with sensitive subject matter,techniques should be used that minimize the discomfort or apprehension of respondents or respondents and interviewers if the survey is interviewer administered. Ways should be devised to keep respondent mistakes and biases(e.g., memory of past events) to a minimum, and to measure those that cannot be eliminated. To accomplish these objectives, well-established cognitive research methods (e.g., paraphrasing and “think-aloud” interviews) and similar methods (e.g., behavioral coding of interviewer-respondent interactions) should be employed with persons similar to those to be surveyed to assess and improve all key questions along these various dimensions.
In self-administered surveys careful attention should be paid to the visual formatting of the questionnaire, whether that be the layout of a mail survey or a particular eye towards respondents completing a web survey on a mobile device. Effort should be taken to reduce respondent burden through a positive user experience in order to reduce measurement error and break offs.
In reviewing a hard copy version of the questionnaire, one that appears to have been written for faculty members, given reference to research, I see a questionnaire that consists of three distinct types of questions:
- A partial ranking question (Question 1) that asks for the five most attractive aspects of the position at two points in time;
- Five-point Likert rating scales, ranging from Strongly Agree to Strongly Disagree and including a “middle” category of “Neither agree or disagree;” and
- Multiple sets of “maximum difference scales” which ask respondents to examine multiple sets of employment or benefits attributes,requesting respondents to select the most important and least important within each set.
Some specific comments about each of these types of questions follows.
With respect to question 1 (partial ranking question), the format choice is not of major concern –certainly this type of ranking is often used to determine respondent’s preferences. What is of concern is some of the mismatch/sloppiness in the question. First, the question references working for“UW” but most of the employees answering this question do not work for the UW system but rather at a specific UW facility, so the wording is odd. Second, the question itself asks about what “interested” you most (for the first part of the question) and what is most “important” (for the second part), but the column headings use the term “attractive.” While not a critical inconsistency,it’s a bit sloppy.
The 5-point Likert items have two sets of response options–strongly agree/strongly disagree (Questions 2-11, 15-20, 22-27) or very satisfied/very dissatisfied (Questions 14a through 14r). Of the 22 items that are agree-disagree items, all but two are written in a positive frame, that is,the language indicates a positive point of view. This is not a best practice and the use of such an approach can lead “straight lining” where individuals simply mark the items in a single column, not carefully reading each item. And in general, the field of survey methodology recommends avoiding the use of agree-disagree items since it often leads to acquiescence bias, that is, the tendency to agree to statements, which leads to exaggerated estimates of endorsement for positively-worded statements.
Although I am reviewing a hard copy questionnaire, I note that Question 14 has 18 sub-questions, all requiring the respondent to use the same five-point scale (as well as a “not applicable” option). Once again, if presented on a single screen, this would not follow a best practice and leads to respondents not fully considering each item individually. In addition, it does not appear that these 18 items are rotated, so as to avoid order effects. Once again, this is in contrast to best practices which recommends randomizing the order of long lists.
Finally, the survey consists of two sets of maximum difference(maxdiff) scaling, an extension of the method of paired comparisons. In a typical maxdiff scaling question, a respondent will rate between four and six attributes of an entity/product/service. Analysis of the data using a specific statistical technique, hierarchical Bayesian multinomial logit modeling, produces important estimates for each attribute for each respondent.
The redundant nature of maxdiff questionnaires is one of the drawbacks of the approach, since respondents often feel that they have just answered the question. In the present questionnaire, Question 12 consists of 11sets and Question 13 consists of 20 sets, each requiring the identification of most important and least important.
What is odd and most disturbing about Question 12 is that the question states “some of these benefits or programs are not current benefits or programs at the university.” But the attributes listed in Question 12 are not all benefits or programs –they are attributes of the actual work environment or characteristics of employment. For example, the question includes attributes such as “Type/variety of work,”“Stable employment,” “Career advancement/professional development,” and “Pay.” These“attributes” are juxtaposed alongside benefits such as “Sick leave,”“Healthcare benefits,” and “Retirement savings plans.” In contrast, the attributes presented in Question 13 appear to be, for the most part, benefits.
I find the mixing of employment attributes and benefits attributes in Question 12 to be atypical of most maxdiff designs. It seems inappropriate to ask respondents to make tradeoff assessments between employment attributes such as pay and benefits attributes such as sick leave.The mix of items –which are attributes of two very different constructs –could result in a misleading set of empirical findings.
And placing two of these maxdiff questions next to each other–thereby forcing the respondent to answer 31 sets of these items consecutively—is not ideal with respect to overall questionnaire design or consideration of respondent fatigue.
It does not appear that a sample has been selected for participation, but rather a census of all benefits-eligible employees. What methodology is being used to ensure diverse participation both across all UW system locations and throughout the ranks of faculty and staff? Although a census allows for all members of the population to voice their opinions, it also means that resources to encourage participation must be spread throughout the population, rather than focused on a specific scientific sample.
The survey included no request for demographic information,location, years working in the UW system, or position. Can we assume that this information will be imported from HR files, given the unique link sent to request participation? At a minimum a few of these questions should have been asked to ensure that the data were collected from the person intended to be queried.
And why does the survey bear the UW system and UW-Madison logos,but not those of other universities? If a different methodology is involved for the Madison campus as compared to other campuses, how will this impact comparisons across campuses?
 It is unclear if there is a different version of the questionnaire sent to non-faculty staff members.
From Aaron Schutz:
The survey contains forced choice questions. And you can’t skip any of them. Anyone taking it cannot avoid picking the kinds of benefits wanted and by implication, not wanted, from the options. I’m not sure whether it is better to complete it or not, but the data is clearly designed to inform cuts in benefits. The very structure of the survey means that the report will necessarily illuminate those benefits that few faculty chose as most important.
To put it another way, why would they even bother asking questions that ask us to choose the benefits we prefer instead of simply asking how important each benefit is for us on a Likert scale. The scale would allow respondents to value all of them. The very structure of the report of this data, regardless of what is intended, will imply that some benefits are perceived as less important by us “clients” and, because they are less important, areas for cuts. Simply answering the questions gives ammunition to those who want to cut benefits because the structure of the survey requires you to distinguish between most and least important.
Here is the first question that you must choose “one” from:
Healthcare benefits (medical, dental, vision)
Retirement savings plans (WRS, 403b, 457)
Type / variety of work
Many people are actually forced to choose among these things. A stable job without healthcare? A job with healthcare but no retirement system?
So one needs to ask, why would they create a survey with this kind of structure? What does it mean about what they want to do with it? Are they confused, not realizing that the report will inevitably highlight which benefits that faculty don’t choose as important? What if no one chooses healthcare over another benefit? Does that mean we don’t want access to healthcare?
Imagined line from the report: “The survey indicated that healthcare was the least important benefit to faculty.”
I can’t advise anyone else, but I won’t fill it out.
-from a colleague with some expertise who prefers to remain anonymous
“The survey company is paid by the regents, ultimately, and so presumably is acting according to the regents’ strategic goals. This is standard practice for short-term consultants in the business world. In my opinion, the regents want empirical support for the notion that UW employees would gladly relinquish the current level of benefits (pension, income continuation, and family/sick leave) in order to gain other tangible and intangible goods. So the survey asks people to rank the relative worth of benefits against other goods (e.g., salary, job flexibility, the opportunity to perform meaningful work, desirable location, etc.). That is really the basic template of every single question. The questions differ from each other only insofar as they give respondents different opportunities to devalue benefits — many different comparisons (pensions vs. A, B or C) and many different hypothetical scenarios (Why did you choose to work for UW? What keeps you at UW? etc.).
“The survey company will be able to mine this data and present it adroitly in order to support the (foregone) conclusion that UW employees would be willing to trade off benefits for other goods. Well, that’s my cynical reading, but survey design does involve the ‘dark art’ of slanting the questions, in order to run a biased analysis, in order to reach the desired conclusion.”