The Arizona Desert Lamp

Survey says. . .

Posted in Campus, Politics by Evan Lisull on 12 February 2009

Before getting into this, the Office of Student Affairs deserves appreciation for the manner in which access to this survey data was provided. After sending an email wondering if we could access the full survey, and not just the summary, we quickly were informed that a hard copy of the survey would be available for pick up in the Administration. No RIPR form was required, no bureaucratic loopholes were thrown in our path. In fact, since our request, the data have been released online, for anyone to access (see raw data here, and the executive summary here — both are PDFs) It would be nice of the rest of the UA bureaucracy took note.

THAT BEING SAID, the ‘noble lie’ under which this survey guises as legitimate bears further looking into. From the Office of Student Affairs:

Over 6,000 students completed a comprehensive survey and the results were used to establish 2008-09 fee allocation priorities.

Curiously, the ‘background history’ of the survey doesn’t even include the percentage of students that supported the fee, although it is generally accepted that a majority of students approved the fee. Yet as I’ve said elsewhere, determining legitimate majorities on the basis of a survey is more difficult that simply a yes-no poll — your sample size must reflect the population, and must not be prone to any bias.

The Survey

The executive summary does not get off to a good start in its introduction:

Incentives in the form of prize drawings were offered to encourage participation.

You might almost say that this selects for students who are more likely to like ‘free’ (TANSTAAFL!) things, who are willing to vote themselves ‘free’ services. Much as they want a free iPods, they also want free computers. And plasma screens. And lunches. (TANSTAAFL!)

The research relied on a convenience sample, as a link was advertised via email to all enrolled students. This type of sample is based on availability and accessibility, and can often produce samples that are quite similar to the population of interest when conducted properly. [Emphasis added – EML]

Oh, really? That’s funny, because my trusty Encyclopedia Brittancia (lest any anti-Wiki crusaders doubt the source) says otherwise:

Probability sampling methods, where the probability of each unit appearing in the sample is known, enable statisticians to make probability statements about the size of the sampling error. Nonprobability sampling methods, which are based on convenience or judgment rather than on probability, are frequently used for cost and time advantages. However, one should be extremely careful in making inferences from a nonprobability sample; whether or not the sample is representative is dependent on the judgment of the individuals designing and conducting the survey and not on sound statistical principles. In addition, there is no objective basis for establishing bounds on the sampling error when a nonprobability sample has been used.

Most governmental and professional polling surveys employ probability sampling. It can generally be assumed that any survey that reports a plus or minus margin of error has been conducted using probability sampling. Statisticians prefer probability sampling methods and recommend that they be used whenever possible. [Emphasis added – EML]

To clarify: the Student Services Fee Survey is not professional, and its use frowned upon by real statisticians; has its legitimacy determined by the authority issuing it; and has no basis upon which it can be overturned as unrepresentative. I’ve already mentioned one possible bias that might manifest itself in this survey, and I’m sure that my colleague can name another.

Anyways, the oft-alluded, never cited results:

On a question that indicated that $49 per semester would be needed to provide all possible initiatives, approximately half (51.22%) of the respondents indicated that they would be willing to pay between $46 and $85 a semester. However, 40.60% of the respondents indicated that they would be willing to pay $46 to $65 a semester, rather than the higher cost, suggesting that students are willing to pay what is necessary to fulfill the need, but not more than is necessary.

Even accepted this at face value, we have a quite slim majority of students supporting this level of a fee — I would call this a tyranny of the majority, except that they didn’t even have the good decency to win at the polls. Yet there are problems with even this modest claim, starting with the way that the question is phrased. The same section as above, with certain lines taken out for emphasis:

On a question that indicated that $49 per semester would be needed to provide all possible initiatives . . . 40.60% of the respondents indicated that they would be willing to pay $46 to $65 a semester, rather than the higher cost, suggesting that students are willing to pay what is necessary to fulfill the need, but not more than is necessary. [Emphasis added – EML]

Huh? The question stipulates that $49 is all that is necessary for all possible initiatives — so why not just ask, ‘Would you approve a fee at $49 per semester?’ Of course, the problem was that this was exactly what they did at the polls — where they were roundly trounced. When in doubt, deception provides certainly.

Deception, or at the very least unseemliness, certainly applies to the way that the actual question was asked. From the survey itself:

Q17. The per-student/semester fee necessary to fulfill the entire need is $49.00. It is important to note that not implementing a fee can result in the reduction/elimination of services and/or programs. Indicate the largest amount you would support for a Student Services Fee:

The emphasis is as printed (although they use italics — yet for block quotes in WordPress, everything is in italics), and serve to highlight the outrageousness of the sentence. Such a blatant insertion of slant into the question provides even more of a skew than the sampling selection process. Imagine, for instance,  if instead, radical libertarian Murray Rothbard was the designer of the survey and the question was as follows:

Q17. The per-student/semester fee necessary to fulfill the entire need is $49.00. It is important to note that supporting a fee represents an increased amount of coercion on your fellow student, and is thus an immoral action. Indicate the largest amount you would support for a Student Services Fee:

Or perhaps, Karl Marx:

Q17. The per-student/semester fee necessary to fulfill the entire need is $49.00. It is important to note that if you do not increase the fee, you will indicate  a slavish devotion to the bourgeoisie mentality, and support for repression of your fellow worker. Indicate the largest amount you would support for a Student Services Fee:

I could go on all day (actually, why don’t you? Best version of the survey question from a celebrity wins . . . an Congressional pony), but you get the point. Such a slant inherently is going to shift towards support of the fee; I have no doubt that printing the question

Q17. The per-student/semester fee necessary to fulfill the entire need is $49.00. Indicate the largest amount you would support for a Student Services Fee:

would result in far less support.

Then we have the issue of the categories. Students were allowed to vote for the following categories (which the percentage a number support attached):

COUNT    PERCENT
544        10.62%        $66 to $85 per semester
2079        40.60%        $46 to $65 per semester
1353        26.42%        $26 to $45 per semester
859        16.77%        $1 to $25 per semester
286        5.58%            I would not support a Student Services Fee

To clarify, I’m not at all saying that they’re lying; faux statistics are better than that. Instead, they obscure, obfuscate, and generally mold the numbers to whatever suits their interests.

The question says that $49 per semester is required for everything, but has groups for ($26-$45) and ($46-65). Suppose you’re willing to support the $49 fee, but don’t want it to go any higher? If the survey had actually wanted to document whether people were willing to go beyond the estimated necessity, the categories would’ve been organized as follows:

$66 to $80
$51 to $65
$26  to $50
$1 to $25
I would not support a Student Services Fee

I suppose that it’s still obscuring those who would support $49 and not $50, but an equally strong case can be made that people get confused about whether or not its an open or closed interval (i.e. is it up to $49, or including $49?).

Family Feud

More importantly, would it have been too difficult to make this a sliding scale? The study could even institute the “$49 = everything” clause, and had that as the default point on the scale. Students could move the scale up or down, depending on how much they would support. This provides data points, rather than bar graphs, a small but genuine improvement.

Going back to the numbers, flawed as they are, you could just as easily make an equal case that, nearly half of students do not support funding the entire gamut of student services, that nearly half of students support cuts in programs.

Demographics

Almost more important than how students responded is which students responded. Breakdown of turnout in elections is interesting; but in a survey, it is absolutely essential — to be legitimate, the sample must be representative of the population.

The survey justifies its numbers by saying:

“Out of the 32,000 students surveyed, 5,111 complete responses were obtained, for an overall response rate of 16%. Given the response rate and size of the population, a 1.65% margin of error (with 99% confidence) was obtained.” [Emphasis added – EML]

Yet judging solely on the basis of population size response ignores that it is not the size of the population that counts, but its consistency. If the 5,111 had all been freshmen, you could hardly call the survey ‘representative.’ First, we’ll look at the demographics for gender:

Female        61.5%        3115
Male            37.8%        1915
Transgender        .08%        4
Prefer not to
respond        .61%        31

Since the UA Factbook doesn’t have “Transgender” or “Prefer not to respond categories,” we’ll have to leave them out. From the UA Fact Book’s gender breakdown:

Female        52.89%
Male            47.11%

I’ve not the time nor the intellectual reserves to go all t-test or F-test on this (but if we have any readers with background in statistics, go forth, and multiply! If you can break these numbers down, I’ll buy you a six-pack. Seriously!), but I can do a quick percentage deviation calculation:

Female: 16.27% (above expected value)

Male: 19.76% (below expected value)

The survey also offers some fun pop sociology:

Females tended to rate initiatives as more important than males, regardless of the content of the initiatives. A notable exception is question 10 (increased availability of free legal advice), which males rated as significantly more important than females. (F=2.741, p<0.5)

Despite rating initiatives as less important, males, on average, indicated that they are willing to pay a greater fee per semester (question 17) than females (F=5.393, p<.01)

Commentariat, discuss!

The race/ethnicity breakdown is not as clear, especially since a full 10.99% answered with “multiracial,” “not listed”, or “prefer not to respond.” Where there are percentage deviations, however, they indicate that the sample size is overly White (by less than a percentage deviation) and Asian/Pacific Islander, and under-represents African-American, Hispanic, and Native American/American Indian students.

The class breakdown is dominated by freshmen and graduate students. The UA Factbook doesn’t have a full undergraduate class breakdown, but it does differentiate freshmen from the rest.

Percentage Deviation (freshmen): 50.88% above expected value

Percentage Deviation (other undergraduate): 22.04% below expected value

Percentage Deviation (graduate): 22.64% above expected value

I assume that by this point you won’t be surprised that:

Freshman (and to a lesser extent sophomores) tended to rate services as more important than other students. Further, they endorsed a willingness to pay significantly more per semester than juniors and graduate students (F=2.284, p<.05).

Yet even though graduate students opposed higher amounts, they certainly didn’t oppose allocating a good portion of the total SSF to “student travel.” The Fee makes an attempt to mitigate this Freshman Fury, by charging the fools $10 per semester that supports “freshman specific programming.” Yet if these are actually ‘freshmen specific,’ shouldn’t they just charge a user fee?

Finally, we have living status. Without any accurate living status measurements to work with, we’ll have to just look at spending trends:

Initiatives tended to be more important to those students living on campus (defined as those in Greek housing or living in the residence halls), than to those living off campus (either with family or in off-campus housing). Additionally, students living in residence halls were willing to pay significantly more than those living off campus (F=6.909, p<.001).

This makes a lot of sense as well — students who live on campus also tend to live in the unions, and utilize on-campus programs at a greater rate than off-campus students. Why not, then, have students who live on campus pay a, say, $15 per semester “programming fee” that’s attached to rent? This is about as close to a perfect user fee as you’re going to get.

An underutilized statistic was the response to a question on methods of payment. The problem, of course, is that it was a ‘check all that apply’ question, making any sort of breakdown or trend almost impossible. Were they to run the survey again, I would like to see the question be rephrased as, “Please indicate the primary method of payment for your college education.” Then, I’d like to see how great of a positive trend existed between those whose primary method was their parents, and support for the fee.

The overall lesson from the demographics is that the groups more inclined to spend money – women, freshmen, living on campus –  were overly represented. And this makes intuitive sense — someone who doesn’t want a fee is far less likely to respond to a survey about a fee than someone who wants one.

So, this isn’t just a bad survey — it’s a poorly run, biased, unrepresentative survey. Yet it is this the guiding document that is used to justify taking $80 of YOUR dollars for plasma TVs next year.

Advertisements

7 Responses

Subscribe to comments with RSS.

  1. […] explained how the Student Services Fee was never approved by the student body, and how an unrepresentative, poorly conducted survey was used as a duplicitous instrument to subvert the student will to fund “accessible […]

  2. […] When asked where the funds for such charts would come from, Mr. Yamaguchi pointed toward the democratically-chosen, wisely allocated, student controlled Student Services Fee, and its anticipated […]

  3. […] ASUA seems to determined to make the SSF process look scientific in comparison. When several Senators proposed that this survey be merged with the […]

  4. […] Fee survey, and it’s the same it ever was. The same biases previously discussed at length here are still at play, and Student Affairs still doesn’t care. One comes across a paragraph like […]

  5. […] on April 5th, 2009 As if the litany of other absurdities associated with the Student Services Fee weren’t enough, recipients of SSF money must use some of the funds to pay for the ‘Mark […]

  6. […] survey isn’t worth going into – if you want to read about all of the errors plaguing it, read this post and multiply the problems by a factor of three.) Somehow, this justifies the provision of […]

  7. […] to the survey used to implement the Student Services Fee. In that survey (discussed at length here), only 16 percent of the student body responded to a similar convenience-sample poll, and freshmen […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: