Missing the Marks

likert-scale

Over the last several years, I have been very pleased to see an emphasis on program assessment within the NIRSA community. Because we have taken the time to perform intentional assessment, we have proof of a positive correlation between students using their campus recreation facilities, and higher GPAs and retention rates. We have been able to show that our student employees have improved their transferable skills by working within our programs and facilities. There is confirmed evidence that students gain leadership skills by participating in intramural and club sports. Additionally, we can use program assessment on a day to day basis to show us whether our intended goals for our programs and services are being met, and how we can tweak our programs for continuous improvement.

This is all good stuff. Truly.

Which makes it so much more disappointing when we, as individual members of NIRSA, don’t take the time to evaluate our own conference sessions.

Recently, a new professional and a graduate student from my department gave a presentation together at a regional conference. It was their first time presenting, and they worked hard to make their presentation engaging and relevant. They felt prepared, but when they saw that their presentation room had 100 chairs, they started feeling a bit nervous. Once they began, the chairs were full, with some attendees standing in the back of the room. They finished going through their information, and ended up fielding a number of questions from the audience. Attendees from our school who were in the audience all thought their presentation went well; however, we were looking forward to feedback from other NIRSA members who would not be quite as biased as our colleagues.

The team received 8 evaluations. Out of those 8, five people wrote comments. In a room with 100 – 120 attendees, only 8 provided feedback of any kind.

I have attended NIRSA conferences regularly over the last 15 years, when session evaluations were still being completed with pencil and paper. I have sat through some really great sessions (one given by George Brown on Student Learning Outcomes, which changed the direction of my career), and have labored through some really awful presentations that seemed like they were thrown together at the last minute. I have tried to at least give a number rating for the presentations I have attended, and have written comments when I could. I feel like that is the least I can do when someone has taken the time to put themselves out there and share information with the group.

It is fantastic that we take program evaluation so seriously. But shouldn’t we also take assessment seriously when it comes to our NIRSA conferences? Otherwise, how will we ever get better?

Advertisements

Needs More Cowbell

cowbells
Photo by Thomas Kohler , license CC BY-NC-SA 2.0

As the cost of higher education soars, colleges and universities have come under closer scrutiny by parents and students to determine if they are receiving commensurate value from their tuition dollars. In response, Student Affairs divisions have become much more adept at showing how we support student success through our co-curricular programs and services. The stage has been set, and through our assessment efforts, we have been telling our stories to an audience that expects a high performance standard.

And if intentional assessment has been the platform for our storytelling, then Student Learning Outcomes has become the rock star. The incessant drum beat from SACS reviews, CAS standards, and Learning Reconsidered that has driven us to create and assess SLOs within each Student Affairs department has steadily taken hold, and we can proudly sing about skills that students learn by participating in our programs.

In Campus Recreation, as well as in other departments within Student Affairs, Student Learning Outcomes assessment has become the lead singer, receiving most of the effort and attention. But the glitz and glamour of SLOs have taken attention away from perhaps the most important measures that lead to student success: participation numbers and student satisfaction. These operational outcomes—what I’m calling the “cowbell”—are always there, are usually assessed, but are easily ignored in the shadow of Student Learning Outcomes assessment. Yes, SLOs are important, but what we really need, particularly in Campus Recreation, is more cowbell. Here’s why.

  1. Students who participate in recreational programs and services have higher GPAs, and are retained at a higher rate, than the average student population.

Reports from schools such as Michigan State University, University of Iowa, and University of Arkansas, show a correlation between gym use and GPA, with gym users earning higher GPAs than non-users. The reports also show a correlation between gym use and retention, with users being retained at a higher rate than non-users. We have noticed the same correlations at my school, and hope to formalize the findings at the end of this assessment cycle.

  1. Exercise resets and recharges your brain.  

Harvard Medical School psychiatrist John Ratey explains the cognitive benefits of exercise in his book, Spark: The Revolutionary New Science of Exercise and the Brain. According to Ratey, “Exercise is the single best thing you can do for your brain in terms of mood, memory, and learning.” And while it’s been shown that exercise improves learning, it also reduces stress and lifts depression.

So let’s take a look at our student participation numbers, and make a plan to reach out to non-users to get them involved with our programs and services. Let’s collaborate with other campus partners, such as student health and student counseling centers, to provide wellness programs to students who are struggling with health issues that impact their classroom learning. Let’s provide a safe, clean, friendly environment in our facilities that make students feel at home and will keep them coming back. Let’s collect student feedback on our programs and services, and make changes according to that feedback, so that students know we value their opinions and want to meet their needs. Then let’s assess our plans to improve student participation and satisfaction, see what has worked and what hasn’t, and develop an improvement plan. And let’s keep running the numbers to see if the correlation between gym use and GPA, and gym use and retention, show positive results.

Student Learning Outcomes assessment is great, but studies show that we need to start featuring the cowbell. An increase in student participation in our recreation programs can mean an increase in student success.

I’m telling you. You’re gonna want more cowbell!

more cowbell
Photo by Mike Kline, license (CC BY 2.0)

Closing the Job Skills Gap: Learning Outcomes in Student Employment

Poster advertising student employment

“Why are so many college students failing to gain job skills before graduation?” This is the title of a Washington Post article that caught my attention. It cited a survey, conducted in November 2014 by Hart Research Associates, which found that employers don’t feel that today’s college students are graduating with important skills necessary to be successful in their workplaces. Skills that employers felt were most important for their employees, such as communication, problem solving, and decision making, were also skills they thought college students lacked upon graduation.

Employers have long complained of the disparity between skills they feel college graduates should have, and the actual skills they possess upon graduation. But with the ever-rising cost of higher education, colleges and universities are being held accountable on a higher scale to deliver an excellent product. With colleges and universities under such scrutiny, this provides a perfect opportunity for Student Affairs to shine, since we offer programs and services that allow students to learn the skills that employers desire.

One way that Student Affairs can help students develop skills for future success is by hiring them to work campus jobs, and providing intentional development opportunities during their employment. When we provide ongoing training, such as sessions on conflict resolution, decision making in undefined circumstances, and understanding personality differences, and we loosen the reins on our employees to start using the skills we teach them, then their job skills are bound to improve. Even if the work is unrelated to their college major, real life experiences gained through campus employment are transferable across all majors. And with the right assessment in place, we can prove it.

For example, Campus Recreation offers unique opportunities for student employees to improve the skills that employers most value. Below is the current list of skills that four in five employers rate as very important (a rating of eight, nine, or ten on a zero-to-ten scale), along with Campus Recreation job responsibilities that develop these skills:

1. The ability to effectively communicate orally
2. The ability to work effectively with others in teams
3. The ability to effectively communicate in writing
4. Ethical judgment and decision-making
5. Critical thinking and analytical reasoning skills
6. Ability to apply knowledge and skills to real-world settings

Campus Recreation student job responsibilities: Supervising staff and delegating tasks; sports officiating; teaching group fitness classes; providing personal training instruction; providing written and verbal employee evaluations; planning staff development sessions; teaching job duties to new hires; collaborating on policy development; answering patron questions; explaining and enforcing policies; resolving conflicts; responding to emergencies; troubleshooting facility problems; providing ongoing positive and corrective feedback; documenting problems and concerns; requesting procedural changes; developing employee handbooks.

In my department, assessment on whether students are learning these skills is done through both direct and indirect measures. Intramural officials are evaluated by their supervisors, and written and verbal feedback is given on a regular basis. Emergency drills are conducted on a monthly basis for both recreation center and aquatic center staff, and written and verbal feedback is provided. Student employees receive mid-semester informal evaluations on their job performance, and written evaluations are completed at the end of each semester.

But my favorite assessment is where the students write about their own learning. Student employees complete exit surveys and blogs to explain how they think their transferable skills have improved as a result of working in Campus Recreation. Here are a few examples taken from assessment results:

“I learned to fine tune ideas and fully plan out the details before presenting them to others.”

“I’ve also learned when to “pick your battles” – knowing when to fight for something, and when to let things go. Sometimes your way isn’t always the best way, and you need to remember the goals and needs of the department over your own.”

“Before coming to college and working at the Johnson Center people just tended to listen and go along with leadership. But in college people start developing their own minds and thoughts and how they want to approach things. It is more realistic to see everyone have their own viewpoint, than for everyone to agree. So being in meetings where collaboration was important taught me its value. I know that a good collaboration environment is open and free from criticism. Everyone’s voice must be welcome and applauded, if it’s not then collaboration will not work.”

“My major at UK is communications. So naturally my communication competency has gone up from my education. But the ability to take what I learn in the classroom and apply it to my supervising at the JC has been great. I now have a better awareness for cultural and personal differences. I know how to navigate different situations with different people. I can do this because I know how to communicate with them on an individual basis.”

“Over the past few years I have handled several emergency situations. After being a part of so many of these incidents, I feel that I can appropriately handle almost any situation.”

“I have watched myself grow as a person and as an employee in my 3 years at the JC, and I have watched the respect and trust of my fellow employees in me grow. Because I have successfully gained the trust of my employees, I feel more confident in everything from group projects to interviews to other jobs.”

By providing ongoing staff training for our student employees, allowing them to be involved in planning and decision making during their employment, and giving them the freedom to apply their skills and make decisions on their own, we can close the gap between what employers expect from graduates, and the actual skills the students acquire before graduation. And by conducting appropriate assessments, we can demonstrate, not only to university administrators and other stakeholders, but to the students themselves, the value of their job experience in Student Affairs.

Learning Reconsidered… Reconsidered

For the past several years there has been a push within Student Affairs to identify and assess student learning outcomes. Five years ago, the idea of assessing learning outside of the classroom was new to me, and I was encouraged to read “Learning Reconsidered 2,” which I eagerly devoured. Reading the book prompted me to look at my work from a different perspective.  It shook things up a bit, creating that fizz of excitement that makes everything seem new and interesting.

However, the learning outcomes initiative that created a pleasant fizz seems to have resulted in an undesirable explosion. “Assessment” has come to mean “student learning outcomes assessment.” All other assessments that have nothing to do with learning outcomes, such as attendance numbers and student satisfaction, seem at best to be unimportant, and at worst are poo-pooed as not being “real assessment.”

I completely agree that we need to assess student learning outcomes within Student Affairs, and we have developed programs in my area that promote and measure student learning. However, in Campus Recreation, most of our programs and services are designed to serve students’ needs, not to measure student learning. And according to current research, this model is completely appropriate for supporting student success.

For example, a 2002 study entitled “The Value of Recreational Sports on College Campuses” indicates that students who participate in physical activity and recreational sports experience improved emotional well-being, reduced stress, and improved happiness. In a spring, 2014, survey of 2500 randomly selected students conducted at my university, 78% of respondents reported that our facilities and programs helped them to feel more at home at our university; 90% reported that they enjoyed participating in our activities or utilizing our facilities, and 92% agreed that our facilities and programs improved the quality of life at our university.

Other recent research has shown a correlation between students who utilize campus recreation facilities and their GPA. In fall 2013, Purdue University found that freshman students who visited their rec center 15 or more times during the semester had a higher GPA (3.08) than those who did not utilize the facility (2.81). In fall 2010, Michigan State University found that freshmen who purchased memberships at their recreation center had higher GPAs and stayed in school longer than those who did not purchase memberships. Both of these studies support research conducted by John J. Ratey, M.D., author of “Spark: The Revolutionary New Science of Exercise and the Brain,” that explores the brain-fitness connection.

What does this mean? For me, it means that student success, in part, depends on students utilizing our programs and services. Because of this, we need to do everything we can to keep students coming to the recreation center. We also need to understand the factors that prevent students from utilizing our programs and services, and change what we can to limit these impediments.

How do we gauge how well we’re meeting these challenges?

Attendance counts. Satisfaction surveys. Various other operational assessments.

We can impact a small percentage of the student population with programs that promote student learning outcomes, but we can impact student success on our entire campus just by improving the participation rate for our programs and services. Now THAT is something worth assessing!

I would love to hear your comments on student learning outcomes assessment vs. operational assessment.

I Am Still Learning (Thank God)

Flickr CC BY Anne Davis
 
I just returned from a meeting with my Student Affairs Assessment Committee. We attended a webinar where Campus Labs demonstrated some of their products. Afterwards, the attendees voiced their thoughts about the products, as well as assessment in general. I left the meeting with two equally strong impressions: 

1. Our committee has some highly intelligent and dedicated individuals who are doing everything they can to help their departments produce quality assessments. They see assessment as an integral part of not only their jobs, but of the entire division – a necessary step we need for continuous improvement and to show that what we do provides value to our stakeholders. 

2. I am just a dumb jock. 

Through most of the 1.5 hour web presentation, and the hour-long discussion afterwards, I had little clue about what any of it meant. Even something as basic (I assume) as SPSSwas foreign to me. But then you toss in buzz words like “ key performance indicators” and “regression analysis,”  large data sets, and names of educational theorists I’d never heard of (OK, I’d at least heard of Kuh), and my head starts to hurt. I feel like I am in a foreign land and everyone around me is talking like I understand what they’re saying, but in truth I don’t speak one word of the language. I want to stand up and scream, “I don’t understand any of this,” but I am too embarrassed. After serving on the Assessment Committee for the past four years, I have never felt so unqualified to be in the room. 

I don’t really think I’m dumb. I’ve just never taken a modern statistics course. There have been many changes in Student Affairs within the past 10 years that I’ve had to learn on my own. Technology, sustainability, assessment, they all come with their own requirements and even their own language. I’ve had to learn about these things either on my own or at conferences, while still performing the daily tasks that my “regular” job requires. Some would argue that these things ARE my regular job, and I would agree, but the level of learning that is required has kept me late at the office many, many times. And just when I thought I was getting a handle on current trends, I realize that I know almost NOTHING about statistics. And I really, really need to know statistics. I WANT to know statistics. But today, at this moment, that task just seems so daunting. I will be starting from chi-square one while the rest of my committee will be moving on to more advanced statistics, and I fear I’ll never be on their same page. I thought about that “I Love Lucy” episode where the assembly line goes faster and faster, and Lucy can’t keep up. And just like Lucy, I can’t keep up with my Assessment Committee colleagues. I briefly fantasized about quitting. It was all too much. 

But the thing is, there will always be something new to learn. Students graduating today with their master’s degree in Higher Education will, ten years from now, find themselves scrambling to keep up with new skills, ideas, and technologies. They will have to learn on the fly, just like me. I don’t want to adopt the attitude of an employee I knew who, when asked to learn how to organize a spreadsheet in MS Excel, replied, “I will be retiring in a few years. I don’t want to learn anything new.” 

So bring it on. Whatever I need to learn to help me stay current in my job, then by god, I’m going to learn it. And, thank god, I WANT to learn it. For me, the worst thing would be to tire of learning. So in the words of Bob Dylan:

Come gather ’round people 
Wherever you roam 
And admit that the waters
Around you have grown
And accept it that soon
You’ll be drenched to the bone

If your time to you 
Is worth savin’
Then you better start swimmin’
Or you’ll sink like a stone
For the times they are a-changin’.

Can’t Get No Satisfaction Assessment

 

With the publication of Learning Reconsidered I & II, there has been a critical push for co-curricular departments within Student Affairs to develop and assess student learning outcomes. As a Campus Recreation professional, I support this completely. For years, those in my profession have “known” that programs we offer provide learning opportunities for students. Now, because of the culture change towards intentional assessment, we have learned how to conduct assessment to show that learning takes place.

But though I am pleased with the number of programs offered through my department that assess student learning outcomes, most of what we do in campus recreation is “operational” in nature. We want to provide the best facilities and services possible so that students can participate in the physical activity of their choice. We strive to create a recreational environment where students feel welcome, where they can exercise without barriers, where they have opportunities during various times of the day to come and relieve stress and play with their friends. When they walk through our doors, we want students to be glad that they chose to attend our university because we have such a wonderful recreational center that meets their needs. We want them to exercise and leave happier, less stressed, and more prepared to face their academic pursuits than when they entered the facility.

How do we assess these operational outcomes? Mainly through satisfaction surveys.

The survey we use is sent via an emailed link to 2500 randomly selected students. Most questions ask students to select their level of satisfaction with a variety of items related to our programs and services. Included are questions pertaining to hours of operation, cleanliness of the facility, availability of equipment, customer service, number and type of group fitness classes offered, number and type of intramural contests offered, etc. Through the survey results, we have received invaluable information and have made improvements to our facilities and services that have increased student satisfaction in subsequent surveys.

So imagine my confusion (and frustration) when I recently read the following tweet from a Student Affairs professional:

“Satisfaction surveys is not assessment work… stop it, NOW!”

After further investigation, I found that the tweet was sent during a chat session with the authors of Learning is Not a Sprint: Assessing and Documenting Student Leader Learning in a Co-curricular Environment. I assume that what the author of the tweet meant was “using satisfaction surveys to assess learning is not assessment work…” And I would wholeheartedly agree with that statement: satisfaction surveys do not show that learning has occurred.

In general, though, this is what I find confusing (and frustrating) with Student Affairs assessment: it seems that when “assessment” is discussed, the focus is almost exclusively on student learning outcomes, and very little mention is made of what is perhaps our most important work: delivering quality services to our students. The reason for this focus could be that, before a few years ago, we didn’t understand the need for a co-curricular unit to develop and assess student learning outcomes. Over-emphasizing the need for student learning outcomes assessment created an awareness of this concept, so that now, most Student Affairs units are doing this type of assessment.

But shouldn’t it be time to again start including operational outcomes in the conversation? I love to talk about assessments we’ve conducted showing that our 200 student employees have learned transferable skills; that our 75 weight loss program participants have learned how to design their own weight training program; that our 80 club sports officers have learned how to develop a risk management plan that is customized for the unique needs of their sport. But for me, an even more important assessment question is this: 

Are we meeting the recreational needs of the 3,000 students per day who use our programs and services?

More Than Tossing Out a Ball

As I struggle to become more comfortable with “public” writing, there is one aspect of my blog with which I am pleased: the title. Although “Campus Recreation Reconsidered” is not the catchiest title I’ve seen, still it represents the rekindling of excitement and interest in my chosen profession that began about four years ago. 


Since I started working in Campus Recreation over 25 years ago, I’ve gotten a sense that those outside the profession think that what I do is little more than throwing a basketball out on a court and letting kids play. And honestly, for many years my main goal was just to provide an outlet for college students to be able to pursue their recreational interests and have fun. 

Then the National Intramural and Recreational Sports Association (NIRSA) published “The Value of Recreational Sports in Higher Education,” which showed that “participation in recreational sports programs and activities is a key determinant of college satisfaction, success, recruitment, and retention.” That’s when the conversation really started to change. Terms like “intentional learning” and “assessment”—words that I had never heard associated with my profession—started popping up at conferences. And I started paying attention. 

Then two things happened that has set my course for the last four years. First, I attended a NIRSA educational session on Student Learning Outcomes, presented by Dr. George Brown from the University of Alabama. During his presentation he talked about the different areas within Campus Recreation where learning can take place, and the importance to identify these areas, establish expected learning outcomes, and assess those outcomes. During this session he also mentioned the book, “Learning Reconsidered 2.” Dr. Brown’s session intrigued me, so when I returned from the conference, I ordered and read “Learning Reconsidered 2.” (You can read the whole book at this link.) I was inspired. 
That’s when the second thing happened to set my course. Right after I finished reading the book, my boss sent an email to the Campus Recreation professional staff asking for volunteers to serve on a Student Affairs Assessment Committee. I jumped at the chance (my Campus Rec colleagues like to say I “dove on the grenade”). 

During the first year, everyone on the committee struggled through what for us was uncharted territory. Most on the committee had not conducted any kind of formal assessments in our areas. We confused “operational assessment” with “student learning outcomes.” We asked for advice from our university’s assessment office, and they seemed almost as lost as we were in trying to determine learning outcomes for non-academic departments. Over the next several years we muddled through the frustration of trying to cram our square peg of “academic support units” into the round hole of student learning outcomes. But we’ve slowly gained understanding, and we’ve continued to pursue intentional assessment within our departments. And even though I’m still not very good at it, I’ve found that I love it. 

I know that studies have shown that participation in recreational sports has value for college students. So now the question that I strive to answer every day is this: does my recreation department’s programs and services have value for our students? This question has led me to pursue different ways to assess my own area of facilities management to see if this is true. I have developed a Graduate Assistant Development Program, where I have established learning outcomes and hope to show that by working as a facilities graduate assistant, my students will gain the knowledge and skills necessary to succeed in a professional job. Assessing outcomes for experiential learning has been a challenge, so requiring my GAs to write reflective blogs is my latest attempt to use indirect measures to show that learning has taken place. And I’ve joined them in blogging, for my own professional development. 

So that “grenade” that I fell on has turned out to be my phoenix. My enthusiasm for my department and my profession is renewed daily. I understand, support, and contribute to the process of looking at our programs and services through the lens of how they help us to accomplish our mission. Campus Recreation, reconsidered. Indeed.