Missing the Marks

likert-scale

Over the last several years, I have been very pleased to see an emphasis on program assessment within the NIRSA community. Because we have taken the time to perform intentional assessment, we have proof of a positive correlation between students using their campus recreation facilities, and higher GPAs and retention rates. We have been able to show that our student employees have improved their transferable skills by working within our programs and facilities. There is confirmed evidence that students gain leadership skills by participating in intramural and club sports. Additionally, we can use program assessment on a day to day basis to show us whether our intended goals for our programs and services are being met, and how we can tweak our programs for continuous improvement.

This is all good stuff. Truly.

Which makes it so much more disappointing when we, as individual members of NIRSA, don’t take the time to evaluate our own conference sessions.

Recently, a new professional and a graduate student from my department gave a presentation together at a regional conference. It was their first time presenting, and they worked hard to make their presentation engaging and relevant. They felt prepared, but when they saw that their presentation room had 100 chairs, they started feeling a bit nervous. Once they began, the chairs were full, with some attendees standing in the back of the room. They finished going through their information, and ended up fielding a number of questions from the audience. Attendees from our school who were in the audience all thought their presentation went well; however, we were looking forward to feedback from other NIRSA members who would not be quite as biased as our colleagues.

The team received 8 evaluations. Out of those 8, five people wrote comments. In a room with 100 – 120 attendees, only 8 provided feedback of any kind.

I have attended NIRSA conferences regularly over the last 15 years, when session evaluations were still being completed with pencil and paper. I have sat through some really great sessions (one given by George Brown on Student Learning Outcomes, which changed the direction of my career), and have labored through some really awful presentations that seemed like they were thrown together at the last minute. I have tried to at least give a number rating for the presentations I have attended, and have written comments when I could. I feel like that is the least I can do when someone has taken the time to put themselves out there and share information with the group.

It is fantastic that we take program evaluation so seriously. But shouldn’t we also take assessment seriously when it comes to our NIRSA conferences? Otherwise, how will we ever get better?

Using Google Quizzes with Online Student Employee Training

bbp_training

Training student staff to work in a recreation facility is an important step to ensure that customer service, risk management, and job skills are all communicated effectively to the employees. Though much of the training we require is more effective when conducted in person, there are some parts that can be conducted just as effectively through an online venue. Sometimes, combining online information with in person training is even better! If students can receive the information first through an online venue, more time can be spent in person on the practical application of the information.

If we decide to provide training information online to our student staff, it is important that they understand the training before continuing on to the practical application. After students complete an online training session, we can test understanding using a Google quiz. The following provides step-by-step instructions for creating and using Google quizzes.

First, create a Google form using quiz questions you have written that are based on the training you provided. If you have never created a Google form before, you can find step-by-step instructions here. Most types of questions will work as long as there is an exact answer expected. For example, multiple choice questions work well, but an essay answer using a paragraph box may not be appropriate, since correct answers may differ slightly from the answer key. It would also be a good idea to select “Required” for each question so that students don’t inadvertently skip a question.

create_form

 

Once you have created your Google form, you can convert it to a quiz. In the upper right-hand corner of the form you will see a “gear” icon, which will take you to the “Settings” menu. Select “QUIZZES” from the menu.

select_quizzes

You will then have the option to make the form a quiz by sliding the first button to the right. If you want your students to have immediate feedback on their quiz results, select “Immediately after each submission” under the Release Grade option. You can also select which options you would like your students to see. If students will need to retake the training if they don’t meet the minimum score, you may want to show which answers they missed, but not the correct answers. These options are up to you to either check or uncheck.

quiz_settings

Next, you will need to create an answer key so that student submissions will be automatically scored. Click on the first question in the quiz. You will now see a link at the bottom called “ANSWER KEY.” Click on that link.

create_answer_key

You will then be able to choose the correct answer, and set the point value for that question.

set_answer_key

Continue to select the correct answer and choose the point value for all subsequent questions. Once you complete this step, you will want to set up a confirmation message so that students can see their score after they submit their answers. Click the “Settings” icon at the top of the page, and select “PRESENTATION.” Then type in a “confirmation message” that directs students on how to find their score, and instructs them on what to do if they did not earn the minimum score to pass the quiz. Once you have typed the confirmation message, select “SAVE.”

show_score_link

When students complete the quiz and submit their answers, they will receive a confirmation screen containing a link they can select to view their score. Below is an example based upon the confirmation message that was set in the above image.

confirmation

You will now want to choose how to collect quiz responses. Select the “RESPONSES” menu item at the top of your quiz.

choose_responses

A screen will appear that contains a spreadsheet icon. Click on the icon.

select_spreadsheet

You will be prompted to select a response destination. I would recommend selecting the “Create a new spreadsheet” option.

create_new_spreadsheet

The new spreadsheet that will collect all of your quiz responses will open on the screen. The spreadsheet will show the name of the student who submitted the response, the day and time the response was submitted, their score, and the answers they selected for each question.

google_quiz_responses

If you have determined the minimum score necessary for students to pass the quiz, you can easily see if a student has failed by using “conditional formatting.” Conditional formatting allows you to create a rule where if a certain condition exists, then the formatting in that cell will be different from all the other cells. On the top menu, select “Format,” then “Conditional formatting…”.

format

A box will pop up on the right side of the spreadsheet that you will use to create rules for formatting your spreadsheet. First, select the range of cells for the formatting. In this case, you want to format the column that contains the quiz scores.

select_range

Next, choose the condition for formatting the cells. The default selection is “Cell is not empty.” You can change this condition by clicking on the arrows beside this option.

format_cells_if

A drop down list appears with choices for formatting. Since you want to be alerted if a student fails the quiz, select “Less than.”

less_than

Once “Less than” is selected, you can insert the minimum value needed for a passing grade. In this case, students must score at least 80% in order to pass the quiz. After you determine the formatting condition, you can choose the style you want to use. The default formatting style is to make the cell green. If you would rather use a different color, then select the arrow beside “Default” to bring up other formatting choices.

custom_format

I would like failed scores to show up as red, so I have selected “Custom format.” I then have the ability to fill the cell with a red color if the score is below 80%. Select “Done” when finished.

select_red

Now let’s say that Joe Schmoe and Famous Amos have both taken the quiz. They submit their quiz answers and receive a confirmation message with the link they can choose to view their results. Below are samples of what each will see. Joe has scored 100/100, or 100% and Famous Amos has scored 60/100, or 60%. Famous Amos will need to retake the training and quiz.

joes_score

famous_score

 

The spreadsheet contains both students’ results, including when they took the quiz, their score, and which answers they missed. As you can see, since Famous Amos scored less than 80%, the cell containing his score is red.

spreadsheet_results

Using Google quizzes and spreadsheets with conditional formatting can help you to complete training processes, especially when you have a large student employee staff, limited time for in person training, or when you need to hire new students in the middle of a semester. You can then follow up the training with practical applications.

Disclaimer: As of October 28, 2016, the steps listed above are, to the best of my knowledge, the ones needed to create and use a quiz in Google forms and to set conditional formatting in the responses spreadsheet. Google occasionally updates their Google Drive, Sheets, Forms, etc., and sometimes when this happens, old documents don’t work the same way with the new updates.

 

 

 

Use “Add Reminder” to Track Certification Expiration Dates

Add_Reminder_Blog

Campus Recreation Departments must do their best to maintain a safe environment for their facility users. One step in this process is to monitor required staff safety certifications and keep them up-to-date. Employees may be required to have CPR/AED and First Aid certifications (good for 2 years), bloodborne pathogens certs (good for 1 year), personal training and group fitness certs (expirations vary), and possibly other certifications with random expiration dates. Keeping up with certifications can be a daunting task, depending on the size of your staff, and the types of certifications required. Fortunately, Google sheets has a free add-on, called “Add Reminders,” that will email you a reminder when a certification is getting ready to expire.

Add Reminders can track dates in a spreadsheet column, and send you an email days, months, or even years before a certification will expire. You can set up multiple reminders for the same spreadsheet so you can keep track of different certifications. Here’s how to use the Add Reminders add-on:

  1. Set up a Google spreadsheet that contains the certifications you need to track. Make sure that you have separate columns containing each expiration date that you want to track. All column headers must be in the first row of the spreadsheet. Below is a sample of how your spreadsheet could look.

1_Beginning_spreadsheet

  1. On the top menu bar, click “Add-ons”, and select “Get Add-ons.”

2_Get_Add-ons

  1. A pop-up screen will appear with a search box in the upper right-hand corner. Type “Add Reminders” in the box and hit the “return” key on your keyboard.

3_Search

  1. Add Reminders should appear at the top of the search return list. Select “Free” to access the add-on.

4_Get_it_free

  1. A box will pop up asking for you to allow the app to access your Google account. You will need to grant access in order to use the add-on.

5_Allow

  1. You will now see the Add Reminders add-on when you select the “Add-ons” menu item. Go ahead and select “Add Reminders,” then “Set up/ edit reminders.”

6_set_up_reminder

  1. A window will open on the right side of your spreadsheet. Select “Add Reminders.” If your spreadsheet contains more than one sheet, select the sheet where the reminder should be added.

7_add_reminder

  1. You can only add one reminder at a time. Select one column that contains the expiration dates that you would like to track. In this case, I am tracking CPR/AED expiration dates. I want to be notified one month before expiration, and I also want the student to be notified that their certification is getting ready to expire.

*Note: when setting up column headers, be more descriptive than just naming a column “expires.” Include the certification name as well, so you will be able to select the correct column when you get to this step.

8_set_up_notification

Once you set up one Add Reminder, select “Done.” You can then add additional reminders for every certification you want to track by selecting “Add reminders” again.

  1. I have added three reminders to this spreadsheet: one for CPR/AED, one for First Aid, and one for Bloodborne Pathogens. You can see them all listed in the “Add Reminders” box.

9_multiple_reminders

  1. The “Add Reminders” add-on will track each date in the column(s) specified, and when the date is one month before the certification expires, an email is sent to the email associated with the Google account, as well as to the email associated with the date in the specified row. Once the reminder email is generated, a note is placed on the date indicating that a reminder was sent. The note is indicted by a black triangle in the corner of the cell.

11_reminder_sent

  1. The Google sheet owner, and the employee associated with the expiration date, will both receive an email indicating which certification is one month away from expiring.
    10_email
  2. NOTE: Once you update the certification that was getting ready to expire, you will need to delete the comment, noted by the black triangle in the cell (see number 10. above). This resets the cell so that Add Reminder can continue to track it.

Add Reminders will track and send emails, regardless of whether you have your Google sheets account open.

Disclaimer: As of March 10, 2016, the steps listed above are, to the best of my knowledge, the ones needed to set the reminders on your spreadsheet. Google occasionally updates their Google Drive, Sheets, Forms, etc., and sometimes when this happens, old add-ons don’t work the same way with the new products. Hopefully, if that happens, the Add-ons will also be updated.

I would be interested in hearing about how other Google Add-ons, or additional free technology tools, are being used in day-to-day campus recreation operations. Just make a comment below!

Why I Won’t Interview You

misspelling Photo by jamieanne     License CC BY-ND 2.0

Dear Graduate Assistant Applicant,

This is a very exciting, albeit stressful, time in your life. You have chosen your favorite graduate school programs, and you have just started the process of applying for graduate assistantships, including the assistantship that I have advertised. Your undergraduate experiences have prepared you well to be a contributor to my program. Your future awaits; the road lies before you.

But I will never interview you.

Why not?

It’s because of your resume, or your cover letter, or both. This information is the first chance I have to see who you are, and if your experiences and talents will work well within my program. Your resume helps me to decide if I want to take the next step and talk with you further about the assistantship.

So why do these documents, these first glimpses into who you are, contain numerous misspellings and grammatical errors? Why is the formatting inconsistent? Why haven’t you fully represented your skills and experiences? I’m not talking about just one mistake that can be overlooked; I’m talking about multiple errors. Reading through your cover letter and resume, I don’t see a talented, enthusiastic, hard-working individual shining through these pages. I see a careless student who lacks attention to detail, and who doesn’t meet the minimum requirements that I look for in a graduate assistant candidate.

I know this isn’t really you. But this is what I see when I read your application.

Before sending out one more poorly written resume or cover letter, please consider doing the following:

  1. Proofread everything. Then proofread again. If you had proofread your application materials, or had asked others to proofread them, I would never have received my copy containing blatant misspellings. I wouldn’t have been distracted by inconsistent and weird indentations, lack of capitalization and punctuation, and incorrect use of homophones (e.g. their, there, they’re). And by the way, I am not a “sir” and I do not work at the school that you so adamantly wrote that you would like to attend.
  1. Be sure to list experiences and skills that show how you are qualified to fulfill my particular graduate assistantship posting. Be specific, and match your skills with the job requirements listed for the assistantship. For instance, rather than stating “supervised staff,” you could write “supervised staff of 50 front desk and fitness employees.” I want to understand all aspects of your past experiences so that I can make a decision about moving forward with an interview.
  1. Ask for help from your current professional staff supervisor or your school’s career center. If you had shown your resume to someone from these two groups, they could have helped you to revise and correct your information.

I hesitated sending you this letter, because I didn’t want to hurt your feelings. And at least your resume was not as bad as these. But I know that you deserve consideration for an assistantship, and first impressions matter. Please correct your resume and cover letter so that you have a chance to compete for your dream assistantship.

NIRSA Students: Tweet THAT!

kristen

I just read a great article on professional development that was tweeted as a link by Kristen Gleason, NIRSA’s Director of Professional Development. It contained some especially good advice for graduate assistants and young professionals who are navigating a professional workplace for the first time. And guess what? I didn’t need to attend a conference or workshop to get the information.

Professional development opportunities are always right at our fingertips. Professionals like Kristen, as well as others from cross-disciplinary fields, are always quick to share great articles or blog posts that are current and relevant. As long as we have an internet connection, there is a wealth of ideas that will challenge us and help us to grow professionally.

But something is missing: the student voice.

We need more idea-sharing from students. After all, students usually know what other students are struggling with. Have you read something lately that has changed your perspective? Why not tweet it? Have you seen a photo that has lifted your spirits? Share it with your fellow students and NIRSA colleagues. Do you have your own blog that you use for reflective writing? Maybe others can relate to your writing and would be grateful for your input. In fact, Erica Estes, the current NIRSA Student Leader, has written several thoughtful articles that would be helpful for students as they become more involved with NIRSA.

So how about it, students? You have an important voice, and a refreshing perspective. Don’t leave it up to the seasoned professionals to feed our industry. Add your own spice to the conversation. And don’t forget to tag your tweets with #recchat, @NIRSAlive, or @NIRSAStudents so that everyone can benefit!

To get you started, I have listed just a few Twitter accounts and hashtags that you can search to find professional articles, applicable to Campus Recreation and higher education. Put all of your professional Twitter contacts on a Twitter list so you can easily find their latest Tweets:

#SAChat
#recChat
@NIRSAlive
@NIRSAStudents
@The_SA_Blog

If you’re looking for some thought-provoking blogs that are share-worthy, try some of these writers:

George Couros, @gcouros
Eric Stoller, @EricStoller
Laura Kennett, @laurakennett

Do you follow an online blog, or someone on Twitter who has made a difference in your life or career? Feel free to add the information in the comments below. Let’s all start feeding each other!

Needs More Cowbell

cowbells
Photo by Thomas Kohler , license CC BY-NC-SA 2.0

As the cost of higher education soars, colleges and universities have come under closer scrutiny by parents and students to determine if they are receiving commensurate value from their tuition dollars. In response, Student Affairs divisions have become much more adept at showing how we support student success through our co-curricular programs and services. The stage has been set, and through our assessment efforts, we have been telling our stories to an audience that expects a high performance standard.

And if intentional assessment has been the platform for our storytelling, then Student Learning Outcomes has become the rock star. The incessant drum beat from SACS reviews, CAS standards, and Learning Reconsidered that has driven us to create and assess SLOs within each Student Affairs department has steadily taken hold, and we can proudly sing about skills that students learn by participating in our programs.

In Campus Recreation, as well as in other departments within Student Affairs, Student Learning Outcomes assessment has become the lead singer, receiving most of the effort and attention. But the glitz and glamour of SLOs have taken attention away from perhaps the most important measures that lead to student success: participation numbers and student satisfaction. These operational outcomes—what I’m calling the “cowbell”—are always there, are usually assessed, but are easily ignored in the shadow of Student Learning Outcomes assessment. Yes, SLOs are important, but what we really need, particularly in Campus Recreation, is more cowbell. Here’s why.

  1. Students who participate in recreational programs and services have higher GPAs, and are retained at a higher rate, than the average student population.

Reports from schools such as Michigan State University, University of Iowa, and University of Arkansas, show a correlation between gym use and GPA, with gym users earning higher GPAs than non-users. The reports also show a correlation between gym use and retention, with users being retained at a higher rate than non-users. We have noticed the same correlations at my school, and hope to formalize the findings at the end of this assessment cycle.

  1. Exercise resets and recharges your brain.  

Harvard Medical School psychiatrist John Ratey explains the cognitive benefits of exercise in his book, Spark: The Revolutionary New Science of Exercise and the Brain. According to Ratey, “Exercise is the single best thing you can do for your brain in terms of mood, memory, and learning.” And while it’s been shown that exercise improves learning, it also reduces stress and lifts depression.

So let’s take a look at our student participation numbers, and make a plan to reach out to non-users to get them involved with our programs and services. Let’s collaborate with other campus partners, such as student health and student counseling centers, to provide wellness programs to students who are struggling with health issues that impact their classroom learning. Let’s provide a safe, clean, friendly environment in our facilities that make students feel at home and will keep them coming back. Let’s collect student feedback on our programs and services, and make changes according to that feedback, so that students know we value their opinions and want to meet their needs. Then let’s assess our plans to improve student participation and satisfaction, see what has worked and what hasn’t, and develop an improvement plan. And let’s keep running the numbers to see if the correlation between gym use and GPA, and gym use and retention, show positive results.

Student Learning Outcomes assessment is great, but studies show that we need to start featuring the cowbell. An increase in student participation in our recreation programs can mean an increase in student success.

I’m telling you. You’re gonna want more cowbell!

more cowbell
Photo by Mike Kline, license (CC BY 2.0)

Closing the Job Skills Gap: Learning Outcomes in Student Employment

Poster advertising student employment

“Why are so many college students failing to gain job skills before graduation?” This is the title of a Washington Post article that caught my attention. It cited a survey, conducted in November 2014 by Hart Research Associates, which found that employers don’t feel that today’s college students are graduating with important skills necessary to be successful in their workplaces. Skills that employers felt were most important for their employees, such as communication, problem solving, and decision making, were also skills they thought college students lacked upon graduation.

Employers have long complained of the disparity between skills they feel college graduates should have, and the actual skills they possess upon graduation. But with the ever-rising cost of higher education, colleges and universities are being held accountable on a higher scale to deliver an excellent product. With colleges and universities under such scrutiny, this provides a perfect opportunity for Student Affairs to shine, since we offer programs and services that allow students to learn the skills that employers desire.

One way that Student Affairs can help students develop skills for future success is by hiring them to work campus jobs, and providing intentional development opportunities during their employment. When we provide ongoing training, such as sessions on conflict resolution, decision making in undefined circumstances, and understanding personality differences, and we loosen the reins on our employees to start using the skills we teach them, then their job skills are bound to improve. Even if the work is unrelated to their college major, real life experiences gained through campus employment are transferable across all majors. And with the right assessment in place, we can prove it.

For example, Campus Recreation offers unique opportunities for student employees to improve the skills that employers most value. Below is the current list of skills that four in five employers rate as very important (a rating of eight, nine, or ten on a zero-to-ten scale), along with Campus Recreation job responsibilities that develop these skills:

1. The ability to effectively communicate orally
2. The ability to work effectively with others in teams
3. The ability to effectively communicate in writing
4. Ethical judgment and decision-making
5. Critical thinking and analytical reasoning skills
6. Ability to apply knowledge and skills to real-world settings

Campus Recreation student job responsibilities: Supervising staff and delegating tasks; sports officiating; teaching group fitness classes; providing personal training instruction; providing written and verbal employee evaluations; planning staff development sessions; teaching job duties to new hires; collaborating on policy development; answering patron questions; explaining and enforcing policies; resolving conflicts; responding to emergencies; troubleshooting facility problems; providing ongoing positive and corrective feedback; documenting problems and concerns; requesting procedural changes; developing employee handbooks.

In my department, assessment on whether students are learning these skills is done through both direct and indirect measures. Intramural officials are evaluated by their supervisors, and written and verbal feedback is given on a regular basis. Emergency drills are conducted on a monthly basis for both recreation center and aquatic center staff, and written and verbal feedback is provided. Student employees receive mid-semester informal evaluations on their job performance, and written evaluations are completed at the end of each semester.

But my favorite assessment is where the students write about their own learning. Student employees complete exit surveys and blogs to explain how they think their transferable skills have improved as a result of working in Campus Recreation. Here are a few examples taken from assessment results:

“I learned to fine tune ideas and fully plan out the details before presenting them to others.”

“I’ve also learned when to “pick your battles” – knowing when to fight for something, and when to let things go. Sometimes your way isn’t always the best way, and you need to remember the goals and needs of the department over your own.”

“Before coming to college and working at the Johnson Center people just tended to listen and go along with leadership. But in college people start developing their own minds and thoughts and how they want to approach things. It is more realistic to see everyone have their own viewpoint, than for everyone to agree. So being in meetings where collaboration was important taught me its value. I know that a good collaboration environment is open and free from criticism. Everyone’s voice must be welcome and applauded, if it’s not then collaboration will not work.”

“My major at UK is communications. So naturally my communication competency has gone up from my education. But the ability to take what I learn in the classroom and apply it to my supervising at the JC has been great. I now have a better awareness for cultural and personal differences. I know how to navigate different situations with different people. I can do this because I know how to communicate with them on an individual basis.”

“Over the past few years I have handled several emergency situations. After being a part of so many of these incidents, I feel that I can appropriately handle almost any situation.”

“I have watched myself grow as a person and as an employee in my 3 years at the JC, and I have watched the respect and trust of my fellow employees in me grow. Because I have successfully gained the trust of my employees, I feel more confident in everything from group projects to interviews to other jobs.”

By providing ongoing staff training for our student employees, allowing them to be involved in planning and decision making during their employment, and giving them the freedom to apply their skills and make decisions on their own, we can close the gap between what employers expect from graduates, and the actual skills the students acquire before graduation. And by conducting appropriate assessments, we can demonstrate, not only to university administrators and other stakeholders, but to the students themselves, the value of their job experience in Student Affairs.

Channeling Gordon Gekko

“The most valuable commodity I know of is information.”
-Gordon Gekko, Wall Street

Last January I spent numerous hours writing blog posts describing technology tools I use for my job. I had found some tools that were especially helpful for managing a university recreation facility, and I wanted to share with others who could use them as well. My plan was to continually add posts as I discovered additional helpful technology tools.

Part of my motivation for attempting to write this series resulted from my frustration with the lack of help and direction that I experienced while trying to find tech tools that I could use. I am not particularly tech-savvy, and I thought that if only someone would show me some nifty tech tools, then I could solve all of my challenges. I complained bitterly that, since I wasn’t an IT person, I had no idea where to begin. My mantra was, “You don’t know what you don’t know.” When I actually stumbled upon a useful tech tool, I felt like one of those blind squirrels who occasionally finds a nut, and I wanted to share my good fortune with everyone whose jobs were similar to mine by writing posts about my find.

It took me only a few months to realize that I was not only a blind squirrel, but I was naïve as well. How could I possibly have thought that the detailed instructions I wrote in my posts would stay current, even for as long as it took to finish writing the post? Google docs spreadsheets changed to “Sheets,” Google Script Apps became “Add-Ons,” and the effort I took to post screen shots along with my instructions was all for naught as that version of Google docs became obsolete.

I’ve reflected on my folly for several months, and I’ve realized that, although it is impractical to do more than write my own review about technology tools, my excuse that “You don’t know what you don’t know” was no longer valid. I was no longer a blind squirrel.

Information_Wordle

 
When I first started trying to find technology tools, it is true that I really didn’t know where to look for information. That was my biggest problem. It’s not so much about that one tool that you might find: it’s about knowing where to get the information you want. As Gordon Gekko, says in the movie Wall Street, “The most valuable commodity I know of is information.” And though Gekko is a villain, he’s still right. You don’t need to already know the information, you just need to know how to get it.

I started with Twitter, following links on professional hashtags such as #SATech, #HigherEd, and #SAChat. The posts from those hashtags led me to websites such as Diigo, Edudemic, and Inside Higher Ed. I signed up to receive posts from the Diigo in Education group, and started reading blogs from tech-savvy professionals such as Ed Cabellon, Eric Stoller, George Couros, and many others. I have now developed a large network of websites and social media sites that I use to discover new tech tools. My latest tech source is Richard Byrne’s blog, “Free Technology for Teachers,” which has a plethora of ideas that can be adapted for use in my job.

New tech tools are being developed every day, and now I feel confident that I have a network of resources that will lead me to the tools I need. So instead of bemoaning the fact that “I don’t know what I don’t know,” I can confidently state, “I know how to find information about what I don’t know.” No more excuses. No more blindly stumbling upon nuts here and there. The most valuable tool—information—is there for the taking.

All Student Employees are Equal…but Some are More Equal than Others

Office_worker_slacking

A favorite part of my job as a campus recreation facilities director is working with student employees. They energize and challenge me daily—in a good way. But, as in all places of business, employees may not always perform according to expectations, so clearly stated employee policies and disciplinary processes are needed.

Last spring I met with my graduate assistants and facility supervisors to discuss our employee disciplinary policy. Our goal was to devise a plan that everyone felt was consistent and fair. Our graduate assistants suggested using a point system, where employees would accumulate points based upon the type of infraction. For example, an unexcused missed shift would be worth more points than a dress code violation. Disciplinary meetings would be held if employees accumulated a certain number of points, and if an employee accumulated 10 points, they would be fired.

I had never used a point system before, but I was willing to give this a try. My student leaders seemed to really like the idea, and it looked like many other campus recreation departments within the NIRSA community employed variations of the point system as well.

We have been using the point system for less than one semester. My graduate assistants like it. My supervisors like it.

I do not like it. Here’s why:

  1. It is based on the assumption that all employees are equal.

Our mission is to provide a safe, clean, welcoming environment for our campus community that encourages participation in our programs and services. We hire staff who can help us fulfill this mission. As new employees become more familiar with their job duties and responsibilities, it becomes obvious that some workers contribute more towards fulfillment of our mission than others. Some employees are super friendly and helpful to our patrons, take initiative to do chores and help other staff members with various job duties, and display high interest and enthusiasm for their job. Other employees are minimally engaged with their job.

Now, imagine that both employees have been late to work a couple of times. According to the point system, they should both be given the same amount of points, and they could both be in danger of being fired. But one employee brings so much more to the department than the other employee. Why should they be judged as equals on a disciplinary point scale?

  1. Disciplinary write-ups can be inconsistent.

It is clear that an employee who is late, or who misses a deadline, should receive a write-up. It is less clear if, or when, an employee should be written up for poor job performance. We have 15 supervisors who work various shifts, and they all may perceive and address employee job performance differently. For example, one supervisor may choose to give verbal corrective feedback to a student employee who may not have been as attentive towards a patron as they should have been, while another supervisor in the same situation may perceive the employee’s behavior as completely unacceptable, and may have submitted a write-up form. This semester, of the 70 student employees on staff, 23 have been written up for a total of 35 infractions—but only 2 write-ups were for poor job performance. It is conceivable that an employee could demonstrate poor job performance on several different shifts, and receive verbal corrective feedback on all of them without receiving one write-up. Yet the employee who usually demonstrates a commitment to our mission, but who may have missed a deadline, will have accumulated points from a write-up. In this case, the point system has captured a minor infraction, while a more egregious behavior problem has gone unreported.

  1. It leaves little room for critical thinking and interpersonal communication.

We are all human. I would much rather discuss employee infractions on an individual basis than be tied to a point system. I have heard more than one person say that an employee who accumulates 10 points is a bad employee and deserves to be fired. That may be true in some, possibly most, situations. However, a point system does not account for intangibles, such as personal struggles, academic rigors, and overall attitude and desire to improve. I would hate to fire the one person who should have been saved, just because they reached the magic number.

So where do we go from here? I will be meeting soon with our student leaders and graduate assistants to talk about this system. Maybe with some tweaking, the point system can still be useful, particularly as an indication of the need for intervention. But we cannot continue to have termination dictated by a system that assigns the same values to all employees, regardless of their personal circumstances and contributions to our department. The point system was originally created in an effort to be fair, consistent, and treat everyone the same, but the bottom line is this: some employees are more equal than others.

How do you handle student employee discipline problems?

Learning Reconsidered… Reconsidered

For the past several years there has been a push within Student Affairs to identify and assess student learning outcomes. Five years ago, the idea of assessing learning outside of the classroom was new to me, and I was encouraged to read “Learning Reconsidered 2,” which I eagerly devoured. Reading the book prompted me to look at my work from a different perspective.  It shook things up a bit, creating that fizz of excitement that makes everything seem new and interesting.

However, the learning outcomes initiative that created a pleasant fizz seems to have resulted in an undesirable explosion. “Assessment” has come to mean “student learning outcomes assessment.” All other assessments that have nothing to do with learning outcomes, such as attendance numbers and student satisfaction, seem at best to be unimportant, and at worst are poo-pooed as not being “real assessment.”

I completely agree that we need to assess student learning outcomes within Student Affairs, and we have developed programs in my area that promote and measure student learning. However, in Campus Recreation, most of our programs and services are designed to serve students’ needs, not to measure student learning. And according to current research, this model is completely appropriate for supporting student success.

For example, a 2002 study entitled “The Value of Recreational Sports on College Campuses” indicates that students who participate in physical activity and recreational sports experience improved emotional well-being, reduced stress, and improved happiness. In a spring, 2014, survey of 2500 randomly selected students conducted at my university, 78% of respondents reported that our facilities and programs helped them to feel more at home at our university; 90% reported that they enjoyed participating in our activities or utilizing our facilities, and 92% agreed that our facilities and programs improved the quality of life at our university.

Other recent research has shown a correlation between students who utilize campus recreation facilities and their GPA. In fall 2013, Purdue University found that freshman students who visited their rec center 15 or more times during the semester had a higher GPA (3.08) than those who did not utilize the facility (2.81). In fall 2010, Michigan State University found that freshmen who purchased memberships at their recreation center had higher GPAs and stayed in school longer than those who did not purchase memberships. Both of these studies support research conducted by John J. Ratey, M.D., author of “Spark: The Revolutionary New Science of Exercise and the Brain,” that explores the brain-fitness connection.

What does this mean? For me, it means that student success, in part, depends on students utilizing our programs and services. Because of this, we need to do everything we can to keep students coming to the recreation center. We also need to understand the factors that prevent students from utilizing our programs and services, and change what we can to limit these impediments.

How do we gauge how well we’re meeting these challenges?

Attendance counts. Satisfaction surveys. Various other operational assessments.

We can impact a small percentage of the student population with programs that promote student learning outcomes, but we can impact student success on our entire campus just by improving the participation rate for our programs and services. Now THAT is something worth assessing!

I would love to hear your comments on student learning outcomes assessment vs. operational assessment.