Missing the Marks


Over the last several years, I have been very pleased to see an emphasis on program assessment within the NIRSA community. Because we have taken the time to perform intentional assessment, we have proof of a positive correlation between students using their campus recreation facilities, and higher GPAs and retention rates. We have been able to show that our student employees have improved their transferable skills by working within our programs and facilities. There is confirmed evidence that students gain leadership skills by participating in intramural and club sports. Additionally, we can use program assessment on a day to day basis to show us whether our intended goals for our programs and services are being met, and how we can tweak our programs for continuous improvement.

This is all good stuff. Truly.

Which makes it so much more disappointing when we, as individual members of NIRSA, don’t take the time to evaluate our own conference sessions.

Recently, a new professional and a graduate student from my department gave a presentation together at a regional conference. It was their first time presenting, and they worked hard to make their presentation engaging and relevant. They felt prepared, but when they saw that their presentation room had 100 chairs, they started feeling a bit nervous. Once they began, the chairs were full, with some attendees standing in the back of the room. They finished going through their information, and ended up fielding a number of questions from the audience. Attendees from our school who were in the audience all thought their presentation went well; however, we were looking forward to feedback from other NIRSA members who would not be quite as biased as our colleagues.

The team received 8 evaluations. Out of those 8, five people wrote comments. In a room with 100 – 120 attendees, only 8 provided feedback of any kind.

I have attended NIRSA conferences regularly over the last 15 years, when session evaluations were still being completed with pencil and paper. I have sat through some really great sessions (one given by George Brown on Student Learning Outcomes, which changed the direction of my career), and have labored through some really awful presentations that seemed like they were thrown together at the last minute. I have tried to at least give a number rating for the presentations I have attended, and have written comments when I could. I feel like that is the least I can do when someone has taken the time to put themselves out there and share information with the group.

It is fantastic that we take program evaluation so seriously. But shouldn’t we also take assessment seriously when it comes to our NIRSA conferences? Otherwise, how will we ever get better?


Using Google Quizzes with Online Student Employee Training


Training student staff to work in a recreation facility is an important step to ensure that customer service, risk management, and job skills are all communicated effectively to the employees. Though much of the training we require is more effective when conducted in person, there are some parts that can be conducted just as effectively through an online venue. Sometimes, combining online information with in person training is even better! If students can receive the information first through an online venue, more time can be spent in person on the practical application of the information.

If we decide to provide training information online to our student staff, it is important that they understand the training before continuing on to the practical application. After students complete an online training session, we can test understanding using a Google quiz. The following provides step-by-step instructions for creating and using Google quizzes.

First, create a Google form using quiz questions you have written that are based on the training you provided. If you have never created a Google form before, you can find step-by-step instructions here. Most types of questions will work as long as there is an exact answer expected. For example, multiple choice questions work well, but an essay answer using a paragraph box may not be appropriate, since correct answers may differ slightly from the answer key. It would also be a good idea to select “Required” for each question so that students don’t inadvertently skip a question.



Once you have created your Google form, you can convert it to a quiz. In the upper right-hand corner of the form you will see a “gear” icon, which will take you to the “Settings” menu. Select “QUIZZES” from the menu.


You will then have the option to make the form a quiz by sliding the first button to the right. If you want your students to have immediate feedback on their quiz results, select “Immediately after each submission” under the Release Grade option. You can also select which options you would like your students to see. If students will need to retake the training if they don’t meet the minimum score, you may want to show which answers they missed, but not the correct answers. These options are up to you to either check or uncheck.


Next, you will need to create an answer key so that student submissions will be automatically scored. Click on the first question in the quiz. You will now see a link at the bottom called “ANSWER KEY.” Click on that link.


You will then be able to choose the correct answer, and set the point value for that question.


Continue to select the correct answer and choose the point value for all subsequent questions. Once you complete this step, you will want to set up a confirmation message so that students can see their score after they submit their answers. Click the “Settings” icon at the top of the page, and select “PRESENTATION.” Then type in a “confirmation message” that directs students on how to find their score, and instructs them on what to do if they did not earn the minimum score to pass the quiz. Once you have typed the confirmation message, select “SAVE.”


When students complete the quiz and submit their answers, they will receive a confirmation screen containing a link they can select to view their score. Below is an example based upon the confirmation message that was set in the above image.


You will now want to choose how to collect quiz responses. Select the “RESPONSES” menu item at the top of your quiz.


A screen will appear that contains a spreadsheet icon. Click on the icon.


You will be prompted to select a response destination. I would recommend selecting the “Create a new spreadsheet” option.


The new spreadsheet that will collect all of your quiz responses will open on the screen. The spreadsheet will show the name of the student who submitted the response, the day and time the response was submitted, their score, and the answers they selected for each question.


If you have determined the minimum score necessary for students to pass the quiz, you can easily see if a student has failed by using “conditional formatting.” Conditional formatting allows you to create a rule where if a certain condition exists, then the formatting in that cell will be different from all the other cells. On the top menu, select “Format,” then “Conditional formatting…”.


A box will pop up on the right side of the spreadsheet that you will use to create rules for formatting your spreadsheet. First, select the range of cells for the formatting. In this case, you want to format the column that contains the quiz scores.


Next, choose the condition for formatting the cells. The default selection is “Cell is not empty.” You can change this condition by clicking on the arrows beside this option.


A drop down list appears with choices for formatting. Since you want to be alerted if a student fails the quiz, select “Less than.”


Once “Less than” is selected, you can insert the minimum value needed for a passing grade. In this case, students must score at least 80% in order to pass the quiz. After you determine the formatting condition, you can choose the style you want to use. The default formatting style is to make the cell green. If you would rather use a different color, then select the arrow beside “Default” to bring up other formatting choices.


I would like failed scores to show up as red, so I have selected “Custom format.” I then have the ability to fill the cell with a red color if the score is below 80%. Select “Done” when finished.


Now let’s say that Joe Schmoe and Famous Amos have both taken the quiz. They submit their quiz answers and receive a confirmation message with the link they can choose to view their results. Below are samples of what each will see. Joe has scored 100/100, or 100% and Famous Amos has scored 60/100, or 60%. Famous Amos will need to retake the training and quiz.




The spreadsheet contains both students’ results, including when they took the quiz, their score, and which answers they missed. As you can see, since Famous Amos scored less than 80%, the cell containing his score is red.


Using Google quizzes and spreadsheets with conditional formatting can help you to complete training processes, especially when you have a large student employee staff, limited time for in person training, or when you need to hire new students in the middle of a semester. You can then follow up the training with practical applications.

Disclaimer: As of October 28, 2016, the steps listed above are, to the best of my knowledge, the ones needed to create and use a quiz in Google forms and to set conditional formatting in the responses spreadsheet. Google occasionally updates their Google Drive, Sheets, Forms, etc., and sometimes when this happens, old documents don’t work the same way with the new updates.




Why I Won’t Interview You

misspelling Photo by jamieanne     License CC BY-ND 2.0

Dear Graduate Assistant Applicant,

This is a very exciting, albeit stressful, time in your life. You have chosen your favorite graduate school programs, and you have just started the process of applying for graduate assistantships, including the assistantship that I have advertised. Your undergraduate experiences have prepared you well to be a contributor to my program. Your future awaits; the road lies before you.

But I will never interview you.

Why not?

It’s because of your resume, or your cover letter, or both. This information is the first chance I have to see who you are, and if your experiences and talents will work well within my program. Your resume helps me to decide if I want to take the next step and talk with you further about the assistantship.

So why do these documents, these first glimpses into who you are, contain numerous misspellings and grammatical errors? Why is the formatting inconsistent? Why haven’t you fully represented your skills and experiences? I’m not talking about just one mistake that can be overlooked; I’m talking about multiple errors. Reading through your cover letter and resume, I don’t see a talented, enthusiastic, hard-working individual shining through these pages. I see a careless student who lacks attention to detail, and who doesn’t meet the minimum requirements that I look for in a graduate assistant candidate.

I know this isn’t really you. But this is what I see when I read your application.

Before sending out one more poorly written resume or cover letter, please consider doing the following:

  1. Proofread everything. Then proofread again. If you had proofread your application materials, or had asked others to proofread them, I would never have received my copy containing blatant misspellings. I wouldn’t have been distracted by inconsistent and weird indentations, lack of capitalization and punctuation, and incorrect use of homophones (e.g. their, there, they’re). And by the way, I am not a “sir” and I do not work at the school that you so adamantly wrote that you would like to attend.
  1. Be sure to list experiences and skills that show how you are qualified to fulfill my particular graduate assistantship posting. Be specific, and match your skills with the job requirements listed for the assistantship. For instance, rather than stating “supervised staff,” you could write “supervised staff of 50 front desk and fitness employees.” I want to understand all aspects of your past experiences so that I can make a decision about moving forward with an interview.
  1. Ask for help from your current professional staff supervisor or your school’s career center. If you had shown your resume to someone from these two groups, they could have helped you to revise and correct your information.

I hesitated sending you this letter, because I didn’t want to hurt your feelings. And at least your resume was not as bad as these. But I know that you deserve consideration for an assistantship, and first impressions matter. Please correct your resume and cover letter so that you have a chance to compete for your dream assistantship.

Needs More Cowbell

Photo by Thomas Kohler , license CC BY-NC-SA 2.0

As the cost of higher education soars, colleges and universities have come under closer scrutiny by parents and students to determine if they are receiving commensurate value from their tuition dollars. In response, Student Affairs divisions have become much more adept at showing how we support student success through our co-curricular programs and services. The stage has been set, and through our assessment efforts, we have been telling our stories to an audience that expects a high performance standard.

And if intentional assessment has been the platform for our storytelling, then Student Learning Outcomes has become the rock star. The incessant drum beat from SACS reviews, CAS standards, and Learning Reconsidered that has driven us to create and assess SLOs within each Student Affairs department has steadily taken hold, and we can proudly sing about skills that students learn by participating in our programs.

In Campus Recreation, as well as in other departments within Student Affairs, Student Learning Outcomes assessment has become the lead singer, receiving most of the effort and attention. But the glitz and glamour of SLOs have taken attention away from perhaps the most important measures that lead to student success: participation numbers and student satisfaction. These operational outcomes—what I’m calling the “cowbell”—are always there, are usually assessed, but are easily ignored in the shadow of Student Learning Outcomes assessment. Yes, SLOs are important, but what we really need, particularly in Campus Recreation, is more cowbell. Here’s why.

  1. Students who participate in recreational programs and services have higher GPAs, and are retained at a higher rate, than the average student population.

Reports from schools such as Michigan State University, University of Iowa, and University of Arkansas, show a correlation between gym use and GPA, with gym users earning higher GPAs than non-users. The reports also show a correlation between gym use and retention, with users being retained at a higher rate than non-users. We have noticed the same correlations at my school, and hope to formalize the findings at the end of this assessment cycle.

  1. Exercise resets and recharges your brain.  

Harvard Medical School psychiatrist John Ratey explains the cognitive benefits of exercise in his book, Spark: The Revolutionary New Science of Exercise and the Brain. According to Ratey, “Exercise is the single best thing you can do for your brain in terms of mood, memory, and learning.” And while it’s been shown that exercise improves learning, it also reduces stress and lifts depression.

So let’s take a look at our student participation numbers, and make a plan to reach out to non-users to get them involved with our programs and services. Let’s collaborate with other campus partners, such as student health and student counseling centers, to provide wellness programs to students who are struggling with health issues that impact their classroom learning. Let’s provide a safe, clean, friendly environment in our facilities that make students feel at home and will keep them coming back. Let’s collect student feedback on our programs and services, and make changes according to that feedback, so that students know we value their opinions and want to meet their needs. Then let’s assess our plans to improve student participation and satisfaction, see what has worked and what hasn’t, and develop an improvement plan. And let’s keep running the numbers to see if the correlation between gym use and GPA, and gym use and retention, show positive results.

Student Learning Outcomes assessment is great, but studies show that we need to start featuring the cowbell. An increase in student participation in our recreation programs can mean an increase in student success.

I’m telling you. You’re gonna want more cowbell!

more cowbell
Photo by Mike Kline, license (CC BY 2.0)

Closing the Job Skills Gap: Learning Outcomes in Student Employment

Poster advertising student employment

“Why are so many college students failing to gain job skills before graduation?” This is the title of a Washington Post article that caught my attention. It cited a survey, conducted in November 2014 by Hart Research Associates, which found that employers don’t feel that today’s college students are graduating with important skills necessary to be successful in their workplaces. Skills that employers felt were most important for their employees, such as communication, problem solving, and decision making, were also skills they thought college students lacked upon graduation.

Employers have long complained of the disparity between skills they feel college graduates should have, and the actual skills they possess upon graduation. But with the ever-rising cost of higher education, colleges and universities are being held accountable on a higher scale to deliver an excellent product. With colleges and universities under such scrutiny, this provides a perfect opportunity for Student Affairs to shine, since we offer programs and services that allow students to learn the skills that employers desire.

One way that Student Affairs can help students develop skills for future success is by hiring them to work campus jobs, and providing intentional development opportunities during their employment. When we provide ongoing training, such as sessions on conflict resolution, decision making in undefined circumstances, and understanding personality differences, and we loosen the reins on our employees to start using the skills we teach them, then their job skills are bound to improve. Even if the work is unrelated to their college major, real life experiences gained through campus employment are transferable across all majors. And with the right assessment in place, we can prove it.

For example, Campus Recreation offers unique opportunities for student employees to improve the skills that employers most value. Below is the current list of skills that four in five employers rate as very important (a rating of eight, nine, or ten on a zero-to-ten scale), along with Campus Recreation job responsibilities that develop these skills:

1. The ability to effectively communicate orally
2. The ability to work effectively with others in teams
3. The ability to effectively communicate in writing
4. Ethical judgment and decision-making
5. Critical thinking and analytical reasoning skills
6. Ability to apply knowledge and skills to real-world settings

Campus Recreation student job responsibilities: Supervising staff and delegating tasks; sports officiating; teaching group fitness classes; providing personal training instruction; providing written and verbal employee evaluations; planning staff development sessions; teaching job duties to new hires; collaborating on policy development; answering patron questions; explaining and enforcing policies; resolving conflicts; responding to emergencies; troubleshooting facility problems; providing ongoing positive and corrective feedback; documenting problems and concerns; requesting procedural changes; developing employee handbooks.

In my department, assessment on whether students are learning these skills is done through both direct and indirect measures. Intramural officials are evaluated by their supervisors, and written and verbal feedback is given on a regular basis. Emergency drills are conducted on a monthly basis for both recreation center and aquatic center staff, and written and verbal feedback is provided. Student employees receive mid-semester informal evaluations on their job performance, and written evaluations are completed at the end of each semester.

But my favorite assessment is where the students write about their own learning. Student employees complete exit surveys and blogs to explain how they think their transferable skills have improved as a result of working in Campus Recreation. Here are a few examples taken from assessment results:

“I learned to fine tune ideas and fully plan out the details before presenting them to others.”

“I’ve also learned when to “pick your battles” – knowing when to fight for something, and when to let things go. Sometimes your way isn’t always the best way, and you need to remember the goals and needs of the department over your own.”

“Before coming to college and working at the Johnson Center people just tended to listen and go along with leadership. But in college people start developing their own minds and thoughts and how they want to approach things. It is more realistic to see everyone have their own viewpoint, than for everyone to agree. So being in meetings where collaboration was important taught me its value. I know that a good collaboration environment is open and free from criticism. Everyone’s voice must be welcome and applauded, if it’s not then collaboration will not work.”

“My major at UK is communications. So naturally my communication competency has gone up from my education. But the ability to take what I learn in the classroom and apply it to my supervising at the JC has been great. I now have a better awareness for cultural and personal differences. I know how to navigate different situations with different people. I can do this because I know how to communicate with them on an individual basis.”

“Over the past few years I have handled several emergency situations. After being a part of so many of these incidents, I feel that I can appropriately handle almost any situation.”

“I have watched myself grow as a person and as an employee in my 3 years at the JC, and I have watched the respect and trust of my fellow employees in me grow. Because I have successfully gained the trust of my employees, I feel more confident in everything from group projects to interviews to other jobs.”

By providing ongoing staff training for our student employees, allowing them to be involved in planning and decision making during their employment, and giving them the freedom to apply their skills and make decisions on their own, we can close the gap between what employers expect from graduates, and the actual skills the students acquire before graduation. And by conducting appropriate assessments, we can demonstrate, not only to university administrators and other stakeholders, but to the students themselves, the value of their job experience in Student Affairs.

Channeling Gordon Gekko

“The most valuable commodity I know of is information.”
-Gordon Gekko, Wall Street

Last January I spent numerous hours writing blog posts describing technology tools I use for my job. I had found some tools that were especially helpful for managing a university recreation facility, and I wanted to share with others who could use them as well. My plan was to continually add posts as I discovered additional helpful technology tools.

Part of my motivation for attempting to write this series resulted from my frustration with the lack of help and direction that I experienced while trying to find tech tools that I could use. I am not particularly tech-savvy, and I thought that if only someone would show me some nifty tech tools, then I could solve all of my challenges. I complained bitterly that, since I wasn’t an IT person, I had no idea where to begin. My mantra was, “You don’t know what you don’t know.” When I actually stumbled upon a useful tech tool, I felt like one of those blind squirrels who occasionally finds a nut, and I wanted to share my good fortune with everyone whose jobs were similar to mine by writing posts about my find.

It took me only a few months to realize that I was not only a blind squirrel, but I was naïve as well. How could I possibly have thought that the detailed instructions I wrote in my posts would stay current, even for as long as it took to finish writing the post? Google docs spreadsheets changed to “Sheets,” Google Script Apps became “Add-Ons,” and the effort I took to post screen shots along with my instructions was all for naught as that version of Google docs became obsolete.

I’ve reflected on my folly for several months, and I’ve realized that, although it is impractical to do more than write my own review about technology tools, my excuse that “You don’t know what you don’t know” was no longer valid. I was no longer a blind squirrel.


When I first started trying to find technology tools, it is true that I really didn’t know where to look for information. That was my biggest problem. It’s not so much about that one tool that you might find: it’s about knowing where to get the information you want. As Gordon Gekko, says in the movie Wall Street, “The most valuable commodity I know of is information.” And though Gekko is a villain, he’s still right. You don’t need to already know the information, you just need to know how to get it.

I started with Twitter, following links on professional hashtags such as #SATech, #HigherEd, and #SAChat. The posts from those hashtags led me to websites such as Diigo, Edudemic, and Inside Higher Ed. I signed up to receive posts from the Diigo in Education group, and started reading blogs from tech-savvy professionals such as Ed Cabellon, Eric Stoller, George Couros, and many others. I have now developed a large network of websites and social media sites that I use to discover new tech tools. My latest tech source is Richard Byrne’s blog, “Free Technology for Teachers,” which has a plethora of ideas that can be adapted for use in my job.

New tech tools are being developed every day, and now I feel confident that I have a network of resources that will lead me to the tools I need. So instead of bemoaning the fact that “I don’t know what I don’t know,” I can confidently state, “I know how to find information about what I don’t know.” No more excuses. No more blindly stumbling upon nuts here and there. The most valuable tool—information—is there for the taking.

Technology Nuts for Blind Squirrels: Create a Self-Grading Quiz

As technology becomes more infused into every profession, some of us who have been in the recreation field for “a while” find ourselves having to explore the wonders of technology on our own. The problem is that we don’t know what we don’t know. This post is part of a series about technology tools I have stumbled upon that have proven to be the most useful in my job.


I was intrigued with the idea of a self-grading quiz when I saw a link to Flubaroo, a Script App that was developed to be used with Google Forms spreadsheets. However, when using Flubaroo, you first had to receive all of the responses before you could grade the quiz. I wanted to be able to automatically grade a response as soon as it was submitted. Here’s how I did it:

1. Create a quiz using Google Forms and set the response location as a new spreadsheet.

2. Open the responses spreadsheet. Select the “+” at the bottom to add two more sheets. Click the arrow next to the sheet name to rename the sheets. In this example I used “Form Responses,” “Answers,” and “Grades.”


3. Select the “Answers” sheet tab to open it. Click on the first cell in the spreadsheet (A1). Type the “=” (equals) sign. Then click on the “Form Responses” sheet.


4. Click on the column header labeled “A”. Then click on the “Answers” sheet.


quiz45.  Select “Enter” on your keyboard. The formula in each cell of column A will now read “=‘Form Responses’!A:A”


6. Select cell B1. Type the “=” sign. Click on the “Form Responses” sheet and click on the column header labeled “B”. Click the “Answers” sheet and select “Enter” on your keyboard. The formula in each cell of column B will now read “=‘Form Responses’!B:B”

7. Repeat this process on the “Answers” sheet until each column in the “Answers” sheet matches each column in the “Form Responses” sheet. To make your spreadsheet easier to read, select the “Text wrap” icon below the spreadsheet menu.


NOTE: This process seems redundant, but it is necessary, since responses will be continually added to the “Form Responses” sheet. Trust me on this.

8. The last sheet will be your grades, so I have surprisingly labeled that sheet “Grades.”  Select the first cell (A1) again and repeat the steps to copy column A from the “Answers” sheet onto column A in the “Grades” sheet. Repeat for each column that DOES NOT contain an actual quiz question.


9. When you get to the cell in row 1 on the “Grades” sheet that contains the first quiz question, select the cell, type the “=” sign, and click on the “Answers” sheet. Do not select the entire column. Click on the corresponding cell only.


10. Click on the “Grades” sheet and select “Enter” on your keyboard. Repeat this process for each question on the quiz.

11. After you have finished this process for all questions on the quiz, select the next cell in Row 1 on the “Grades” sheet and type “Grade (Percent)”

12. Now you will want to set the criteria for grading. On the “Grades” sheet, select the cell in Row 2 that is directly under the first question in Row 1. You will want this cell to refer to the cells on the “Answer” sheet, so the formula will contain the following notation:


And the formula will look similar to this:

=if(‘Answer’!D2=“b. Occupational Exposure”,100,0)


For your own quiz, replace everything inside the quotation marks (in this case replace b. Occupational Exposure) with your own quiz answer. When entering this formula, the answer you expect should EXACTLY match the answer on the Google Form. This formula is case sensitive and if there is a space or special character on the form itself, then it needs to be typed the same way in this formula. Note in this example, there is a space between “b.” and “Occupational Exposure”. Select “Enter” on your keyboard. You should see a “0” in the cell.

13. Copy the formula down the column through as many cells as expected responses. Since I have 70 students, I copied down 90 rows, just in case some needed to retake the quiz.


14. Continue entering the answer formula for each question on the “Grades” sheet. Be sure to copy the formula down in each column.

15. Under the “Grade (Percent)” cell, enter a formula to calculate the average of all questions on the quiz. In my example, the first question started in column D and the last question was in column Q. The formula I entered is “=average(d2:q2)”.


16. Copy this formula down the column through as many cells as expected responses.

17. Select the column containing the Grade (Percent). On the top menu, select “Format→Conditional formatting…


18. Click in the blue box and select “is between”. In the first box enter “1” and in the second box, enter the minimum percentage required to pass the quiz. In my example, I required my students to score at least 75% on the quiz. Check the “background” box, and select the color you want the cell to be if students score below the minimum percentage. If you have a lot of students who will be responding, this is an easy way to see at a glance if anyone has failed the quiz.


20. To reduce the number of decimal places in the grade, choose “123” located just below the top menu, and select “Rounded.”


That’s it! Now you can send the live form link to your staff and start gathering responses and calculating grades. Add the AutoCrat Script App to your spreadsheet if you want to let your students know immediately how they scored on the quiz. You might want to set your Google Forms Notification Rule to notify you when a form is submitted, but you will no longer have to spend any time grading the quiz.

Here is the link to a copy of the quiz that I used as my example. It is a “view only” version, but if you open the link and select File→Make a copy… then you will be able to edit the copy you created. You can then experiment with the responses to see how a form submission looks in the response spreadsheets.

If you have other variations of self-grading quizzes, feel free to share them in the comments section.