RU Banner
Office of Teaching Evaluation and Assessment Research
Office of Teaching Evaluation and Assessment Research 116 College Avenue
Rutgers, The State University of New Jersey
New Brunswick, NJ 08901
Phone: (848) 932-7466
Fax: (732) 932-1845

Frequently Asked Questions (for instructors) - Online SIRS

Survey Results

How do I see the results of my surveys?
For recent results (Summer 2019 on, and for some units Fall 2018 on), instructors can view their results by logging into Blue at Results are made available on the day after the registrar closes the grading period, or approximately 3 weekdays after the last day of exams. Exact calendar dates vary from year to year, but roughly:
  • Fall Semester: results distributed first week of January
  • Spring Semester: results distributed prior to Memorial day
  • Summer session: results for all sessions distributed in the last week of August
  • Winter session: results distributed at the start of the Spring semester
SIRS results will remain available in Blue indefinitely, but we strongly recommend saving the PDF version for future reference.

For past semesters, the statistical portion may be viewable at the SIRS results site - - or by request. The full report (including student comments) was sent as an email attachment to the individual instructors and a copy was made available to the academic department; OTEAR does not archive the full reports.
Will the online format change the faculty rating? Will students who do not attend class but respond to the survey bias the outcome?
In our studies to date, changes in the average rating are not significant but we are continuing to collect data to determine if the online system affects the ratings. On average, individual ratings varied by ±0.48 points between Fall 2007 paper surveys and Fall 2008 online surveys. For comparison, individual ratings varied by an average of ±0.40 points between Fall 2006 and Fall 2007, both semesters using paper surveys. Comparisons were limited to instructors teaching the same course for more than one semester (i.e., we did not compare ratings for the same instructor teaching different courses, nor did we compare ratings for the same course taught by different instructors).
How are the results used?
University-wide, the survey results are used as part of the faculty promotion and tenure review process. While the use of the survey data varies within individual academic units, it is often used as part of a review process for improving the curriculum, implementing changes to teaching strategies, reappointment review for part-time lecturers and teaching assistants. Many faculty and instructors use the survey data, in particular the comments, to assess and improve their own teaching methods.
Who gets to to see the results?
The summary statistics of anonymous student responses for faculty and part-time lecturers are available to the entire university community at , Fall and Spring semesters only. Older data is available on CD-ROM at the University Libraries. Data for Teaching Assistants or other student instructors is no longer published because of the requirements of the Family Educational Right to Privacy Act (FERPA). Student comments are not published.

OTEAR distributes all the reports, including the data for teaching assistants and the student comments, directly to the academic departments and to the individual instructors shortly after the grading period ends. By request, OTEAR also provides the raw, numerical student response data to departments that want to run their own statistical analysis.
Who gets to see the comments?
OTEAR sends the comments directly to the instructors and to the academic departments. The comments are completely anonymous and grouped by question. Comments are not published.
Why are some of the department means on the SIRS web site or my teaching grid different from what I see in Blue?
There are two reasons why the “mean of department” or other comparative means might change. Usually the differences are within a couple hundredths of a point, and not statistically significant:
  1. Corrections, made at the request of the department chair or dean, to other courses in the department. Making these corrections necessitates a recalculation of the department-wide means for all courses in the department.
  2. Differences between the new "Blue" survey platform and the old survey processes on paper and in Sakai that cause some questions in team-taught courses to be weighted differently in each system (per-section versus per-instructor). Importing the data from Blue into our local historical database causes the Blue data to be recalculated using the older weightings (multiplying the responses by the number of instructors in the course). This only affects questions about the course or student, not the questions about the instructor.

Student Responses

What is the survey link to give students?
Everyone (faculty, students and staff) can reach the Blue survey system at .
Will the number of students who respond drop?
Please refer to our information about increasing participation.

Instructors who continue to conduct the survey during class time will see very little change in their student participation rates.

The response rate (the number of students who filled out the survey divided by the enrollment) for the online surveys for each semester ranges on average between 50% to 65%. When compared to paper surveys, this represents a drop for some departments, and an increase for others. This change is largely caused by a shift in the behavior of instructors, from conducting the survey in class to relying on students to complete the survey on their own. We are closely tracking response rates as we implement our online ratings. Based on our own data and on evidence from other universities, the change in response rate does not significantly affect the average ratings for individual instructors or departments as a whole.

Response rates to individual surveys may either decrease or rise depending on how the instructor communicates the details of the survey with the students, and is affected by factors such as class size, attendance policies, and mode of instruction (lecture versus lab, etc.).
If the response rate is very low, can one student unfairly impact my ratings?
Relative to the enrollment in the course, the impact of an outlier may be exaggerated by a low response rate. Although possibly disproportionate, these responses do reflect some student opinions and care must be taken when interpreting the survey. Multiple surveys across courses and semesters should be considered together, and outliers should be recognized as such. Please refer to the guidelines for interpreting the Student Instructional Rating Survey. SIRS is not intended to be the sole determiner in the assessment of teaching; other evidence of teaching ability can and should be included to offset the impact of an outlier.

If the outlier is due to student error (e.g., filling out the wrong survey, reversing the scale), please refer to our policy for requesting corrections.
Can I get a list of students who did or did not respond to the survey?
No, the system is designed to protect student identities and does not report who did or did not respond to the survey.
A student made a mistake on the survey form, can it be corrected?
Prior to the survey due date, students can go back and edit their own responses.

We do not interpret the students' responses on the survey, and we cannot examine student responses while the survey is still running. Any instructors who feel that a student has incorrectly submitted comments for another instructor or reversed the answer scale should communicate this to his or her department chair. The department chair should request in writing that the survey responses be reviewed and reprocessed.

In many cases students report that they they have made a mistake, but they merely misremember the details of filling out the survey. Requests for corrections should only be made after reviewing the survey results and ascertaining that a mistake does in fact exist and impacts the outcome. All requests for the department must be submitted together because each correction may affect the entire department mean.

Survey Process

Which courses are using the online survey?
As of Spring 2020, all Student Instructional Rating Surveys are conducted online.
Are Biomedical and Health Sciences (legacy UMDNJ) courses included?
As of Fall 2019, all School of Nursing, School of Public Health, and Ernest Mario School of Pharmacy courses will use SIRS. For other units, participation in the SIRS process depends in part on which student registration system is used for a course. Courses that were in "joint Rutgers/UMDNJ" programs, use the Rutgers registration system (WebReg/REGIS) and used SIRS before the merger will continue to use SIRS. RBHS courses that were formerly UMDNJ and continue to use the UMDNJ regstration system (Banner) will not be included in SIRS, and should continue to use their existing processes (RBHS departments may contact us to discuss the use of SIRS).
What questions are on the survey?
The questions are identical to the ones used for the past twenty-five years. We have an example of the online survey , and we can also provide interested people with a fully functional demonstration. Please contact us at please replace "brokenemail" with to arrange a demonstration.

Some departments add additional questions. To see a full list of all department forms and added questions, please consult our extended question list.
Can I add my own questions?
Yes, instructors can add their own questions (exceptions may be made at the school level). Please refer to the instructions for adding additional questions. Questions added to SIRS should be consistent with the purpose of collecting course feedback, and may be posted publicly at SIRS Results. For other types of questions, instructors may prefer to use alternate methods to run their own questions, such as a poll or an anonymous quiz in Sakai or a Google Docs web form.

By request we can add a standardized set of additional questions if they are to be used department-wide.
Can I see how many students have replied to my surveys?
Instructors can see the real-time response rate on the Blue Response Rate Dashboard, or by logging in to Blue and clicking “Response Rates” in the left-hand column.
How are surveys for lectures with multiple recitations or labs handled?
Partly this depends on how your department lists the course in the Schedule of Classes. If the lecture and labs or recitations are listed separately (e.g. the lecture is listed as "102:01" and the labs are listed as "103:01", "103:02" etc.), we will create a survey for each section independently as with any other course. If the lecture is made up of multiple sections but does not have it's own course number (e.g., "101:01", "101:02" and "101:03" all meet together in a lecture hall one day a week), we will create a survey for each of the individual sections, each of which will include questions for both the recitation/lab instructor (TA) and the lecturer; this is treated similarly to a “team taught” course. Responses will be sorted into separate reports to reflect the lecture vs the recitation or labs.

If you teach both the lecture and a recitation for the same course you may wish to have two surveys so you can gather feedback about the lecture and the recitation separately. To do this, we will need to alter the survey manually. Please contact us to request the change.

Note that due to the way we combine the sections into reports for lectures, TAs who teach more than one section will similarly receive combined reports for their own sections. We can separate these on request.
How are courses with more than one lecturer (team-taught) handled?
All instructors in a course should be included in the same survey. Blue will repeat some questions for each instructor, so the students may answer individually. A small number of questions are designated as “section-wide” and will only be presented to the student once, with the responses distributed to all instructors who teach the same section, including TAs (note that the section-wide questions represent a change from previous SIRS systems, and may result in small differences to the department-wide comparative means when compared to previous years).

Team-taught courses that wish to stagger the faculty surveys may ”split” their course into multiple surveys, one per instructor, that can then be distributed to the students in different weeks of the semester. This split can be created by administrators in each academic department, but you may contact OTEAR for assistance.
What can I do to get more students to take the survey?
Faculty and instructors are essential to ensuring that the students respond to the survey. Above all else, communicate with your students regarding the importance of the survey to improve your own teaching, as well as the importance to the university as a whole. Consider taking the following actions:
  • Set aside some class time and have the students complete the survey in class. This will often result in response rates equal to the paper surveys.
  • While the survey is running, direct your students to . Do not rely solely on email messages, or on the Canvas pop-up message (both of which may sometimes fail) - be sure to give this link to your students via multiple channels.
  • Do not rely on our email reminders - students may not read the email, and we cannot send email to students who do not provide an accurate email address to the university.
  • Include a statement on your syllabus that you expect all students to complete the SIRS survey.
  • Use informal, midcourse surveys throughout the term.
  • When the survey begins, take some class time to discuss the importance of the survey.
  • Give the students personal examples of how you have used prior surveys to improve your teaching.
  • Inform the students that the surveys are used by the University in promotion, tenure, and reappointment decisions.
  • Assure students that their comments and responses will always remain anonymous.
  • Invite students to view survey data from previous semesters at
  • Read more about student participation
How do you enforce student participation?
The survey is voluntary. The system notifies students by email of the availability of the survey, students who do not reply to the survey will receive repeated reminders until they respond or until the survey ends. For courses that are taught through Canvas or Blackboard, students will see a pop-up reminder each time they log in to those systems. Additional methods of enforcement cannot be implemented until the university community has an opportunity to discuss the implications and practicality, however instructors have a great deal of flexibility for encouraging student participation.
Why do students need to log in? Does this violate their anonymity?
Student log-in information is used only to determine which surveys a student can take, and to prevent the students from responding more than once to the same survey. The survey software never reveals the students' identities, and all reports generated by OTEAR only include anonymous, aggregate data. See the privacy statement for more information. It is important that you communicate to your students that you will only see anonymous data, and only after final grades have been submitted.
My students cannot log in to the survey.
If you sent the survey link to your students, please note that two problems frequently beak the link:
  1. The period at the end of the sentence may break the link if it directly follows the link. Please put the link on a line by itself with no additional punctuation, or make sure you put a space between the end of the link and the period (e.g., “Please go to .”).
  2. If you use Outlook Web Access, do not copy and paste the link. There is a known bug in some versions of Microsoft Outlook Web / Exchange 2003 that breaks links. You must forward or retype links from Outlook Web instead of copying.

If you are using Canvas:
  1. Some students may see a blank page after clicking the “pop-up reminder” in Canvas. These students should use the direct link - - instead. We are investigating the issue, and believe it affects only a very small number of students.
  2. In many cases, telling the students to clear their browser cache and cookies (or use a different computer) may resolve any issue in Canvas.
Can students who withdraw from the course take the survey?
No. The survey system only allows students who are currently enrolled in the course to take the survey. Roster information is updated daily, however various university processes and cut-off times may cause a 24-hour delay.

Note that Blue may report total counts that include withdrawn students. These students do not have access to the survey, but we do leave them in the system temporarily to allow for registration problems that are later corrected (students with registration problems will temporarily lose access). This creates a minor discrepancy in the student count that we correct when generating the final results. 

Blue Surveys

What is Blue?
Blue is a new survey tool that replaces both the older Sakai survey component and the EvaluationKit Pilot. As of Summer 2019, all online surveys are now hosted in Blue. Blue has several advantages over the previous systems.
What is DIG?
DIG (“Data Integrity Gateway”) is an administrative feature of the Blue survey system. DIG allows academic departments to verify or correct the course instructor details, modify survey dates, initiate midcourse surveys, and make several other changes before the surveys are sent to instructors or students.
Is Sakai going away?
Sakai will no longer be used for SIRS, but it remains in place for regular coursework for the time-being. in Fall 2018, Rutgers announced plans to transition to Canvas as the primary course management system. Faculty are encouraged to move to Canvas as soon as they can, but Sakai will remain available until a transition plan for Canvas is devised and implemented.

The decision to use Blue was made independently, and Blue works well in Sakai as a direct replacement of the previous ”built-in“ survey component. The only changes that you will see in Sakai are that the current “Survey Dashboard” will eventually be replaced by the equivalent Blue survey dashboard at the same link -, and the “All Surveys” item on the “My Workspace” tab may eventually be removed.

Additionally, Blue surveys now appear as a Sakai tool that instructors may choose to add directly to their course sites (look for it in the “LTI Plugin Tools” section of the tools page when creating or editing a Sakai site).
What is the survey link for Blue?
Everyone (faculty, students and staff) can reach Blue at . Students and faculty will all receive the link via email at the start of the survey process.

The link above is preferred, but Blue can also be reached through links in the Canvas and Blackboard systems. Courses in Sakai may use the new SIRS tool or the previous SIRS link - - which now shows the Blue surveys.
Will instructors or students need to do anything differently?
No. OTEAR will try to make the transition as seamless as possible. Instructors will see some new features, including:
  • The ability to change survey dates directly in Blue
  • A streamlined method of adding additional questions
How do I add questions to surveys in Blue?
You will receive an email when your survey is available for editing in Blue, and can choose from pre-existing questions simply by clicking a button. Additional, several slots are available for entering questions of your own. Full instructions are available for adding questions in Blue.
How do Canvas, Blackboard, and Sakai integration work?
When the survey starts, Canvas and Blackboard will prompt the students to complete the survey through a pop-up reminder when they log in, and continue to remind the students to complete the survey each time they visit.

Sakai is not able to send pop-up reminders. Faculty and students who use Sakai will not see significant changes in the way the surveys currently work. Most notably, the “All Surveys” link on the “My Workspace” tob will no longer work, but the main Sakai survey link continues to show a “Survey Dashboard” as it did in previous semesters. Finally, a new tool has been added to Sakai that allows instructors to add the survey dashboard to their own course sites (look for the SIRS tool in the LTI section when setting up your course site).
Does Blue send email reminders?
Yes, Blue will send email reminders to both students and faculty. For Canvas courses, students will get both the email reminders and the in-course prompt to complete the survey.

evaluationKit Pilot (ended)

What was the evaluationKit pilot?
OTEAR previously tried a new survey tool, hosted at While EvaluationKit was an improvement over the Sakai survey tool, we feel that Blue is a better fit for Rutgers University. Blue has many of the same benefits as EvaluationKit, plus some additional advantages including a survey dashboard in Sakai (EvaluationKit only directly supported Canvas and Blackboard).





You are using a web browser that does not support "CSS". Please ignore the part of this page below this text. You may be using an older version of web browsing software that is likely to have severe security and identity-theft issues, as well as problems displaying certain web pages. You should download a newer web browser from Microsoft or Mozilla.

Search Rutgers