SIRS Information for Instructors

Frequently Asked Questions (for instructors) – Online SIRS

Survey Results

For recent results (Summer 2019 on, and for some units Fall 2018 on), instructors can view their results by logging into Blue at Results are made available on the day after the registrar closes the grading period, or approximately 3 weekdays after the last day of exams. Exact calendar dates vary from year to year, but roughly:

  • Fall Semester: results distributed the first week of January
  • Spring Semester: results distributed prior to Memorial Day
  • Summer session: results for all sessions distributed in the last week of August
  • Winter session: results distributed at the start of the Spring semester

SIRS results will remain available in Blue indefinitely, but we strongly recommend saving the PDF version for future reference.

For past semesters, the statistical portion may be viewable at the SIRS results site – – or by request. The full report (including student comments) was sent as an email attachment to the individual instructors, and a copy was made available to the academic department. OTEAR does not archive the full reports.

In our studies to date, changes in the average rating are not significant, but we are continuing to collect data to determine if the online system affects the ratings. On average, individual ratings varied by ±0.48 points between Fall 2007 paper surveys and Fall 2008 online surveys. For comparison, individual ratings varied by an average of ±0.40 points between Fall 2006 and Fall 2007, both semesters using paper surveys. Comparisons were limited to instructors teaching the same course for more than one semester (i.e., we did not compare ratings for the same instructor teaching different courses, nor did we compare ratings for the same course taught by different instructors).

Universitywide, the survey results are used as part of the faculty promotion and tenure review process. While the use of the survey data varies within individual academic units, it is often used as part of a review process for improving the curriculum, implementing changes to teaching strategies, and reappointment review for lecturers and teaching assistants. Many faculty and instructors use the survey data, in particular the comments, to assess and improve their own teaching methods.

The summary statistics of anonymous student responses for faculty and part-time lecturers are available to the entire university community at, Fall and Spring semesters only. Older data is available on CD-ROM at the University Libraries. Data for Teaching Assistants or other student instructors is no longer published because of the requirements of the Family Educational Right to Privacy Act (FERPA). Student comments are not published.

OTEAR distributes all the reports, including the data for teaching assistants and the student comments, directly to the academic departments and to the individual instructors shortly after the grading period ends. By request, OTEAR also provides the raw, numerical student response data to departments that want to run their own statistical analysis.

OTEAR sends the comments directly to the instructors and to the academic departments. The comments are completely anonymous and grouped by question. Comments are not published.

There are two reasons why the “mean of department” or other comparative means might change. Usually, the differences are within a couple hundredths of a point, and not statistically significant:

  1. Corrections, made at the request of the department chair or dean, to other courses in the department. Making these corrections necessitates a recalculation of the department-wide means for all courses in the department.
  2. Differences between the new “Blue” survey platform and the old survey processes on paper and in Sakai that cause some questions in team-taught courses to be weighted differently in each system (per-section versus per-instructor). Importing the data from Blue into our local historical database causes the Blue data to be recalculated using the older weightings (multiplying the responses by the number of instructors in the course). This only affects questions about the course or student, not the questions about the instructor.

Student Responses

Everyone (faculty, students, and staff) can reach the Blue survey system at

Please refer to our information about increasing participation.

Instructors who continue to conduct the survey during class time will see very little change in their student participation rates.

The response rate (the number of students who filled out the survey divided by the enrollment) for the online surveys for each semester ranges on average between 50% to 65%. When compared to paper surveys, this represents a drop for some departments and an increase for others. This change is largely caused by a shift in the behavior of instructors, from conducting the survey in class to relying on students to complete the survey on their own. We are closely tracking response rates as we implement our online ratings. Based on our own data and on evidence from other universities, the change in response rate does not significantly affect the average ratings for individual instructors or departments as a whole.

Response rates to individual surveys may either decrease or rise depending on how the instructor communicates the details of the survey with the students and is affected by factors such as class size, attendance policies, and mode of instruction (lecture versus lab, etc.).

Relative to the enrollment in the course, the impact of an outlier may be exaggerated by a low response rate. Although possibly disproportionate, these responses do reflect some student opinions and care must be taken when interpreting the survey. Multiple surveys across courses and semesters should be considered together, and outliers should be recognized as such. Please refer to the guidelines for interpreting the Student Instructional Rating Survey. SIRS is not intended to be the sole determiner in the assessment of teaching; other evidence of teaching ability can and should be included to offset the impact of an outlier.

If the outlier is due to student error (e.g., filling out the wrong survey, or reversing the scale), please refer to our policy for requesting corrections.

No, the system is designed to protect student identities and does not report who did or did not respond to the survey.

Prior to the survey due date, students can go back and edit their own responses.

We do not interpret the students’ responses on the survey, and we cannot examine student responses while the survey is still running. Any instructor who feels that a student has incorrectly submitted comments for another instructor or reversed the answer scale should communicate this to his or her department chair. The department chair should request in writing that the survey responses be reviewed and reprocessed.

In many cases, students report that they they have made a mistake, but they merely misremember the details of filling out the survey. Requests for corrections should only be made after reviewing the survey results and ascertaining that a mistake does in fact exist and impacts the outcome. All requests for the department must be submitted together because each correction may affect the entire department mean.

Survey Process

As of Spring 2020, all Student Instructional Rating Surveys are conducted online.

As of Fall 2022, all School of Nursing, School of Public Health, School of Dental Medicine, and Ernest Mario School of Pharmacy courses use SIRS. For other units, participation in the SIRS process depends in part on which student registration system is used for a course. Courses that were in “joint Rutgers/UMDNJ” programs, use the Rutgers registration system (WebReg/REGIS) and used SIRS before the merger will continue to use SIRS. Rutgers Health courses that were formerly UMDNJ and continue to use the UMDNJ registration system (Banner) will not be included in SIRS, and should continue to use their existing processes (departments may contact us to discuss the use of SIRS).

The questions are identical to the ones used for the past twenty-five years. We have an example of the online survey, and we can also provide interested people with a fully functional demonstration. Please contact us at to arrange a demonstration.

Some departments add additional questions. To see a full list of all department forms and added questions, please consult our extended question list.

Yes, instructors can add their own questions (exceptions may be made at the school level). Please refer to the instructions for adding additional questions. Questions added to SIRS should be consistent with the purpose of collecting course feedback and may be posted publicly on SIRS Results. For other types of questions, instructors may prefer to use alternate methods to run their own questions, such as a Google Docs web form.

By request, we can add a standardized set of additional questions if they are to be used department-wide.

Instructors can see the real-time response rate on the Blue Response Rate Dashboard, or by logging in to Blue at and clicking “Response Rates” in the left-hand column.

Partly this depends on how your department lists the course in the Schedule of Classes. If the lecture and labs or recitations are listed separately (e.g. the lecture is listed as “102:01” and the labs are listed as “103:01”, “103:02” etc.), we will create a survey for each section independently as with any other course. If the lecture is made up of multiple sections but does not have it’s own course number (e.g., “101:01”, “101:02” and “101:03” all meet together in a lecture hall one day a week), we will create a survey for each of the individual sections, each of which will include questions for both the recitation/lab instructor (TA) and the lecturer; this is treated similarly to a “team taught” course. Responses will be sorted into separate reports to reflect the lecture vs the recitation or labs.

If you teach both the lecture and a recitation for the same course you may wish to have two surveys so you can gather feedback about the lecture and the recitation separately. To do this, we will need to alter the survey manually. Please contact us to request the change.

Note that due to the way we combine the sections into reports for lectures, TAs who teach more than one section will similarly receive combined reports for their own sections. We can separate these on request.

All instructors in a course should be included in the same survey. Blue will repeat some questions for each instructor, so the students may answer individually. A small number of questions are designated as “section-wide” and will only be presented to the student once, with the responses distributed to all instructors who teach the same section, including TAs (note that the section-wide questions represent a change from previous SIRS systems, and may result in small differences to the department-wide comparative means when compared to previous years).

Team-taught courses that wish to stagger the faculty surveys may ”split” their course into multiple surveys, one per instructor, that can then be distributed to the students in different weeks of the semester. This split can be created by administrators in each academic department, but you may contact OTEAR for assistance.

Faculty and instructors are essential to ensuring that the students respond to the survey. Above all else, communicate with your students regarding the importance of the survey to improve your own teaching, as well as the importance to the university as a whole. Consider taking the following actions:

  • Set aside some class time and have the students complete the survey in class. This will often result in response rates equal to the paper surveys.
  • While the survey is running, direct your students to Do not rely solely on email messages, or on the Canvas pop-up message (both of which may sometimes fail) – be sure to give this link to your students via multiple channels.
  • Do not rely on our email reminders – students may not read the email, and we cannot send emails to students who do not provide an accurate email address to the university.
  • Include a statement on your syllabus that you expect all students to complete the SIRS survey.
  • Use informal, midcourse surveys throughout the term.
  • When the survey begins, take some class time to discuss the importance of the survey.
  • Give the students personal examples of how you have used prior surveys to improve your teaching.
  • Inform the students that the surveys are used by the University in promotion, tenure, and reappointment decisions.
  • Assure students that their comments and responses will always remain anonymous.
  • Invite students to view survey data from previous semesters at
  • Read more about student participation.

The survey is voluntary. The system notifies students by email of the availability of the survey, students who do not reply to the survey will receive repeated reminders until they respond or until the survey ends. For courses that are taught through Canvas or Blackboard, students will see a pop-up reminder each time they log in to those systems. Additional methods of enforcement cannot be implemented until the university community has an opportunity to discuss the implications and practicality, however instructors have a great deal of flexibility for encouraging student participation.

Student log-in information is used only to determine which surveys a student can take, and to prevent the students from responding more than once to the same survey. The survey software never reveals the students’ identities, and all reports generated by OTEAR only include anonymous, aggregate data. See the privacy statement for more information. It is important that you communicate to your students that you will only see anonymous data, and only after final grades have been submitted.

If you sent the survey link to your students, please note that the period at the end of the sentence may break the link if it directly follows the link. Please put the link on a line by itself with no additional punctuation, or make sure you put a space between the end of the link and the period (e.g., “Please go to .”)

If you are using Canvas:

  1. Some students may see a blank page after clicking the “pop-up reminder” in Canvas. These students should use the direct link – – instead. This affects only a very small number of students, and is generally caused by pop-up blockers or privacy plugins that the students have installed on their own computers.
  2. In many cases, telling the students to clear their browser cache and cookies (or use a different computer) may resolve any issue in Canvas.

No. The survey system only allows students who are currently enrolled in the course to take the survey. Roster information is updated daily, however various university processes and cut-off times may cause a 24-hour delay.

Note that Blue may report total counts that include withdrawn students. These students do not have access to the survey, but we do leave them in the system temporarily to allow for registration problems that are later corrected (students with registration problems will temporarily lose access). This creates a minor discrepancy in the student count that we correct when generating the final results. 

Blue Surveys

Blue is a new survey tool that replaces both the older Sakai survey component and the EvaluationKit Pilot. As of Summer 2019, all online surveys are now hosted in Blue.

DIG (“Data Integrity Gateway”) is an administrative feature of the Blue survey system. DIG allows academic departments to verify or correct the course instructor details, modify survey dates, initiate midcourse surveys, and make several other changes before the surveys are sent to instructors or students.

Everyone (faculty, students, and staff) can reach Blue at Students and faculty will all receive the link via email at the start of the survey process.

The link above is preferred, but Blue can also be reached through links in Canvas.

No. OTEAR will try to make the transition as seamless as possible. Instructors will see some new features, including:

  • The ability to change survey dates directly in Blue
  • A streamlined method of adding additional questions

You will receive an email when your survey is available for editing in Blue, and you can choose from pre-existing questions simply by clicking a button. Additionally, several slots are available for entering questions of your own. Full instructions are available for adding questions in Blue.

When the survey starts, Canvas will prompt the students to complete the survey through a pop-up reminder when they log in, and continue to remind the students to complete the survey each time they visit. Each course also has a “Student Instructional Rating Survey” menu item, this must remain in place and visible for the Canvas integration to work correctly.

Yes, Blue will send email reminders to both students and faculty. For Canvas courses, students will get both the email reminders and the in-course prompt to complete the survey.

evaluationKit Pilot (ended)

OTEAR previously tried a new survey tool, hosted at While EvaluationKit was an improvement over the Sakai survey tool, we feel that Blue is a better fit for Rutgers University. Blue has many of the same benefits as EvaluationKit, plus some additional advantages which allowed us to better meet the needs of individual academic units.