× LoneStarCollegeALERT:

Program Review FAQs

Program Review in General

How often is a program reviewed?
Programs are now reviewed on a 4-year cycle.

Why is my program not being reviewed?
At this point, only technical programs are reviewed. Academic and CE programs are not reviewed. New programs are also given a 2 to 3 year start-up period before they are reviewed so that data can be collected.

The Quality Indicator Tables

What is the timeframe for the quality indicators?
The current year unless otherwise noted.

What do I do if a measure in the quality indicators table does not fit my program's particular situation?
Answer it as clearly as possible and provide an explanation in the Comments field.

What if I cannot access the quality indicators tables?
Participants are given access to the QI tables using their network login and password. If your login and password do not work on the system, contact Christina Todd (christina.c.todd@lonestar.edu) to let her know, including the program and campus.

Where do I find the information needed for the quality indicators?
Campus participants only answer the fields in yellow. Do not worry about the fields in white - they will be answered by System Office. Deans or program chairs might have the answers to the campus fields. In some cases, like advisory committee measures, you might need to contact the curriculum team facilitator. If nobody in the program has an answer to one of the measures, then explain that in the comments field.  Regardless of the source of the answer, primary contacts at the colleges are responsible for inputting the answers to the yellow fields.

Where do I find student enrollment trends?
This data will be provided to the program review contacts and their deans by Curriculum & Instruction.  Root data from Institutional Effectiveness can also be found using the data link on the program review website.

The graduation rate looks wrong. How is it determined?
The graduation rate data is pulled from the THECB's Annual Data Profile for each campus.  In recent years, the THECB has not produced a timely ADP, so the THECB's IE data is used from its Web site.  Note that the THECB provides data on previous years, so its data might be older than the most recent available from LSCS. Note also that students must register for graduation, and not doing so timely might affect your program's graduation number until the next year. Note also that the Colleague system defaults the system to indicate a LSC-North Harris (location code 100) graduate unless the person entering the graduation information specifically enters another campus. Though the numbers already posted in the Annual Data Profile cannot be changed, you can take steps to improve this rate in the future by encouraging your students to file for graduation on time.  Lastly, the THECB data includes duplicate graduate counts to the extent that it allows one degree at each level per year (for example, a student who graduates with a certificate and then an AAS in the same year may be counted twice, but a student graduating with two level one certificates in the same year is only counted once).

The success rate data looks wrong. How is it determined?
This data is also pulled from the THECB's Annual Data Profile (and IE site above). To determine the success rate, the THECB checks for employment (Unemployment Insurance records), enrollment in higher education in Texas, and military service for graduates, and all after 5 quarters. Graduates might not be found if they have (a) moved out of Texas, or (b) gone into self-employment or consulting work in which they are not covered by Unemployment Insurance.  Incarcerated graduates are taken out of the data so that they neither count for or against a program.  Though the numbers already posted in this data cannot be changed, you can improve future rates by following-up on graduates not found in the success rates when the System Office sends that information to deans annually.

How do I know if my program is a Tech Prep program and subject to Table II-6D?
Tech Prep programs are indicated in the cluster document (star chart) in the catalog.

Table II-7A refers to a curriculum review process in addition to the advisory committee input. Can you give examples?
Examples of such a review process would be PCAL or DACUM meetings to establish new or update existing degree plans. Other examples might be focus groups (not advisory committee members) or development of skills standards.

What are some examples of real work experience?
Any work experience other than academic experience. This includes any related paid positions or volunteer experience, whether full-time, part-time or project-based. The point of this measure is to determine if faculty have had recent opportunities to apply their skills and knowledge. For example: health program instructors' participation in health screenings; business instructors mentoring small businesses; other consulting or project work, etc.

Who are program-specific staff?
Any staff members whose work time is completely devoted to your program. Lab techs are usually included, whether full-time or part-time, unless they work for multiple programs. If a program chair is not also faculty, then he/she should be included here. Division staff who work with several programs should not be included.

How do I know what the required passing rate is? (Table 3, #6)
Contact your accrediting agency.

The Surveys

Who is surveyed and why?
The program review process includes surveys of advisory committee members, current students, graduates, former students (those who left before earning an award), and employers.  These groups are asked about their satisfaction with the program - curriculum, core skills and competencies, equipment, preparation for the workforce, etc.

How are the recipients chosen?
The advisory committee survey goes to non-faculty and non-staff members of the advisory committee.  New members are not included because they do not have the background knowledge to answer the questions.  Faculty ask their current students to complete that survey.  Lists of the past year's graduates are pulled from Colleague.  Former student lists include students who were enrolled in the past year but did not return in the current Fall semester, and these are also pulled from Colleague.  Employer contact information is gathered from graduate survey responses, but program heads may choose to add coop/internship/practicum employers to increase the sample size.

How do survey recipients respond?
All surveys are online.  When circumstances require it, advisory committee surveys can be done on paper as well.  For example, some advisory committees answer the surveys at a meeting, and their responses are entered by non-program staff.

Make LSC part of your story.