August, 1999
|
SPECIAL EDUCATION ELEMENTARY LONGITUDINAL STUDY (SEELS) | |
INSTRUMENT FIELD TEST REPORT |
SRI Project 3421 | |
Special Education Elementary Longitudinal Study (SEELS)
Instrument Field Test Report
The Office of Special Education Programs (OSEP) of the U.S. Department of Education has commissioned a design for the Special Education Elementary Longitudinal Study (SEELS), a study that will provide the first national data on the elementary and middle school experiences of students in special education. SRI International has led the design effort, which has involved development of the conceptual framework, sampling approach, and data collection instruments and procedures.
Data collection instruments designed for SEELS were field tested for clarity and appropriateness with a range of respondents. As presented in Table 1, three school staff questionnaires and a parent interview were included in the field test:
A total of 34 interviews and surveys were completed as part of this field test.
Table 1
Numbers of Respondents Participating in Field Test
|
|
Number Completed |
Parent Interview |
Parent |
9 |
School Characteristics Survey |
Principal |
8 |
Teacher Survey |
Reading/language arts teacher |
9 |
Students School Program Survey |
The person most knowledgeable about the students program |
9 |
It is important that the pilot test reflect the breadth of the SEELS research questions and the diversity of the SEELS sample. Field test participants were selected who could respond about specific children who differed in the following areas:
The field test was successful in capturing this diversity; Table 2 describes the children about whom the participating parents and school staff responded.
The field test attempted to replicate as much as possible the actual data collection methods that will be used in the SEELS study. During the study, the three school-oriented instruments will be mailed to the principal, and the principal will be asked to forward the two school staff instruments to the appropriate respondents. For more than half of the school staff surveys completed in the field test, a principal or another school staff member was mailed the instruments and asked to distribute them. School staff were instructed to complete the surveys prior to participating in a debriefing phone interview. Parents were interviewed by phone, in most cases. The one exception was the mother of a child with a hearing impairment who also had a hearing impairment. To assist in the administration of this phone interview, an interpreter was hired. However, because of an illness in the respondents family, much of the interview was conducted by e-mail, with questions and responses being typed. Field test participants were paid $25 for each survey completed as a thank you for their time.
The results of the field test for the parent and school instruments are described below.
Table 2
Characteristics of Child about Whom Respondents Completed Survey/Interview
Child Characteristics |
SEELS Pilot Test Respondents |
|||
Parents
Parent Interview |
Principals
School Characteristics Survey |
Language Arts Teachers Teacher Survey |
Teacher Most Knowledgeable about Students Program Program Survey |
|
Childs disability category |
||||
Autism |
1 |
1 |
1 |
1 |
Deafness |
1 |
1 |
1 |
|
Emotional disturbance |
1 |
1 |
1 |
|
Learning disability |
1 |
1 |
1 |
2 |
Mental retardation (moderate to severe) |
1 |
1 |
1 |
1 |
Orthopedic impairment |
1 |
1 |
1 |
1 |
Other health impairment and learning disability |
1 |
1 |
1 |
1 |
Speech impairment |
1 |
1 |
1 |
1 |
Visual impairment |
1 |
1 |
1 |
1 |
Grade level |
||||
Elementary |
5 |
3 |
5 |
5 |
1st-3rd grade |
2 |
2 |
2 |
|
4th-6th grade |
3 |
3 |
3 |
|
Middle school |
2 |
1 |
3 |
3 |
High school |
2 |
1 |
1 |
1 |
K-12 |
3 |
|||
Geographic area |
||||
Arkansas |
1 |
1 |
1 |
1 |
California (northern) |
2 |
1 |
1 |
1 |
California (southern) |
1 |
1 |
1 |
1 |
District of Columbia |
1 |
1 |
||
Maryland |
1 |
1 |
1 |
1 |
North Carolina |
1 |
1 |
1 |
1 |
New York |
1 |
1 |
1 |
1 |
Oregon |
1 |
|||
Texas |
1 |
1 |
||
Virginia |
1 |
1 |
1 |
1 |
Washington |
1 |
|||
Type of school attends |
||||
Private |
1 |
1 |
1 |
1 |
Public |
8 |
8 |
8 |
8 |
Residential school |
2 |
2 |
2 |
2 |
Special school for students with |
3 |
3 |
2 |
2 |
Parent Telephone Interview
The field test with each parent began with a description of the study and reassurance of the confidentiality of responses. In addition, there was a discussion of the parents role in the pilot test, asking parents to let us know if anything was unclear, if there were words or terms they didnt understand or questions they didnt feel comfortable with, and whether there were questions or topics they thought should be included in the interview that were not currently included.
Interviews lasted between 45 and 60 minutes, with an average of 49 minutes. These field test interviews were probably several minutes longer than the interviews will be during the SEELS study. The parent interview is designed to be conducted by using a computer-assisted approach, with the interviews skip logic programmed into the computer so that each appropriate question automatically appears on the interviewers screen. Because the field test interviews were conducted without computer assistance, they took longer than they will take during the SEELS study because of the additional time needed to locate the next appropriate question.
After each parent interview was completed, responses were reviewed, and necessary changes were made to the interview so that the next interview would incorporate the earlier changes; in this way, continuously updated versions of the interview were being field tested.
Most of the changes to the parent interview were fairly minor, such as changes to skip instructions or to item wording to better clarify an items intent. For example, the section of the interview asking parents to describe their childs disability initially included the question "What are CHILDs learning problems or disabilities?" We found that most parents interpreted this as focusing only on issues related to learning and did not describe other types of disability. The wording of this question was changed to "What are CHILDs physical, sensory, learning, or other disabilities or problems?"
The interview was modified more extensively, to reflect what we learned from parents, in the following areas:
The interview was originally designed to be administered during the spring of the school year, but the first administration of the interview will be during the summer because of OMBs concern about possible conflict with Census interviews. The wording of some items was changed to reflect this move to summer interviewing. For example, because parents felt that the amount of time they spent reading to their children differed between the summer and the school year, the question "How many times have you or has someone in your family read to CHILD in the past week?" was modified to "In a typical week during the school year, about how many times have you or has someone in your family read to CHILD?"
Parents of children with more severe impairments suggested several additional items to better reflect their experiences. For example, adding "respite care" to the list of services received, adding "having a sense of humor" and "being sensitive to other peoples feelings" to a list of childrens strengths, and adding an item to the future expectations section of the interview about the expectation that children eventually would live away from home on their own, but with supervision.
The appropriateness of individual items was reviewed and item skip logic was changed for students who live in residential schools or facilities or who receive instruction in alternative settings, such as being home-schooled by a parent or receiving homebound instruction from a professional. For example, parents are not asked whether they help their child with homework if their child attends a residential school, and they are not asked about how well their child gets along with others in the class if their child receives homebound instruction.
Parents of younger children are no longer asked about how well their child functions in several areas that are age dependent. For example, only parents of children who are 12 or older will be asked "How well can CHILD get to places outside the home, like to school or to a nearby store or park, on their own, without help?" because some parents felt their children were too young to be allowed to go places on their own.
Several items were modified to improve the ease of responding. An example involves the item on student grades, for which responses were changed from "Mostly As, Mostly Bs, Mostly Cs", etc., to "Mostly As, Mostly As and Bs, Mostly Bs, Mostly Bs and Cs," etc., to better reflect the way parents were responding. A more significant example involves the section with items from the social skills rating system. Parents felt that the list of behaviors was too long, especially because so many of the items on the list were negative behaviors. In addition, they were often frustrated by the 3-point scale of "Never, Sometimes, Very Often," especially for items such as "acting sad" or "appearing lonely." They felt that that no one "never" appears this way, but they felt uncomfortable selecting the mid-range response of "sometimes" when they child very rarely acted this way. Parents also felt that children acted very differently with siblings than with friends and were unsure which group to answer for regarding "fights with others" or "controls temper when arguing with other children." The phrase "responds appropriately" also was difficult for parents when they were asked whether their child "responds appropriately to teasing from friends or relatives of his or her own age" or "responds appropriately when hit or pushed by other children." On the basis on these concerns, we shortened this section of the interview by including items from only two of the social scales, which also are included in the teacher survey. These scales were selected because they avoided some of the issues of negativity and wording clarity.
These and other changes are reflected in the attached updated version of the SEELS parent interview.
School-Based Instruments
After school staff completed the mailed questionnaires, a debriefing interview was conducted with each respondent to discuss the respondents experiences in completing the instrument. Once the debriefing phone call was completed, school staff were asked to return the completed questionnaire to SRI.
On the basis of the field test of the school-based instruments, all questionnaires have undergone substantial cuts so that survey completion times could be substantially reduced. In addition, items were shortened, either by eliminating response choices or by changing the format of the question. Any redundancies between survey items and across surveys were cut or merged, thus shortening some items. Significant revisions also were needed to clarify instructions to ensure that respondents would skip sections that did not pertain to them. Although the field test version contained instructions for skipping sections, several respondents answered questions that did not apply to them, resulting in longer times to complete the surveys. Clarifying these instructions by being more explicit with the skip patterns should shorten the time needed to complete the survey. Furthermore, any items that were confusing or that resulted in unexpected answers were also reworded.
For all the school-based surveys, we realized the importance of letting respondents know what information they may want to collect before beginning the survey. For example, the person filling out the Students School Program Survey may want to have the students IEP available, look up the students absences for the month of February, and find out whether the student qualifies for the free/reduced-price lunch program; the principal may want to get some student demographic information, student absence rates, and student disciplinary actions and incidents of violence; respondents to the Teacher Survey may want to locate scores and dates for the most recent student assessments in reading and math. Including these instructions will help respondents to complete the questionnaire more quickly.
Students School Program Survey
The average time to complete this survey was 36.5 minutes, with a range of 15 minutes to 50 minutes. Three teachers completed both this questionnaire and the Teacher Survey. Seven of the field test respondents are special educators, and one respondent is a general education teacher.
This survey was shortened by eliminating items related to respondent background information. Given that this category of items would be of most interest for school staff who are providing instruction to the student, we decided to ask these questions only in the Teacher Survey. In this way, the Students School Program Survey is more focused on a students program and less on the respondent completing the survey. Since we anticipate that many school personnel may be given both the Students School Program Survey and the Teacher Survey, this cuts respondent burden by a substantial amount. In total, 15 items have been deleted from the revised version of the Students School Program Survey.
The team also discussed whether the instrument would be appropriate for a paraeducator who may be providing the language arts, reading, or English instruction to the student on a one-to-one basis. One field test teacher who completed the Teacher Survey might have passed the survey on to the students one-to-one paraeducator if that option had been available, but explicit directions were given to the principal to give the instrument to a certified teacher. After discussing this issue the research team members decided that in some cases we may want to allow for the paraeducator to be selected, since the goal of the survey is to ascertain an accurate picture of features for each students language arts, reading, or English program. Changes were made to the survey to make the instrument appropriate for a paraeducator.
Teacher Survey
The average time to complete the Teacher Survey was 37 minutes, with a range of 20 minutes to 60 minutes. Field test teachers included three who taught in a special education setting, five who taught in a general education setting, and one who taught in an individualized setting.
This survey has the most complicated skip patterns, involving whole sections that are to be completed by teachers who provide instruction in a special education setting and a different section for teachers who provide instruction in a general education setting. Despite instructions, many respondents completed both sections, which increased the time needed to complete the survey. Consequently, directions have been clarified to minimize this tendency. In addition, five items were deleted.
On the basis of responses from teachers of students with more significant disabilities, one item was added to ask about the content of students language arts programs; for many of these students, there are likely to be substantial modifications to the curriculum and/or activities. One concern was that the field test version of the survey failed to accurately capture a students language arts program when that program differed substantially from the general education program.
At this time, we are still concerned with the length of this survey, and we will continue to look at individual survey items to examine the types of information provided and the importance of that information in answering crucial research questions. However, the dilemma will continue to be how to balance the time required to complete the survey and the need for as complete a picture of the language arts, reading, or English instruction that special education students receive. Without an observation component, obtaining this information from teachers becomes even more important.
School Characteristics Survey
The average time to complete this questionnaire was 24 minutes, ranging from 15 to 47 minutes. Administrators from five traditional public schools, one from a residential school setting, and two from special schools completed the field test. In general, field test respondents had very little difficulty completing this survey. However, many were challenged in completing Section B (Student Characteristics), for which several administrators looked up exact numbers. As a result, the instructions were changed, stressing that estimating numbers and percentages would be acceptable. In addition, many of the revised items allow for respondents to use either percentages or numbers because principals of smaller schools preferred numbers, whereas principals of larger schools tended to prefer percentages. Wording also was changed on some items to "around October 1," so respondents would not feel that the particular response must be an exact number from a specific date. Also, the item asking for the number of students in each disability category was modified to ensure that students were counted only once, because many respondents included some students in more than one disability category.
Additional instructions to accommodate schools that serve only special education students also are included in the current revised version of the School Characteristics Survey. Another change to this survey was to eliminate two items related to special education students and standardized assessment practices. The field test version of this survey asked school administrators to identify the percentage of students with low-incidence and high-incidence disabilities who participated in standardized assessments. Principals in regular schools were unsure what constitutes a high-incidence disability and what constitutes a low-incidence disability. Given that other items addressed the schools policies regarding inclusion of special education students in standardized assessments, these two items were deleted. In total, four items were deleted from the current version of the School Characteristics Survey.
All changes to these questionnaires are reflected in the attached school-based instruments.
SEELS DESIGN DOCUMENTS