Trouble viewing? Open in web browser.

GSE News GSE Faculty Contact Us
Stanford Graduate School of Education homepage

News

June 6, 2014

Innovations in research: Longitudinal study transitions from paper-and-pencil to mobile

GSE-IT provides new approach to data gathering for Stanford/UC Santa Cruz study of how reclassification affects English language learners.

By Pamela Levine (Digital Initiatives)

Aleshia Barajas, SoYoung Park and Claudia Rivas (Photo courtesy GSE-IT)

Aleshia Barajas, SoYoung Park and Claudia Rivas (Photo courtesy GSE-IT)

This story was written for the spring 2014 issue of the GSE-IT newsletter, Digital Initiatives.

After two years of conducting interviews and collecting and hand-coding paper-and-pencil survey data, Stanford professor of education Claude Goldenberg and UC Santa Cruz faculty associate research scientist Peggy Estrada have embraced a digital and paperless approach to field research. Their longitudinal study examines the process of reclassifying English language learners to fluent English proficient status in California and the academic and course-taking outcomes of either reclassifying to fluent English proficient or remaining in EL status.

In the 2012-13 school year, there were approximately 1.5 million English learners in California public schools. Although some of these students have met some of the minimum requirements for being reclassified to fluent English proficient, they retain their EL status. The researchers note that schools ostensibly provide ELs with instructional services such as English language development and sheltered instruction.

This study, for which Stanford Graduate School of Education alumna Estrada is principal investigator, poses the questions:

  • Why and what are the consequences of students remaining English learners when they may be ready to transition to mainstream English instruction?
  • What happens when students who qualify to be reclassified continue to receive EL services and curricular placement they don’t need?
  • Could it restrict their learning opportunities by denying them access to core academic content, the full curriculum, and non-EL peers?

The mixed methods research design seeks to answer these questions by following seven cohorts of students for four years and examining quantitative measures including students’ scores on the California English Language Development Test and academic content standards achievement test; classification and reclassification status; course taking; course credits; and other academic outcomes.

The quantitative data reveal broad patterns. The qualitative data inform those patterns with interviews of district staff regarding EL policies and interviews of teachers, principals, EL coordinators and other school staff regarding their awareness and understanding of and participation in the reclassification process, school procedures, instructional practices and rationales for student curricular placement.

“It’s probably not a surprise that secondary English learners tend to take the least core and advanced courses and the most intervention courses,” says Estrada.

Funded by the Institute of Education Sciences, the research arm of the U.S. Department of Education, the project is a partnership between Stanford, UC Santa Cruz, and SRI International. In addition to Goldenberg and Estrada, the research team includes director of SRI Education Patrick Shields (also a GSE alum), SRI senior researcher Haiwen Wang, UCSC junior research specialist Aleshia Barajas and Stanford doctoral candidates SoYoung Park, Claudia Rivas and Claudia Rodriguez-Mojica.

The researchers used an interview-only format in Year 1 and an interview plus paper-and-pencil survey format in Year 2 to collect field data. “The project is designed so that every year the instruments are revised on the basis of what we discovered previously. The idea is that each year we are going more in depth, so that each set of questions is informed by what we learned from the previous set of questions as well as changes in policy in the district. The process allows us to develop new hypotheses based on what we’ve learned, and through the interview and survey process, test them,” says Estrada.

Barajas says: “When we came back from our field work in Year 2, we were daunted by the task of having to code the paper survey. We had to input everything manually, come up with a coding scheme from scratch, and then check everything. This process took us about three months.” This was when the team began considering using an electronic survey to collect participant responses. “There’d be a lot of advantages, for example, it’s all coded immediately,” she adds.

However, the team had concerns about maintaining their high response rate with an electronic survey, as well as the new technical and logistical aspects implicit in this approach. Barajas recounts, “How would we give this to the teachers? Would we have to send them the link electronically, and would we need their email addresses in advance? How would we keep track of their surveys? A lot of these practical questions started coming up.”

The team consulted with me and Shawn Kim (my colleague in the GSE’s Office of Innovation & Technology) on an approach to electronic data collection in the field — a growing practice amongst GSE faculty and staff members for their research. We considered the project’s particular logistical and IRB constraints in distributing a survey via email, as well as concerns about response rate. Estrada told us, “We know that our presence and our connection with [participants] motivates them to turn in the survey — we had a 98 percent response rate in the district during Year 2.” We ultimately recommended creating the survey in Qualtrics and using the GSE iPads to collect response data on the spot.

Kim then provided in-depth guidance about the survey flow and design, in order to help the team program the survey to adapt to each respondent (based on input demographic criteria and responses). She also helped them think beyond setting up the questions, to the output each item would produce, and whether or not this would provide the type and level of information needed during analysis. “At first we were still thinking about the paper survey, so we didn’t fully explore all of the capacities Qualtrics had to offer,” says Barajas.  “That’s when Shawn came in, saw our question formats, and said ‘You can try this.’ We learned so much, and it was really helpful to discover what we could really do with Qualtrics.  We also started thinking about how to connect our questions to what we wanted in the end in the output.”

To others exploring developing and distributing an electronic survey on iPads, Estrada recommends thorough testing of the instrument and devices within the research team even before a pilot phase. In relation to using mobile devices for data collection, she says, “We would try to reduce the amount of typing required by replacing text fields with touch buttons,” because using the iPad’s built-in keyboard to enter text was cumbersome. She also recommends still giving respondents the option to use a paper survey.

The team was receptive to using this approach in future projects. “I’ve got this other study going on in Rwanda which also has teacher surveys and several hundred teachers,” says Goldenberg. “We would love to get more efficient data collection.”

Pamela Levine is instructional technology associate in the GSE Office of Innovation & Technology.

Contact

Jonathan Rabinovitz, Director of Communications, Stanford Graduate School of Education: 650-724-9440, jrabin@stanford.edu

 

Stay educated

More GSE coverage

Facebook Twitter LinkedIn YouTube RSS

GSE News GSE Faculty Contact Us

© Stanford Graduate School of Education | 485 Lasuen Mall, Stanford, CA 94305-3096 | (650) 723-2109