Improving Speaker’s Use of Segmental and Suprasegmental Features of L2 Speech


  •  Azza A. M. Abdelrahim    

Abstract

Unlike L1 acquisition, which is based on automatic acquisition, L2 adult learners’ acquisition of English phonology is based on mental reflection and processing of information. There is a limited investigation of L2 phonology research exploring the contribution of the cognitive/theoretical part of pronunciation training. The study reports on the use of online collaborative reflection for improving students’ use of English segmental and suprasegmental features of L2 speech. Ninety participants at the tertiary level at Tabuk university in the kingdom of Saudi Arabia were divided into two groups which used an online instruction. The only difference between the instruction of the experimental group and the control group is that the experimental group spent part of the time of instruction on collaborative reflection, while the control group spent this time on routine activities without using collaborative reflection (but all other activities were the same). The results showed that the online collaborative reflection improved the pronunciation of the experimental group. The learners learned the pronunciation of the major segmentals (e.g., vowels, consonants, diphthongs), minor segmentals (e.g., the way of articulation), and the suprasegmental features (e.g., intonation, stress). The results also showed that students perceived the online collaborative reflection as a helpful means in improving their use of L2 English phonology features. The findings have important implications and contribute to our theoretical knowledge of second language acquisition and L2 phonetics instruction research.



This work is licensed under a Creative Commons Attribution 4.0 License.
  • ISSN(Print): 1923-869X
  • ISSN(Online): 1923-8703
  • Started: 2011
  • Frequency: bimonthly

Journal Metrics

Google-based Impact Factor (2021): 1.43

h-index (July 2022): 45

i10-index (July 2022): 283

h5-index (2017-2021): 25

h5-median (2017-2021): 37

Learn more

Contact