https://doi.org/10.17323/jle.2022.12935
Cognitive Predictors of Coherence in Adult ESL Learners' Writing
Abdul Saeed 12 , John Everatt 2 , Amir Sadeghi 3 , Athar Munir 24
' Sukkur Institute of Business Administration University, Pakistan
2 University of Canterbury, New Zealand
3 PEETO, Multicultural Learning Centre, New Zealand
4 Emerson University Multan, Pakistan
ABSTRACT
Background. Coherence is considered one of the most important qualities of written discourse. Despite its fundamental importance, it is still considered a fuzzy and abstract concept in most English Second Language (ESL) contexts. Consequently, many ESL learners struggle to produce a coherent text. Morphological, phonological, orthographic awareness, vocabulary knowledge, and grammatical competence have been identified as predictors of writing quality in novice writers. There is, however, a lack of data to assess whether such linguistic skills also predict coherence in adult ESL learners' writing.
Purpose. The purpose of the study was to find out the relationships among a set of linguistic skills measures which included morphological, phonological, orthographic awareness, vocabulary knowledge and grammatical competence and coherence in adult ESL learners' writing. Methods. To testify to the potential predictors of coherence in ESL writing, adult university students (126) were assessed by the measures of the linguistic skills mentioned above in addition to four measures of coherence: two relatively reader-based measures (ILETS and the Holistic Coherence Scale) and two relatively text-based measures (Topical Structure Analysis and Topic Based Analysis). All measures of the study were proved valid and reliable. Results. The findings revealed that vocabulary knowledge, morphological awareness, and grammatical competence were related to the coherence measures, particularly the reader-based measures. In contrast, measures of phonological and orthographic awareness generally did not correlate with the coherence measures.
Implication. Reasons for the associations among the variables of the study were discussed and areas for future research were offered.
KEYWORDS
Coherence, second language writing, predictors of writing quality
INTRODUCTION
Citation: Saeed, A., Everatt, J., Sadeghi, A., & Munir, A. (2022). Cognitive Predictors of Coherence in Adult ESL Learners' Writing. Journal of Language and Education, S(3), 106-118. https://jle. hse.ru/article/view/12935
Correspondence:
Abdul Saeed,
saeedabdulskr@gmail.com
Received: August 26, 2021 Accepted: September 18, 2022 Published: September 30, 2022
Writing is considered a more complex and difficult skill to learn than listening, speaking, and reading. It is because writing does not come naturally like listening and speaking. To have good writing skills, one must not only learn them but also practise them regularly. Writing is very close to reading in this regard. The situation gets further complex for writing in a second language (L2) as the learner has to learn a new set of writing skills that may be different from their first language. Faced with the complexities of this process, and to support learners and make the writing process easier both in L1 and L2, researchers have attempted to identify predictors of writing. High proficiency in linguistic skills seems to predict writing quality. Such predictors include, but are not limited to, morphological, phonological, and orthographic awareness, vocabulary knowledge and grammatical competence. Studies of writing predictors have addressed such skills among writers (for example, Berninger et al., 2010; McCutchen et al., 2014; Abu-Rabia (2001); see literature review for further details on studies of ESL students) but few studies have considered predictors of measures of coherence among adult ESL learners.
Coherence is an essential and important construct to assess one's quality of writing (Candelo et al., 2018; Chiang, 2003). Traditionally, coherence has been defined as the semantic relationship in the text whereby all elements are logically joined to give a single unit of meaning (Knoch, 2007). Yet, in spite of its fundamental importance in writing, it is deemed to be a fuzzy and abstract concept in most ESL contexts (Lee, 2002), a misconception that often leads to its neglect in teaching and learning (Attelisi, 2012). One consequence of this deficit is that many adult ESL learners struggle to produce coherent texts (Masadeh, 2019; RahmtAllah, 2020). In addition, coherence has been considered a subjective construct (Van Dijk, 1977). For this reason, most measures of coherence have been seen as subjective and it has proven difficult to find a completely objective measure of coherence analysis (Todd, 2016).
The purpose of the present study was to investigate a set of linguistic skills (i.e., morphological, phonological, and orthographic awareness, vocabulary knowledge and grammatical competence) as potential predictors of measures of coherence in adult ESL learners' writing. The findings should inform theories about coherence analysis and help language teachers and ESL learners to focus on key skills involved in writing a coherent text. The current research aimed to answer the following questions:
(1) Are there any relationships among linguistic skills such as morphological awareness, phonological awareness, orthographic awareness, vocabulary knowledge and grammatical competence and coherence in adult ESL learners' writing? If there are, what are the best or common predictors of coherence?
(2) Are there any relationships among the sub-component parts of the measures of coherence used in this study and the measures of morphological awareness, phonological awareness, orthographic awareness, vocabulary knowledge and grammatical competence?
LITERATURE REVIEW
Coherence
Coherence is deemed to be one of the most important qualities of writing (Attelisi, 2012; Crossley & McNamara, 2010). It is known as 'sine qua non' (a thing that is necessary) in written discourse (McCulley, 1985). Its centrality can also be gauged by the fact that it is present in every test of English in which learners' proficiency is assessed, such as the Test of English as a Foreign Language (TOEFL), and the International English Language Testing System (IELTS). Nevertheless, in its centrality in written language learning, coherence remains a fuzzy and abstract concept to most ESL and EFL teachers, many of whom are unable to define and
practise coherence in their classrooms (Lee, 2002). As a consequence, a great number of ESL/EFL learners struggle to produce highly coherent texts (Khalil, 1989).
Predictors of Writing Quality
The current research investigated potential predictors of coherence in ESL writings. A range of linguistic skills that have been identified to predict variance in literacy skills (reading and writing) were further examined in the current study; the predictors that are mostly suggested in works focused on children. This is because such skills are considered to be developing in children but may have fully developed in adults. However, the development of such skills may vary across first language and second language contexts. It can be argued that adult ESL learners may still develop such skills in a second language which can influence their ability to produce a coherent text. This background review of the literature, therefore, focuses on the evidence of linguistic skills targeted in the study that may predict literacy outcomes -previously suggested in studies among children.
Morphological Awareness
Morphological awareness, an important element of language learning, is generally known as a conscious awareness of the morphemic structures of the words and the ability to manipulate that structure to make new words. Zhang and Koda (2012) conducted a study to test the role of morphological awareness in the development of second language vocabulary and reading comprehension amongst 130 students in a university in Shanghai in China. The study concluded that morphological awareness contributed significantly to the development of vocabulary knowledge in English as a second language. In addition to this, morphological awareness has also been found to have positive effects on learning spelling in novice writers, particularly in the spelling of morphologically complex words (Berninger et al., 2010; McCutchen and Stull, 2015).
Phonological Awareness
Phonological awareness has also been documented as a predictor of writing in children. Mackenzie and Hemmings (2014) examined the role of phonemic awareness in the development of English writing performance of children in ten kindergarten classrooms in New South Wales, Australia. Findings indicated a high correlation between oral language performance and writing development in these young children. Additionally, Zhao (2011) tested 339 grade 8 students in China with Chinese first and English as a second language. The participants were measured on their morphological, phonological, and orthographic awareness skills. Results revealed orthographic awareness as the main contributor to spelling in Chinese (the first language), whereas phonological awareness was the main predictor for spelling in English.
Orthographic Awareness
Abu-Rabia (2001) found orthographic awareness a predictor of spelling ability in both English and Russian languages. The participants were ESL university students in Israel between the ages of 25 and 30. Findings suggested that orthographic awareness may improve learners' spelling which can further develop their writing skills since improved spelling may help writing. Harrison et al. (2016) conducted research to assess potential predictors of spelling and writing in grade 3 in native and ESL learners. The participants were given rapid naming, phonological awareness, single-word fluency, text spelling, handwriting fluency, and paragraph writing fluency tasks. For native speakers, vocabulary and rapid naming predicted variance in writing whereas rapid naming and syntactic awareness predicted writing quality in ESL learners. Orthographic awareness was found a predictor of spelling in native as well as ESL learners.
Vocabulary Knowledge
Diamond et al. (2008) described vocabulary knowledge as the comprehension of the meaning of words in different contexts. To convey a message effectively, a learner needs to have good vocabulary knowledge. In other words, poor text generation and comprehension could both be caused by a lack of vocabulary knowledge (Lee, 2003). To examine the relationship between lexical diversity, indicating vocabulary knowledge, and holistic markings of a composition, Roess-ingh et al. (2015) asked 77 third-grade students in Canada to write a composition on a given prompt. Lexical diversity was assessed by corpus-based analysis through high-and-low-frequency words. For holistic scoring, a trait-based rubric (HALT) was used. The findings highlighted a correlation between lexical diversity, which is indicative of vocabulary size and writing quality.
Grammatical Competence
Grammar knowledge is believed to give the learner a sense of correct and incorrect use of the language (Wang et al., 2015). Grammar teaching has always been an inseparable part of second language teaching. There is a large body of research on different aspects of grammar teaching and their effects on language learning in both first and second language contexts (Berninger et al., 2011; Kim et al., 2013; McCutchen & Stull, 2015; O'brien et al., 2006; Wong, 2012). As an example, Jones et al. (2013) argued on the bases of findings from several schools in England that the teaching of grammar improves learners' understanding of the writing system. Their data indicated a positive effect of grammar teaching on writing, with skilled writers benefitting more than less skilled writers. Such studies support the view that higher levels of grammar knowledge should predict writing skills.
METHODS
Participants
Participants were recruited from six different government universities in Punjab, Pakistan. Following informed consent, 129 university students volunteered. Three participants were excluded from the data. Two students did not complete the measures: one missed a session due to illness and another had to leave the test session due to a personal emergency. The third case was the student who scored zero in the English vocabulary measure. Out of the remaining 126, 58 were male and 68 were female, with their ages ranging from 18 to 27 years (M = 21 years).
Background information was provided by the participants via a demographic questionnaire. All students were multilingual speakers, with the majority (82) being Punjabi speakers, though a minority used Saraiki as their first language. They all were able to communicate in Urdu, the national language of Pakistan, as well as in English, and for most, these were also the languages of reading - a minority could also read in their home language.
All participants have learned English since their first year of schooling in Pakistan. Additionally, the participants reported to have been using English in verbal communication between 5 and 22 years (M= 14 years). All participants were university students enrolled in their second year of study with English being the language of instruction according to the standards set by the 1Pakistan National Curriculum (Ministry of Education, 2006). The National Curriculum for English Language grades I-XII (Ministry of Education, 2006, pp. 127-131) requires students to be able to display all necessary writing skills, such as brain storming, mind mapping, transitional devices, key ideas, grammar, syntactic maturity and variety of sentences. Hence, the participants were assumed to know how to produce a well-connected coherent text. Additionally, since the participants were university students in their second year, they also experienced writing their assignments in English at tertiary levels for at least one year. A majority of these students would fall under the B1 category of the Common European Framework of Reference for Languages, which describes independent users of English who can use language for a variety of purposes, read extended texts, use correct and varied sentence structures, and also produce well-connected text. The criterion established by the Higher Education Commission for students with certain academic standards, as was described earlier in this paragraph, can serve as further validation. As these students are in the second year of their degree and have done with their first year, it is understood that a majority of participants have Bllevel of proficiency.
1 Ministry of Education. (2006). National curriculum for English language grades I-XII. Government of Pakistan, https://bisep.com.pk/down-loads/curriculum/Grades-I XII/pk_al_eng_2006_eng.pdf
Data Collection
To identify relationships between these language skills and measures of coherence, the participants were asked to write an essay of 250-300 words on a given topic. The topic was given via a short 150-word story about 'wildlife' and individuals who study wildlife. The participants read the story before the writing task, and they were given an hour to complete the task. They were also told to use this time in whatever way they chose (planning, writing, revising). The students were told that the essays would be assessed on several factors including meaningfulness and appropriate use of English. The topic was selected as one that participants should be at least partially familiar with. The purpose of the story was to give all participants the same topic and range of ideas for completing their essays, thereby avoiding major variations in essays due to topic knowledge and/or choice. Essays were handwritten on a sheet of paper and then transcribed onto a computer for assessment. The transcript was an exact copy of the original handwritten essay to allow an analysis of grammatical errors, organisation, and content, but the computer copy made it easier for two raters to assess the essays using the four measures of coherence.
Measures and Procedures
All measures were developed for the current study. The measures were piloted with participants from a similar background to those participating in the original study. Two or three sessions were used to administer the measures. The majority of the data were gathered over the course of two sessions; however, a third session may have been necessary if students were preoccupied with other tasks or had a different schedule. Language tests and a background questionnaire were completed during the first session. The participant's background questionnaire was given by the researcher, who also offered assistance as needed. The language tests were also given to the participants in the same session. The writing assignment was completed by the participants in the second session. To ensure the adequate distance between participants, tests were conducted in lecture halls with groups of no more than 20 students.
Language Measures
The morphological processing measure aimed to assess the participant's understanding of how words are broken down into smaller meaningful units, including roots, prefixes, and suffixes, and how words can be derived from root forms. In the current study, participants were given two tasks. Task one required the participants to choose the correct form to complete a sentence. For example, 'Geography involves the study of different (country)' and 'I (start) my new school last week' with the task being to write the correct form of the word in brackets that completed the sentence; 'countries' and 'started' in these examples. There were 20 items in this task. The second task required the participants to write a
word based on the rule given in the example. For example, 'sing - singer; read - ....', where the correct answer would be 'reader'; or 'boy - boys; man - ...' where the correct answer would be 'men'. There were 20 items in this task, too. Examples were provided to show what was required and each item was given one mark for a correct answer, making a potential maximum score of 40. The mean for the measure was 22.06, with a standard deviation (SD) of 8.06, and a range from 6 to 36. Cronbach's alpha was around .9.
The phonological awareness measure assessed the student's ability to use the sound pattern of the language and translate a written form into that sound pattern. Participants were given pairs of made-up words (pseudo-words) and were asked to choose the item that sounded like a real English word. For example, 'nale pult' and 'warg dore' were presented; the participants should select 'nale' and 'dore' as these sound like the English words 'nail' and 'door'. There were 17 pairs of pseudo-words in the test and each correct answer was given one mark. The mean for this measure was 12.02, SD = 2.79, and a range from 2 to 17. Cronbach's alpha was over .7.
An orthographic choice task was used to assess orthographic awareness. This task focused on the ability to identify correct spellings based on their orthographic features. The task comprised 18 pairs of letter strings, one of which was an incorrect spelling while the other was correct: for example, 'monk, munk' or 'goat, gote'. Items were selected so that both produced the same word-sound if converted by simple English letter-sound conversion rules to require recalling the orthographic features of the word to choose the correct item. The Participants were asked to choose the correct spelling in each pair and they were given one mark for each correct answer. The mean for the measure was 8.87, SD = 2.78, and a range from 2 to 16. Cronbach's alpha was over .7.
A vocabulary test was used to assess the participant's knowledge of words. In the present study, the participants were given 40 words followed by one possible meaning and three distractors for each word. The participants were asked to choose the closest meaning to the given word. For example, if the word 'rich', followed by '(i) no money at all, (ii) have a lot of money, (iii) feel happy, and (iv) feel sad', with the participant being expected to choose item ii as the correct answer. Each correct answer scored one mark. The mean for this measure was 23.27, SD = 5.85, and a range from 9 to 37. Cronbach's alpha was over .8.
Grammatical knowledge was assessed via 22 items that required the participant to identify the correct form of grammar, use of articles, and subject-verb agreement. There were three tasks in this measure to assess different aspects of grammatical understanding. In the first task, the participants were presented short sentences with four parts of the sentence underlined, one of which comprised an error based on its context. For example, in the sentences 'I am
going to an Indian restaurant for a lunch. Will you go with me? It's not too far away. It serves the best food, I believe.', the participants were required to indicate which of the underlined sections was incorrect ('a lunch' in this case). In the second task, sentences were again used, but this time with gaps, and the participants were asked to indicate one word/ phrase from four options that completed the sentence. For example, the sentence might be 'The distinct geology of the
island began_about 20,000 years ago', for which the
options would be 'i) formed; ii) form; iii) to form; iv) was forming'. A final task involved a passage of text that contained grammatical errors which the participant was expected to correct. For example, for the sentences 'I could see the water from my window. the boat sailed over the waves silent.', the participant should change 'the' to 'The' and 'silent' to 'silently'. Each correct answer was given a mark. The mean for the measure was 8.37, SD = 3.50, and a range from 1.5 to 18. Cronbach's alpha was .7 or above.
Coherence Measures
For the assessment of coherence, four measures of coherence were used. These include (i) the part of the international English Language Testing System (IELTS) measure that analyses coherence (IELTS, 2019), (ii) the Holistic Coherence Scale (HCS) developed by Bamberg (1984), (iii) the Topical Structure Analysis (TSA) developed by Lautamatti (1978), and (iv) the Topic Based Analysis developed by Todd (2016). To ensure consistency, two raters marked the essays. The first rater was the first author and the second was recruited as they were an experienced ESL teacher who had approximately ten years of experience in teaching and assessment. For each essay, the raters followed the criteria described below for each coherence assessment to calculate a mark - the raters made themselves familiar with the assessment procedures prior to calculating marks to ensure they understood the methods required. Correlations between the two raters' scores were .75 for the IELTS scores, .82 for the Holistic Coherence Scale, .81 for the Topical Structure Analysis, and .71 for the Topic Based Analysis. Given a reasonable level of agreement, the first rater's scores were used in the analyses as they were more familiar with the coherence methods used in the study.
The four coherence measures were selected because of the evidence for their validity and reliability provided by researchers over the years (for IELTS, see Moore, 2007; Müller & Daller, 2019; Schoepp, 2018 - for the Holistic Coherence Scale, see Connor & Lauer, 1985; McKenna, 1988 - for Topical Structure Analysis, see Ghazanfari et al., 2011; Kilig et al., 2016; Knoch, 2007 - for Topic Based Analysis, see Todd, 2016). These measures of coherence provided a mixture of relatively old and new measures indicating the development of the topic over years: the Holistic Coherence Scale and Topical Structure Analysis were developed in the 1970s
and 1980s, whereas Topic Based Analysis and the coherence scale of IELTS were comparatively new. The four measures also provided a combination of text-based and reader-based measures of coherence. The IELTS and Holistic Coherence Scale measures focused more on reader-based perspectives, whereas the Topical Structure Analysis and Topic-Based Analysis focused more on text-based methods of assessing coherence.
The International English Language Testing System (IELTS) is a high-stake English language proficiency test for international students and migrants (Alsagoafi, 2013). Candidates' writing skill is assessed on four criteria: 'Task Achievement (in Task 1) and Task Response (in Task 2), Coherence and Cohesion, Lexical Resource, and Grammatical Range and Accuracy' (Pearson, 2018). Marks are given to the writing task on a 9-point scale; (1 is the lowest level for poor outputs and 9 is the highest band - a score of 0 was not used as this indicates no attempt). Given the focus of the current study, only the Coherence and Cohesion component was used. This analysis of coherence included an assessment of cohesion, which has been considered one of the constructs of coherence (Halliday & Hasan, 2014); though equally, coherence is not simply the use of cohesive devices, and evidence has shown that a text can be coherent with a few cohesive ties (Kim & Crossley, 2018). Hence, the coherence assessment also included elements of organisation, or the compositional structure of a text (an appropriate beginning, middle and ending), and the progression or development of ideas via the logical insertion of new information related to the main topic. A score of 9 was given to texts that logically organised information and ideas, with evidence of a clear progression throughout, that used cohesion well and showed good use of paragraphing. A lower score of about 6 was given to texts that showed overall progression of ideas and information and used cohesive devices effectively, though these may be faulty/mechanical with paragraphs that were not always logically connected. A score of 4 was given to texts that presented ideas and information but in which these were not well arranged and a lack of clear progression and basic use of cohesive devices, and paragraphing may be confusing. The lowest mark of 0 was given to those who failed to convey any message. The assessment led to the participants achieving a mean score of 4.77, SD = 1.30, and a range from 1 to 8.
Bamberg (1984) developed the Holistic Coherence Scale to assess coherence for a large number of essays. This used a 4-point rubric scale with a score of 4 indicating the highest level of coherence termed as fully coherent. This suggests that the writer identifies the topic with no shift or digress, orients the reader by creating a context or situation, organises details according to a discernible plan that is sustained throughout the essay, skilfully uses cohesive ties such as lexical cohesion, conjunction, reference, etc. to link sentenc-
From IELTS. (2019). IELTS scoring in detail, https://www.ielts.0rg/-/media/pdfs/writing-band-descript0rs-task-2. ashx?la=en.
es and/or paragraphs together, and often concludes with a statement that gives closure. Additionally, the discourse flows smoothly with few or no grammatical and/or mechanical errors interrupting the reading process. Lower scores, however, indicate less coherence, with 3 suggesting partial coherence and 2 suggesting that the text is incoherent; that is, the reader can unlikely infer the topic, the writer digresses frequently and provides little orientation or organisational plan, the writer uses few cohesive ties to link text and there is no sense of closure with reading being interrupted by frequent mechanical and grammatical errors. A score of 1 indicates that the text is incomprehensible. Connor and Lauer (1985) argued that the descriptors of the scale could be divided into six sub-components of coherence: Focus, Context, Organisation, Cohesion, Closure and Grammar error. Each of these sub-components is also marked on a 4-point scale, with 4 being the highest score and 1 the lowest. Both the overall coherence scale of Bamberg and the six sub-components were calculated for each essay in the present study, though the raters started with the components, before giving an overall mark for the essay. Based on the description of the components, Focus in this context means there should be no irrelevant topic in the text: sentences should be developed logically or sequentially one after the other with no abrupt changes in topic. Context refers to a social, physical or psychological setting, and this should be clear throughout the text. Organisation suggests that the writer should start with the focus of the text and move towards a clear end. The Closure should typically involve a reiteration of the writer's purpose: the writer normally concludes the argument developed throughout the body of the text. Cohesion should be used to connect the text grammatically: i.e., the use of lexical ties such as reference, substitution, ellipsis, conjunction, collocation and lexical reiteration to connect text together as a cohesive unit. In the case of Grammar, the writer should avoid mistakes in tenses, subject-verb agreement and punctuation. Marks from all sub-components also showed reasonable correlations (r = .7 or greater) between the raters. The assessment led to the participants being marked with a mean score of 2.63, SD = 0.64 and a range from 1 to 4.
Lautamatti (1978) proposed Topical Structure Analysis to analyse coherence at the sentence, paragraph and discourse level. It was developed to examine how topics repeat, shift, and return to earlier topics in a discourse to maintain coherence. As part of the analysis, determining topics and progressions between topics is vital. For present purposes, texts were divided into t-units, or the 'minimal terminable unit', defined as an independent clause with all its dependent clauses (Hewings & North, 2006). The analysis used t-units as a measurement unit because of their flexibility to recognise more than one topic in compound sentences. In addition, t-units have been used by researchers to analyse learner's writing quality (Knoch, 2007; Witte & Faigley, 1981). For t-units, the analysis then determined the t-unit topic, or theme, and identified information about the topic,
or rheme. The topic was always the semantic topic of the sentence, rather than the grammatical subject of a phrase, to allow for an analysis of coherence. The theme should be information already known to the reader, while the rheme should provide new information, with the juxtaposition of old/new or known/unknown information providing the basis on which the discourse topic is developed. This development was determined via different types of progressions through the text.
Four main types of progressions were considered in the present analysis. The first were Parallel progressions, where the topics in successive sentences were either the same or synonyms and/or pronouns were used to link the topics. The second type, Sequential progressions, indicated seman-tically related but different topics in successive sentences. Typically, this occurred when a rheme part of a preceding sentence became the topic or theme of the following sentence. The third type was Extended parallel progressions where two semantically identical topics were interrupted by at least one occurrence of a Sequential progression. These three progressions support the thematic development of the text. Parallel progressions give depth to the topic: by repeating the same topic in consecutive sentences, the writer provides additional and detailed information about the topic under discussion. Sequential progressions provide a way to extend the text by introducing new but related topics, and Extended parallel progressions remind the reader about the main topic by repeating a previously used topic. Extended parallel progressions can also be used to present a closing statement. Thus, these three progressions work to give depth, width and to close the text. The final type used in the current analyses was Unrelated progressions in which the topic of a sentence was not related to the theme or rheme part of either the preceding or the successive sentences. These types of progressions indicate problems with the coherence of the text. Combining the differing types of progressions between t-units in the text provided a basis on which to assess coherence. (Note that correlations in determining t-units and types of progressions between raters were .9 or greater.)
The specific form of this assessment was based on the marking scale developed by Knoch (2007). According to Knoch, this five-point scale has been assessed for its validity, in comparison with professional raters, and its reliability. The scale has also been used with second language learners of English (Knoch, 2007). Scores range from 4 to 8, with 8 being the highest mark reflecting frequent Sequential progressions, infrequent but supportive Parallel progressions, few but appropriate Extended parallel progressions and no Unrelated progressions. Lower scores on the scale reflected more mixed use of progressions, particularly an increase in Unrelated progressions. The lowest score represented frequent use of Unrelated progressions and infrequent use of Sequential progressions. The assessment led to the partic-
ipants being marked with a mean score of 5.88, SD = 0.64, and a range from 4 to 8.
Topic Based Analysis was developed by Todd (2016). It analyses coherence by dividing the text into t-units, tracing out references, identifying key concepts and linking these concepts through moves in the text. Todd (2016) found a high correlation between experienced teachers' marks for coherence and the number of moves per t-units assessed via Topic-Based Analysis. The same calculation was used in the present study. As above, t-units were determined as an independent clause with all of its dependent clauses. Moves were used to show the change of concept from old to new, meaning that fewer moves would be indicative of fewer concepts in the text and, hence, greater unity in the text. To identify moves, concepts within the text needed to be determined. A concept was taken as a psychological construct and represents some entity in the world. The frequency of occurrence represented the importance of a concept in the text: higher frequency concepts were considered as the topics of the text (de Beaugrande, 1981). Once t-units, concepts within t-units and moves between concepts were each determined, coherence was calculated by dividing the total number of moves in the text by the total number of t-units. (Correlations between raters were .92 for t-units and .81 for the number of moves.) A higher score, therefore, represented more moves and less unity: higher scores meant less coherence. The assessment led to the participants being marked with a mean score of 1.40, SD = 0.33, and a range from 0.75 to 2.50.
RESULTS
The aim of the study was to investigate the potential cognitive linguistic predictors of coherence. First correlations between the language measures were calculated to ensure there was variability between the skills used for each of the measures; that is, they were not simply determined by English usage/proficiency alone. Correlations were all significant (as expected) and ranged from .34, for grammatical knowledge and phonological awareness, to .7, for grammatical knowledge and morphological processing. Morphological processing also produced a correlation with the vocabulary of .67 but was less related to phonological and orthograph-
ic awareness (.43 and .55 respectively). These correlations were consistent with common elements between the measures but indicated that they were not simply determined by proficiency. Similar correlational analyses for the coherence scales also indicated variability, with the largest being between the IELTS scores and the Holistic Coherence Scale (r = .59). There was a reasonable relationship between the Topic Based Analysis and the measures of Topical Structure Analysis (-.44) and Holistic Coherence Scale (-.45), with the other correlations ranging in size from .32 to .36. Again, these were indicative of some common construct, but variability in the way the construct was conceptualised.
Relationships between Language Skills Measures and Coherence Measures
Table 1 displays the correlations between the five language measures and four measures of coherence used in the study. The results indicated that measures of vocabulary size, morphological processing, and grammatical knowledge overall produced higher correlations than those found for the phonological and orthographic awareness measures. However, the size of the correlations varied across the four coherence measures utilised in the study. Morphological processing, vocabulary size and grammatical knowledge showed medium size correlations (i.e., around .3 to .5) with the IELTS assessment and the Holistic Coherence Scale. Morphological processing produced the largest correlation with the IELTS scores (.51) and also produced the second largest with the Holistic Coherence Scale (.46), though the correlation between the Holistic Coherence Scale and the grammatical knowledge measure was almost identical (r = .45). Scores produced via the Topical Structure Analysis and the Topic Based Analysis, however, showed small correlations (less than .3) with morphological awareness, vocabulary knowledge and grammatical competence (note that negative correlations with Topic Based Analysis are because lower scores were indicative of higher levels of coherence). The measures of phonological and orthographic awareness also showed generally small correlations (less than .3) with the coherence measures, except for the correlations with the Holistic Coherence Scale (.33 and .35); and the correlations with the Topical Structure Analysis were near-zero. These findings suggest that coherence in adult ESL learners' writing is more likely associated with morphological pro-
Table 1
Correlations between the Language Skills Measures and the Coherence Measures
Morphological Phonological Orthographic Vocabulary Grammatical
Processing Awareness Awareness Size Knowledge
IELTS .51 .22 .27 .35 .44
Holistic Coherence Scale .46 .33 .35 .38 .45
Topical Structure Analysis .20 .00 -.02 .21 .19
Topic Based Analysis -.20 -.11 -.19 -.08 -.18 Note: Correlations in bold are significant at the .01 level, those in italics are significant at the .05 level
cessing, vocabulary size and grammatical knowledge than phonological and orthographic awareness.
These correlational results were consistent with the findings of regression analyses assessing the level of prediction offered by the five language skills measures for each of the coherence analyses. The results (see Table 2) present the total variability explained by the five language measures along the standardised beta scores for each measure (significant beta scores are bolded). Total variability explained was larger for the two more reader-based assessments (27-28%) but provided little explanation for the two more text-based analyses (10% for the Topical Structure Analysis, but not significantly greater than zero for the Topic Based Analysis). The morphological processing measure showed the largest beta score, with the IELTS scores. Grammatical knowledge also showed a significant beta score with the Holistic Coherence Scale, but generally beta scores were around .2 or less.
Relationships between Language Skills Measure and Sub-components of coherence Measures
Table 3 shows the correlations between the five language measures and the sub-components of the measures of coherence used in this study (note that the IELTS score was a whole score and did not have sub-components). The results indicated that both sub-components of the Topic Based Analysis (number of moves and t-units) produced small or near-zero correlations with the language skills measures.
The findings also revealed relatively small correlations (less than .3) between the sub-components of the Topical Structure Analysis and the language skills measures, with the Unrelated progression measure producing the largest correlation, that with vocabulary size (r = -.29). This suggests a trend for those with smaller vocabularies to produce more Unrelated progressions in their written text. The largest correlations were found with the sub-components of the Holistic Coherence Scale. The largest correlations were again with the measure of morphological processing; though, vocabulary size and grammatical competence also produced reasonable size of correlations with the different subcomponents. Focus, Organisation and Closure showed correlations with at least one language measure above .4, whereas Context, Cohesion and Grammar errors generally showed small correlations with all the language skills measures. Overall, these correlations support the argument for associations between several sub-components of this coherence scale and more meaning-based language skills (vocabulary and morphology) along with a grammatical understanding of English. However, generally, the findings again suggest small relationships between the five language skills assessed in the current study and the different aspects of coherence assessed by the four coherence analyses performed.
DISCUSSION
The findings suggested a higher relationship between morphological awareness, vocabulary knowledge and coherence. The relationship between morphological awareness
Table 2
Results of Regression Analyses for the IELTS Scores, the Holistic Coherence Scale (HCS), the Topical Structure Analysis (TSA) and the Topic Based Analysis (TBA)
IELTS HCS TSA TBA
R2 Sig. R2 Sig. R2 Sig. R2 Sig.
Total variability explained
Morphological Processing
Phonological Awareness
Orthographic Awareness
Vocabulary Size
Grammatical Knowledge
.28 Beta .41
.02
-.05
.01
.17
F=9.15 p < .001
t=3.14 p=.002 t=0.16 p=.87 t=-0.49 p=.63 t=.05 p=.96 t=1.51 p=.13
.27 Beta .18
.12
.05
.08
.22
F=8.72 p < .001
t=1.37 p=.17 t=1.24 p=.22 t=0.50 p=.62 t=0.68 p=.50 t=2.00 p=.05
.10 Beta .12
-.07
-.21
.20
.13
F=2.52 p = .03
t=0.84 p=.40 t=-0.69 p=.49 t=-1.84 p=.07 t=1.61 p=.11 t=1.07 p=.29
.06 Beta -.17
-.00
-.13
.13
-.07
F=1.52
p = .19
t=-1.14 p=.26 t=-.03 p=.97 t=-1.12 p=.27 t=1.07 p=.29 t=-0.52 p=.61
Note: Collinearity statistics suggested no problems with multicollinearity in any of the analyses (i.e., tolerance scores were 0.35 or greater and VIF scores were all less than 3)
Table 3
Correlations between the Language Skills measures and the Sub-components of Coherence Measures
Morphological Processing Phonological Awareness Orthographic Awareness
Focus .51 .33 .33
Context .33 .26 .24
Organisation .49 .30 .21
Closure .41 .21 .19
Cohesion .34 .18 .23
Grammar -.33 -.10 -.19
Parallel progressions .23 .18 .23
Sequential progressions .23 .05 .01
Extended Parallel progressions -.10 -.10 -.09
Unrelated progressions -.22 -.11 -.05
Number of t-units .05 .02 .11
Number of moves -.05 -.05 .00
Number of words .26 .15 .20
Note: Correlations in bold are significant at the .01 level, those in italics are significant at the .05 level
and the development of coherence in adult ESL learners' writing may be because both are involved in the processing or production of meaning - and perhaps the awareness of how to process or produce meaning from different units of meaning: affixes and roots versus concepts, phrases, sentences, and paragraphs. Some studies that have addressed the role of morphological awareness in the development of spelling (Berninger et al., 2010; McCutchen et al., 2014), vocabulary knowledge (Mochizuki & Aizawa, 2000), and text generation (McCutchen & Stull, 2015; Northey et al., 2016). One potential relationship here is that morphologically constructed words, such as 'firstly' and 'secondly', may help the writers to organise the development of text from sentence to paragraph, and forward to the discourse level.
Vocabulary knowledge was also found to be a small but potentially significant predictor of different aspects of coherence, after morphological awareness. Vocabulary knowledge is helpful for learners to express a message using a variety of words. Like morphological awareness, vocabulary knowledge should also help learners produce a meaningful text. Thus, both coherence and vocabulary are meaning-related, and this may be the potential reason to have a significant correlation between them. These findings also seem consistent with the correlation between Sequential progression and vocabulary knowledge. For Sequential progression, writers should possess a good range of vocabulary in order to describe the same topic but using new words. It is worth noting that the correlation between Sequential progressions and morphological awareness is similar to that between Sequential progressions and vocabulary knowledge and, therefore, there may be alternative explanations for these inter-relations that future research could identify.
Again, considering the overall correlation results, grammatical competence seems to show the next largest relationship with the measures of coherence, after morphological awareness and vocabulary knowledge. Grammatical competence is the knowledge to produce a well-structured sentence that assists learners to comprehend meaning with ease. Under these assumptions, grammatical competence may be expected to show relationships with coherence as it may help to organise the text. The present as well as past studies such as Ahmad et al. (2019), Garing (2014) and Saeed (2020) found that the organisation of the text bears a high correlation with coherence.
Schleppegrell's (2004) work may help to explain why grammatical knowledge correlates with coherence at a lower level than morphological awareness. She asserted that the grammar of everyday communication differs from that of academic discourse. The latter is more formal, better organised, and more elaborated through nouns and noun phrases rather than clause structures. The development of the academic form of the language may improve cognitive functioning and language/writing practice. However, the English second language participants of this study may not have been proficient enough in the academic form of writing for this to support the development of grammatical competence. Hence, even for those with good coherence scores may not have scored well on the grammar tasks used in the current study, leading to smaller correlations between the writing and grammatical tasks. The prediction here would be that with improved academic skills, more experience of complex sentence structure will follow, which will in turn support grammatical competence. Hence, it is likely that a reciprocal relationship exists such that grammar skills will
support coherent writing, but equally text experience will help improve grammar skills.
However, from the results charted in the present study, phonological and orthographic awareness exhibited a lower size correlation across the four measures of coherence than morphological awareness, vocabulary knowledge and grammatical competence. One possible reason might be related to the methods of teaching English in government institutions in Pakistan (where the research was conducted). Most Pakistani government institutions teach English predominantly through a grammar-translation method (Ahmed, 2019; Shamim, 2008). Given that most English teaching takes place in the local language, learners have less chance of listening to the target language, a factor which may consequently affect their learning of sound patterns in the target language, and their development of phonological awareness in that language. If a skill is not developed, then it would be unlikely to be used in writing production.
The current research also considered the relationships between the linguistic skills measures and the sub-component parts of each measure of coherence. The sub-component parts of Holistic Coherence Scale showed mainly medium size correlations with morphological awareness, vocabulary knowledge and grammatical competence. These were Focus, Organisation and Closure. A good range of vocabulary related to the main topic is required to keep uniformity of the text. It is likely to result in complementary interactions between these skills and Focus by claiming morphological awareness and vocabulary knowledge related to meaning. Organisation of the text refers to the step-by-step description of the events which makes the text logical and meaningful as compared to any random collection of events which makes no sense. Morphological awareness helps to organise the correct word structure by knowing the proper knowledge of affixation and thus enhances vocabulary knowledge. For example, the discourse markers such as 'finally' 'consequently' and some others help to organise the text and need the proper use of affixation.
As a measure of coherence, Topical Structure Analysis exhibited higher relationships with morphological awareness, vocabulary knowledge and grammatical competence than phonological and orthographic awareness. Its sub-components, Sequential progression and Unrelated progression produced larger correlations with morphological awareness and vocabulary knowledge, whereas Parallel progression showed larger correlations with phonological and orthographic awareness. Most of these correlations were small (the largest being .31 between vocabulary and Unrelated progression) with many being insignificant. Based on these findings, it is possible to suggest that the language skills assessed in the current study do not substantially support the production of these types of progressions. The higher association of Sequential progression and Unrelated progression with morphological awareness and vocabulary
knowledge indicates that these two progressions are more related to meaning, whereas the higher association of Parallel progression with Cohesion and Organisation, suggests that Parallel progression is more related to the connectivity and structure of the text than to meaning.
Limitations of the study
The present study is limited in terms of using only descriptive essays in the study. Previous studies, such as that of Ghazanfari et al. (2011) have documented the effects of different genre on coherence analysis, pointing out that different genres of writing employ different structure. Therefore, using other genre of essays may produce different results.
The present study administered a general vocabulary measure which showed the highest level of correlation with coherence. Subsequent studies might also meaningfully use depth and width of vocabulary measures separately to investigate as predictors of coherence. It would greatly help in the understanding of coherence as to whether the in-depth understanding of a word is more associated with coherence, or with the quantity of words.
Gender has been documented to have effects on language learning. Female learners have been found to be better language learners than males (Saeed et al., 2011). Future studies could address the effect of gender differences in written coherent text in adult ESL learners' writing and also the reason behind it. For instance, both genders differ in employing strategies to write a coherent text.
CONCLUSION
The purpose of the study was to find out the predictors of coherence in adult ESL learners' writing. The study identified mainly medium-size correlations between the reader-based measures of coherence and the language skills of morphological awareness, vocabulary knowledge and grammatical competence in adult ESL learners' writing. However, the coherence-language skill correlations were generally small in the analyses involving the text-based measures. Phonological and orthographic awareness measures showed mainly small relationships with all coherence measures used in the study. These findings suggest that more meaning-related language skills were more likely to support coherent text production than those focused on linking written and spoken language, though the variability in relationships across the types of coherence measures warrants further research. Some sub-components of the coherence measures did produce medium size relationships with these same meaning-related language skills, but again the main relationships were with sub-components of the reading-based coherence measure. These differences suggest that components of coherence measures may be somewhat independent of those language skills associated with writing ability. Further re-
search identifying the skills that may lead to better scores in coherence assessments, and which support the development of these skills in second language learning contexts, would seem necessary to inform practices used in writing classes.
ACKNOWLEDGEMENT
This study was funded entirely by the researchers themselves.
AUTHOR CONTRIBUTIONS
Abdul Saeed: conceptualization, data curation, investigation, methodology, project administration, validation, visualization, writing-original draft, formal analysis.
REFERENCES
John Everatt: conceptualization, formal analysis, methodology, project administration, supervision, validation, visualization, writing-original draft, writing-review & editing.
Amir Sadeghi: conceptualization, methodology, project administration, resources, supervision, validation, writing-review & editing.
Athar Munir: data curation, investigation, methodology, project administration, resources, validation, writing-review & editing.
DECLARATION OF COMPETING INTEREST
None declared.
Abu-Rabia, S. (2001). Testing the interdependence hypothesis among native adult bilingual Russian-English students. Journal of Psycholinguistic Research, 30(4), 437-455. https://doi.org/10.1023/A:1010425825251
Ahmed, F. E. Y. (2019). Errors of unity and coherence in Saudi Arabian EFL university students' written paragraph - A case study of College of Science & Arts, Tanumah, King Khalid University, Kingdom of Saudi Arabia. European Journal of English Language Teaching, 4(3), 125-155. https://dx.doi.org/10.5281/zenodo.321555
Ahmad, M., Mahmood, M. A., & Siddique, A. R. (2019). Organisational skills in academic writing: A study on coherence and cohesion in Pakistani research abstracts. Languages, 4(4), 1-26. https://doi.org/10.3390/languages4040092
Alsagoafi, A. A. (2013). An investigation into the construct validity of an academic writing test in English with special reference to the academic writing module of the IELTS test. [Unpublished doctoral thesis]. University of Exeter. https://ore.exeter.ac.uk/ repository/handle/10871/10121
Attelisi, A. A. S. (2012). The impact of teaching topical structure analysis on EFL writing with special reference to undergraduate students in Libya [Unpublished doctoral thesis]. Newcastle University. https://theses.ncl.ac.uk/jspui/handle/10443/1619
Bamberg, B. (1984). Assessing coherence: A reanalysis of essays written for the National Assessment of Educational Progress, 1969-1979. Research in the Teaching of English, 18(3), 305-319. https://www.jstor.org/stable/40171021
Berninger, V. W., Abbott, R. D., Nagy, W., & Carlisle, J. (2010). Growth in phonological, orthographic, and morphological awareness in grades 1 to 6. Journal of Psycholinguistic Research, 39(2), 141-163. https://doi.org/10.1007/s10936-009-9130-6
Berninger, V. W., Nagy, W., & Beers, S. (2011). Child writers' construction and reconstruction of single sentences and construction of multi-sentence texts: Contributions of syntax and transcription to translation. Reading and Writing, 24(2), 151-182. https://doi.org/10.1007/s11145-010-9262-y
Candelo, J. E., Soto, J. D., Torres, L., Schettini, N., Calle, M., Garcia, L., & de Castro, A. (2018). Coherence and cohesion issues in argumentation documents written by engineering students. In proceedings of 2018 IEEE Global Engineering Education Conference (EDUCON) (pp. 156-160). IEEE. 10.1109/EDUC0N.2018.8363222
Chiang, S. (2003). The importance of cohesive conditions to perceptions of writing quality at the early stages of foreign language learning. System, 31(4), 471-484. https://doi.org/10.1016/j.system.2003.02.002
Connor, U., & Lauer, J. (1985). Understanding persuasive essay writing: Linguistic/rhetorical approach. Text, 5(4), 309-326. http:// hdl.handle.net/1805/2662
Crossley, S.A., & McNamara, D.S. (2010). Cohesion, coherence, and expert evaluations of writing proficiency. In R. Catrambone & S. Ohlsson (Eds.), Proceedings of the 32nd annual conference of the Cognitive Science Society (pp. 984-989). Cognitive Science Society.
de Beaugrande, R. A., & Dressler, W. U. (1981). Introduction to text linguistics. Longman.
Diamond, K. E., Gerde, H. K., & Powell, D. R. (2008). Development in early literacy skills during the pre-kindergarten year in Head Start: Relations between growth in children's writing and understanding of letters. Early Childhood Research Quarterly, 23(4), 467-478. https://doi.Org/10.1016/j.ecresq.2008.05.002
Garing, A. G. (2014, March). Coherence in argumentative essays of first-year College of Liberal Arts students at De La Salle University. In DLSU Research Congress (pp. 1-15). DLSU press.
Ghazanfari, M., Alavi, S. Z., & Ghabanchi, Z. (2011). The relationship between types of paragraphs and topic progression used in paragraphs written by Iranian EFL students. Journal of International Education Research, 7(4), 39-46. https://doi. org/10.19030/jier.v7i4.8003
Halliday, M. A. K., & Hasan, R. (2014). Cohesion in English. Routledge.
Harrison, G. L., Goegan, L. D., Jalbert, R., McManus, K., Sinclair, K., & Spurling, J. (2016). Predictors of spelling and writing skills in first-and second-language learners. Reading and Writing, 29(1), 69-89. https://doi.org/10.1007/s11145-015-9580-1
Hewings, A., & North, S. (2006). Emergent disciplinarity: A comparative study of Theme in undergraduate essays in geography and history of science. In Whittaker, R., McCabe, A. & O'Donnell, M. (Eds.). Language and literacy: Functional approaches (266-281). Continuum.
Jones, S., Myhill, D., & Bailey, T. (2013). Grammar for writing? An investigation of the effects of contextualised grammar teaching on students' writing. Reading and Writing, 26(8), 1241-1263. https://doi.org/10.1007/s11145-012-9416-1
Khalil, A. (1989). A study of cohesion and coherence in Arab EFL college students' writing. System, 17(3), 359-371. https://doi. org/10.1016/0346251X(89)90008-0
Kilig, M., Geng, B., & Bada, E. (2016). Topical structure in argumentative essays of EFL learners and implications for writing classes. Journal of Language and Linguistic Studies, 12(2), 107-116. https://dergipark.org.tr/en/pub/jlls/issue/36115/405544
Kim, M., & Crossley, S. A. (2018). Modeling second language writing quality: A structural equation investigation of lexical, syntactic, and cohesive features in source-based and independent writing. Assessing Writing, 37, 39-56. https://doi.org/10.1016/j. asw.2018.03.002
Kim, Y.-S., Al Otaiba, S., Sidler, J. F., & Gruelich, L. (2013). Language, literacy, attentional behaviors, and instructional quality predictors of written composition for first grades. Early Childhood Research Quarterly, 28(3), 461-469. https://doi.org/10.1016/j. ecresq.2013.01.001
Knoch, U. (2007). 'Little coherence, considerable strain for reader': A comparison between two rating scales for the assessment of coherence. Assessing writing, 12(2), 108-128. https://doi.org/10.1016/j.asw.2007.07.002
Lautamatti, L. (1978). Observations on the development of the topic in simplified discourse. Afinla-import, 8(22), 71-104.
Lee, I. (2002). Teaching coherence to ESL students: A classroom inquiry. Journal of Second Language Writing, 11(2), 135-159. https://doi.org/10.1016/S1060-3743(02)00065-6
Lee, S. H. (2003). ESL learners' vocabulary use in writing and the effects of explicit vocabulary instrcution. Syetem, 31(4), 537561. https://doi.org/10.1016Zj.system.2003.02.004
Mackenzie, N., & Hemmings, B. (2014). Predictors of success with writing in the first year of school. Issues in Educational Research, 24(1), 41-54. https://search.informit.org/doi/10.3316/informit.352676063609817
Masadeh, T. S. (2019). Cohesion and coherence in the writings of Saudi undergraduates majoring in English. Journal of Social Sciences and Humnaities, 5(3), 200-208. http://www.aiscience.org/journal/paperInfo/jssh?paperId=4522
McCutchen, D., & Stull, S. (2015). Morphological awareness and children's writing: Accuracy, error, and invention. Reading and Writing, 28(2), 271-289. https://doi.org/10.1007/s11145-014-9524-1
McCutchen, D., Stull, S., Herrera, B. L., Lotas, S., & Evans, S. (2014). Putting words to work: Effects of morphological instruction on children's writing. Journal of Learning Disabilities, 47(1), 86-97 https://doi.org/10.1177/0022219413509969
McCulley, G. A. (1985). Writing quality, coherence, and cohesion. Research in the Teaching of English, 19(3), 269-282. http://www. jstor.org/stable/40171050
McKenna, M. J. (1988). The development and validation of a model for text coherency. https://eric.ed.gov/?id=ED302830
Mochizuki, M., & Aizawa, K. (2000). An affix acquisition order for EFL learners: An exploratory study. System, 28(2), 291-304. https:// doi.org/10.1016/S0346-251X(00)00013-0
Moore, T. J., & Morton, J. (2007). Authenticity in the IELTS academic module writing test: a comparative study of task 2 items and university assignments. In L. Taylor & P. Falvey (Eds.), Studies in language testing 19: IELTS collected papers (pp. 197 - 248). Cambridge University Press.
Müller, A., & Daller, M. (2019). Predicting international students' clinical and academic grades using two language tests (IELTS and C-test): A correlational research study. Nurse Education Today, 72, 6-11. https://doi.org/10.1016/j.nedt.2018.10.007
Northey, M., McCutchen, D., & Sanders, E. A. (2016). Contributions of morphological skill to children's essay writing. Reading and Writing, 29(1), 47-68. https://doi.org/10.1007/s11145-015-9579-7
O'brien, I., Segalowitz, N., Collentine, J., & Freed, B. (2006). Phonological memory and lexical, narrative, and grammatical skills in second language oral production by adult learners. Applied Psycholinguistics, 27(03), 377-402. https://doi.org/10.1017/ S0142716406060322
Pearson, W. S. (2018). Written corrective feedback in IELTS writing task 2: Teachers' priorities, practices, and beliefs. Tesl-Ej, 21(4), 1-32. https://eric.ed.gov/?id=EJ1172568
RahmtAllah, E. A. E. (2020). EFL students' coherence skill in writing: A case study of third-year students of bachelors in English language. English Language Teaching, 13(8), 120-126. https://doi.org/10.5539/elt.v13n8p120
Roessingh, H., Elgie, S., & Kover, P. (2015). Using lexical profiling tools to investigate children's written vocabulary in grade 3: An exploratory study. Language Assessment Quarterly, 12(1), 67-86. https://doi.org/10.1080/15434303.2014.936603
Saeed, A. (2020). Cognitive predictors of coherence in adult ESL learners' writing. (Unpublished doctoral dissertation). University of Canterbury. https://doi.org/http://dx.doi.org/10.26021/2710
Saeed, A., Ghani, M., & Ramzan, M. (2011). Gender Difference and L2 Writing. International Research Journal of Arts & Humanities (IRJAH), 39(39). https://sujo-old.usindh.edu.pk/index.php/IRJAH/article/view/1148/1064
Schleppegrell, M. J. (2004). The language of schooling: A functional linguistics perspective. Routledge. https://doi. org/10.4324/9781410610317
Schoepp, K. (2018). Predictive validity of the IELTS in an English as a medium of instruction environment. Higher Education Quarterly, 72(4), 271-285. https://doi.org/10.1111/hequ.12163
Shamim, F. (2008). Trends, issues and challenges in English language education in Pakistan. Asia Pacific Journal of Education, 28(3), 235-249. https://doi.org/10.1080/02188790802267324
Todd, R. W. (2016). Discourse topics. John Benjamins Publishing Company.
van Dijk, T. A. (1977). Semantic macro-structures, knowledge frames, and discourse comprehension. In M. A. Just & P. Carpenter (Eds.), Cognitive processes in comprehension (pp. 03-32). Erlbaum.
Wang, Y., Yin, L., & McBride, C. (2015). Unique predictors of early reading and writing: A one-year longitudinal study of Chinese kindergarteners. Early Childhood Research Quarterly, 32, 51-59. https://doi.org/10.1016/j.ecresq.2015.02.004
Witte, S. P., & Faigley, L. (1981). Coherence, cohesion, and writing quality. College Composition and Communication, 32(2), 189-204. https://doi.org/10.2307/356693
Wong, A. S. C. (2012). An investigation of the predictors of L2 writing among adult ESL students [Unpublished doctoral thesis]. University of Canterbury. http://dx.doi.org/10.26021/9887
Zhang, D., & Koda, K. (2012). Contribution of morphological awareness and lexical inferencing ability to L2 vocabulary knowledge and reading comprehension among advanced EFL learners: Testing direct and indirect effects. Reading and Writing, 25(5), 1195-1216. https://doi.org/10.1007/s11145-011-9313-z
Zhao, J. (2011). Spelling English words: Contributions of phonological, morphological and orthographic knowledge in speakers of English and Chinese [Unpublished doctoral dissertation]. Texas A&M University.