Self-assessment of Writing in a Portfolio Program: A Case of Iranian EFL Learners (Research Paper)

Document Type : Original Article

Authors

1 Department of English, Farhangian University, Tehran, Iran

2 Department of English Language, University of Isfahan, Isfahan, Iran

Abstract

After moving away from the psychometric and the integrative language testing paradigms toward the communicative language testing paradigm, experts in writing and assessment have been concerned with creating conditions under which learners can experience and display writing in an authentic way so that their writings will tally the ones in non-test situations. As an alternative assessment option, portfolio-based writing assessment has recently received considerable attention in the field of ESL/EFL writing. However, self- assessment as a corner stone of portfolio pedagogy has received scant empirical attention in the second/foreign language teaching literature. Therefore, the present study aimed at comparing teacher assessment and students’ self-assessment of writing in a portfolio program in Iranian EFL context. Also, it aimed at eliciting students’ perceptions about self-assessment. To analyze data, independent samples t-test was used. The results showed that there is a significant difference between teacher’s assessment and students’ self-assessment at the beginning and the end of the portfolio program. Students’ perceptions were also elicited through interviews. The results seem to have some implications for curriculum development and teaching and assessment of L2 writing.

Keywords


Article Title [Persian]

خود ارزیابی نگارش در برنامه پوشه کار توسط دانشجویان ایرانی انگایسی به عنوان زبان خارجی

Authors [Persian]

  • بهروز قورچائی 1
  • منصور توکلی 2
1 گروه زبان انگلیسی، دانشگاه فرهنگیان تهران
2 گروه زبان و ادبیات انگلیسی، دانشگاه اصفهان
Abstract [Persian]

بعد از دور شدن از پارادایمهای ارزشیابی سایکومتریک و ترکیبی و رسیدن به پارادایم ارتباطی ارزشیابی زبان متخصصان نگارش و ارزشیابی در پی به وجود آوردن شرایطی بودند تا در آن یادگیرندگان، نگارش را به شیوه واقعی تجربه و به نمایش بگذارند تا نگارش آنها به آنچه در موقعیت غیر ارزشیابی رخ می دهد شبیه باشد.اخیرا، ارزشیابی بر اساس پوشه کار به عنوان روش ارزشیابی جایگزین توجه زیادی را در حیطه نگارش انگلیسی به عنوان زبان دوم یا زبان خارجی به خود جلب کرده است. با این وجود خود ارزیابی به عنوان بخش مهمی از آموزش پوشه کار،مورد توجه تجربی اندک در حوزه آموزش زبان دوم یا زبان خارجی واقع شده است.بنابر این، مطالعه حاضر سعی در مقایسه بین ارزشیابی معلم و خود ارزیابی نگارش دانشجویان در برنامه پوشه کار در بافت انگلیسی به عنوان زبان خارجی در ایران دارد. هدف دیگر تحقیق دریافت تصور دانشجویان از خود ارزیابی در این بافت می باشد.برای ارزیابی فرضیه های تهی از روش آماری تی- تست مستقل استفاده شد تا به این موضوع پی برده شود که آیا اختلاف معناداری بین خود ارزیابی و ارزشیابی معلم در ابتدا و انتهای برنامه پوشه کار وجود دارد. داده های مربوط به تصور دانشجویان از طریق انجام مصاحبه گردآوری شد.نتایج نشان داد که بین ارزشیابی معلم و خود ارزیابی نگارش دانشجویان درابتدا و انتهای برنامه پوشه کار در بافت انگلیسی اختلاف معنا داری وجود دارد.به نظر می رسد نتایج حاصل از تحقیق کاربردهایی در زمینه های برنامه ریزی درسی و تدریس و ارزشیابی نگارش زبان دوم دارد

Keywords [Persian]

  • ارزشیابی جایگزینی
  • مقاله نویسی
  • ارزشیابی پوشه ای
  • خود ارزیابی

Self-assessment of Writing in a Portfolio Program: A Case of Iranian EFL Learners

[1]Behrooz Ghoorchaei*

[2]Mansoor Tavakoli

IJEAP- 1812-1318

 

Abstract

After moving away from the psychometric and the integrative language testing paradigms toward the communicative language testing paradigm, experts in writing and assessment have been concerned with creating conditions under which learners can experience and display writing in an authentic way so that their writings will tally the ones in non-test situations. As an alternative assessment option, portfolio-based writing assessment has recently received considerable attention in the field of ESL/EFL writing. However, self- assessment as a corner stone of portfolio pedagogy has received scant empirical attention in the second/foreign language teaching literature. Therefore, the present study aimed at comparing teacher assessment and students’ self-assessment of writing in a portfolio program in Iranian EFL context. Also, it aimed at eliciting students’ perceptions about self-assessment. To analyze data, independent samples t-test was used. The results showed that there is a significant difference between teacher’s assessment and students’ self-assessment at the beginning and the end of the portfolio program. Students’ perceptions were also elicited through interviews. The results seem to have some implications for curriculum development and teaching and assessment of L2 writing.

Keywords: Alternative assessment, Essay writing, Portfolio assessment, Self-assessment

1. Introduction

With the dissemination of process approaches to writing, there has been a shift of interest from traditional norm-referenced summative assessment of writing to formative, learner-directed assessment, from bureaucratic testing to democratic assessment, from outcomes-based grading to process-based assessment in which assessment is aimed at improving learning and teaching. Traditional tests are incongruent with current practices in EFL writing classes. As Barootchi and Keshavarz (2002) argue, such tests fail to provide rich descriptive information about the product and process of learning. Because of the incompatibility of process learning and product assessment and the difference between information needed and the information derived through traditional tests, alternative ways of assessment are essential and a paradigmatic shift in assessment is required (Gottlieb, 1995).

Among alternative assessment options, portfolios have received considerable attention in the field of education. However, little research is done concerning the use of portfolios with college students in the EFL context (Yang, 2003). Similarly, self-assessment as a measure of language learning has been the focus of increasing interest in the field of language education. Although there is ample research in first language pedagogy which suggests that self-assessment is pedagogically beneficial, few EFL/ESL studies have brought to the fore the question as to how effectively students can function as raters (Matsuno, 2009). Moreover, there is scarcity of empirical research related to self-assessment in portfolio-based writing programs. Thus, this study dealt with self-assessment as a corner stone of portfolio pedagogy to see the difference between self- and teacher assessment of writing in a portfolio-based writing program. Also, attempts were made to find out students’ opinions about self-assessment in the program.

 

2. Review of the Related Literature

Constructivist learning is popular in the world of education today because it is perceived as a more natural, relevant, productive, and empowering framework (Buyukduman & Sirin, 2010). In constructivist education, learning is constructed by learners rather than transmitted form teachers. In the past, learning was regarded as the acquisition of knowledge and skills, but now it is considered as socially derived and situated, constructed, and developed in interactions with others (Vygotsky, 1986). In Williams and Burden’s (1997) social constructivist model, four key factors influence the learning process: Teachers, learners, tasks, and contexts which interact with each other as part of a dynamic process.

If the purpose of instruction and assessment is meaningful learning, learning has to be about constructing knowledge rather than acquisition of knowledge (Mayer, as cited in Struyven, Dichy, Janssens, Schelfhout & Gielen, 2006). Although constructivism has been considered a learning theory, several pedagogical principles are drawn from it to facilitate learning and assessment (Buyukduman & Sirin, 2010). Jonassen (1991) introduced some points related to appropriate assessment in the constructivist learning theory:

  1. Technology can and will force the issue of constructivism.
  2. Assessment will have to be outcome-based and student-centered.
  3. Assessment techniques must be developed which reflect instructional outcomes.
  4. Grades must be contracted where they are required.
  5. There must be non-graded options and portfolio assessment.
  6. There must be self - and peer evaluation as well as teacher assessment.
  7. Performance standards must be developed.
  8. A grading system must be developed which provides meaningful feedback.        
  9. Students will be videotaped as they work as part of their portfolio; and
  10. The focus must be on originality rather than regurgitation; it is important to evaluate how the learner goes about constructing his or her own knowledge rather than the product.  

The last decade of the second millennium has witnessed a widespread experimentation with learner-centered alternative assessment methods. The concern of assessment is now the ongoing assessment of students’ efforts and contribution to the learning process (Ross, 2005).  Assessment is viewed as a dynamic, ongoing process which is inextricably linked to teaching and learning. In fact, in the constructivist education, assessment occurs simultaneously with the learning process (Marlowe & Page, 2005). In fact, assessment is viewed as part of teaching and learning not as the end-result of teaching and learning.

2.1. Traditional Vs. Alternative Assessment

As the role of writing in second/foreign language education increases, there is a greater need to develop more reliable and valid ways of gauging the writing ability (Weigle, 2002). In the past, people used to judge the writing ability of an individual  indirectly by giving multiple choice tests of grammar and usage, but as Hughes (2003) assumes, the best way to test an individual 's writing ability is to get them to write (i.e. direct testing of the writing ability).

Traditional approaches to writing assessment focused mainly on the cognitive aspects of writing. However, writing should be viewed not only as a product but also as a "social and cultural act" (Weigle, 2002, p. 19). Learning to write involves more than learning vocabulary and grammar or the rhetorical forms of academic writing (ibid). Thus, there has been a trend in language assessment away from highly decontextualized test designs towards alternative assessment procedures like self-and peer-assessment, journals, portfolios, and conferences which are "more authentic in their elicitation of meaningful communication" (Brown, 2001, p. 405). Because tests make students compete with each other, they cannot provide peer learning. Also, students have limited time for performance on tests which makes them nervous and anxious. As a result, alternative assessment options like portfolios have come to the fore to solve the deficiencies of tests (Ustunel & Deren, 2010). These new modes of assessment tell students that their success depends not on how much but on how well they have learned (Hargreaves, cited in Struyven, et al. 2006).

2.2. Portfolio Assessment

Portfolio is viewed as one of the most popular alternatives in assessment, especially within the framework of communicative language teaching (Brown, 2004). In the field of foreign language teaching and learning, it is regarded as an alternative assessment tool used to provide opportunities for absorbing language authentically and actively, and for assessing learner progress (Delett, Barnhardt & Kevorkian, 2001). According to Hyland (2003), in ESL writing context, portfolios are a response to testing situations whereby students produce one-shot pieces of writing with no opportunities for selection of topic and revision, disadvantaging L2 writers who need longer time to perform such tasks. Hyland asserts that "portfolio evaluation reflects the practice of most writing courses where students use readings and other sources of information as a basis for writing and revise and resubmit their assignments after receiving feedback from teachers or peers" (p. 233). 

According to Tiwari and Tang (2003), there is an assumption in the literature that during the process of preparing portfolios, learning is fostered as students are encouraged to reflect on their practice, identify learning needs, and initiate further learning. To realize the full potential of portfolios, as the authors argue, the assumption should be supported with empirical evidence.

2.3. Self-assessment in the EFL/ ESL Writing Context

Self-assessment denotes the student’s evaluation of their own performance at various points in a course (Coombe, Folse & Hubley, 2007). There has been an increasing interest in self-assessment in higher education at the turn of the century. Boud (2000) points out that for the development of lifelong learning skills, assessment must move from the exclusive domain of assessors to the hands of learners (p. 151).  Little (2005) has mentioned the reasons for engaging learners in self-assessment. First, in a learner-centered curriculum, students should be involved not only in decisions about the content of the curriculum but also in the process of assessing curriculum outcomes, including their learning achievement. Second, in learner-centered pedagogies aimed at developing learner autonomy, self-assessment plays a major role in shaping and directing the reflective process. Third, to the extent that languages learned in formal contexts should be used beyond the classroom context, accurate self-assessment is an essential part of the program that allows learners to turn occasions of target language use into opportunities for further explicit learning. 

Janssen-van Dieton (2000) reports the results of a study aimed at investigating the validity of self-assessment carried out by migrants with a low educational level. The students were trained to carry out self-assessment of their progress. The results showed that training led to more accurate assessment and also higher scores in writing test overall. However, there was not a statistically significant difference between the control and experimental groups of the study. In a longitudinal study, El-Koumy (2004) investigated the effect of self-assessment of writing product and process on EFL students’ writing. They found that self-assessment of writing process significantly improved the students’ quantity of writing as measured by a simple count of total words in each composition. Also, self-assessment of writing product significantly improved the students’ quality of writing as measured by the number of words in error-free T-units.

2.4. Portfolios and Student Self-assessment

Self-assessment and reflection are viewed by Gottlieb (2000) as the cornerstones of learner-directed assessment. Delett, et al., (2001) maintain that the use of portfolios leads to an interactive assessment process that involves both students and teachers and forms a partnership in the learning process. When used interactively, portfolios provide students with a sense of involvement in and control over their learning (Genesee & Upshur, 1996). According to Genesee and Upshur, students are viewed not as objects of evaluation but as agents of reflection and evaluation. They encourage learners to reflect upon their own learning, to assess their strengths and weaknesses, and to identify their own learning goals. Wu (2005) reckons that portfolio assessment emphasizes long-term learning and multiple assessment, and encourages self-assessment and reflection, learner-teacher interaction, reader evaluation and interaction. Similarly, Kuo and Popham as both cited in Chang (2008) consider self-assessment as one of the most important items in portfolio assessment.  

 It could be mentioned that self-assessment in the literature on portfolio pedagogy is of two types which can be referred to as formative and summative. The formative self-assessment refers to students’ reflections and self-evaluation of their progress during the implementation of portfolios. As Gottlieb (2000) notes, multiple opportunities for students' self-assessment within instruction make the students become independent learners while acquiring English. Also, Camp and Levine (1991) point out that reflection reveals the hidden aspects of learning which is unknown even for the student writers themselves. Discussing portfolio-based reflection, D'Aoust (1992) argues that "reflection is the act of pausing to see oneself as a writer. It enables a writer to celebrate his or her strengths as well as identify areas to be developed" (p. 43). As an example of formative self-assessment in the portfolio program, one could refer to Elahinia’s (2004) study in which the participants self-assessed their writing pieces during the course based on a self-assessment checklist developed by the researcher. The results showed that there was a high correlation (.73) between teacher assessment and self-assessment in the portfolio program. In another experimental study, Sajedi (2014) investigated the effect of self-assessment on EFL students’ composition skills. The experimental group assessed their writing based on a self-assessment guide which asked them to rate their compositions based on subskills including organization, content and structure, and grammar. The results showed that students in the experimental group significantly outperformed the ones in the control group in terms of composition.

Summative self-assessment is done after the portfolio is complete. Learners can be asked to include - for instance in an introductory letter to their portfolios - their evaluative account of the portfolio experience (Hirvela & Pierson, 2000). The letter and the portfolio provide the students with "a valuable opportunity to review their effort and progress during the course, and also a time to reflect on the learning styles that brought them to this point" (p. 124). Also, they might be asked to rate their portfolios against a checklist which was used to guide the portfolio process from the beginning (Little, 2005). As Little points out, this approach is certainly limited to the immediate context of learning and may apply criteria that have little relation to the criteria of external assessment. As an example of summative self-assessment, one could refer to Ballard (1992) who used portfolios to teach and assess high school seniors’ writing. Students were required to self-assess their own writing and progress by means of an end-of-term written assignment which included the following steps:

STEP 1: The students rank their papers in order of most to least effective with a brief rationale for what they think are the good and bad points and what they have learned through the assignment.

STEP 2: They discuss what they have learned about writing as a result of this course and the way they go about writing.

STEP 3: They describe how they feel at this point about writing and how they view writing now, as opposed to before taking the course (p. 46).

As Ballard asserts, the students had insight into their own strong and weak points and were willing to be honest about their efforts. They told what they learned from particular assignments, ranging from technical problems such as overcoming sentence structure errors to more personal insights like realization of the importance of "thinking" at every step of the writing process. They also came to realize that writing is a difficult process. However, writing block became less and less of a problem throughout the semester. Also, many of the students believed the importance of revision and peer review in the writing program and that they were confident enough to try any type of writing.

As mentioned before, there is scant empirical research related to self-assessment of writing in the portfolio context. Therefore, the present research aimed at investigating the following research questions:

Research Question One: Is there a significant difference between the teacher’s assessment and students’ self-assessment of writing at the beginning of the portfolio program?

Research Question Two: Is there a significant difference between the teacher’s assessment and students’ self-assessment of writing at the end of the portfolio program?

Research Question Three: What are students’ perceptions about the use of self-assessment in the portfolio program?

3. Methodology

3.1. Participants

The participants of the study were 30 undergraduate EFL students at the University of Isfahan who were majoring in English literature at the time of study. They were juniors who had passed general courses like reading comprehension, conversation, and grammar. They had also passed a course on paragraph writing. A bio-data questionnaire given at the beginning of the semester showed that the participants in the portfolio program were aged between 20 and 24 at the time of the study and none of them had experienced portfolio assessment in their classes i.e. they were all assessed by traditional assessment in their previous classes.

3.2. Design

Both quantitative and qualitative methods were used to deal with research questions. The study used one- group pretest- posttest design to investigate the differences between the teacher’s assessment and students’ self-assessment of writing at the beginning and end of the treatment i.e. the portfolio program. As for qualitative data, inductive thematic analysis was utilized and the themes emerged from the data were discussed.

3.3. Instrumentation

Multiple data collection methods were used to obtain comprehensive and useful data. The data were collected from a bio-data questionnaire, writing tasks, and oral interviews which are explained as follows: Bio-data questionnaire: The students were given a simple researcher-made questionnaire at the beginning of the study to obtain information about their age, gender, and previous experience in learning English in general, and learning English writing in particular (see Appendix A). Writing tasks: The participants were given 5 writing tasks throughout the program. The writing samples were rated analytically using the modified version of Wang and Liao’s (2008) writing scoring rubric (see Appendix B) and were rated by the teacher twice in order to ensure intra-rater reliability of the ratings. They were also assessed by students as part of the portfolio program to find out the nature of self-assessment in the Iranian University EFL context. To investigate the hypotheses of the study, the first and the last writing tasks were used as data.

3.4. Procedure

A group of thirty students at the University of Isfahan took a course on essay writing. The class met once a week for ninety minutes. The teacher explicitly taught them the structure of the essay, how to develop the thesis statement, body paragraphs, and conclusion, outlining, coherence, unity, etc. The students were required to write five essays of different genres (i.e. example, classification, cause-and-effect analysis, comparison and contrast, and argumentative essays) during the term on general topics which did not require expert knowledge. The students were asked to reflect on, redraft, and revise their essays. They were also asked by the teacher to self-assess their writing based on a writing scoring rubric by Wang and Liao (2008). The evaluation of their writing ability was based on their portfolios.

The portfolio process involved all the stakeholders i.e. the teacher, the students, and their peers. At the beginning of the course, the participants were provided with the explanation of the nature, purpose, and the design of portfolio assessment. Having received the first draft of students’ essays, the teacher read them carefully. Then, under each assignment, he wrote his comments as to focus, elaboration, organization, conventions, and vocabulary of students’ writings. Therefore, the students gained information about their strengths and weaknesses in these aspects of their essays. The students were asked to make reflections on their writing in the classroom and evaluate their strengths and weaknesses. Then, they self-assessed the first draft of every writing type using the modified version of Wang and Liao’s (2008) writing scoring rubric and receiving some guidance and assistance from their teacher. The teacher did not grade the drafts so that students could develop their writing ability in an anxiety-free environment. In short, the portfolio project required that students write essays of different genres. They revisited, reflected on, and revised the essays in response to the teacher – and peer feedback during the term. The result was multi-drafted essays informed by teacher feedback, peer feedback, and student reflections.

As mentioned earlier, the analytic rating scale was based on a modified version of Wang and Liao’s (2008) writing scoring rubric. A previous version of the writing scoring rubric was sent to a number of scholars for expert scrutiny. By utilizing the expert judgment, some amendments and revisions were made to the previous version. As shown in Appendix B, the writing scoring rubric has five subscales of focus, elaboration, organization, conventions, and vocabulary, each with five levels. The subscales receive equal weight. The lowest and highest score in each subscale is 1 and 5, respectively. The ratings were made twice by the teacher to ensure intra-rater reliability. In line with Lam and Lee's (2010) study, to foster a close connection between teaching and assessment, the assessment criteria focusing on five aspects (i.e. focus, elaboration, organization, conventions, and vocabulary) were made explicit and clear to the participants before they compiled their portfolios.    

4. Results

The main aim of the study was to investigate the use of self-assessment in a portfolio program. For this aim, a number of research questions were posed to be examined in a classroom research. In order to investigate the first two research questions, the following null hypotheses were set forth to be studies as follows:

Hypothesis One: There is no significant difference between the teacher’s assessment and students’ self-assessment of writing at the beginning of the portfolio program.

Hypothesis Two: There is no significant difference between the teacher’s assessment and students’ self-assessment of writing at the end of the portfolio program.

To test the null hypotheses, the first draft of students’ first and last writing tasks were assessed by both the students themselves and the teacher using the modified version of Wang and Liao’s (2008) writing scoring rubric. To foster the reliability of self-assessment, all the terms within the writing scoring rubric were explained to the students at the very beginning of the semester. The teacher also rated the writing samples twice to ensure intra-rater reliability of ratings. The intra-rater reliability coefficients for the teacher’s ratings of the first and the last essays were .87and .84 respectively (as shown in appendices C and D). Independent samples t-test was used to investigate the difference between teachers’ assessment and students’ self-assessment of written tasks at the beginning and end of the portfolio program. To investigate the third research question, 8 students were interviewed in order to elicit their perceptions about the use of self-assessment in the portfolio program.

4.1. Results Concerning the Research Hypotheses

As for testing the first null hypothesis mentioned above, an independent samples t-test was run to find out whether the difference between the means of teacher’s rating and student self-assessment at the beginning of the program is statistically significant. Table 1 below shows that the means of teacher’s rating and students’ self-assessment are 15.63 and 18.20, respectively. In fact, students tended to rate their writing performance much higher than the teacher.

Table1: Independent Samples Statistics for Teacher's rating and Self-assessment

 

Group

N

Mean

Std. Deviation

Std. Error Mean

T1 ass

1.00

2.00

30

30

15.6333

18.2000

2.93003

2.79655

0.53495

0.51058

Notes: a. T1ass = Task 1 assessment; b. Group 1 = Teacher assessment; c. Group 2 = Self-assessment

 

As displayed in Table 2 below, the p value of 0.00 is much less than 0.05. This shows that the difference between the means is statistically significant.

Table 2: Independent Samples t-test

 

Levene’s Test for Equality of Variances

t-test for Equality of Means

F

Sig.

t

DF

Sig. (2-taied)

Mean Difference

Std. Error Difference

95% Confidence Interval of the Difference

Lower

Upper

T1 ass

Equal variances assumed

Equal variances not assumed

.000

1.000

-3.741

 

 

 

 

-3.741

58

 

 

 

 

57.874

.001

 

 

 

 

.001

-2.5666

 

 

 

 

-2.5666

 

.7395

 

 

 

 

.7395

-4.0469

 

 

 

 

-4.0470

-1.0864

 

 

 

 

-1.0863

                     

Note: T1ass = Task 1 assessment

Based on the aforementioned analyses, it was shown that there was a significant difference between teacher’s rating and student self-assessment of the first essay at the beginning of the portfolio program. Therefore, the first null hypothesis was rejected. To see if there was a significant difference between teacher’s rating and students’ self-assessment of the last essay in the portfolio program, another independent samples t-test was run.

 

Table 3: Independent Samples Statistics for Teacher's rating and Self-assessment

 

Group

N

Mean

Std. Deviation

Std. Error Mean

T2 ass

1.00

2.00

30

30

17.0333

19.0000

2.77282

2.99425

.50624

.54667

Notes: a. T2ass = Task 2 assessment; b. Group 1 = Teacher assessment; c. Group 2 = Self-assessment

 

As shown in Table 3 above, the mean scores of teacher’s rating and students’ self-assessment were shown to be 17.03 and 19 respectively. It should be noted again that the students tended to self-assess their writings higher than the teacher. As shown in Table 4 below, the p value of .01 is much less than .05. This means that the difference between the means is statistically significant. In other words, there was a significant difference between teacher’s rating and students’ self-assessment of their last essay in the portfolio program. Hence, the second null hypothesis was also rejected.

 

Table 4: Independent Samples t-test

 

Levene’s Test for Equality of Variances

t-test for Equality of Means

F

Sig.

t

DF

Sig. (2-taied)

Mean Difference

Std. Error Difference

95% Confidence Interval of the Difference

Lower

Upper

T2 ass

Equal variances assumed

Equal variances not assumed

.998

.322

-2.64

 

 

 

 

-2.64

58

 

 

 

 

57.661

.011

 

 

 

 

.011

-1.96667

 

 

 

 

-1.96667

 

.74507

 

 

 

 

.74507

-3.4580

 

 

 

 

-3.4582

-.4752

 

 

 

 

-.4750

                     

Note: T2ass = Task 2 assessment

 

The aforementioned statistical analyses showed that both null hypotheses of the study were rejected. In other words, there was a significant difference between teacher’s assessment and students’ self-assessment of essays at the beginning and end of the portfolio program.

4.2. Results of Qualitative Data: Student Perceptions about Self-Assessment in the Portfolio Program

To investigate the third research question, the first author interviewed 8 students and elicited their ideas about the use of self-assessment in the portfolio program. Transcripts of the interviews were analyzed inductively and the themes that emerged from the data were identified. The major extracted themes were “Merits” and “Demerits” as detailed below.

As the majority of the students said, using reflection in the writing class was a new and interesting experience for them. They commented on the usefulness of reflection and self-assessment in the writing portfolio program. They believed that as writers they should not rely on the teacher’s or peer comments. Instead, they should develop a critical, reflective approach to their own writing. The interesting point to be mentioned here is that the students found reflection to be useful in their real lives, too. As one of them said, "It is better for someone evaluate herself before anyone else do [es] this for her." The other one noted that "The reflection and self-assessment of my writing not only helped me learn how to assess my writing, but also evaluate myself, to assess myself in my whole life." In fact, self-assessment of the essays-as a main element of student-centered classes-, was well-received by most of the students. As one of them said "when I get back to my own essay I understand in which parts I have problems and I try to improve my essay".

Contrary to the above mentioned statements about reflection and self-assessment in the portfolio program, two interviewees commented that they did not like them because they believed that "I as a writer can’t distinguish my mistakes and errors. If I did know them, then there was no need to revise it for my writing was perfect" and "I think students can’t comment on their own or their peer's writing. There must be a superior person with more knowledge [i.e. the teacher] to correct us. Our information [i.e. knowledge] is limited; we can’t find mistakes to comment on them". It seems that these students could not get used to the practice of reflection and self-assessment. It should be mentioned that many of the students had this idea at the beginning, but gradually they began to appreciate the importance of reflection in improving writing.

 

 

5. Discussion

Today, one can find some university EFL students that cannot write even a simple coherent English sentence after four years of study at the university level, and the grades they get in the norm-referenced assessment culture do not have accountability. Based on these grades, teachers cannot provide evidence that the intended learning outcomes have been achieved. As Hirvela and Pierson (2000) note, grades do not account for the processes by which students produce their products and do not ask the students to reflect upon their learning and form a comprehensive picture of what they have learned. The problem of the Iranian EFL learners might be rooted in the classroom practices. Focusing mainly on the one-shot essay tests might deprive the students of the values of process-oriented approach to writing.  Perhaps we could improve this problem by administering portfolio assessment - which has a focus on process by its definition - in writing classes.

Two independent samples t-tests were run to see the difference between self-assessment and teacher assessment of writing in the writing portfolio context. The results showed that there was a significant difference between students’ self-assessment and teacher assessment in the first and last writing tasks in the portfolio context. In line with Little (2005), it could be mentioned that learners whose experience of formal instruction was traditional and teacher- led cannot be expected to assess themselves accurately. Our students were guided in self-assessment, but the duration of one semester might not be long enough for them to fully develop self-assessment strategies. This may explain significant difference between self - and teacher assessment. The findings are in contrast with Elahinia’s (2004) findings that there is a significant positive relationship between self - and teacher assessment in the portfolio context. The difference in findings might be due to the difference in self-assessment checklists. Elahinia’s participants were required to fill in a checklist which asked them to provide yes/no answers to a series of yes/no questions. The participants of this study, on the other hand, had to select - on a Likert-type scale - the statement which they thought tallied their writing ability. It should be noted that as displayed in Tables 2 and 4, the difference between the means of teacher’s rating and students’ self-assessment shows that students’ assessment was higher than that of the teacher. This finding is in contrast to Matsuno’s (2009) finding in which the Japanese students assessed their writing lower than predicted. This underestimation of their writing abilities might be rooted in the tendency of many Japanese to display a degree of modesty, as Matsuno said. These contradictory findings stress the need for more and more studies to explore self-assessment as a tool for assessing language ability in the EFL context.

Analysis of qualitative data suggests that students have a positive attitude toward self-assessment. The findings are in line with Sharifi and Hassaskhah (2011) who found that students had a positive attitude to portfolio-based learning. The results are also in line with Farahian and Avarzamani (2018) who found in an experimental study that students had a positive view of writing portfolios. However, a few students disliked the idea of student’s self-assessment. This resistance to self-assessment might be rooted in the educational culture of many Asian countries in which students are not used to assessing their own learning in their classes. There is usually an authority in the class (i.e. the teacher) who evaluates their achievements. There appears to be a need to provide students with more training in self-assessment and collaborative learning strategies to ensure the quality of self-assessment in portfolio programs.

In line with Tavakoli and Ghoorchaei (2009), self-assessment and teacher assessment should be used jointly in assessing students’ language abilities. It could be suggested that to assess students’ writings in portfolio programs teachers use multiple sources of information including teacher-, peer- and students’ self-assessment. Based on the aforementioned findings, it could be suggested that self-assessment per se, is not a valid measure of student achievement but could be used in low stakes decisions such as placement tests whereby as Bachman (2004) asserts, the “decision errors are easy to correct” (p. 12).  Meanwhile, self-assessment which is at the heart of portfolio pedagogy should not be sent to oblivion by language teachers because it can be used as a pedagogical tool to improve students’ writing.

6. Conclusion and Implications

The present study delved into the issue of self-assessment and teacher assessment of writing in the portfolio context. The results showed that there was a significant difference between students’ and teachers’ self-assessment of writing. Furthermore, students’ perceptions about self-assessment were also tapped on using semi-structured interviews. The findings showed that most of the students had a positive attitude toward reflection and self-assessment in the portfolio program.

The study has some implications for teaching, assessment, and curriculum development in the Iranian EFL context. First, teachers should bear in mind that assessment with timed-essays is incoherent with recent process-based approaches to teaching writing. Portfolio assessment can be used in EFL classrooms to evaluate both the product and process of English writing. It can be used in classroom settings as a mechanism, whereby learning, teaching, and assessment are linked. Also, there should be a shift away from the authoritarian power of teachers to assign grades. Assessment should be viewed as an enterprise involving all stakeholders i.e. students, peers, and teachers.  Involving students in the assessment process might pave the way for learner autonomy. 

In developing curricula and writing syllabi, special attention should be given to students’ self-assessment of their learning. This will not only give the teachers a full portrait of students’ learning but also will be a motivating factor for students to pursue their learning more meaningfully. It could be suggested that portfolios be embedded in the writing curriculum of EFL college students. The implementation of portfolio assessment in EFL classes needs careful planning and adequate training of teachers so that it becomes an effective teaching, learning, and assessment tool.

Students’ activities and attitudes toward portfolios are of prime importance in success or failure of any portfolio program. Therefore, there is a need for constant tutorial support during the implementation of portfolio programs. Taking into account the importance of students in the constructivist education, students should be trained in collaborative learning and self-assessment.

The study had some limitations. The participants of the study were only a sample of EFL students studying at the University of Isfahan who cannot be claimed to be representative of the EFL students in the whole population. This might limit the strength of findings. Also, due to the shortage of time and institutional constraints, students completed their portfolios beyond the classroom context. Thus, their work is subject to "academic dishonesty" (Zhang, 2009, p. 108). The next limitation is related to the one- semester-long duration of the program. The period might not be long enough for significant changes to take place. For example, students’ self-assessment strategies may develop in a longer period of time. Extending the duration of the essay writing course to two semesters was impractical considering the two credit essay writing course in the syllabus for English majoring students at state Universities in Iran.

Few studies have been done on the notion of self- assessment of writing in portfolio programs and there has been scant research which deals with the issue in an EFL context. Therefore, there are several potential research areas to enrich our understanding of the implementation of self-assessment in such contexts. Other studies with students of various levels of writing and at different universities can be done to consolidate the findings. In this study, the modified version of Wang and Liao’s (2008) writing scoring rubric was used to measure students’ essay writing ability. Other scales may be used as measurement tools in further studies. Finally, implementation of self-assessment in on-line portfolios could be dealt with in further research.

 

 

References

Bachman, L. F. (2004). Statistical analysis for language assessment. Cambridge: Cambridge University Press.

Ballard, L. (1992). Portfolios and self-assessment. English Journal, 81, 46-48.

Barootchi, N. & Keshavarz, M. H. (2002). Assessment of achievement through portfolios and teacher-made tests. Educational Research, 44 (3), 279-288.

Boud, D. (2000). Sustainable assessment: Rethinking assessment for the learning society. Studies in Continuing Education, 22 (2), 151-167.

Brown, H.D. (2001). Teaching by principles: An interactive approach to language pedagogy. New York: Longman.

Brown, H. D. (2004). Language assessment: Principles and classroom practices. White Plains, New York: Longman.

Buyukduman, I. & Sirin, S. (2010). Telling ELT tales out of school, learning portfolio to enhance constructivism and student autonomy. Procedia Social and Behavioral Sciences, 3, 55-61.

Camp, R., & Levine, D. S. (1991). Portfolios evolving. In P. Belanoff & M. Dickson (Eds.), Portfolios: Process and product (pp. 194-205). Portsmouth, NH: Boynton/Cook.

Chang, C.C. (2008). Enhancing self-perceived effects using web-based portfolio assessment. Computers in Human Behavior, 24, 1753-1777.

Coombe, C., Folse, K., & Hubley, N. (2007). A practical guide to assessing English language learners. Michigan: The University of Michigan Press.

D'Aoust, C. (1992). Portfolios: Process for students and teachers. In K. B. Yancey (Ed.), Portfolios in the writing classroom (pp. 39-48). Urbana, IL: National Council of Teachers of English.

Delett, J.S., Barnhardt, S., & Kevorkian, J.A. (2001). A framework for portfolio assessment in the foreign language classroom. Foreign Language Annals, 34 (6), 559-568.

Elahinia, H. (2004). Assessment of writing through portfolios and achievement tests. (Unpublished M.A thesis). Kharazmi University, Tehran, Iran.

El-Koumy, A. S. (2004). Effect of self-assessment of writing processes versus products on EFL students’ writing. Paper presented at the Tenth EFL Skills Conference, the American University in Cairo, Center for Adult and Continuing Education. Retrieved in January 2011 from https://files.eric.ed.gov/fulltext/ED490559.pdf, March 10, 2008.

Farahian, M. & Avarzamani, F. (2018). The impact of portfolio on EFL learners’ metacognition and writing performance. Cogent Education, 5 (1), 1-21.

Genesee, F. & Upshur, J. A. (1996). Classroom-based evaluation in second language education. Cambridge: Cambridge University Press.

Gottlieb, M. (1995). Nurturing student learning through portfolios. TESOL Journal, 5(1), 12-14.

Gottlieb, M. (2000). Portfolio practices in elementary and secondary schools. In Ekbatani, G. & Pierson, H. (Eds.), Learner directed assessment in ESL (pp. 89-104). Mahwah, NJ: Erlbaum.

Hirvela, A. & Pierson, H. (2000). Portfolios: Vehicles for authentic self-assessment. In G. Ekbatani & H. Pierson (Eds.), Learner directed assessment in ESL (pp. 105-126). Mahwah, NJ: Lawrence Erlbaum.

Hughes, A. (2003). Testing for language teachers. Cambridge: Cambridge University Press.

Hyland, K. (2003). Second language writing. Cambridge: Cambridge University Press.

Janssen-van Dieton, A. (2000). Alternative assessment: Self-assessment beyond the mainstream. Melbourne Papers in Language Testing, 9, 18-29.

Jonassen, D.H. (1991). Evaluating constructivist learning. Educational technology, 31, 28-33.

Lam, R. & Lee, I. (2010). Balancing the dual functions of portfolio assessment. ELT Journal, 64 (1), 54-64.

Little, D. (2005). The Common European Framework and the European Language Portfolio: involving learners and their judgments in the assessment process. Language Testing, 22 (3), 321–336.

Marlowe, B.A., & Page, M.L. (2005). Creating and sustaining the constructivist classroom. (2nd ed.). California: Corwin press.

Matsuno, S. (2009). Self-, peer-, and teacher-assessments in Japanese university EFL writing classrooms. Language Testing, 26 (1),75-100.

Ross, S. (2005). The impact of assessment method on foreign language proficiency growth. Applied Linguistics, 26 (3), 317-342.

Sajedi, R. (2014). Self- assessment and portfolio production of Iranian EFL
Learners, Procedia - Social and Behavioral Sciences 98: 1641 – 1649.

Sharifi, A. & Hassaakhah,J. (2011).The Role of Portfolio Assessment and Reflection on Process Writing. Asian EFL journal, 13(1): 193-223 

Struyven, K., Dichy, F., Janssens, S., Schelfhout, W., & Gielen, F. (2006). The overall effects of end-of-course assessment on student performance: A comparison between multiple choice testing, peer assessment, case-based assessment and portfolio assessment. Studies in Educational Evaluation, 32, 202-222.

Tavakoli, M. & Ghoorchaei, B. (2009).On the relationship between risk-taking and
self-assessment of speaking ability: a case of
freshman EFL learners. Journal of Asia TEFL, 6 (1), 1-27.

Tiwari, A. & Tang, C. (2003). From process to outcome: The effect of portfolio assessment on learning. Nurse Education Today, 23, 269-277.

Ustunel, E., & Deren, E. (2010). The effects of e-portfolio based assessment on students' perceptions of educational environment. Procedia Social and Behavioral sciences, 2, 1477-1481.

Vygotsky, L. S. (1986). Thought and language (A. Kozulin, Ed. & Trans.). Cambridge, MA: MIT Press.

Wang, Y.H. & Liao, H.C. (2008). The application of learning portfolio assessment for students in the technological and vocational education system. Asian EFL Journal, 10 (2), 132-154.

Weigle, S.C. (2002). Assessing writing. Cambridge: Cambridge University Press.

Williams, M. & Burden, R. Psychology for language teachers: A social constructivist approach. Cambridge: Cambridge University Press.

Wu, Y.I. (2005). Portfolio assessment. Educational test and assessment: From classroom view point. Taipei: Wu Nan.

Yang, N.D. (2003). Integrating portfolios into learning strategy-based instruction for EFL college students. IRAL, 41, 293–317.

Zhang, S. (2009). Has portfolio assessment become common practice in EFL classroom? Empirical studies from china. English Language Teaching 2 (2), 98-118.

Appendix A: Bio-data Questionnaire

Your teacher would be grateful if you kindly and attentively give the following personal information. The information will be kept confidential.

Name:                                               Gender:                   Age:

  1. Do you have any learning experiences in language institutes? If yes, how long?
  2. Have you ever had any opportunity to visit or stay in an English speaking country? If yes, how long?
  3. Have you ever passed any course on English writing? If yes, what are the courses? Explain how the instructor taught writing? How did they assess your writing ability?
  1. Have you ever had any experience with writing portfolios? If yes, please explain what you did in the class? How did you like it?

 

 

Appendix B: The Writing Scoring Rubric modified from Wang and Liao (2008)

Criteria

Descriptors

Scores

 

 

 

Focus

1.  Having problems with focus or failing to address the writing task.

2. Inadequately addressing the writing task.

3. Addressing the writing task adequately but sometimes straying from the task.

4. Addressing most of the writing task.

5. Specifically addressing the writing task.

1

2

3

4

5

 

 

 

 

Elaboration/Support

 

1. Using few or no details or irrelevant details to support topics or illustrate ideas.

2. Using inappropriate or insufficient details to support topics or illustrate ideas.

3. Using some details to support topics or illustrate ideas.

4. Using appropriate details to support topics or illustrate ideas.

 5. Using specific and appropriate details to support topics or illustrate ideas.

1

 

2

 

3

4

 

5

 

 

 

Organization

1. The logical flow of ideas is not at all clear and connected.

2. The logical flow of ideas is seldom clear and connected.

3. The logical flow of ideas is often clear and connected.

4. The logical flow of ideas is usually clear and connected.

5. The logical flow of ideas is always clear and connected.

1

2

3

4

5

 

 

 

 

Conventions

1. Standard English conventions (spelling, grammar, and punctuation) are poor with frequent errors.   

2. Standard English conventions (spelling, grammar, and punctuation) are inappropriate with obvious errors. 

3. Standard English conventions (spelling, grammar, and punctuation) are fair with some minor errors. 

4. Standard English conventions (spelling, grammar, and punctuation) are almost accurate.

5. Standard English conventions (spelling, grammar, and punctuation) are perfect or near perfect.

1

 

2

 

3

 

4

 

5

 

 

 

Vocabulary

1. Little knowledge of English vocabulary, idioms, and verb forms

2. Frequent errors of word/idiom form, choice, and usage;  Meaning confused or obscured.

3. Occasional errors of word/idiom form, choice, and usage but meaning not obscured.

4. Almost effective word/idiom form, choice, and usage. Almost appropriate register.

5. Effective word/idiom form, choice, and usage. Appropriate register.

 

1

2

 

3

 

4

 

5

 

Appendix C: The Intra-Rater Reliability of Teacher's Rating off Essays 1

 

 

 

Appendix D: The Intra-rater Reliability of Teacher's Ratings of Essays 5

 

 

 



[1] Assistant Professor (Corresponding Author), behroozghoorchaei@gmail.com; Department of English, Farhangian University, Tehran, Iran

[2] Professor, mr.tavakoli14@gmail.com; Department of English Language, University of Isfahan, Isfahan, Iran

Bachman, L. F. (2004). Statistical analysis for language assessment. Cambridge: Cambridge University Press.
Ballard, L. (1992). Portfolios and self-assessment. English Journal, 81, 46-48.
Barootchi, N. & Keshavarz, M. H. (2002). Assessment of achievement through portfolios and teacher-made tests. Educational Research44 (3), 279-288.
Boud, D. (2000). Sustainable assessment: Rethinking assessment for the learning society. Studies in Continuing Education22 (2), 151-167.
Brown, H.D. (2001). Teaching by principles: An interactive approach to language pedagogy. New York: Longman.
Brown, H. D. (2004). Language assessment: Principles and classroom practices. White Plains, New York: Longman.
Buyukduman, I. & Sirin, S. (2010). Telling ELT tales out of school, learning portfolio to enhance constructivism and student autonomy. Procedia Social and Behavioral Sciences, 3, 55-61.
Camp, R., & Levine, D. S. (1991). Portfolios evolving. In P. Belanoff & M. Dickson (Eds.), Portfolios: Process and product (pp. 194-205). Portsmouth, NH: Boynton/Cook.
Chang, C.C. (2008). Enhancing self-perceived effects using web-based portfolio assessment. Computers in Human Behavior, 24, 1753-1777.
Coombe, C., Folse, K., & Hubley, N. (2007). A practical guide to assessing English language learners. Michigan: The University of Michigan Press.
D'Aoust, C. (1992). Portfolios: Process for students and teachers. In K. B. Yancey (Ed.), Portfolios in the writing classroom (pp. 39-48). Urbana, IL: National Council of Teachers of English.
Delett, J.S., Barnhardt, S., & Kevorkian, J.A. (2001). A framework for portfolio assessment in the foreign language classroom. Foreign Language Annals, 34 (6), 559-568.
Elahinia, H. (2004). Assessment of writing through portfolios and achievement tests. (Unpublished M.A thesis). Kharazmi University, Tehran, Iran.
El-Koumy, A. S. (2004). Effect of self-assessment of writing processes versus products on EFL students’ writing. Paper presented at the Tenth EFL Skills Conference, the American University in Cairo, Center for Adult and Continuing Education. Retrieved in January 2011 from https://files.eric.ed.gov/fulltext/ED490559.pdf, March 10, 2008.
Farahian, M. & Avarzamani, F. (2018). The impact of portfolio on EFL learners’ metacognition and writing performance. Cogent Education, 5 (1), 1-21.
Genesee, F. & Upshur, J. A. (1996). Classroom-based evaluation in second language education. Cambridge: Cambridge University Press.
Gottlieb, M. (1995). Nurturing student learning through portfolios. TESOL Journal, 5(1), 12-14.
Gottlieb, M. (2000). Portfolio practices in elementary and secondary schools. In Ekbatani, G. & Pierson, H. (Eds.), Learner directed assessment in ESL (pp. 89-104). Mahwah, NJ: Erlbaum.
Hirvela, A. & Pierson, H. (2000). Portfolios: Vehicles for authentic self-assessment. In G. Ekbatani & H. Pierson (Eds.), Learner directed assessment in ESL (pp. 105-126). Mahwah, NJ: Lawrence Erlbaum.
Hughes, A. (2003). Testing for language teachers. Cambridge: Cambridge University Press.
Hyland, K. (2003). Second language writing. Cambridge: Cambridge University Press.
Janssen-van Dieton, A. (2000). Alternative assessment: Self-assessment beyond the mainstream. Melbourne Papers in Language Testing, 9, 18-29.
Jonassen, D.H. (1991). Evaluating constructivist learning. Educational technology, 31, 28-33.
Lam, R. & Lee, I. (2010). Balancing the dual functions of portfolio assessment. ELT Journal, 64 (1), 54-64.
Little, D. (2005). The Common European Framework and the European Language Portfolio: involving learners and their judgments in the assessment process. Language Testing, 22 (3), 321–336.
Marlowe, B.A., & Page, M.L. (2005). Creating and sustaining the constructivist classroom. (2nd ed.). California: Corwin press.
Matsuno, S. (2009). Self-, peer-, and teacher-assessments in Japanese university EFL writing classrooms. Language Testing, 26 (1),75-100.
Ross, S. (2005). The impact of assessment method on foreign language proficiency growth. Applied Linguistics26 (3), 317-342.
Sajedi, R. (2014). Self- assessment and portfolio production of Iranian EFL
Learners, Procedia - Social and Behavioral Sciences 98: 1641 – 1649.
Sharifi, A. & Hassaakhah,J. (2011).The Role of Portfolio Assessment and Reflection on Process Writing. Asian EFL journal, 13(1): 193-223 
Struyven, K., Dichy, F., Janssens, S., Schelfhout, W., & Gielen, F. (2006). The overall effects of end-of-course assessment on student performance: A comparison between multiple choice testing, peer assessment, case-based assessment and portfolio assessment. Studies in Educational Evaluation, 32, 202-222.
Tavakoli, M. & Ghoorchaei, B. (2009).On the relationship between risk-taking and
self-assessment of speaking ability: a case of
freshman EFL learners. Journal of Asia TEFL, 6 (1), 1-27.
Tiwari, A. & Tang, C. (2003). From process to outcome: The effect of portfolio assessment on learning. Nurse Education Today23, 269-277.
Ustunel, E., & Deren, E. (2010). The effects of e-portfolio based assessment on students' perceptions of educational environment. Procedia Social and Behavioral sciences, 2, 1477-1481.
Vygotsky, L. S. (1986). Thought and language (A. Kozulin, Ed. & Trans.). Cambridge, MA: MIT Press.
Wang, Y.H. & Liao, H.C. (2008). The application of learning portfolio assessment for students in the technological and vocational education system. Asian EFL Journal10 (2), 132-154.
Weigle, S.C. (2002). Assessing writing. Cambridge: Cambridge University Press.
Williams, M. & Burden, R. Psychology for language teachers: A social constructivist approach. Cambridge: Cambridge University Press.
Wu, Y.I. (2005). Portfolio assessment. Educational test and assessment: From classroom view point. Taipei: Wu Nan.
Yang, N.D. (2003). Integrating portfolios into learning strategy-based instruction for EFL college students. IRAL, 41, 293–317.
Zhang, S. (2009). Has portfolio assessment become common practice in EFL classroom? Empirical studies from china. English Language Teaching 2 (2), 98-118.