Exploring the Attitude of ESP Learners towards Using Automated Writing Evaluation to Assess their Writing


  • Tamer Gamal Abd El Rasoul Arab Academy for Science, Technology, and Maritime Transport, Alexandria, Egypt.
  • Abeer Refky Arab Academy for Science, Technology and Maritime Transport, Alexandria, Egypt.
  • Maerwa Adel Aboelwafa Arab Academy for Science, Technology and Maritime Transport, Alexandria, Egypt.


The aim of the current study is to explore the attitudes of ESP learners towards using automated writing evaluation (AWE) to assess their writing. The mixed-method qualitative and quantitative approach is employed in this study. The sample of the study consisted of 201 second-year students from the college of engineering at the Arab Academy for Science, Technology and Maritime Transport, Egypt. A post-experiment questionnaire was utilized to investigate the students` attitudes towards using AWE to assess their writing. The results of the study revealed that the students hold positive attitudes towards using the AWE software Grammarly since it encouraged them to self-correct their errors and revise their writings before submitting them to their teachers. Based on the findings of this study, it is recommended to conduct research on the pedagogical usage of AWE tools in writing classes, and the attitudes of the writing instructors towards using AWE tools in their writing classes.


Burstein, J., Chodorow, M., & Leacock, C. (2004). Automated essay evaluation: The criterion online writing service. AI Magazine, 25(3), 27. https://doi.org/10.1002/rcm.5057.

Cavaleri, M., & Dianati, S. (2016). You want me to check your grammar again? The usefulness of an online grammar checker as perceived by students. Journal of Academic Language and Learning, 10(1), 223-236. Retrieved from https://journal.aall.org.au/index.php/jall/issue/view/22

Chapelle, C. A., Cotos, E., & Lee, J. (2015). Validity arguments for diagnostic assessment using automated writing evaluation. Language Testing, 32(3). https://doi.org/10.1177/0265532214565386.

Chen, C.-F., & Cheng, W.-Y. (2008). Beyond the design of automated writing evaluation: Pedagogical practices and perceived learningeffectiveness in EFL writing classes. Language Learning and Technology, 12(2), 94-112. Retrieved from https://www.lltjournal.org/item/2631

Flower, L. (1994). The construction of negotiated meaning: A social cognitive theoryCarbondale, IL: Southern Illinois University Press.

Graham, S. (2006). Writing. In P. Alexander, & P. Winne, (eds.), Handbook of Educational Psychology (pp. 457–478). Mahwah, NJ: Erlbaum.

Hayes, J. R. (2012). Modeling and Remodeling Writing. Written Communication, 29(3), 369– 388.

Hayes, J. R., & Olinghouse, N. G. (2015). Can Cognitive Writing Models inform the Design of the Common Core State Standards? The Elementary School Journal, 115(4), 480-497. Retrieved from https://www.learntechlib.org/p/161731/.

Horowitz, D. (1986). Process, Not Product: Less Than Meets the Eye. TESOL Quarterly, 20(1), 141-144. DOI:10.2307/3586397

Hyland, K. (2003) Second language writing. New York: Cambridge University Press.

Hyland, K., & Hyland, F. (2006). Feedback in second language writing: Contexts and issues. Cambridge University.

Khoii, R., & Doroudian, A. (2013). Automated Scoring of EFL Learners’ Written Performance: a Torture or a Blessing? In Conference proceedings. ICT for language learning (p. 367).

Lavolette, E. (2015). The accuracy of computer- assisted feedback and students. Retrieved from https://www.mendeley.com/catalogue/accuracy-computer-assisted-feedback-students-responses-it/.

O’Neill, R., & Russell, A. M. T. (2019). Stop! Grammar time: University students’ perceptions of the automated feedback program Grammarly. Australasian Journal of Educational Technology,

Shim, Y. (2013). The effects of online writing evaluation program. Teaching English with Technology, 13(3), 18–34.

Silva, T. 1990. Second language composition instruction: developments, issues, and directions. In Kroll (ed.), Second language writing: research insights for the classroom (pp11-23).Cambridge: Cambridge University Press.

Smith, T. 2018. ‘More states opting to “robo-grade” student essays by computer’. National Public Radio website. Available at https://www.npr.org/2018/06/30/624373367/more-states-opting-torobo-

Walsh, K. (2010). The importance of writing skills: Online tools to encourage success. Retrieved December 27, 2012, from http://www.emergingedtech.com/2010/11/the-importance-of-writing-skills-online-tools-to-encourage-success/

Wang, P. (2013). Can Automated Writing Evaluation Programs Help Students Improve Their English Writing? International Journal of Applied Linguistics & English Literature, 2(1), 6–12. https://doi.org/10.7575/ijalel.v.2n.1p.6.

Wang, Y.-J., Shang, H.-F., & Briody, P. (2013). Exploring the impact of using automated writing evaluation in English as a foreign language university students’ writing. Computer Assisted Language Learning, 26(3), 234–257. https://doi.org/10.1080/09588221.2012.655300.

Warschauer, M., & Grimes, D. (2008). Automated Writing Assessment in the classroom. Pedagogies: An International Journal, 3(1), 22–36. https://doi.org/10.1080/15544800701771580.

Warschauer, M., & Grimes, D. (2008). Automated Writing Assessment in the classroom. Pedagogies: An International Journal, 3(1), 22–36. https://doi.org/10.1080/15544800701771580.

Wilson, J., & Czik, A. (2016). Automated essay evaluation software in English Language Arts classrooms: Effects on teacher feedback, student motivation, and writing quality. Computers and Education, 100, 94–109. https://doi.org/10.1016/j.compedu.2016.05.004.