Innoeduca. International Journal of Technology and Educational Innovation
Innoeduca. International
Journal of Technology and Educational Innovation
Vol. 11. No. 1. June 2025 - pp. 108-133 - ISSN: 2444-2925
DOI: https://doi.org/10.24310/ijtei.111.2025.20521

The effectiveness of Grammarly application and teacher feedback for undergraduate EFL students’ writing skills

La eficacia de la aplicación de Grammarly y la retroalimentación proporcionada por el docente en relación con los estudiantes de EFL en el ámbito de la escritura
RECEIVED 25/11/2024 ACCEPTED 24/02/2025 PUBLISHED 01/06/2025
orcid
Delsa Miranty
Universitas Sultan Ageng Tirtayasa, Indonesia
orcid
Utami Widiati
Universitas Negeri Malang, Indonesia
orcid
Bambang Yudi Cahyono
Universitas Negeri Malang, Indonesia
orcid
Tengku Intan Suzila Tengku Sharif
Universiti Teknologi MARA, Malaysia
ABSTRACT
This research investigates the efficacy of Grammarly as an application and teacher feedback in enhancing the writing skills of Indonesian undergraduate EFL students. The Switching Replication Design encompasses 78 first-year students enrolled in a writing course. The study examines the impact of these feedback methods on writing, skill enhancement, and student responses. Data is gathered via pre-and post-tests and a closed-ended questionnaire and analyzed using One-Way ANCOVA and descriptive statistics in SPSS 20. Research shows that Grammarly, when combined with teacher feedback, enhances diction, language usage, and mechanics and from teacher feedback has more effect on content and organization. The initial experimental group surpasses the second, attaining a moderate N-Gain score. Students react favorably to both feedback modalities, with Grammarly commended for its automated recommendations. The research concludes that integrating Grammarly and teacher feedback significantly improves writing skills, particularly in diction, language usage, mechanics, content, and organization.

KEY WORDS Applications; Grammarly; N-Gain score; One-Way ANCOVA; teacher feedback.

RESUMEN
Este estudio examina la eficacia de Grammarly como una aplicación y una retroalimentación pedagógica en la mejora de la escritura de los estudiantes de EFL en Indonesia. La disciplina de la Switching Replication Design abarca 78 estudiantes de primer año que están inscritos en un curso de escritura. La investigación analiza la repercusión de estos métodos de retroalimentación en la escritura, el desarrollo de habilidades y las respuestas de los estudiantes. La información se recaba a través de pre- y post-tests, así como a través de un questionnaire closed-ended, y se analiza mediante el uso de One-Way ANCOVA y descriptive statistics en SPSS 20. Se ha demostrado que el uso de Grammarly, cuando se combina con la retroalimentación del profesor, potencia la diction, el uso del lenguaje y los mecanismos, y la retroalimentación del profesor tiene una mayor influencia en el contenido y la organización. El grupo experimental inicial supera al grupo experimental secundario, logrando una puntuación moderada de N-Gain. Los estudiantes responden favorablemente a ambas modalidades de retroalimentación, con Grammarly siendo elogiado por sus consejos automatizados. La investigación concluye que la integración de Grammarly y la retroalimentación pedagógica contribuye significativamente a la mejora de las habilidades de escritura, especialmente en aspectos como la diction, el uso del lenguaje, las técnicas, el contenido y la organización.

PALABRAS CLAVE Aplicaciones; Grammarly; puntuación N-Gain; ANCOVA unidireccional; retroalimentación del docente.

1. INTRODUCTION

Writing in English is essential for foreign language learners. To produce a piece of writing, they should use appropriate vocabulary, form meaningful sentences of different text types, and choose proper arrangements to link sentences (Nasser, 2018) through the writing processes of planning and organizing ideas followed by drafting, reviewing, and editing to improve the writing quality before arriving at the final version (Albesher, 2022; Bai et al., 2022; Martínez et al., 2020; Mirzae & Shamsudin, 2023; Teng et al., 2022). Technology must be utilized in writing classes to deal with the writing processes (Aljameel, 2022; Loncar et al., 2023; Taskiran & Goksel, 2022; Wong et al., 2022; Wu, 2022; Yuniar et al., 2019; Zhang & Zou, 2022).

Recent technological advances in artificial intelligence (AI) have paved the way for improved electronic writing tools, as well as the development of innovative ones. These writing support systems provide human-like sentence completion and text generation, making them essential for many writers and students (Alharbi, 2023). Alharbi showed that students use more AI-powered writing tools to improve their writing. Other studies found that deep learning may improve human-AI learning community performance (Song & Song, 2023; Wang et al., 2023). In sum, new AI-powered writing tools have improved writing performance.

Technology has been a learning tool for electronic corrective feedback by EFL teachers to their students’ compositions (Almusharraf & Alotaibi, 2023; Barrot, 2020; Lei, 2020; Liu et al., 2023; Miranty et al., 2023; Mohsen & Alshahrani, 2019; Parra & Calero, 2019; Tang & Rich, 2017; Zou et al., 2023), fostering EFL students’ writing achievement (Andina et al., 2020). One of the new language-learning technologies is Automated Writing Evaluation (AWE), an outstanding support for meeting diagnostic feedback on writing aspects (Ranalli et al., 2017). Bai and Hu (2017) stated that AWE offers users consistent explanations and immediate feedback. These two characteristics have allowed students to overcome time constraints, develop writing skills at their own pace, and participate and interact independently in language classes (Liao, 2016; Zhang et al., 2020). In addition, the consistency and objectivity of AWE allow students to improve their writing mechanics and accuracy (Tian & Zhou, 2020).

In writing courses, students can use AWE tools to plan, write, get automated feedback, revise, and improve their writing (Roscoe et al., 2017). AWE can also promote learner autonomy (Tang & Rich, 2017) and save teachers’ time (Palermo & Wilson, 2020). Moreover, students can access sample writing and online dictionaries through some AWE tools, and teachers may have access to additional tools like plagiarism detection (Alharbi, 2023; Ariyanto et al., 2021; Barrot, 2023; Hockly, 2019; Joo, 2021; Shadiev & Feng, 2023).

Much literature has shown that the use of Grammarly, one of the AWE tools, helps students understand grammatical rules and their writing tasks (Agustin & Wulandari, 2022; Cavaleri & Dianati, 2016; Estacio et al., 2022; Fitriana & Nurazni, 2021). Grammarly can correct grammaticality, assess the correctness and readability of writings, and suggest vocabulary enhancements (Lei, 2020). It also has a built-in plagiarism detector and can check styles beyond sentences (Grammarly, 2019). Thus, the core concept to use Grammarly is that electronic corrective feedback can help EFL students improve their writing.

The use of Grammarly has been empirically proven beneficial. Grammarly encourages learner autonomy by requiring students to evaluate their work using the system’s feedback (Qassemzadeh & Soleimani, 2016). Cavaleri and Dianati (2016) and O’Neil and Russel (2019) found that students of both studies liked the easy-to-use and valuable feedback of Grammarly. It can deal with writing complexity in different genres, allowing teachers to spend more time with students and providing direct, indirect, and metalinguistic feedback (Bailey & Lee, 2020). O’Neill and Russell (2019) found Grammarly valuable because it allows students to choose their preferred feedback strategy based on their needs. Using Grammarly software in EFL writing helps students make fewer errors in terms of vocabulary usage, language usage, and mechanics (Ghufron & Rosyida, 2018). Grammarly can be a proofreading service for spelling, punctuation, vocabulary, and plagiarism (Barrot, 2023; O’Neil & Russel, 2019). In addition, Grammarly has proved helpful for L2 writers right after they finish their first draft because the feedback lets them see where they might need to improve their writing before turning it into the teacher (Almusharraf & Alotaibi, 2023). In sum, Grammarly is a useful tool; it provides students with some autonomy and motivation elements necessary in writing classes.

However, studies have found different results and essential gaps in our understanding of how Grammarly can help improve writing skills. Grammarly adds to teacher feedback without clarifying who is responsible for fixing higher- and lower-order issues (Koltovskaia, 2023). Therefore, it is still unclear how well this integration improves writing quality overall. Research comparing Grammarly’s corrections with teacher feedback shows how valuable and vital the tool is, suggesting that Grammarly cannot replace human evaluation (Khushk et al., 2024). Grammarly considerably reduces mistakes, but how well it works depends on how well the learners already know the language (Jomaa & Jibroo, 2024). These results show that more research is needed to fully understand Grammarly’s role in various settings and its relations with teacher feedback.

In this study, Grammarly is complemented with teacher feedback, which educators have used in workbooks, exams, and lessons. Teacher corrective feedback impacts content and organization more than diction, language use, and mechanics (Ghufron & Rosyida, 2018). The best-known teacher feedback is written feedback on students’ writing pieces (Sermsook et al., 2017). In sum, teacher feedback has been viewed as an essential part of improving the performance of L2 writers. The present study explores how Grammarly’s automated feedback can be combined with teacher feedback. In Indonesia’s collectivist culture, where teacher-student trust and respect affect learning outcomes, the study stressed the relational aspect of feedback. Boud and Dawson (2023) found that teacher feedback literacy—the ability of teachers to design and deliver feedback—can improve learning, as also highlighted by Heron et al. (2023).

Zheng et al. (2023) showed that verification, scaffolding, and teacher praise improved SRL strategies. EFL education in Indonesia emphasizes independent learning, making this finding relevant. In Indonesia, feedback delivery may be inconsistent, especially in large classrooms. Also, teacher feedback should balance praise and constructive criticism for writing development. In conclusion, this feedback from Grammarly and teacher feedback may improve writing development. 

Studying how students interact with teachers and automated feedback tools like Grammarly has expanded research on feedback’s impact on writing. One study found that Hungarian university students engaged moderately to low with teachers and automated feedback, focusing on form rather than meaning, which affected their writing revision strategies (Thi et al., 2022). Then, research on AWE tools like Grammarly, ChatGPT, and Quillbot shows that they improve students’ feedback literacy, depending on their feedback-seeking behaviors (Gozali et al., 2024). Moreover, using these technologies in writing courses has necessitated frameworks to help educators integrate AWE tools while considering digital literacy and ethical AI use (Arredondo & Laurens, 2023). New technologies like the pandemic-era metaverse could improve educational delivery by creating virtual learning ecosystems that change how students interact with feedback and digital tools.

This study employed a Switching Replication Design in two writing classes, providing two different types of feedback, Grammarly and teacher feedback, in a different order, a research design different from the previous studies about Grammarly and teacher feedback, such as the ones by Karyuatry et al. (2018), Ventayen and Ventayen (2018) and ONeill and Russell (2019).

In the context of teaching argumentative writing, many studies examined the effect of various strategies and methods on argumentative writing skills (Awada & Diab, 2023; Backman et al., 2023; Jumariati & Sulistyo, 2017; Landrieu et al., 2023; Olson et al., 2023; Wang & Chiu, 2024). They discovered that EFL student teachers with limited writing experience struggled with vocabulary and grammar when composing an argumentative writing task (Ghufron & Rosyida, 2018). The most consistent finding in these studies is that students, regardless of their stage of development, struggle to write argumentative texts and that it is necessary to provide planned and educational tools to ensure student’s success in the writing course. Using AWE may help overcome students’ problems in writing (Fitria, 2021). Grammarly students report increased confidence and writing quality due to the grammar and spelling correction features (Setyani et al., 2023). Despite some inaccuracies in certain features, such as plagiarism detection, students find the app helpful in increasing their self-awareness and vocabulary (Setyowati et al., 2024). Less-proficient EFL students mainly show positive engagement with Grammarly despite their low cognitive engagement (Anastasia et al., 2024). This suggests that integrating Grammarly into the writing process, in addition to traditional teacher feedback, significantly impacts students’ writing skills (Inayah & Apoko, 2024). In sum, Grammarly’s effectiveness in improving writing skills is evident, especially with proper guidance and an understanding of its limitations.

However, only a few studies have been done to know the effectiveness of AWE for undergraduate students, Grammarly in particular as electronic feedback, and teacher feedback on student writing performance for Indonesian undergraduate EFL students (Fahmi & Cahyono, 2021; Karyuatry et al., 2018; Miranty & Widiati, 2021). Therefore, this study aims to determine the effectiveness of Grammarly followed by teacher feedback in writing classes of Indonesian undergraduate EFL students with two experimental courses. Three research questions are formulated in this study:

  1. How does Grammarly followed with teacher feedback affect the writing skill of Indonesian undergraduate EFL students?
  2. To what extent does Grammarly followed with teacher feedback affect the writing skill of Indonesian undergraduate EFL students?
  3. How do the Indonesian undergraduate EFL students respond to the use of Grammarly followed with teacher feedback?

2. LITERATURE REVIEW

2.1. Automated writing evaluation (AWE) as technology in the class

With advancements in educational technology and a growing reliance on technology, several studies demonstrated the efficacy of using AWE in the L2 classroom. AWE enables different technology interactions, learners, teachers, and peers to write. For instance, students can use the system to plan, write, receive automated feedback, review their work, and improve their writing (Wilson & Roscoe, 2019). AWE significantly improved students’ writing quality and genre, with argumentative writing outperforming academic and mixed writing genres (Zhai & Ma, 2023).

AWE is adapting to various cultural practices as a well-known technology and minimizing teacher writing. AWE enables students to adjust their writing style to specific cultural practices, leading to improved writing skills (Zhai & Ma, 2023). Stevenson (2016) analyzed how AWE is used in the writing classroom as a teaching method and summarizes AWE’s use in class. AWE is a popular educational technology that allows teachers to write less (Roscoe et al., 2017). Next, AWE is more effective than traditional methods for developing writing skills, including classroom, teacher-led, and peer interaction (Li, 2023). Moreover, in academic writing, one effective strategy for encouraging students to take responsibility for their improvement could be to emphasize the constant need to draft and revise. In sum, technology is needed as feedback for the students’ writing texts, and many teachers should use electronic feedback, such as Grammarly.

2.2. Sources of feedback

Generally, feedback encompasses corrective feedback, focusing on formal aspects of learners’ language and improving linguistic accuracy. Computer-generated information was used to deliver it automatically. In this study, the sources of feedback are electronic feedback (Alsmari, 2019; Barrot, 2020; Mohsen & Alshahrani, 2019) and teacher feedback (Lie & He, 2017; Wu et al., 2023; Yang et al., 2023).

Electronic feedback is a strategy to provide automated feedback from the computer to draw attention to students’ writing. Furthermore, one electronic feedback method that can be applied to students’ texts is Grammarly. Grammarly is a popular educational technology that eliminates the need for teachers to write, and it helps students enhance their writing (Fitriana & Nurazni, 2022; Roscoe et al., 2017). This is in line with the study conducted by several researchers (Dewi, 2023; Karyuatry et al., 2018; O’Neil & Russel, 2019), who also found that students perceived Grammarly as a powerful tool for quickly checking for possible stylistic errors in grammar. Additionally, they discovered that students were more satisfied with Grammarly’s feedback than with teacher feedback. As a result, the writing quality can be improved. Then, numerous studies have confirmed Grammarly’s beneficial effect (Ebyary & Windeatt, 2010; Fitria, 2022; Liao, 2016; Saman et al., 2023). In sum, Grammarly provides electronic writing feedback to students and is helpful for quickly checking grammar and stylistic errors.

Teacher feedback is commonly viewed as an essential part of improving the performance of L2 writers. The best-known teacher’s feedback is written feedback on students’ writing pieces (Sermsook et al., 2017). Teachers must do a needs assessment and method analysis on the components of writing, the amount, and how feedback should be provided to provide a source of feedback. Many of the studies on feedback focus on teacher feedback, with the study’s primary object of giving feedback and the particular type of feedback’s efficacy (Elumalai, 2019; Pearson, 2018). Moreover, the written feedback from the teacher needs to be easy to follow, practical, precise, and easy to understand (Kurnia, 2022). Teachers’ written feedback is concise, focused, practical, and crucial for enhancing L2 writing proficiency.

Grammarly and teacher feedback work well together to address various aspects of writing, particularly lower-order concerns (LOCs) and higher-order concerns (HOCs). Grammarly provides immediate and consistent corrections for mechanical issues like grammar, spelling, and vocabulary, helping students improve their writing skills (Ayan & Erdemi̇R, 2023). However, it struggles to provide meaningful feedback on complex areas such as content, organization, and coherence, which require nuanced and context-specific insights (Bulatović et al., 2024). In contrast, teacher feedback is critical in guiding students through higher-order aspects, providing detailed advice on argument structure, idea development, and logical flow (Shum et al., 2023). While Grammarly promotes mechanical accuracy, teacher feedback is critical for improving the depth and clarity of content and organization. In conclusion, while Grammarly effectively handles technical corrections, teacher feedback remains critical for fine-tuning content and structure.

FIGURE 1. Theoretical Framework of Grammarly in ELT Classroom

Theoretical Framework of Grammarly in ELT Classroom

As shown in Figure 1, the reason for using Grammarly in this study is that Grammarly focuses on academic writing in higher education. This tool combines machine-based artificial intelligence and natural language processing with deep learning algorithms to deliver real-time, quick, comprehensive writing evaluation results. Teacher feedback is crucially needed to fulfil the interactions of human beings among the students, which cannot be replaced with machines.

2.3. Argumentative text writing

The essay is a standard writing unit in college writing classes, with the argumentative essay being the most common type. The writer defends their position and attempts to rationalize the counter-arguments in the argumentative texts (Balta, 2018). Then, an argumentative essay is a piece of writing that argues for one’s position while refuting opposing viewpoints (Özdemir, 2018). Next, in argumentative writing, writers can convey their opinions but should use objective sources (Landrieu et al., 2022). As a result, argumentative essays rely on cohesion to connect sentences, which assist students in developing coherent arguments for scientific papers.

The most crucial general skill in higher education is writing an argumentative essay. Students in higher education need even more guidance regarding their academic writing, primarily argumentative writing (Kleemola et al., 2022). Then, first-year students in higher education must learn argumentative writing to meet their educational requirements (Ghanbari & Salari, 2022). Moreover, idea generation, topic-oriented writing, and learner autonomy are essential components of argumentative writing for university students (Wu & Wang, 2023). In conclusion, argumentative writing is a critical competence in university students, and in addition to meet academic requirements, college freshmen must learn how to write argumentative writing along with get feedback concerning their argumentative writing.

In this study, the students were asked to write argumentative texts with topics given by the teacher. Through argumentative writing, the students were asked to provide their opinions, criticism, and ideas related to the issue and supported by evidence and facts. Moreover, the students were allowed to integrate using Grammarly, followed by teacher feedback.

3. METHOD

3.1. Research Design

This study employed a Switching-Replications (SR) design, a robust hybrid experimental framework, to compare Grammarly’s electronic and teacher feedback on undergraduate EFL students’ writing skills (Williams & Lowman, 2018). William and Lowman add that SR is one of the most potent hybrid experimental designs. The design split English Education Department students into two experimental groups (G1 and G2), with both groups receiving SR. The SR design ensured all participants received feedback that prevented the first experimental group from becoming hostile, reducing social threats to internal validity. There was no control group, so the first and second experimental groups served as the main subjects. The first experimental group (G1) received both Grammarly and teacher feedback, while G2 received teacher and Grammarly feedback, as shown in Table 1.

TABLE 1. Switching replication design

Group Pre-test Independent Variable 1 Independent Variable 2 Post-test
G1 Y1 X O Y2
G2 Y1 O X Y2

G1: First Experimental Group (G1)
G2: Second Experimental Group (G2)
Y1: Pre Test
Y2: Post Test
X: Grammarly
O: Teacher Feedback

As shown in Table 1, the Switching Replication Design was applied in this study with feedback from Grammarly and teacher feedback for two experimental groups, but with a different order of feedback when given to the students. The first experimental group started with Grammarly, followed by teacher feedback, but the second one started with teacher feedback, followed by Grammarly. In sum, both types of feedback were given to the students but in a different order to determine their effectiveness in improving the undergraduate EFL students’ writing skills.

3.2. Population and sample

This research was done at a public university in Banten, Indonesia, where English is not spoken. First-year writing students from the 2020/2021 Teachers’ Training and Education Faculty were studied. We sampled intact classes from two classes of 78 students: 40 in the experimental class and 38 in the other. Research design and statistics determined the sample size.

This study recruited Grammarly-using EFL writing students who were willing to participate. Next, this study included 78 students from two classes: 40 in the first experimental group and 38 in the second. Cluster sampling selected 78 second-semester students from 113. This study required (1) Grammarly-taught EFL writing class enrollment and (2) willingness to participate. The reason for involving the students was that one of us (the first author) was one of the Department’s faculty members. Therefore, it was easy to get access to the students for conducting the study and distributing the closed-ended online questionnaire.

3.3. Research instruments

In this study, validators verified the data from tests and questionnaires. Two writing tests were given before and after treatments in the study. Argumentative writing tests would demonstrate students’ writing skills in this study. Students wrote two argumentative texts on the topics for this test (Yamanishi et al., 2019). This rubric focused on the content, organization, diction (choice of words), language use, and mechanics. Then, it was modified in this study. These profiles were created in 1981 by Jacobs and his colleagues (Jacobs et al., 1981). Essays are graded on a 100-point scale for content (30), organization (20), vocabulary (20), language use (25), and mechanics (30) (5 points). Each set of criteria generates interval scores.

The study used a closed-ended online questionnaire with a 5-point Likert scale (strongly disagree to agree strongly) to address the third research question on students’ responses to Grammarly. The questionnaire, consisting of 25 questions on effectiveness, source-based writing instruction, and usage frequency, was distributed via Google Forms in both English and Bahasa Indonesia to ensure clarity. After a validator review and suggestions, the questionnaire was linked to Grammarly’s effectiveness in Indonesian undergraduate EFL writing classes. The reliability and validity were assessed through Pearson correlation, Cronbach’s alpha, and SPSS 20, which also calculated test normality and homogeneity. Moreover, the closed-ended questionnaire was adapted from (Parra & Calero, 2019; Zhang et al., 2020).

3.4. Data collection procedure

This study involved first-year English Education students at a public university in Banten, Indonesia, from April to August 2021. It included a pre-test, four assignments on different topics, a post-test, and a questionnaire. The two experimental groups received Grammarly and teacher feedback but in different orders. The study aimed to assess whether Grammarly, followed by teacher feedback, improved writing performance in the first group and which writing aspects improved with this sequence. The first author taught the classes with two raters scoring the tests and assignments. Moreover, the Null hypothesis (H0) was tested in this study: “There is no significant effect of the use of Grammarly followed by teacher feedback in the writing classes of Indonesian undergraduate EFL students.”

3.5. Data analysis

This section presented data analysis of the student’s scores before and after the study and data analysis of the questionnaire from two classes.

3.5.1. Data analysis from the students’ tests

In this study, to measure the effectiveness of Grammarly and teacher feedback, researchers calculated the Gain Score (g) followed by One-way ANCOVA, and a t-test was employed, too. One-Way ANCOVA was employed to compare two or more groups while controlling for a continuous covariate that may affect the dependent variable (Howell, 2012). Howell adds that it is a continuation of One-Way ANCOVA that eliminates a covariate’s impact before examining group differences.

Next, this study was conducted to measure the extent to which Grammarly and teacher feedback affect the writing skills of Indonesian EFL undergraduate students’ effectiveness of Grammarly and teacher feedback and a t-test was conducted. A t-test was used to compare the first and second experimental groups for each indicator individually and as a group (Ghufron & Rosyida, 2018). First, the pre-and post-test results were analyzed using a paired-sample t-test to see how the AWE programs affected their writing ability. Second, an independent samples t-test was used to determine whether there was a difference in using Grammarly, followed by teacher feedback in this study.

3.5.2. Data analysis of the questionnaire

In terms of quantitative data, we described students’ responses towards using Grammarly and teacher feedback on their writing quality to answer the third research question. Descriptive statistics were used to answer the third research question. It aims at knowing whether students have positive responses to using Grammarly and teacher feedback. Therefore, SPSS version 20 was used to calculate the data.

4. RESULTS

4.1. Effects of Grammarly and teacher feedback on the writing skills of Indonesian undergraduate EFL students

4.1.1. Gain score

In this study, after calculating and getting a mean score, the gain score from the first and second experimental groups was calculated, and the result is shown in Figure 2.

FIGURE 2. N-Gain score (first & second experimental groups)

N-Gain score (first & second experimental groups)

In this study, as shown in Figure 2, the N-Gain score in the first experimental group was more significant than in the second experimental group. The N-Gain score in the first experimental group was 55.256 with moderate criteria, while the second group’s N-Gain score was 46.228 with moderate criteria. From these data, it can be seen that there was an average increase in students’ argumentative texts using Grammarly as automated feedback in the writing classes from both the tested classes. In sum, using Grammarly as automated feedback was more effective in the first experimental group than in the second group.

Then, based on the Gain score from the pre-test- and the post-test results, the mean in the first experimental group was more significant than in the second experimental group. Next, the null hypothesis in this study that the use of Grammarly and teacher feedback affects the writing skills of Indonesian undergraduate EFL students cannot be rejected. Therefore, it can be concluded that using Grammarly, followed by teacher feedback in writing classes of Indonesian undergraduate EFL students, was effective in this study.

4.1.2. One-Way ANCOVA

In this study, there were two hypotheses: Ho= There is no significant effect of using Grammarly followed by teacher feedback in the writing classes of Indonesian undergraduate EFL students. Then, for Ha=, there is a significant effect of using Grammarly followed by teacher feedback in the writing classes of Indonesian undergraduate EFL students. Based on the SPSS table output, One-Way ANCOVA has a sig value of 0.000 < 0.05. As a result, Ho was eliminated, and Ha was retained, resulting in differences in learning outcomes when using Gramarly applications and receiving teacher feedback.

TABLE 2. Tests of Between-Subjects Effects

Dependent Variable: Posttest
Source Type III Sum of Squares df Mean Square F Sig.
Corrected Model 484.145a 2 242.072 19.412 .000
Intercept 1865.755 1 1865.755 149.614 .000
Pretest 298.729 1 298.729 23.955 .000
Group 172.525 1 172.525 13.835 .000
Error 935.287 75 12.470
Total 571482.140 78
Corrected Total 1419.432 77

a. R Squared = .318 (Adjusted R Squared = .300)

An analysis of covariance (ANCOVA) was performed using the first test score as a covariate to see if the means of the second test scores for the two groups differed statistically significantly. The significance level was established at p = 0.05 for all analyses. The findings confirmed the validity of the normality assumption, as shown in Table 2.

TABLE 3. Test of Normality

Group Kolmogorov-Smirnov2 Shapiro-Wilk
Statistic df Sig. Statistic df Sig.
Residual for Posttest Grammarly-Teacher Feedback .122 40 .139 .968 40 .311
Teacher Feedback -Grammarly .096 38 .200* .989 38 .960


Results of the normality test obtained a significance of 0.311 and 0.960, and both classes had a significance greater than 0.05. Leven’s test for homogeneity of variance shows that the assumption of equality of variance is not violated; it is shown in table 3 with a significance value of 0.221> 0.05.

The sig value is determined by examining the SPSS output table of the One-Way ANCOVA test in table 4. 0.05 is less than 0.000. Consequently, there is a distinction between learning through the Grammarly application followed by teacher feedback and receiving teacher feedback followed by using the Grammarly application.

TABLE 4. Levene’s Test of Equality of Error

Variances
Dependent Variable: Posttest
F df1 df2 Sig.
1.521 1 76 .221

TABLE 5. Tests of Between-Subjects Effects

Dependent Variable: Posttest
Source Type III Sum of Squares df Mean Square F Sig.
Group 172.525 1 172.525 13.835 .000

TABLE 6. Tests of Group

Dependent Variable: Posttest
Group Mean Std. Error 95% Confidence Interval
Lower Bound Upper Bound
Grammarly-Teacher Feedback 86.940a .558 85.827 88.052
Teacher Feedback-Grammarly 83.963a .573 82.822 85.105


Based on the results of the descriptive analysis in table 5, the adjusted mean value in the class with Grammarly treatment followed by teacher feedback is 86.940, and in the group with teacher feedback treatment first, then using the Grammarly application gets a mean of 83.963. Thus, the conclusion of the class with Grammarly first is higher than the teacher’s feedback, so the learning treatment with Grammarly has a significant effect.

4.1.3. The t-Test

Grammarly and teacher feedback were two types of feedback given to the students in this study. On the one hand, Grammarly deals with three aspects of writing: Diction, language use, and mechanics. On the other hand, teacher feedback may cover the five writing aspects: Content, Organization, Diction, Language Use, and Mechanics. These feedback types were given in the first and second experimental groups.

This study’s t-test determined if the two groups wrote similarly before answering the first research question. Pre- and post-test writing scores were 1-5. A significant difference in post-test scores was observed between participants who favored Grammarly as automated feedback (M=4.0137, SD=.46362) and the experimental group (M=4.2310, SD=.61832) (Sig=.008<.05). Indonesian undergraduate EFL students benefit more from Grammarly first followed by teacher feedback in writing classes. Grammarly worked, and teacher feedback in Indonesian undergraduate EFL students’ writing classes rejected H0 and accepted H1.

Indonesian undergraduate EFL students have been proven more effective than the teacher feedback first followed by Grammarly. Therefore, H0 was rejected, and H1 was accepted because Grammarly was effective, followed by teacher feedback in the writing classes of Indonesian undergraduate EFL students.

4.2. The extent to which Grammarly and teacher feedback affect the writing skills of Indonesian EFL undergraduate students

Grammarly and teacher feedback were two types of feedback given to the students in this study. On the one hand, Grammarly deals with three aspects of writing: Diction, language use, and mechanics. On the other hand, teacher feedback may cover the five writing aspects: Content, Organization, Diction, Language Use, and Mechanics. These feedback types were given in the first and second experimental groups.

As shown in Table 2, based on the data processing results related to teacher feedback in terms of content, the mean for the first experimental group was 27.08, which is better than the second experimental group, which is 22.78. The data were normally distributed and homogeneous. For the t-test, the calculated t value is 9.189>2.040, indicating an effect between the first and second experimental groups with a significant effect of 0.3%.

TABLE 7. The summary feedback from Grammarly and teacher feedback

Items Group N Mean Score Normality Test Homogeneity Test t-test t-table (α= 0.05) Conclusion
Content First Experimental Group 40 27.08 Normal Homogenous 9.189 2.040 t-test>t-table accepted, Ho is rejected
Second Experimental Group 38 22.78 Normal Homogenous
Organization First Experimental Group 40 17.1842 Normal Homogenous 8.369 t-test>t-table accepted, Ho is rejected
Second Experimental Group 38 13.55 Normal Homogenous
Diction First Experimental Group 40 18.2632 Normal Homogenous 11.434 t-test>t-table accepted, Ho is rejected
Second Experimental Group 38 16.82 Normal Homogenous
Language Use First Experimental Group 40 19.6645 Normal Homogenous 3.033 t-test>t-table accepted, Ho is rejected
Second Experimental Group 38 19.50 Normal Homogenous
Mechanics First Experimental Group 40 3.99 Normal Homogenous 3.363 t-test>t-table accepted, Ho is wrejected
Second Experimental Group 38 3.9539 Normal Homogenous

TABLE 8. The result of the t-test (First and Second experimental group)

Group N Mean Normality Homogeneity t-test t-table
First Experimental Group 40 103.87 Normal Homogenous 1.001 1.696
Second Experimental Group 38 99.48 Normal


Next, as shown in Table 3, several assessments in the two experimental classes support Grammarly and teacher feedback in the writing course. The pre-test and post-test results showed that the tools can improve students’ writing. Data met normality and homogeneity criteria because the t-test result was 1.001, less than the t-table value of 1.696. This confirms that Grammarly and teacher feedback improve student writing.

4.3. Responses of Indonesian undergraduate EFL students to Grammarly and teacher feedback

The online questionnaire was sent to eighty subjects from two classes. However, only 78 participants completed the online questionnaire given through Google Forms, making it easier for the students to read and respond to the statements.

The students’ responses to Grammarly in the writing classes were generally positive. Moreover, this questionnaire consisted of 25 statements divided into three parts: The Effectiveness of Grammarly, Grammarly as an Instructional Tool to Help with Writing from Sources, and the Frequency of Using Grammarly, as shown in Figure 2.

Figure 2 shows that statements 1, 2, and 12 had the highest score from the first part of the experimental group questionnaire. Statement 1 was that “Grammarly’s feedback positively affects my writing text.” Statement 2 was “Grammarly boosted my confidence as I handed in the length of argumentative texts,” and Statement 12 was “Grammarly is a user-friendly program.” These three statements had the highest percentage of respondents strongly agreeing (92.5%).

Then, the result of the questionnaire from the first experimental group (Second Part: Grammarly as an Instructional Tool to help with Writing from Sources) is shown in Figure 3.

FIGURE 3. Questionnaire from the first experimental group (First part: The effectiveness of Grammarly)

Questionnaire from the first experimental group (First part: The effectiveness of Grammarly)

FIGURE 4. Questionnaire from the first experimental group (Second part: Grammarly as an instructional tool to help with writing from sources)

Questionnaire from the first experimental group (Second part: Grammarly as an instructional tool to help with writing from sources)

FIGURE 5. Questionnaire from the first experimental group (Third part: frequency of using Grammarly)

Questionnaire from the first experimental group (Third part: frequency of using Grammarly)

As shown in Figure 5, Statement 24 reached the highest score from the third part of the first experimental group with a questionnaire: “I started to use Grammarly since I was a student in the English Education Department.”

5. DISCUSSION

5.1. Effects of Grammarly and teacher feedback on the writing skills of Indonesian undergraduate EFL students

This study divided the tests into two sections: pre-test and post-test. The pre-test showed that first-year students and those who had completed semester one were balanced, starting both groups at the same level. After testing basic EFL writing skills, students were assigned a topic. For the post-test, the samples were balanced differently, so the groups were treated separately. The first experimental group used Grammarly for the first two writing tasks, followed by teacher feedback for the last two. Students received Grammarly’s corrected and original reports, which included revision comments for easy evaluation of their writing.

This study examined how Grammarly affected Indonesian EFL students. The first experimental group performed better than the second. The first experimental group scored 68.61% on N-Gain effectiveness interpretation, indicating success. The second experimental group’s average N-Gain score was 46.675, or 46.68 percent, indicating an inadequate performance. Indonesian college students learning English as a second language performed better in the first group that used Grammarly first and then teacher feedback.

This study could extend previous research on Grammarly and teacher feedback. Grammarly and teacher feedback showed they successfully integrated input into students’ revisions, and their increased post-test scores showed the students used feedback and improved their writing, demonstrating its pedagogical potential for students (Thi & Nikolov, 2022). Nova (2018) said that Grammarly’s feedback helps them learn better. O’Neill and Russell (2019) stated that Grammarly provides more feedback faster than traditional methods. The findings suggest using machine feedback in writing instruction to supplement teacher feedback. The findings suggest using machine feedback in writing instruction to supplement teacher feedback. The findings suggest using machine feedback in writing instruction to supplement teacher feedback. Ghufron and Rosyida (2018) also found that students who use Grammarly make fewer errors than those who use a teacher (indirect corrective feedback). Moreover, Grammarly’s user-friendliness and accessibility make it simple for students to employ, resulting in favorable reviews and enhanced writing practices (Raskova, 2023). In sum, Grammarly’s use in this study was effective, particularly in students’ argumentative writing skills for Indonesian EFL undergraduates.

Previous research has demonstrated that Grammarly’s feedback feature assists students with writing improvement and error identification (Qassemzadeh & Soleimani, 2016). Daniels and Leslie (2016) also reported that Grammarly is language learning software that can evaluate EFL writing and improve language skills. This is also consistent with our earlier findings, which demonstrated that using electronic feedback such as Grammarly software assists students in overcoming their apprehension of grammar (Saadi & Saadat, 2015). Then Grammarly is a helpful instrument for learning English, providing many tools and challenging exercises that keep students involved (Galingging et al., 2023). Moreover, Grammarly offers valuable features such as grammar checker, punctuation, and spelling, helping students save time and develop confidence in academic writing (Faisal & Carabella, 2023). In sum, Grammarly’s feedback feature facilitates students improve their writing skills by recognizing mistakes by providing an abundance of tools and challenging exercises that ensure they remain engaged.

5.2. The extent to which Grammarly and teacher feedback affect the writing skills of Indonesian EFL undergraduate students

The second question of the study investigates to what extent Grammarly affects the writing skills of Indonesian undergraduate EFL students. The investigation of using Grammarly on students’ texts was calculated using Diction, Language Use, and Mechanics. Then, teacher feedback was on Content, Organization, Mechanics, Language Use, and Diction. This study found that Grammarly software has more effect on mechanics (spelling and punctuation), language use (grammar), and diction but less on the content and organization. However, teacher feedback has more effect on content and organization but less on mechanics, language use, and diction.

Several reports have shown that students who use Grammarly to evaluate their work have significantly fewer errors than students who use the teacher’s evaluation, and writing teachers could use it regularly or encourage students to use it independently (Ghufron & Rosyida, 2018; Thi & Nikolov, 2022). Ghufron and Rosyida add that Grammarly reduces diction, language use, and writing mechanics errors. Then, Dizon and Gayed (2021) stated that based on the descriptive statistics and t-test results, they made fewer grammatical errors when students used Grammarly to help them write in L2. Next, Grammarly emphasizes the significance of technology and self-learning in modern learning by enabling writers to track learning objectives and engagement (Wardatin et al., 2022). Moreover, Grammarly has the potential to be a helpful tool for facilitating students’ learning and assessment of source-based writing techniques (Dong & Shi, 2021). In sum, in this study, using Grammarly significantly affects the mechanics of the argumentative texts written by students rather than the diction and language use.

In the first experimental group, Grammarly was used to evaluate and correct students’ writing for the first two texts, with students trained to use it independently. Teachers then provided feedback on the third and fourth texts. Grammarly reports, submitted alongside the corrected documents, included comments that helped teachers assess revisions. Students were evaluated on writing mechanics (spelling, punctuation, grammar) and diction.

For the third and fourth assignments, teachers reviewed student work, reading it aloud and taking notes on content, organization, diction, mechanics, and language use. Corrections were made and returned via Google Drive. Grammarly effectively reduced mechanics, diction, and language errors, identifying mistakes such as missing spaces and punctuation while offering correction suggestions. This aligns with research showing Grammarly aids EFL writing evaluation. It aligns with Daniels and Leslie (2016) that Grammarly is a software application that can aid language learning, particularly when evaluating EFL writing. Grammarly’s feedback feature also helps students identify errors and improve their writing (Qassemzadeh & Soleimani, 2016; Thi & Nikolov, 2022).

Grammarly has minimal impact on writing organization and content, as it cannot assess topic relevance or paragraph coherence. While it detects sentence movement, it overlooks logical flow. The tool prioritizes mechanical accuracy—spelling, punctuation, and grammar—over argumentation and coherence (Muhammad, 2024; Resiana et al., 2024). It may also flag contextually appropriate sentences as incorrect, emphasizing form over substance. This reliance on technical correctness can limit critical thinking and complex writing, potentially hindering academic success.

Next, the limitation of Grammarly is that it cannot give students personalized feedback that meets their learning needs (Hasby et al., 2024). Therefore, without tailored guidance, students may receive the same feedback regardless of their writing strengths or weaknesses (Ding & Zou, 2024). Moreover, Grammarly may not help writers improve due to its lack of personalized support (Ebadi et al., 2022). In conclusion, Grammarly can provide general corrections but not customized feedback to help students improve their writing.

Regarding the other issues and limitations of using Grammarly in writing classes. Some students initially do not have computers, tablets, or a stable internet connection to use Grammarly (Sanderson & Stephens, 2023). Without devices or reliable internet, students cannot use the tool to correct their grammar, making automated feedback less useful. Second, students may not be comfortable using Grammarly or other digital tools. Software novices may struggle to use Grammarly’s features, which may frustrate or prevent them from using it (Giray, 2024). Third, it is related to Grammarly in the form of free and premium limits. Grammarly’s free version may not give students as much feedback as the paid version, hindering their writing improvement (Setyowati et al., 2024). Students who cannot afford the premium version may miss out on advanced features, creating disparities. While Grammarly supports writing, its accessibility, usability, and cost barriers must be addressed. However, although some previous studies have shown the limitations of Grammarly, related to the result, this study found that it is more effective to begin with Grammarly and then provide feedback to the teacher rather than providing feedback to the teacher first. This research implies that. This fostered greater autonomy among students and motivated them to verify their assignments. Grammarly affects mechanics, diction, language use, and teacher feedback concerning content and organization.

Concerning the second research question, this study has shown that the present results are significant in at least two crucial respects. The findings in this study align with Barrot (2020), who states that Grammarly is an effective tool for students to use in their writing classes because it helps detect mechanical errors. Next, the findings in this study support the previous study from Fitria (2022) that Grammarly automatically verifies typed work based on several factors and reveals grammatical and mechanical student writing errors. In this study, Grammarly has more effect on diction, language use, and mechanics but less on content and organization. On the other hand, teacher feedback has more effect on content and organization but less effect on diction, mechanics, and language use.

5.3. Responses to Grammarly and Teacher Feedback

Regarding the effectiveness of Grammarly, most students found the suggestions helpful in improving their papers, and half of the students believed it aided them in achieving a higher grade. In the first section of the survey, most students agreed that Grammarly was a user-friendly program that helped them improve their writing skills. This study’s findings indicate that students have positive attitudes toward using AWE tools to improve writing skills. Overall, students found Grammarly to be practical and valuable. More than 85% of students rated practical and helpful factors as positive. It is in line with Perdana et al. (2021) that the Grammarly app is helpful for many linguistic issues in academic writing and the outside academic world.

The findings of this study coincide with the findings of Miranty et al. (2023), who state that Grammarly is perceived as a helpful AWE tool by students across all year levels because they know the need for proofreading services. Next, O’ Neil & Russel (2019) reported that Grammarly students scored higher on 9 out of 15 survey items and were happier with the grammar advice they received than non-Grammarly students. Research by Bailey and Lee (2020) found that students using Grammarly as automated feedback can increase their confidence and save time, incrementally improving writing compositions when writing in a second language because of fewer accuracy mistakes. Moreover, undergraduate EFL students responded positively regarding AWE tools’ potential advantages, such as Grammarly, in enhancing their writing skills (Miranty et al., 2023).

Concerning the third research question, based on the result from the questionnaire, there were positive responses from learners on using Grammarly as automated feedback instead of teacher feedback for submission in their writing course. Grammarly positively influenced the learners, gave feedback, and corrected their mistakes before submission, thus saving instructor effort and developing self-assisted learning styles among EFL learners.

6. CONCLUSIONS

This study examined using Grammarly and teacher feedback in undergraduate EFL writing classes in Indonesia. These two types of feedback, namely Grammarly and teacher feedback, were applied because Grammarly could reduce the teacher’s time to check student assignments, make students independent learners, and give immediate writing feedback. However, Grammarly cannot replace teachers because ELT classrooms require student-teacher interaction. Grammarly in EFL writing reduces errors in mechanics (spelling and punctuation), language use (grammar), and diction (vocabulary), but not content and organization. Teacher feedback affects content and organization more than diction, language, and mechanics. The teacher spotted topic-content gaps quickly. Therefore, when the paragraph lacks coherence, the teacher may be sensitive. Moreover, incorporating Grammarly and teacher feedback into students’ revisions showed that students got feedback effectively, which improved their writing skills. In sum, Grammarly and teacher feedback are valuable tools for enhancing EFL writing skills in Indonesian undergraduate students.

The study indicates two primary implications: first, Grammarly aids large classes in conserving time by delivering instantaneous feedback on mechanics, whereas teacher feedback is crucial for enhancing content and organization. Second, the research indicated that employing Grammarly before receiving teacher feedback proved more effective. The study’s dependence on closed-ended questions constrained its breadth, necessitating future research to include open-ended questions for enhanced understanding. Subsequent research should investigate various contexts, the influence of automated feedback on self-editing, and the responses of L1 and EFL learners to both automated and instructor feedback. Further investigation is required to validate the effectiveness of automated feedback and its overall influence on writing enhancement.

Declaration of conflicting interest

There is no conflict of interest in this work.

7. FUNDING

We want to express our sincere gratitude to the Research and Community Service Unit of Universitas Negeri Malang with contract number 19.5.1104/UN32.20.1/LT/2022 for funding our research, which was used as a basis to write this article.

8. REFERENCES

Agustin, R., & Wulandari, S. (2022). The Analysis of Grammatical Errors on Students’ Essay Writing by Using Grammarly. Jurnal Pendidikan Bahasa Inggris Proficiency, 4(1), 39-46. https://doi.org/10.32503/proficiency.v4i1.2247

Albesher, K. B. (2022). Teachers’ Views on Using the Writing Process Approach to Improve ESL Learners’ Writing Skills. International TESOL & Technology Journal (ITTJ), 17(2), 76–95. https://connect.academics.education/index.php/itj/issue/view/31

Alharbi, W. (2023). AI in the Foreign Language Classroom: A Pedagogical Overview of Automated Writing Assistance Tools. Education Research International. https://doi.org/10.1155/2023/4253331

Aljameel, I. H. (2022). Computer-Assisted Language Learning in Saudi Arabia: Past, Present, and Future. International Education Studies, 15(4), 95. https://doi.org/10.5539/ies.v15n4p95

Almusharraf, N., & Alotaibi, H. (2023). An error-analysis study from an EFL writing context: Human and Automated Essay Scoring Approaches. Technology, Knowledge and Learning, 28(3), 1015-1031. https://doi.org/10.1007/s10758-022-09592-z

Alsmari, N. A. (2019). Fostering EFL Students’ Paragraph Writing Using Edmodo. English Language Teaching, 12(10), 44. https://doi.org/10.5539/elt.v12n10p44

Anastasia, W., Murtisari, E., & Isharyanti, N. (2024). Less-proficient EFL students’ use of Grammarly in writing: Behavior, cognition, and affect. The JALT CALL Journal, 20(1), 1-25. https://doi.org/10.29140/jaltcall.v20n1.1089

Andina, D. M., Cahyono, B. Y., & Widiati, U. (2020). How English Foreign Language Students’ Autonomy and Digital Competence Relate to Their Writing Achievement. Tadris: Jurnal Keguruan Dan Ilmu Tarbiyah, 5(1), 77-86. https://doi.org/10.24042/tadris.v5i1.5760

Ariyanto, M. S. A., Mukminatien, N., & Tresnadewi, S. (2021). College Students’ Perceptions of an Automated Writing Evaluation as a Supplementary Feedback Tool in a Writing Class. Jurnal Ilmu Pendidikan (JIP), 27(1), 41-51. https://doi.org/10.17977/um048v27i1p41-51

Arredondo, L. L. A., & Laurens, L. (2023). Metaversity: Beyond Emerging Educational Technology. Sustainability, 15(22), 15844. https://doi.org/10.3390/su152215844

Awada, G. M., & Diab, N. M. (2023). Effect of online peer review versus face-to-Face peer review on argumentative writing achievement of EFL learners. Computer Assisted Language Learning, 36(1-2), 238–256. https://doi.org/10.1080/09588221.2021.1912104.

Ayan, A. D., & Erdemi̇R, N. (2023). EFL Teachers’ Perceptions of Automated Written Correctiv Feedback and Grammarly. Ahmet Keleşoğlu Eğitim Fakültesi Dergisi, 5(3), 1183-1198. https://doi.org/10.38151/akef.2023.106

Backman, Y., Reznitskaya, A., Gardelli, V., & Wilkinson, I. A. G. (2023). Beyond Structure: Using the Rational Force Model to Assess Argumentative Writing. Written Communication, 40(2), 555-585. https://doi.org/10.1177/07410883221148664

Bai, B., Wang, J., & Zhou, H. (2022). An Intervention Study to Improve Primary School Students’ self-regulated strategy use in English Writing through e-learning in Hong Kong. Computer Assisted Language Learning, 35(9), 2265-2290. https://doi.org/10.1080/09588221.2020.1871030

Bai, L., & Hu, G. (2017). In the face of fallible AWE feedback: How do students respond? Educational Psychology, 37(1), 67-81. https://doi.org/10.1080/01443410.2016.1223275

Bailey, D., & Lee, A. R. (2020). An Exploratory Study of Grammarly in the Language Learning Context: An Analysis of Test-Based, Textbook-Based and Facebook Corpora. TESOL International Journal, 15(2), 4–27. https://www.tesol-international-journal.com/volume-15-issue-2-2020/

Balta, E. E. (2018). The Relationships Among Writing Skills, Writing Anxiety and Metacognitive Awareness. JEL, 7(3), 233-241. https://doi.org/10.5539/jel.v7n3p233

Barrot, J. S. (2020). Integrating Technology into ESL/EFL Writing through Grammarly. RELC Journal, 1–5. https://doi.org/10.1177/0033688220966632

Barrot, J. S. (2023). Trends in automated writing evaluation systems research for teaching, learning, and assessment: A bibliometric analysis. Education and Information Technologies, 7155-7179. https://doi.org/10.1007/s10639-023-12083-y

Boud, D., & Dawson, P. (2023). What feedback literate teachers do: An empirically-derived competency framework. Assessment & Evaluation in Higher Education, 48(2), 157-171. https://doi.org/10.1080/02602938.2021.191092

Bulatović, V. V., Mirović, I., & Kaurin, T. (2024). Analyzing grammarly software for corrective feedback: Teacher’s perspective on affordances, limitations and implementation. Focus on ELT Journal, 6(1), 74–86. https://doi.org/10.14744/felt.6.1.6

Cavaleri, M., & Dianati, S. (2016). You want me to check your grammar again? The usefulness of an online grammar checker as perceived by students. Journal of Academic Language and Learning, 10(1), 223-236. https://journal.aall.org.au/index.php/jall/article/view/393

Daniels, P., & Leslie, D. (2016). Grammar software ready for EFL writers? OnCue Journal, 9(4), 391-401. https://jaltcue.org/journal_9.4

Dewi, U. (2023). Grammarly as Automated Writing Evaluation: Its Effectiveness from EFL Students’ Perceptions. Lingua Cultura, 16(2), 155-161. https://doi.org/10.21512/lc.v16i2.8315

Ding, L., & Zou, D. (2024). Automated writing evaluation systems: A systematic review of Grammarly, Pigai, and Criterion with a perspective on future directions in the age of generative artificial intelligence. Education and Information Technologies, 29, 14151-14203. https://doi.org/10.1007/s10639-023-12402-3

Dizon, G., & Gayed, J. (2021). Examining the impact of Grammarly on the quality of mobile L2 writing. The JALT CALL Journal, 17(2), 74-92. https://doi.org/10.29140/jaltcall.v17n2.336.

Dong, Y., & Shi, L. (2021). Using Grammarly to support students’ source-based writing practices. Assessing Writing, 50, 100564. https://doi.org/10.1016/j.asw.2021.100564

Ebadi, S., Gholami, M., & Vakili. (2022). Investigating the Effects of Using Grammarly in EFL Writing: The Case of Articles. Computers in the Schools, 40(1), 85-105. https://doi.org/10.1080/07380569.2022.2150067

Ebyary, K. E., & Windeatt, S. (2010). The impact of computer-based feedback on students’ written work. IJES, 10(2), 121-142. https://doi.org/10.6018/ijes/2010/2/119231.

Elumalai, K. V. (2019). Teacher Constructed Corrective Feedback Enhancing Students Writing Skills in EFL Classroom. Advances in Language and Literary Studies, 10(5), 103. https://doi.org/10.7575/aiac.alls.v.10n.5p.103

Estacio, D. L., Valencia, N. M., & Abdala, M. P. L. T. (2022). Development and Evaluation of Academic Learning Modules by the Faculty of Architecture and Fine Arts of Bulacan State University Using Grammarly Software by the Library Service. International Journal of Research Publications, 97(1), 335-344. https://doi.org/10.47119/IJRP100971320223002

Fahmi, M. A., & Cahyono, B. Y. (2021). EFL students’ perception on the use of Grammarly and teacher feedback. JEES (Journal of English Educators Society), 6(1), 18-25. https://doi.org/doi: https://doi.org/10.21070/jees.v6i1.849.

Faisal, F., & Carabella, P. A. (2023). Utilizing Grammarly in an Academic Writing Process: Higher-Education Students’ Perceived Views. Journal of English Language Teaching and Linguistics, 8(1), 23-42. https://doi.org/10.21462/jeltl.v8i1.1006

Fitria, T. N. (2021). Grammarly’ As A Teachers’ Alternative in Evaluation Non-EFL Students Writings. Leksema: Jurnal Bahasa Dan Sastra, 6(2), 141-152. https://doi.org/10.22515/ljbs.v6i2.3957

Fitria, T. N. (2022). Identifying Grammatical and Mechanical Errors of Students’ Writing: Using “Grammarly” as an Online Assessment. Lingua Didaktika: Jurnal Bahasa Dan Pembelajaran Bahasa, 16(2), 169-184. https://doi.org/10.24036/ld.v16i2.116824

Fitriana, K., & Nurazni, L. (2021). Exploring English Department Students’ Perceptions on Using Grammarly to Check the Grammar in their Writing. Journal of English Teaching, 8(1), 15-25. https://doi.org/10.33541/jet.v8i1.3044

Fitriana, K., & Nurazni, L. (2022). Exploring Students’ Perception of Using Grammarly to Check Grammar in Their Writing. JET (Journal of English Teaching), 8(1), 15-25. https://doi.org/10.33541/jet.v8i1.3044

Galingging, C. K., Sipayung, K. T., Silitonga, H., & Pardede, S. (2023). The Effectiveness Of Grammarly Application On Writing Descriptive Text Tenth Grade SMA Negeri 1 Lau Baleng. Journal on Education, 06(01), 2891-2904. https://jonedu.org/index.php/joe

Ghanbari, N., & Salari, M. (2022). Problematizing Argumentative Writing in an Iranian EFL Undergraduate Context. Frontiers in Psychology, 13, 862400. https://doi.org/10.3389/fpsyg.2022.862400

Ghufron, M. A., & Rosyida, F. (2018). The Role of Grammarly in Assessing English as a Foreign Language (EFL) Writing. Lingua Cultura, 12(4), 395. https://doi.org/10.21512/lc.v12i4.4582.

Giray, L. (2024). “Don’t Let Grammarly Overwrite Your Style and Voice:” Writers’ Advice on Using Grammarly in Writing. Internet Reference Services Quarterly, 28(3), 293-303. https://doi.org/10.1080/10875301.2024.2344762

Gozali, I., Wijaya, A. R. T., Lie, A., Cahyono, B. Y., & Suryati, N. (2024). Leveraging the potential of ChatGPT as an automated writing evaluation (AWE) tool: Students’ feedback literacy development and AWE tools integration framework. The JALT CALL Journal, 20(1), 1-22. https://doi.org/10.29140/jaltcall.v20n1.1200

Grammarly. (2019). About Grammarly. Retrieved from https://app.grammarly.com.

Hasby, N., Perdana, I., & Kodriyah, L. (2024). A Systematic Literature Review on Grammarly in English Studies: Advantages that Dazzle and Disadvantages that Fazzle. AJELP: The Asian Journal of English Language & Pedagogy, 12(2), 139-154.

Heron, M., Medland, E., Winstone, N., & Pitt, E. (2023). Developing the relational in teacher feedback literacy: Exploring feedback talk. Assessment & Evaluation in Higher Education, 48(2), 172-185.https://doi.org/10.1080/02602938.2021.1932735

Hockly, N. (2019). Automated Writing Evaluation. ELT Journal, 73(1), 82-88. https://doi.org/10.1093/elt/ccy044

Howell, D. C. (2012). Statistical Methods for Psychology (8th ed.). Cengage Learning.

Inayah, T. M., & Apoko, T. W. (2024). Exploring Students’ Perspectives on the Use of Grammarly in Writing Analytical Exposition Text. JLE: Journal of Literate of English Education Study Program, 5(1), 73-83. https://doi.org/10.47435/jle.v5i1.2802

Jacobs, H. L., Zingkraf, S. A., Wormuth, D. R., Hearfiel, V. F., & Hughey, J. B. (1981). Testing ESL Composition: A Practical Approach. English Composition Program Rowley. MA: Newbury House. https://doi.org/10.1177/026553228400100210

Jomaa, N., & Jibroo, H. (2024). Corrective Feedback of Grammarly in Enhancing L2 Writing by EFL Kurdish Students. Bulletin of Advanced English Studies, 9(1), 1-15. https://doi.org/10.31559/BAES2024.9.1.1

Joo, M. (2021). The Influence of Users’ Satisfaction With AWE on English Learning Achievement through Self-Efficacy: Using PLS-SEM. Journal of Digital Convergence, 19(9), 1-8. https://doi.org/10.14400/JDC.2021.19.9.001

Jumariati, J., & Sulistyo, G. (2017). Problem-Based Writing Instruction: Its Effect on Students’ Skills in Argumentative Writing. Arab World English Journal, 8(2), 87-100. https://doi.org/10.24093/awej/vol8no2.6

Karyuatry, L., Rizqan, M. D., & Darayani, N. A. (2018). Grammarly as a Tool to Improve Students’ Writing Quality: Free Online-Proofreader across the Boundaries. JSSH (Jurnal Sains Sosial Dan Humaniora), 2(1), 83-89. https://doi.org/10.30595/jssh.v2i1.2297

Kleemola, K., Hyytinen, H., & Toom, A. (2022). The Challenge of Position-Taking in Novice Higher Education Students’ Argumentative Writing. Front. Educ, 7. https://doi.org/10.3389/feduc.2022.885987

Koltovskaia, S. (2023). Postsecondary L2 writing teachers’ use and perceptions of Grammarly as a complement to their feedback. ReCALL, 35(3), 290-304. https://doi.org/10.1017/S0958344022000179

Khushk, M., Masroor, H., & Naeem, A. (2024). Evolution and Use of Grammarly in English Language Teaching and Learning in The Classroom Context. Al - Aijaz, Research Journal of Islamic Studies & Humanities, VIII(2), 72-78.

Kurnia, A. (2022). EFL Students’ Problems In Dealing With Teacher Written Feedback. Jurnal Pendidikan Indonesia, 3(05), 485-492. https://doi.org/10.36418/japendi.v3i05.1143

Landrieu, Y., De Smedt, F., Van Keer, H., & De Wever, B. (2022). Assessing the Quality of Argumentative Texts: Examining the General Agreement Between Different Rating Procedures and Exploring Inferences of (Dis)agreement Cases. Frontiers in education, 7, 784261 https://doi.org/10.3389/feduc.2022.784261

Landrieu, Y., De Smedt, F., Van Keer, H., & De Wever, B. (2023). Argumentation in collaboration: The impact of explicit instruction and collaborative writing on secondary school students’ argumentative writing. Reading and Writing, 37, 1407-1434. https://doi.org/10.1007/s11145-023-10439-x

Lei, J. I. (2020). An AWE-Based Diagnosis of L2 English Learners’ Written Errors. English Language Teaching, 13(10), 111-119. https://doi.org/10.5539/elt.v13n10p111

Li, R. (2023). Still a fallible tool? Revisiting effects of automated writing evaluation from activity theory perspective. British Journal of Educational Technology, 54(3), 773-789. https://doi.org/10.1111/bjet.13294

Liao, H. C. (2016). Using automated writing evaluation to reduce grammar errors in writing. ELT Journal, 70(3), 308-319. https://doi.org/10.1093/elt/ccv058.

Liu, C. C., Liu, S. J., Hwang, G. J., Tu, Y. F., Wang, Y., & Wang, N. (2023). Engaging EFL students’ critical thinking tendency and in-depth reflection in technology-based writing contexts: A peer assessment-incorporated automatic evaluation approach. Educ Inf Technol, 28, 13027-13052. https://doi.org/10.1007/s10639-023-11697-6

Loncar, M., Schams, W., & Shing, L. (2023). Multiple technologies, multiple sources: Trends and analyses of the literature on technology-mediated feedback for L2 English writing. Computer Assisted Language Learning, 36(4). https://doi.org/10.1080/09588221.2021.1943452

Martínez, J., López-Díaz, A., & Pérez, E. (2020). Using Process in the Teaching English as a Foreign Language. Revista Caribeña de Investigación Educativa (RECIE), 4(1), 42-61. https://doi.org/doi: https://doi.org/10.32541/recie.2020.v4i1.pp49-61

Mirzae, F., & Shamsudin, S. (2023). Process Oriented Writing Approach by UTM Iranian Students: Difficulties and Features. International Journal of Advanced Research in Education and Society, 5(2), 248-260. https://doi.org/10.55057/ijares.2023.5.2.23

Miranty, D., & Widiati, U. (2021). An automated writing evaluation (AWE) in higher education. Pegem Journal of Education and Instruction, 11(4), 126-137. https://doi.org/10.47750/pegegog.11.04.12

Miranty, D., Widiati, U., Cahyono, B. Y., & Suzila, T. I. (2023). Automated writing evaluation tools for Indonesian undergraduate English as a foreign language students’ writing. International Journal of Evaluation and Research in Education (IJERE), 12(3), 1705. https://doi.org/10.11591/ijere.v12i3.24958

Mohsen, M. A., & Alshahrani, A. (2019). The Effectiveness of Using A Hybrid Mode of Automated Writing Evaluation System on EFL Students’ Writing. Teaching English with Technology, 19(1), 118-131. https://tewtjournal.org/volume-2019/issue-1/

Muhammad, G. (2024). The Effectiveness of Grammarly Features in Building Arguments in Writing Essays. IKOMTI, 5(3), 41-47. https://doi.org/10.35960/ikomti.v5i3.1664

Nasser, S. M. (2018). Iraqi EFL Students’ Difficulties in Writing Composition: An Experimental Study (University of Baghdad). International Journal of English Linguistics, 9(1), 178. https://doi.org/10.5539/ijel.v9n1p178

Nova, M. (2018). Utilizing Grammarly in Evaluating Academic Writing: A Narrative Research on EFLStudents’ Experience. Premise: Journal of English Education and Applied Linguistics, 7(1), 80-96. https://ojs.fkip.ummetro.ac.id/index.php/english/issue/view/April%20Edition2018.

O’ Neil, R., & Russel, A. (2019). Grammarly: Help or hindrance? Academic Learning Advisors’ perceptions of an online grammar checker. Journal of Academic Language & Learning, 13(1), 88-107. https://journal.aall.org.au/index.php/jall/article/view/591

Olson, C. B., Maamuujav, U., Steiss, J., & Chung, H. (2023). Examining the Impact of a Cognitive Strategies Approach on the Argument Writing of Mainstreamed English Learners in Secondary School. Written Communication, 40(2), 373-416. https://doi.org/10.1177/07410883221148724

ONeill, R., & Russell, A. (2019). Stop! Grammar time: University students’ perceptions of the automated feedback program Grammarly. Australasian Journal of Educational Technology (AJET), 35(1), 42-55. https://doi.org/10.14742/ajet.3795

Özdemir, S. (2018). The Effect of Argumentative Text Pattern Teaching on Success of Constituting Argumentative Text Elements. WJE, 8(5), 112. https://doi.org/10.5430/wje.v8n5p112.

Palermo, C., & Wilson, J. (2020). Implementing Automated Writing Evaluation in Different Instructional Contexts: A Mixed-Methods Study. Journal of Writing Research, 12(1), 63-108. https://doi.org/doi: https://doi.org/10.17239/jowr-2020.12.01.04

Parra, G. L., & Calero, S. X. (2019). Automated Writing Evaluation Tools in the Improvement of the Writing Skill. International Journal of Instruction, 12(2), 209-226. https://doi.org/doi: https://doi.org/10.29333/iji.2019.12214a

Pearson, W. S. (2018). Written Corrective Feedback in IELTS Writing Task 2: Teachers’ Priorities, Practices, and Beliefs. The Electronic Journal for English as a Second Language, 21(4), 1-32. https://tesl-ej.org/wordpress/issues/volume21/ej84/

Perdana, I., Manullang, S. O., & Masri, F. A. (2021). Effectiveness of Online Grammarly Application in Improving Academic Writing: Review of Experts Experience. International Journal of Social Sciences, 4(1), 122-130. https://doi.org/10.31295/ijss.v4n1.1444

Qassemzadeh, A., & Soleimani, H. (2016). The Impact of Feedback Provision by Grammarly Software and Teachers on Learning Passive Structures by Iranian EFL Learners. Theory and Practice in Language Studies, 6(9), 1884-1894. http://dx.doi.org/10.17507/tpls.0609.23

Ranalli, J., Link, S., & Hudilainen, E. C. (2017). Automated writing evaluation for formative assessment of second language writing: Investigating the accuracy and usefulness of feedback as part of argument-based validation. An International Journal of Experimental Educational Psychology, 47(1), 8-25. https://doi.org/10.1080/01443410.2015.1136407

Raskova, L. (2023). Integrating Grammarly Tools to Enhance Writing Efficiency in Senior High School. Jadila: Journal of Development and Innovation in Language and Literature Education, 3(1), 92-106. https://doi.org/10.52690/jadila.v3i1.389

Resiana, A. T., Zamzam, A., Putera, L. J., Amrullah, & Arfah, H. (2024). The Effectiveness of Grammarly Application on the Students’ Argumentative Writing Progress. Journal of English Education Forum (JEEF), 4(3), 153-159. https://doi.org/10.29303/jeef.v4i3.722

Roscoe, R., Wilson, J., Johnson, A. C., & Mayra, C. R. (2017). Presentation, expectations, and experience: Sources of student perceptions of automated writing evaluation. Computers in Human Behavior, 70, 207-221. https://doi.org/10.1016/j.chb.2016.12.076

Saadi, Z. K., & Saadat, M. (2015). EFL Learners’ Writing Accuracy: Effects of Direct and Metalinguistic Electronic Feedback. TPLS, 5(10), 2053-2063. http://dx.doi.org/10.17507/tpls.0510.11

Saman, E., Gholami, M., & Vakili, S. (20233). Investigating the Effects of Using Grammarly in EFL Writing: The Case of Articles. Computers in the Schools, 40(1). https://doi.org/10.1080/07380569.2022.2150067.

Sanderson, I. J., & Stephens, O. S. (2023). The Frequency and Accuracy of Prompts and Suggestions Made by Free Grammarly on Thai University Students’ English Writing Errors. Chiang Mai University Journal of Humanities, 24(3), 257-294.

Sermsook, K., Liamnimitr, J., & Pochakorn, R. (2017). An Analysis of Errors in Written English Sentences: A Case Study of Thai EFL Students. English Language Teaching, 10(3), 101-110. https://doi.org/10.5539/elt.v10n3p101

Setyani, E. D., Bunau, E., & Rezeki, Y. S. (2023). The Influence of Grammarly towards Indonesian EFL Students’ First-Degree Thesis Writing Confidence. Elsya: Journal of English Language Studies, 5(1), 54-67. https://doi.org/10.31849/elsya.v5i1.6773

Setyowati, Y., Priyambudi, S., & Wijayanti, G. C. (2024). Students’ Reflections on Grammarly as a Tool for Academic Writing Support: Perceived Knowledge and Challenges in Higher Education. SCOPE: Journal of English Language Teaching, 09(01), 577-587. https://doi.org/10.30998/scope.v9i1.24854

Shadiev, R., & Feng, Y. (2023). Using automated corrective feedback tools in language learning: A review study. Interactive Learning Environments, 32(6), 2538-2566. https://doi.org/10.1080/10494820.2022.2153145

Shum, S. B., Lim, L.-A., Boud, D., Bearman, M., & Dawson, P. (2023). A comparative analysis of the skilled use of automated feedback tools through the lens of teacher feedback literacy. International Journal of Educational Technology in Higher Education, 20(1), 40. https://doi.org/10.1186/s41239-023-00410-9

Song, C., & Song, Y. (2023). Enhancing academic writing skills and motivation: Assessing the efficacy of ChatGPT in AI-assisted language learning for EFL students. Frontiers in Psychology, 14, 1260843. https://doi.org/10.3389/fpsyg.2023.1260843

Stevenson, M, A. (2016). Critical Interpretative Synthesis: The Integration of Automated Writing Evaluation into Classroom Writing Instruction. Computers and Composition, 42(1), 1-16. http://dx.doi.org/10.1016/j.compcom.2016.05.001

Tang, J., & Rich, C. (2017). Automated Writing Evaluation in an EFL Setting: Lessons from China. The JALT CALL Journal, 13(2), 117-146. https://doi.org/10.29140/jaltcall.v13n2.215

Taskiran, A., & Goksel, N. (2022). Automated Feedback and Teacher Feedback: Writing Achievement in Learning English as A Foreign Language at A Distance. Turkish Online Journal of Distance Education, 23(2), 120-139. https://doi.org/10.17718/tojde.1096260

Teng, M. F., Wang, C., & Zhang, L. J. (2022). Assessing self-regulatory writing strategies and their predictive effects on young EFL learners’ writing performance. Assessing Writing, 51, 100573. https://doi.org/10.1016/j.asw.2021.100573

Thi, N. K., & Nikolov, M. (2022). How Teacher and Grammarly Feedback Complement One Another in Myanmar EFL Students’ Writing. The Asia-Pacific Education Researcher, 31(6), 767-779. https://doi.org/10.1007/s40299-021-00625-2

Thi, N. K., Nikolov, M., & Simon, K. (2022). Higher-proficiency students’ engagement with and uptake of teacher and Grammarly feedback in an EFL writing course. Innovation in Language Learning and Teaching, 17(3), 690-705. https://doi.org/10.1080/17501229.2022.2122476

Tian, L., & Zhou, Y. (2020). Learner Engagement with Automated Feedback, peer feedback and teacher feedback in an online EFL writing context. System, 91. https://doi.org/doi: https://doi.org/10.1016/j.system.2020.102247

Ventayen, R. J. M., & Ventayen, C. C. O. (2018). Graduate Students’ Perspective on the Usability of Grammarly® in one ASEAN State University. Asian ESP Journal, 14(7.2), 09-30. https://www.asian-esp-journal.com/volume-14-issue-7-2-december-2018/

Wang, X., Liu, Q., Pang, H., Tan, S. C., Lei, J., Wallace, M. P., & Li, L. (2023). What matters in AI-supported learning: A study of human-AI interactions in language learning using cluster analysis and epistemic network analysis. Computers & Education, 194, 104703. https://doi.org/10.1016/j.compedu.2022.104703

Wang, Z., & Chiu, M. M. (2024). Multi-discourse Modes in Student Writing: Effects of Combining Narrative and Argument Discourse Modes on Argumentative Essay Scores. Applied Linguistics, 45(1), 20-40. https://doi.org/10.1093/applin/amac073

Wardatin, F. N., Setiawan, S., Mustofa, A., & Nugroho, H. A. (2022). Integrating self-directed learning in facilitating writers engagement through Grammarly: Exploring the perceptions of premium users. EnJourMe (English Journal of Merdeka). Culture, Language, and Teaching of English, 7(1), 32-46. https://doi.org/10.26905/enjourme.v7i1.6849

Williams, J. S., & Lowman, R. L. (2018). The efficacy of executive coaching: An empirical investigation of two approaches using random assignment and a switching-replications design. Consulting Psychology Journal: Practice and Research, 70(3), 227-249. https://doi.org/10.1037/cpb0000115

Wilson, J., & Roscoe, R. D. (2019). Automated Writing Evaluation and Feedback: Multiple Metrics of Efficacy. Journal of Educational Computing Research, 58(1), 87-125. https://doi.org/10.1177/0735633119830764

Wong, W. L., Muhammad, M. M., Chuah, K. P., Ma’arop, A. H., & Elias, R. (2022). Did you Run the Telegram? Use of Mobile Spelling Checker on Academic Writing. Multilingual Academic Journal of Education and Social Sciences, 10(1), 20. http://dx.doi.org/10.46886/MAJESS/v10-i1/7379

Wu, P., Yu, S., & Luo, Y. (2023). The development of teacher feedback literacy in situ: EFL writing teachers’ endeavor to human-computer-AWE integral feedback innovation. Assessing Writing Volume, 57, 100739. https://doi.org/10.1016/j.asw.2023.100739

Wu, X. (2022). Dynamic evaluation of college English writing ability based on AI technology. Journal of Intelligent Systems, 31(1), 298-309. https://doi.org/10.1515/jisys-2022-0020

Wu, Y. T., & Wang, L. J. (2023). Advancing University EFL Students’ Argumentative Essay Writing Performance through Knowledge-Building-based Holistic Instruction. Educational Technology & Society, 26(3), 115-128. https://doi.org/10.30191/ETS.202307_26(3).0009

Yamanishi, H., Ono, M., & Hijikata, Y. (2019). Developing a scoring rubric for L2 summary writing: A hybrid approach combining analytic and holistic assessment. Language Testing in Asia, 9(13), 1-22. https://doi.org/10.1186/s40468-019-0087-6

Yang, L. (Francoise), Zhang, L. J., & Dixon, H. R. (2023). Understanding the impact of teacher feedback on EFL students’ use of self-regulated writing strategies. Journal of Second Language Writing, 60. https://doi.org/10.1016/j.jslw.2023.101015

Yuniar, R. F., Widiati, U., & Astuti, U. P. (2019). The Effect of Using Wattpad on Process-Genre Approach towards Writing Achievement in Tertiary Level. Jurnal Pendidikan: Teori, Penelitian, Dan Pengembangan, 4(7), 897-905. http://dx.doi.org/10.17977/jptpp.v4i7.12631

Zhai, N., & Ma, X. (2023). The Effectiveness of Automated Writing Evaluation on Writing Quality: A Meta-Analysis. Journal of Educational Computing Research, 61(4), 875-900. https://doi.org/10.1177/07356331221127300.

Zhang, J., Ozer, H. Z., & Bayazeed, R. (2020). Grammarly VS face-To-Face at The Writing Centre: ESL Students Writers’Perceptions. Praxis: A Writing Center Journal, 17(2), 2-17. https://doi.org/doi: http://dx.doi.org/10.26153/tsw/8523

Zhang, R., & Zou, D. (2022). Types, features, and effectiveness of technologies in collaborative writing for second language learning. Computer Assisted Language Learning, 35(9). https://doi.org/10.1080/09588221.2021.1880441

Zheng, X., Luo, L., & Liu, C. (2023). Facilitating Undergraduates’ Online Self-Regulated Learning: The Role of Teacher Feedback. The Asia-Pacific Education Researcher, 32(6), 805-816. https://doi.org/10.1007/s40299-022-00697-8

Zou, D., Xie, H., & Wang, F. L. (2023). Effects of technology enhanced peer, teacher and self-feedback on students’ collaborative writing, critical thinking tendency and engagement in learning. Journal of Computing in Higher Education, 35(1), 166-185. https://doi.org/10.1007/s12528-022-09337-y