Parts

STUCK with your assignment? When is it due? Hire our professional essay experts who are available online 24/7 for an essay paper written to a high standard at a reasonable price.


Order a Similar Paper Order a Different Paper

APA format, in-text citation, references include

Part 1: 1 page

What is the difference between a marketing plan and a business plan?

A business plan covers the overall elements of business, including the strategic plan, financial plans, target markets, sales, products and services, and operations. The business plan also contains information on how all of these elements relate to each other.

A marketing plan, in contrast, focuses on the marketing and marketing strategy of certain products and services. Essentially, the marketing plan is tasked with identifying potential market areas while also addressing how to appropriately engage in marketing messages for those products or services to target populations.

Therefore, both marketing and business plans cement the foundations of how the organization of business will operate. They identify which populations are served and which products or services will most likely contribute to the viability of the business or organization. Specific to the health care administrator, the marketing and business plan should focus on effective health care delivery and capitalize on the unique health care services offered by individual health care organizations.

– DOING: An explanation of the consequences of how a misalignment between marketing plans, business plans, and strategic plans might affect the success of health care organizations and why.

Part 2:

1. Read the article ( attachment)

2. 1 sentence that states an argument you can make about the topic based on the article. This sentence will serve as the main idea in your eventual MEAL plan paragraph. 2-4 sentences that explain why you have chosen this argument as a critical reader. 

3. What is evidence you can use from the article to support your main idea sentence? Find two pieces of evidence from the article and paraphrase them. Be sure to include an APA citation for any sentences that include paraphrased material. Additionally, reflect on the process of paraphrasing the evidence. Pose any questions and/or explain challenges that came up during the process.

International Journal of Teaching and Learning in Higher Education 2018, Volume 30, Number 2, 195-206
http://www.isetl.org/ijtlhe/ ISSN 1812-9129

Postsecondary Online Students’ Preferences for Text-Based Instructor Feedback

Joseph J. Gredler
Walden University

Misalignment between student preferences and instructor practices regarding writing feedback may
impede student learning. This sequential explanatory mixed-methods study addressed postsecondary
online students’ preferences and the reasons for their preferences. A survey was used to collect 93
responses from postsecondary students attending a large private online university; data collection
included interviews with a subsample of 4 participants. Findings indicated students preferred
proximal, detailed, supportive feedback to enhance their writing skills and to understand deductions
assessed by instructors. Findings may increase instructor awareness of students’ preferences and
enhance collaboration in the feedback process to promote writing skill development and improve
academic outcomes.

Researchers have explored postsecondary students’

preferences for various types of instructor feedback
including written, audio recorded, and video recorded
(Bilbro, Iluzada, & Clark, 2013; Crews & Wilkinson,
2010; Ice, Swan, Diaz, Kupczynski, & Swan-Dagen,
2010). However, most of the research has been done
with students attending brick-and-mortar institutions.
Several researchers affirmed the importance of
instructor feedback to student learning in the
postsecondary setting (Johnson & Cooke, 2015;
Mirzaee & Hasrati, 2014; Van der Kleij, Feskens, &
Eggen, 2015). Instructor feedback could undermine
learning if the tone and content are not perceived by
students to be supportive (Carless, 2006). Also,
discrepancies in belief systems between teachers and
students could disrupt the learning process (Schulz,
2001). Ferguson (2011) acknowledged the occasional
dissatisfaction reported by students regarding feedback
and asserted that instructors’ understanding of students’
preferences is essential to the learning process. Schulz
(2001) agreed that instructors should explore students’
feedback preferences and should address conflicts that
could impede learning. Instructors need not strive to
please their students (Smith, 2008); however,
instructors may increase the likelihood of student
learning by using strategies that enhance student
engagement such as demonstrating awareness of
students’ feedback preferences. Given the increasing
number of students matriculated in online programs
(Cavanaugh & Song, 2014), describing online students’
preferences for electronic feedback delivered via
software applications such as Microsoft Word may help
instructors serve students’ learning needs more
effectively (Nicole & Macfarlane-Dick, 2006).

Background

Numerous studies have addressed postsecondary

students’ perceptions and preferences regarding
instructor feedback. Several researchers reported that
postsecondary students’ preferred clear, detailed

comments (Ferguson, 2011; Glover & Brown, 2006;
Mulliner & Tucker, 2015;), suggestive rather than
directive comments (Can, 2009; Rae & Cochrane, 2008;
Treglia, 2008), electronic feedback (Can, 2009; Rae &
Cochrane, 2008), prompt feedback (Mulliner & Tucker,
2015; Poulos & Mahony, 2008), and a balance between
positive and negative comments (Duncan, 2007; Smith,
2008; Weaver, 2006). Studies also indicated that active
students were more inclined to review and apply
instructor feedback than passive students (Wingate,
2010; Zacharias, 2007). Students preferred feedback that
aligned with assignment criteria (Ferguson, 2011;
Weaver, 2006; Wolsey, 2008) and enhanced their
performance on upcoming assignments (Orsmond &
Merry, 2011). Studies done with English as a foreign
language (EFL) students indicated that students’
preferences appeared to be associated with their literacy
levels (Boram, 2009; Tabatabaei & Ahranjani, 2012).
However, most of the studies done on postsecondary
students’ feedback preferences addressed students
attending brick-and-mortar institutions. Few studies
addressed online students’ preferences (Cavanaugh &
Song, 2014; Gallien & Oomen-Early, 2008).

Detailed, meaningful instructor feedback adds
value to the learning process, and instructors working in
an online environment should consider how their
feedback may enhance their students’ writing skills
(Crews & Wilkinson, 2010). Wolsey (2008) and
Nordrum, Evans, and Gustafsson (2013) agreed that
instructor feedback plays an important role in the
formative learning process that occurs within individual
writing projects and also in the development of skills
that students will employ in future assignments.
Feedback is the most personal, specific, and direct way
in which students are given writing instruction
(Szymanski, 2014). Weaver (2006) agreed that
feedback stimulates student reflection and development
and is an essential part of the learning process. Weaver
also noted that identifying students’ strengths and
weaknesses may facilitate their self-assessment and
application of feedback to future writing assignments.

Gredler Text-Based Instructor Feedback 196

Purpose, Framework, and Research Questions

The purpose of this study was to describe
undergraduate- and graduate-level online students’
preferences for instructor feedback delivered
electronically via software applications such as
Microsoft Word. The purpose also included
describing reasons why students prefer certain types
of feedback rather than others. An additional purpose
had been to test for variation among online students’
preferences based on age, grade level, online
experience, and English-language status; however,
due to the lower than expected sample size and the
disproportionate representation of graduate students,
native English speakers, and experienced online
learners in the self-selected sample, this third
purpose could not be satisfied.

Vygotsky’s (1978) social-constructivist theory
provided a suitable framework for the study.
Vygotsky argued that learning promotes internal
developmental processes that occur only when the
student is collaborating with individuals in his or
her environment. The current study applied social-
constructivist principles by encouraging instructor
recognition of the significance of students’
preferences in the instructor-student relationship
(Benko, 2012) and by exhorting instructors to
engage with students in the recursive writing
process by embracing their preferences as essential
to their writing skill development (Budge, 2011;
Ferguson, 2011). Instructor feedback was situated
as a scaffolding tool used to move students through
their zone of proximal development as emerging
academic writers (Benko, 2012; McCarthy, 2015).
Instructor feedback increases students’ self-
regulation as writers and thinkers (Treglia, 2008)
and promotes learning by enhancing students’ self-
regulation, improving their motivation, and
reducing their anxiety (McVey, 2008). Szymanski
(2014) supported the use of professional-genre
assignments that promote undergraduate students as
apprentice writers and encourage their self-
regulation as emerging scholars. When viewed
through a social-constructivist lens, the purpose of
the current study was to describe online students’
preferences for different levels of scaffolding and to
explore their reasons for preferring certain types of
feedback rather than others. The study addressed
the following research questions:

1. What types of electronic feedback in word-
processing software do postsecondary online
students prefer?

2. What reasons do postsecondary online students
give for preferring certain types of electronic
feedback but not others?

Method

The study included a sequential explanatory
mixed-methods design with a survey questionnaire
containing closed and open-ended questions followed
by interviews with participants to probe their
preferences more deeply (Patton, 2002). Survey
questions were adapted from those used by Budge
(2011) and Wolsey (2008); permission was obtained
prior to the study. Survey data came from 93
undergraduate and graduate students attending a large
private online university in the Midwestern United
States. Four participants who completed the survey also
participated in semi structured interviews. Interview
participants came from different programs (psychology,
education, nursing, and public policy) to enhance
disciplinary representation in interview data.

The survey instrument contained 17 quantitative
questions and two qualitative questions (Appendix A).
The first 12 quantitative questions addressed students’
preferences for online feedback delivered via software
applications such as Microsoft Word. Silva (2012) noted
that “electronic feedback via Microsoft Word
comments…affords the reader nearly an infinite amount
of space to provide commentary” (p. 3). Silva conceded
that video technology provides similar advantages but
expressed concern about instructors’ willingness to spend
extra time on video feedback and cautioned that the size
of video files may limit delivery options. Silva
acknowledged that audio comments may be used to
personalize the feedback process; however, technology
issues may impede students’ reception of audio feedback.
In addition, the lack of proximity of audio comments to
essay text may reduce the impact of audio feedback on
student revisions and learning. Given the predominant
use of text-based feedback in online programs,
quantitative survey questions addressed students’
preferences for text-based feedback. However, two open-
ended questions were included to allow students to report
their preferences for other types of feedback, including
video and audio. The survey also included five questions
addressing participants’ age, grade level, online
experience, English-language status, and area of study.
Interview questions (Appendix B) were aligned with
survey questions to explore participants’ feedback
preferences and the reasons for their preferences.

Data Analysis

Descriptive frequencies were used to report

quantitative survey data findings. Analysis of open-ended
survey questions involved a structured yet flexible
approach consistent with Miles, Huberman, and Saldana’s
(2014) recommendation to use both deductive coding
based on the conceptual framework and inductive coding
to identify unanticipated themes that emerged from the

Gredler Text-Based Instructor Feedback 197

data analysis. Preliminary analysis included provisional
codes borrowed from Aliakbari and Toni’s (2009) study
comparing the influence of different types of error-
correction techniques on postsecondary EFL students’
grammatical accuracy: (a) direct coded, (b) indirect coded,
(c) direct uncoded, and (d) indirect uncoded.

Quantitative Results

Demographic data indicated most participants

(95.6%) identified as graduate students. When asked
whether English was their first language, most
participants (89.0%) answered yes. Regarding area of
study, most participants selected social sciences
(36.3%), health sciences (24.2%), or other (33.0%). In
this third category, most participants (23) identified
education as their area of study. Additional categories
included business (3.3%), humanities (2.2%), and
information technology (1.1%). When asked how many
online courses they had taken, most participants
(84.6%) answered four or more. Most participants
(76%) were between the ages of 30 and 54.

Participants strongly agreed (63.4%) or slightly
agreed (20.4%) with having instructors correct errors
using track changes. Participants also agreed (95.7%)
with having online instructors include comments to
explain their corrections. Most participants (77.4%)
preferred balloon comments in the margins of the
paper, with less than a quarter (20.4%) preferring
comments typed within the essay text. Most participants
were neutral (34.4%) or strongly disagreed (19.4%)
with the use of grammar codes. Participants (92.4%)
preferred that instructors include both comments and
corrections in their feedback. Most participants (58.1%)
preferred comments inserted throughout the paper, and
over a third (37.6%) preferred comments inserted
throughout the paper and at the end.

Participants (91.4%) reported that they always
review their online assignments for feedback from their
instructor. In addition, participants strongly agreed
(67.7%) or slightly agreed (15.1%) that electronic
feedback provided by online instructors had been
helpful in developing their writing skills. Results were
mixed in response to Survey Question 9, “Considering
the types of instructor comments listed below, which
ones do you prefer?” Participants were allowed to
choose more than one response. The most popular
choices were explorations (85.0%), corrections to
content (81.7%), and complex affirmations (73.1%).
The least popular choices were personal reflections
(24.7%), simple affirmations (32.3%), and observations
(43%). Table 1 shows a breakdown of participants’
responses to this question.

Most participants (82.8%) preferred online
instructors to include grading rubrics with their
feedback. In addition, most participants strongly agreed

(51.6%) or slightly agreed (24.7%) that their
instructors’ electronic feedback had been consistent
with the grading rubric. Most participants strongly
agreed (64.1%) or slightly agreed (25.0%) that their
English writing skills were very good.

Qualitative Survey Results

Nearly all of the 93 survey participants responded

to the two open-ended survey questions. Major themes
contained 20 or more participant comments, and minor
themes contained at least two but not more than 19
participant comments. Major themes included the desire
to improve writing skills and the preference for
proximal, detailed, supportive feedback.

Theme 1: Desire to Improve Skills

The dominant theme from the qualitative data was
desire to improve as academic writers. Participants
expressed an interest in using instructor feedback to
develop their writing skills. Data showed 61 responses
included a comment reflecting a desire to improve. One
participant reported, “Feedback is how students learn and
grow in their writing and understanding of information. I
cannot become a better writer and learn if I do not receive
feedback that helps me do both of these things.” A second
participant commented, “I like to know what I am doing
wrong with recommendations to improve,” and indicated,
“I appreciate feedback that is meaningful. For example, if I
make a mistake or do something wrong, I need to know
about it so that I can improve.”

Theme 2: Proximal Comments

Many participant responses (53) indicated that
instructor comments should be located near related essay
text. Approximately one fourth (14) of these responses
indicated that proximity was important but did not specify
the desired location (e.g., marginal balloons or within
paragraph text). One participant reported, “I prefer to
receive electronic feedback from my online instructor within
the body of my essay.” Another observed, “With comments
not associated with a specific part of my paper, I am not sure
what the instructor is talking about. It helps to have the
comment be located in the location being referenced.”
According to a third participant, “It is important for me to
have feedback posted throughout the paper rather than a
long comment at the end. This makes the comments and
corrections more concise and clear and easier to follow.” A
fourth participant commented, “I prefer the feedback
directly adjacent to the error or the section being referred to
in order to avoid confusion.”

Nearly half (26) of the responses in Theme 2
indicated a clear preference for marginal balloon
comments. Only one of the 93 participants indicated a

Gredler Text-Based Instructor Feedback 198

Table 1
Preferences for Types of Instructor Comments

Response Number Percent
Simple affirmations 30 32.3
Complex affirmations 68 73.1
Explorations 79 85.0
Personal Reflections 23 24.7
Clarifications 58 62.4
Observations 40 43.0
Questions 59 63.4
Corrections to content 76 81.7
Corrections to mechanics 57 61.3

preference for in-paragraph comments rather than
balloons. Ten responses in this theme indicated a
preference for both in-text comments and a long
comment at the end. Two responses indicated
preference for comments only at the end.

Theme 3: Clear, Detailed Feedback

Many participant responses (37) indicated a
preference for instructor feedback that is easily
comprehended and substantive. One participant reported,
“I dislike simple feedback that does not provide a
substantive critique of my work. A ‘good job’ or ‘it
needs work’ does nothing to improve my comprehension
or writing skills.” Another participant commented, “I
would like that my online instructor’s feedback was
substantial, productive, encouraging, clear, concise, and
precise.” A third participant added, “It is essential to
have detailed feedback when working at the doctoral
level. This feedback should include specific detail to
errors, content that needs additions and/or omissions, and
simply learning from the instructor’s expertise.”

Theme 4: Constructive, Supportive Feedback

The fourth major theme (28 comments) was that
instructor feedback should be delivered with a supportive
tone. One participant insisted that instructors should
“eliminate value loaded bias comments. Give me
direction, not insult. Let me use my own mind—nudge me
the right way so I learn.” Another participant reported, “I
believe various instructors take liberties to insult and
complain. I do not want to be the recipient of someone’s
bad day.” A third participant commented, “It is important
for me to know that my instructors care about my learning
and growing rather than how many errors they can find.”

Minor Themes

Several responses (18) indicated support for
electronic feedback delivered as attachments or links

within courses or via e-mail. Participants described the
convenience and efficiency of electronic feedback.
Eleven responses indicated a preference for rubrics to
clarify how the grade was determined, and seven
comments reflected a preference for track changes
delivered via Microsoft Word to promote error
correction and skill development. Seven responses
indicated that feedback should be delivered in a timely
manner, and five comments indicated that instructor
feedback should include information explaining why
points were deducted. Four responses indicated that
instructors should include examples with their
feedback, and three comments indicated that
substantive feedback is needed even though a good
grade was given. Three responses indicated that
instructors should avoid personal reflections in their
feedback. No qualitative survey comments indicated a
preference for video or audio feedback. Table 2 shows
the number of comments associated with major and
minor themes.

Interview Results

Consistent with a sequential explanatory mixed-
methods design (Creswell & Plano Clark, 2011; Teddlie
& Tashakkori, 2009), interview transcripts were
analyzed using survey data codes as provisional codes.
Provisional codes preselected from Aliakbari and
Toni’s (2009) study were abandoned in the analysis of
survey data. However, provisional codes that emerged
from the survey data analysis were useful in the
examination of interview data.

Interview data supported all four major themes
from the qualitative survey data. Interview responses
also supported four of the minor themes, including
rubric feedback, timely feedback, feedback needed to
justify deductions, and feedback needed despite a good
grade. In addition, two new themes emerged from the
interview data: (a) include references to external
resources, and (b) provide evidence that the instructor
read the paper. One participant commented, “What has

Gredler Text-Based Instructor Feedback 199

Table 2
Themes From Qualitative Survey Data

Theme Number of responses
Desire to improve skills 61
Proximal feedback 53
Clear, detailed feedback 37
Constructive, supportive feedback 28
Electronic feedback 18
Rubrics included 11
Track changes used 7
Timely feedback 7
Feedback to justify deductions 5
Examples included 4
Feedback needed despite good grade 3
No instructor personal reflections 3
No grammar codes 2

helped is when they refer me in their comments to other
research or back to the literature of the course.” A
second participant noted, “What I found most helpful
were very specific references. A couple of professors
were very good with specific reference citations
especially when it has to do with APA.” Another
participant mentioned, “It’s helpful when you see the
comments that they actually looked at the paper.”

Discussion

Misalignment between instructor practices and

student preferences in the writing feedback process may
impede student learning (Schulz, 2001). Minimal
research on postsecondary online students’ preferences
for text-based feedback prompted the current study.
Findings showed that qualitative survey results were
consistent with quantitative survey results. Qualitative
responses indicated that participants preferred proximal,
detailed, supportive feedback including rubrics, track
changes, and examples to help them improve their
writing skills, but participants did not want grammar
codes or instructors’ personal reflections. Qualitative
survey results also indicated that feedback is needed even
when the grade is good and to justify deductions.
Quantitative findings showed that participants preferred
proximal comments, rubric feedback, and the use of track
changes for corrections. Quantitative results reinforced
the preference for detailed feedback provided via
complex rather than simple affirmations. Interview
findings supported survey findings. Interview
participants commented that detailed feedback is needed
to provide evidence that the paper had been read and to
improve writing performance on upcoming assignments.
Interview participants also reported that instructor
feedback should identify resources such as websites
students can access to promote their skill development.

Most of the themes aligned with results from
previous studies. The preference for clear, detailed
feedback was consistent with findings from Can (2009),
Duncan (2007), Ferguson (2011), Glover and Brown
(2006), Mulliner and Tucker (2015), Rae and Cochrane
(2008), and Zacharias (2007). Riddell (2015) noted the
significant body of research supporting detailed
feedback as more effective than general feedback in
enhancing writing performance. Students who received
personalized feedback scored significantly higher and
were more satisfied with the course than those who
received collective feedback (Gallien & Oomen-Early,
2008). Personalized feedback on related assignments
may be especially helpful in enhancing skill
development (Vardi, 2012, 2013). According to Poulos
and Mahony (2008), effective feedback is timely and
specific to the student’s individual needs.

A strong preference for supportive feedback
aligned with findings from previous studies. Mulliner
and Tucker (2015) found that feedback should be
delivered in a constructive, supportive manner. Weaver
(2006) noted that tutors should monitor their response
styles and balance positive feedback with critical
feedback while ensuring that comments are aligned
with assessment criteria and learning objectives.
Weaver also observed that, according to student
participants, tutors did not provide enough feedback
and did not include enough positive comments. Poulos
and Mahony (2008) observed that negative feedback
had a demoralizing impact on students’ motivation and
learning. Other studies indicated support for balance
between positive and negative comments (Can, 2009;
Ferguson, 2011; Treglia, 2008).

Participants’ preference for exploratory comments,
questions, and complex affirmations was consistent with
findings from several studies that indicated a preference
for suggestive rather than directive feedback (Can, 2009;

Gredler Text-Based Instructor Feedback 200

Mulliner & Tucker, 2015; Rae & Cochrane, 2008; Treglia,
2008). Some studies showed that instructors pay attention
to micro-level issues rather than content issues and use a
directive rather than suggestive approach (Stern &
Solomon, 2007; Szymanski, 2014). This type of feedback
does not support students’ preference for content-oriented
feedback delivered via explorations and questions, as
reported by participants in the current study. However, the
self-selected sample of primarily graduate-level native
English speakers may account for this preference, which
was consistent with Wolsey’s (2008) findings.

Participants’ preference for rubric feedback aligned
with Nordrum et al.’s (2013) finding that rubric-
articulated feedback helped students understand general
issues with their writing and techniques for approaching
future writing assignments. Nordrum et al. also found
that rubric feedback was not as useful as in-text
feedback, which served a corrective function as
opposed to the evaluative function of rubric feedback.
Students in Ferguson’s (2011) study reported a
preference for customized, criteria-oriented comments
explaining how grades were determined, which was
consistent with findings from the current study. Riddell
(2015) noted that providing students with a clear
understanding of how their work will be assessed may
increase the likelihood of students meeting assignment
expectations. Although Riddell did not specify rubrics
as a means of enhancing assessment awareness, this
tool is often used for that purpose in postsecondary
education. One major theme from the current study
(desire for proximal feedback) was not widely reported
in the literature. The preference for proximal feedback
echoed Wolsey’s (2008) finding that most students
preferred comments located near relevant essay text.

Participants’ preference for supportive, detailed
feedback aligned with social constructivist theory,
which provided the theoretical framework for the study.
Instructor feedback was situated as a scaffolding tool
intended to move students through their zone of
proximal development from other regulation to self-
regulation. Participants’ preference for exploratory
comments and questions suggested their desire for
feedback that promotes independent thinking and
encourages greater self-regulation as academic writers.
Overall, participants’ preference for proximal feedback
suggested a desire for moderate scaffolding. Although
participants supported the use of track changes to
designate corrections, the preference for exploratory,
suggestive comments indicated a desire for less
intrusive scaffolding.

Constructivist regard for students’ preferences
should be examined in the context of instructor
workload. Postsecondary instructors face a persistent
challenge to “balance their desire to provide
personalized, meaningful feedback with the limited
time they can allot to each paper” (Bilbro et al., 2013,

p. 47). Instructors experience pressure to provide
prompt, detailed feedback to high numbers of students
in postsecondary courses (Lunt & Curran, 2010).
Riddell (2015) argued that increasing the number of
feedback loops involving drafts, feedback, and
revisions may enhance students’ metacognitive
awareness and promote development of academic
writing skills; however, Riddell cautioned against
burdening instructors with an unmanageable workload.
Postsecondary instructors should accommodate student
preferences whenever possible and find ways to balance
their workload when providing scaffolding feedback to
promote writing skill development.

Limitations and Recommendations

High self-efficacy may have been a factor in

motivating students to volunteer for the study, as
suggested by the percentage of participants who
strongly agreed (64.1%) or slightly agreed (25.0%) that
their writing skills were very good. Wingate (2010)
found that students with low self-efficacy as academic
writers were less likely to value instructor feedback.
Other researchers observed that active students were
more inclined to study and apply instructor feedback
than passive students (Duncan, 2007; Rae & Cochrane,
2008; Wingate, 2010; Zacharias, 2007). Most
participants in the current study reported that they
always read instructor feedback, which may limit
generalizability of findings. Future studies should
include more data from students with low self-efficacy,
although gathering these data may be challenging.

None of the survey participants in the current study
reported a preference for audio and video feedback
when responding to the open-ended questions, and none
of the interview participants reported having received
these types of feedback in their online courses. One
interview participant reported that these types of
feedback would probably not be helpful, but another
indicated that audio feedback would be better than
“great job.” The other interview participants did not
report a preference or lack of preference for audio or
video feedback. More research should be done
exploring postsecondary online students’ preference for
audio and video feedback, as these types gain broader
acceptance and use in postsecondary education.

The study was further limited by participant
self-selection in that most participants were
graduate-level native English speakers who had
considerable online learning experience. Future
studies could include multiple data collection sites
(both public and private postsecondary institutions),
more data from undergraduate students, and more
data from inexperienced online students. A larger
sample would allow researchers to test for variation
in preferences based on demographic variables

Gredler Text-Based Instructor Feedback 201

including age, grade level, online experience, and
English-language status. Findings from these
studies may help instructors further customize their
feedback and follow a constructivist approach when
promoting writing skill development among
postsecondary online students.

References

Aliakbari, M., & Toni, A. (2009). On the effects of

error correction strategies on the grammatical
accuracy of the Iranian English learners. Journal of
Pan-Pacific Association of Applied Linguistics,
13(1), 99-112. Retrieved from
http://www.paaljapan.org/conference/journals.html

Benko, S. (2012). Scaffolding: An ongoing process to
support adolescent writing development. Journal of
Adolescent and Adult Literacy, 56(4), 291-300.
doi:10.1002/JAAL.00142

Bilbro, J., Iluzada, C., & Clark, D. E. (2013).
Responding effectively to composition students:
Comparing student perceptions of written and
audio feedback. Journal on Excellence in College
Teaching, 24(1), 47-83. Retrieved from
http://celt.muohio.edu/ject/index.php

Boram, K. (2009). Proficiency level and the relative
effects of different corrective feedback options
on EFL student writing. English Teaching,
64(4), 203-222. Retrieved from
http://www.kate.or.kr/Contents/Publications/Ar
ticle/list.asp

Budge, K. (2011). A desire for the personal: Student
perceptions of electronic feedback. International
Journal of Teaching and Learning in Higher
Education, 23(3), 342-349. Retrieved from
http://www.isetl.org/ijtlhe/pdf/IJTLHE1067.pdf

Can, G. (2009). A model for doctoral students’
perceptions and attitudes toward written feedback
for academic writing (Doctoral dissertation).
Retrieved from ProQuest Dissertations & Theses
Global. (305012078)

Carless, D. (2006). Differing perceptions in the
feedback process. Studies in Higher Education,
31(2), 219-233. doi:10.1080/03075070600572132

Cavanaugh, A. J., & Song, L. (2014). Audio feedback
versus written feedback: Instructors’ and students’
perspectives. Journal of Online Learning and
Teaching, 10(1), 122-138. Retrieved from
http://jolt.merlot.org/vol10no1/cavanaugh_0314.pdf

Creswell, J. W., & Plano Clark, V. L. (2011).
Designing and conducting mixed methods
research. Thousand Oaks, CA: Sage Publications.

Crews, T., & Wilkinson, K. (2010). Students’ perceived
preference for visual and auditory assessment with e-
handwritten feedback. Business Communication
Quarterly, 73(4), 399-412.

doi:10.1177/1080569910385566
Duncan, N. (2007). “Feed forward”: Improving

students’ use of tutors’ comments. Assessment &
Evaluation in Higher Education, 32(3), 271-283.
doi:10.1080/02602930600896498

Ferguson, P. (2011). Student perceptions of quality
feedback in teacher education. Assessment &
Evaluation in Higher Education, 36(1), 51-62.
doi:10.1080/02602930903197883

Gallien, T., & Oomen-Early, J. (2008). Personalized
versus collective instructor feedback in the
online courseroom: Does type of feedback
affect student satisfaction, academic
performance and perceived connectedness with
the instructor? International Journal on E-
Learning, 7(3), 463-476.

Glover, C., & Brown, E. (2006). Written feedback for
students: Too much, too detailed or too
incomprehensible to be effective? Bioscience
Education, 7, 1-16. doi:10.3108/beej.2006.07000004

Ice, P., Swan, K., Diaz, S., Kupczynski, L., & Swan-
Dagen, A. (2010). An analysis of students’
perceptions of the value and efficacy of instructors’
auditory and text-based feedback modalities across
multiple conceptual levels. Journal of Educational
Computing Research, 43(1), 113-134.
doi:10.2190/EC.43.1.g

Johnson, M., & Cooke, A. (2015). Self-regulation
of learning and preference for written versus
audio-recorded feedback by distance education
students. Distance Education, 1-14.
doi:10.1080/01587919.2015.1081737

Lunt, T., & Curran, J. (2010, December). “Are you
listening please?” The advantage of electronic
audio feedback compared to written feedback.
Assessment & Evaluation in Higher Education,
35(7), 759-769. doi:10.1080/02602930902977772

McCarthy, J. (2015). Evaluating written, audio and
video feedback in higher education summative
assessment tasks. Issues in Educational Research,
25(2), 153-169. Retrieved from
http://www.iier.org.au/iier25/mccarthy.html

McVey, M. (2008). Writing in an online environment:
Student views of “inked” feedback. International
Journal of Teaching and Learning in Higher
Education, 20(1), 39-50. Retrieved from
http://www.isetl.org/ijtlhe/pdf/IJTLHE365.pdf

Miles, M. B., Huberman, A. M,. & Saldana, J. (2014).
Qualitative data analysis: A methods sourcebook
(3rd ed.). Thousand Oaks, CA: Sage Publications.

Mirzaee, A., & Hasrati, M. (2014). The role of written
formative feedback in inducing non-formal
learning among masters students. Teaching in
Higher Education, 19(5), 555-564.
doi:10.1080/13562517.2014.880683

Mulliner, E., & Tucker, M. (2015). Feedback on feedback

Gredler Text-Based Instructor Feedback 202

practice: Perceptions of students and academics.
Assessment & Evaluation in Higher Education, 1-23.
doi:10.1080/02602938.2015.1103365

Nicol, D. J., & Macfarlane-Dick, D. (2006). Formative
assessment and self-regulated learning: A model
and seven principles of good feedback practice.
Studies in Higher Education, 31(2), 199-218.
doi:10.1080/03075070600572090

Nordrum, L., Evans, K., & Gustafsson, M. (2013).
Comparing student learning experiences of in-text
commentary and rubric-articulated feedback:
Strategies for formative assessment. Assessment &
Evaluation in Higher Education, 38(8), 919-940.
doi:10.1080/02602938.2012.758229

Orsmond, P., & Merry, S. (2011). Feedback alignment:
Effective and ineffective links between students’ and
tutors’ understanding of coursework feedback.
Assessment & Evaluation in Higher Education, 36(2),
125-136. doi:10.1080/02602930903201651

Patton, M. Q. (2002). Qualitative research and
evaluation methods (3rd ed.). Thousand Oaks, CA:
Sage Publications.

Poulos, A., & Mahony, M. J. (2008). Effectiveness of
feedback: The students’ perspective. Assessment &
Evaluation in Higher Education, 33(2), 143-154.
doi:10.1080/02602930601127869

Rae, A. M., & Cochrane, D. K. (2008). Listening to
students: How to make written assessment feedback
useful. Active Learning in Higher Education, 9(3), 217-
230. doi:10.1177/1469787408095847

Riddell, J. (2015). Performance, feedback, and revision:
Metacognitive approaches to undergraduate essay
writing. Collected Essays on Learning and
Teaching, 8, 79-96.

Schulz, R. A. (2001). Cultural differences in student
and teacher perceptions concerning the role of
grammar instruction and corrective feedback:
USA-Columbia. Modern Language Journal, 85(2),
244-258. doi:10.1111/0026-7902.00107

Silva, M. L. (2012). Camtasia in the classroom: Student
attitudes and preferences for video commentary or
Microsoft Word comments during the revision
process. Computers and Composition, 29(1), 1-22.
doi:10.1016/j.compcom.2011.12.001

Smith, L. J. (2008). Grading written projects: What
approaches do students find most helpful? Journal
of Education for Business, 83(6), 325-330.
doi:10.3200/JOEB.83.6.325-330

Stern, L. A., & Solomon, A. (2006). Effective faculty
feedback: The road less traveled. Assessing
Writing, 11, 22-41. doi:10.1016/j.asw.2005.12.001

Szymanski, E. A. (2014). Instructor feedback in upper-
division biology courses: Moving from spelling and
syntax to scientific discourse. Across the Disciplines,
11(2), 1-13. Retrieved from

http://wac.colostate.edu/atd/articles/szymanski2014.cfm
Tabatabaei, M. A., & Ahranjani, A. K. (2012). The

comparative study of Iranian monolingual and
bilingual university EFL students’ preferences for
different types of written feedback. International
Journal of Academic Research, 4(2), 67-69.

Teddlie, C., & Tashakkori, A. (2009). Foundations of
mixed-methods research: Integrating quantitative and
qualitative approaches in the social and behavioral
sciences. Thousand Oaks, CA: Sage Publications.

Treglia, M. O. (2008). Feedback on feedback: Exploring
student responses to teachers’ written commentary.
Journal of Basic Writing, 27(1), 105-137.

Van der Kleij, F. M., Feskens, R. C., & Eggen, T. J. (2015).
Effects of feedback in a computer-based learning
environment on students’ learning outcomes: A meta-
analysis. Review of Educational Research, 85(4), 475-
511. doi:10.3102/0034654314564881

Vardi, I. (2012). The impact of iterative writing and
feedback on the characteristics of tertiary student’s
written texts. Teaching in Higher Education, 17(2),
167-179. doi:10.1080/13562517.2011.611865

Vardi, I. (2013). Effectively feeding forward from one
written assessment task to the next. Assessment &
Evaluation in Higher Education, 38(5), 599-610.
doi:10.1080/02602938.2012.670197

Vygotsky, L. S. (1978). Mind in society: The
development of higher psychological processes.
Cambridge, MA: Harvard University Press.

Weaver, M. (2006). Do students value feedback?
Student perceptions of tutors’ written responses.
Assessment & Evaluation in Higher Education,
31(3), 379-394. doi:10.1080/02602930500353061

Wingate, U. (2010). The impact of formative feedback
on the development of academic writing.
Assessment & Evaluation in Higher Education,
35(5), 519-533. doi:10.1080/02602930903512909

Wolsey, T. D. (2008). Efficacy of instructor feedback on
written work in an online program. International
Journal of E-Learning, 7(2), 311-329.

Zacharias, N. T. (2007). Teacher and student attitudes
toward teacher feedback. RELC Journal, 38(1), 38-
52. doi:10.1177/0033688206076157

____________________________

JOSEPH GREDLER is a dissertation editor in Walden
University’s Writing Center. Dr. Gredler conducts
form-and-style reviews, chapter edits, and dissertation
intensives to support students during their capstone
studies. Dr. Gredler also serves as lead faculty in
Walden’s Academic Skills Center where he teaches
graduate courses on scholarly writing and workshops
on proposal, literature review, and postproposal writing.
He recently served as subject matter expert in writing
the curriculum for an academic integrity course.

Gredler Text-Based Instructor Feedback 203

Appendix A

Survey Questions

1. I prefer to have online instructors correct my errors using track changes. (Choose one)

a. Strongly agree
b. Slightly agree
c. Neutral
d. Slightly disagree
e. Strongly disagree

2. I prefer to have online instructors include comments to explain their corrections. (Choose one)

a. Strongly agree
b. Slightly agree
c. Neutral
d. Slightly disagree
e. Strongly disagree

3. I prefer to have online instructors’ comments appear: (Choose one)

a. Within my essay text
b. In balloons in the margin of my paper
c. Neither

4. I prefer to have online instructors use grammar codes when identifying errors in my assignments. (Choose one)

a. Strongly agree
b. Slightly agree
c. Neutral
d. Slightly disagree
e. Strongly disagree

5. I prefer to have online instructors include the following when grading my assignments. (Choose one)

a. Corrections only
b. Comments only
c. Corrections and comments
d. Neither corrections nor comments
e. Highlighted errors but no corrections or comments
f. Other (please describe ________________ )

6. I prefer to have an online instructor: (Choose one)

a. Insert comments throughout my paper
b. Type a long comment at the end
c. Neither
d. Both

Gredler Text-Based Instructor Feedback 204

7. I always review my online assignments for electronic feedback from my online instructor. (Choose one)

a. Strongly agree
b. Slightly agree
c. Neutral
d. Slightly disagree
e. Strongly disagree

8. I have found that the electronic feedback provided by online instructors has been helpful in developing my
writing skills. (Choose one)

a. Strongly agree
b. Slightly agree
c. Neutral
d. Slightly disagree
e. Strongly disagree

9. Considering the types of instructor comments listed below, which one(s) do you prefer? (Choose as many as
apply)

a. Simple affirmations (e.g. Good point! Nice job!)
b. Complex affirmations (e.g. You made a great point here because….)
c. Explorations (e.g. You might also consider….)
d. Personal reflections (e.g. Your point reminded me of an experience I had….)
e. Clarifications (e.g. Studies actually show that…. I think the author was trying to say….)
f. Observations (e.g. I wasn’t aware of this…. I came to the same conclusion….)
g. Questions (e.g. Do you mean…? What about…?)
h. Corrections to content (e.g. This point is confusing because…. Please develop your ideas here by….)
i. Corrections to mechanics such as spelling, grammar, punctuation, capitalization, etc.

10. I prefer online instructors to include completed grading rubrics with their electronic feedback. (Choose one)

a. Yes
b. No

11. In my online courses, the instructor’s electronic feedback is consistent with the grading rubric. (Choose one)

a. Strongly agree
b. Slightly agree
c. Neutral
d. Slightly disagree
e. Strongly disagree

12. I consider my English writing skills to be very good. (Choose one)

a. Strongly agree
b. Slightly agree
c. Neutral
d. Slightly disagree
e. Strongly disagree

13. In your own words, please explain how you prefer to receive electronic feedback from your online instructors in
your writing assignments.

Gredler Text-Based Instructor Feedback 205

14. In your own words, please explain why you prefer certain types of electronic feedback from instructors but not
others.

15. How much experience have you had receiving electronic feedback in online courses? (Choose one)

a. 1 course
b. 2-4 courses
c. More than 4 courses

16. I am the following: (Choose one)

a. Undergraduate student
b. Graduate student

17. English is my first language. (Choose one)

a. Yes
b. No

18. My age is: (Choose one)

a. 18-20
b. 21-24
c. 25-29
d. 30-34
e. 35-39
f. 40-44
g. 45-49
h. 50-54
i. 55-59
j. 60-64
k. 65+

19. My area of study is: (Choose one)

a. Business
b. Information Technology
c. Health Sciences
d. Social Sciences
e. Humanities
f. Other (please indicate __________________ )

Gredler Text-Based Instructor Feedback 206

Appendix B

Interview Questions

1. One of the survey questions asked you how you feel about instructors correcting your writing errors by editing
them with track changes. How do you like to have your errors addressed electronically? Why?

2. Please describe where you like instructor comments to appear in your papers. What are the reasons you like that
approach?

3. One of the survey questions asked about your preference for grading rubrics, which describe how well you met
assignment expectations in categories such as content, organization, grammar, and style. How do you feel
about the use of grading rubrics?

4. In your survey, you indicated that you liked certain types of comments but not others (e.g. simple affirmations,
questions, corrections). Please explain why you like some types of comments but not others.

5. Please describe a positive experience you had with an instructor’s electronic feedback in an online course. Why
did you find the feedback helpful?

6. Please describe a negative experience you had with an instructor’s electronic feedback in an online course. Why
did the feedback seem unhelpful?

7. The survey focused primarily on text-based feedback such as track changes and comments. What other types of
electronic feedback do you prefer (for example, audio comments, video files, or something else)? Why do
you like this type of feedback?

8. When you think about your development as an academic writer, how has your online instructor’s electronic
feedback helped you improve your skills? What types of feedback have not been helpful? Why?

Copyright of International Journal of Teaching & Learning in Higher Education is the
property of International Society for Exploring Teaching & Learning and its content may not
be copied or emailed to multiple sites or posted to a listserv without the copyright holder’s
express written permission. However, users may print, download, or email articles for
individual use.

Writerbay.net

Everyone needs a little help with academic work from time to time. Hire the best essay writing professionals working for us today!

Get a 15% discount for your first order


Order a Similar Paper Order a Different Paper