Grading Act Essays

The writing test is one of the five sections that make up the ACT. Each student’s writing test is evaluated based on the elements in the ACT essay scoring rubric. The ACT writing rubric features four areas or domains. The four domains are ideas and analysis, development and support, organization, and language use and conventions. The scores a student receives in each of these domains contribute to a student’s total score on the essay.

Let’s examine the scoring process for the writing test and take a closer look at the ACT essay scoring rubric:

The Scoring System for the ACT Essay
Each student’s essay is evaluated by two individuals who are familiar with the ACT essay rubric. A score of one to six points is given for each of the four domains in the ACT writing rubric. The scores of both graders are added together to get a total score for each domain. If there is a discrepancy of more than one point between the individual scores of the two readers, then a third reader is brought in to re-evaluate the student’s essay. Otherwise, an essay receives a total score based on the domain scores awarded by the two readers.

Ideas and Analysis
The first item in the ACT essay rubric concerns ideas and analysis. Essay graders evaluate a student’s ability to understand and express the ideas contained in the given issue. In order to achieve a high score on the essay, students must also be able to understand the different perspectives offered on the issue. An essay should contain relevant ideas expressed in a clear, succinct fashion.

Development and Support
Students who achieve a high score in this domain offer solid evidence to support their points of view. In fact, they provide specific examples that help to support their perspectives. Students are able to convey their ideas in a way that is easy to understand. They take their audience into account as they craft their arguments. At the end of the essay, the reader should be able to see a student’s way of thinking regarding the given issue.

Organization
Students receive a score for the way they organize their essay. Their ideas should be organized in a logical way that lends to the reader’s understanding. A student must transition from idea to idea in a smooth way. An essay should have a clear purpose and end with a conclusion that sums up the student’s thoughts on the issue. A typical format for an ACT essay includes an introduction, three or four paragraphs in the body, and a solid conclusion.

Language Use and Conventions
Essay graders evaluate a student’s skill at using written language to clearly express ideas. A student’s grammar, spelling, and mechanics all play a part in a grader’s final evaluation of the essay. Incorrect punctuation and misspellings are a distraction for essay readers. A student who can use vocabulary, phrasing, and sentence style to convey ideas in an effective way will receive a high score in this domain.

Tips for Writing an ACT Essay
Students who want to excel on the ACT writing test should practice their essay-writing skills on a regular basis. This is all the more effective if a student studies high-scoring ACT essays. They can practice including all of the components necessary for an essay worthy of a high score.

Another tip for writing a convincing ACT essay is to learn new vocabulary words. Students can use these vocabulary words to fully express the ideas in their essay. Plus, learning these words can also be useful in answering questions in the reading section of the ACT. Students can also benefit from making practice outlines. A solid outline can help students organize all of their ideas and supporting evidence. Furthermore, an outline is a helpful guide if a student loses their train of thought while writing the essay on test day.

Our encouraging instructors at Veritas Prep can provide students with guidance on the essay portion of the ACT. Also, we can advise them on the various components of the ACT essay rubric. We hire instructors who achieved a score of at least 33 on the ACT: Veritas Prep students learn from tutors who have real-life experience with the exam! Choose from our in-person or online prep courses and gain the confidence you need to ace the ACT.

Still need to take the ACT? We run afree online ACT prep seminar every few weeks. And be sure to find us on Facebook, YouTube, Google+ and Twitter!

ACT PrepACT Essay, ACT Prep, ACT Scoring, ACT vocabulary, Essays

Will colleges use ACT’s concordance or calculate the average domain score? Will the results always be the same?

By trying to give a variety of ways of thinking about Writing scores, ACT seems to be confusing matters more in its “5 Ways to Compare 2015–2016 and 2016–2017 ACT Writing Scores” white paper. If there are so many ways to compare scores, which one is right? Which one will colleges use? Why don’t they all give the same result?

Most students are familiar with the concept that different raw scores on the English, Math, Reading, or Science tests can produce different scaled scores. The equating of forms can smooth out any differences in difficulty from test date to test date. When ACT introduced scaling to the Writing test, it opened up the same opportunity. In fact, we have seen that the same raw score (8-48) on one test can give a different result on another test. Not all prompts behave in the same way, just as not all multiple-choice items behave in the same way. This poses a problem, though, when things are reversed. Suddenly ACT is saying to “ignore all that scaling nonsense and just trust our readers.” Trusting the readers helped get ACT into this mess, and ignoring the scaling is hard to do when an estimated one million students have already provided scaled Writing scores to colleges.

Because of the peculiarities of scaling and concordances, the comparison methods that ACT suggests of calculating a new 2-12 score from an old score report versus using a concordance table can produce differing results.

On the April 2016 ACT, a student with reader scores of {4, 3, 4, 3} and {4, 4, 4, 3} would have a raw score of 29 and would have received a scaled score of 21. In order to compare that score to the “new” score range, we could simply take the rounded average of the domain scores and get 7 (29/4 = 7.25).

An alternative provided by ACT is to use the concordance table (see below). We could look up the 21 scaled score the student received and find that it concords to a score of 8.

Same student, same test, same reader scores, different result. Here is where percentiles can give false readings, again. The difference between a 7 and an 8 is the difference between 59th percentile and 84th percentile. That’s a distressing change for a student who already thought she knew exactly where she had scored.

It would seem as if directly calculating the new 2-12 average would be the superior route, but this neglects to account for the fact that some prompts are “easier” than others — the whole reason the April scaling was a little bit different than the scaling in September or December. There is no psychometrically perfect solution; reverting to a raw scale has certain trade-offs. We can’t unring the bell curve.

Below is the concordance that ACT provides to translate from 1-36 scaled scores to 2-12 average domain scores.

Scaled 1-36 ScoreConcorded 2-12 Score
12
22
32
43
53
63
73
84
94
104
115
125
135
146
156
166
176
187
197
207
218
228
238
248
259
269
279
2810
2910
3010
3111
3211
3311
3412
3512
3612

0 Thoughts to “Grading Act Essays

Leave a comment

L'indirizzo email non verrà pubblicato. I campi obbligatori sono contrassegnati *