1 Dolar

10th Grade Fcat Writes Sample Essays

B. Essay Questions (Short and Extended Response)


Essay questions are a more complex version of constructed response assessments. With essay questions, there is one general question or proposition, and the student is asked to respond in writing. This type of assessment is very powerful -- it allows the students to express themselves and demonstrate their reasoning related to a topic. Essay questions often demand the use of higher level thinking skills, such as analysis, synthesis, and evaluation.

Essay questions may appear to be easier to write than multiple choice and other question types, but writing effective essay questions requires a great deal of thought and planning. If an essay question is vague, it will be much more difficult for the students to answer and much more difficult for the instructor to score. Well-written essay questions have the following features:

Essay questions are used both as formative assessments (in classrooms) and summative assessments (on standardized tests). There are 2 major categories of essay questions -- short response (also referred to as restricted or brief ) and extended response.

Short Response

Short response questions are more focused and constrained than extended response questions. For example, a short response might ask a student to "write an example," "list three reasons," or "compare and contrast two techniques." The short response items on the Florida assessment (FCAT) are designed to take about 5 minutes to complete and the student is allowed up to 8 lines for each answer. The short responses are scored using a 2-point scoring rubric. A complete and correct answer is worth 2 points. A partial answer is worth 1 point.

Sample Short Response Question 
(10th Grade Reading)

How are the scrub jay and the mockingbird different? Support your answer with details and information from the article.

Extended Response

Extended responses can be much longer and complex then short responses, but students should be encouraged to remain focused and organized. On the FCAT, students have 14 lines for each answer to an extended response item, and they are advised to allow approximately 10-15 minutes to complete each item. The FCAT extended responses are scored using a 4-point scoring rubric. A complete and correct answer is worth 4 points. A partial answer is worth 1, 2, or 3 points.

Sample Extended Response Question 
(5th Grade Science)

Robert is designing a demonstration to display at his school’s science fair. He will show how changing the position of a fulcrum on a lever changes the amount of force needed to lift an object. To do this, Robert will use a piece of wood for a lever and a block of wood to act as a fulcrum. He plans to move the fulcrum to different places on the lever to see how its placement affects the force needed to lift an object.

Part A  Identify at least two other actions that would make Robert’s demonstration better.

Part B  Explain why each action would improve the demonstration.

 

New Florida Writing Test Will Use Computers To Grade Student Essays

jeffrey james pacres / Flickr

Florida writing tests will be graded by a human and a computer program, according to bid documents for the new test. And just 2 percent of students will take a pencil and paper exam in 2015.

A computer program will grade student essays on the writing portion of the standardized test set to replace the FCAT, according to bid documents released by the Florida Department of Education.

The essays will be scored by a human and a computer, but the computer score will only matter if the score is significantly different from that of the human reviewer. If that happens, the documents indicate the essay will be scored by another human reviewer.

Florida writing tests are currently graded by two human scorers and the state has never used computerized grading on the exam.

The Florida Department of Education announced Monday it chose the non-profit American Institutes for Research to produce new tests tied to Florida’s Common Core-based math and language arts standards. Spokesmen for the agency and AIR said they had yet to sign a contract, were still working out the details and declined to comment about the specifics of the new test.

“It’s speculative at this point to think about what is on the assessments,” said Joe Follick, communications director for the Florida Department of Education.

But the bid documents show using computers to grade the state writing test will save $30.5 million over the course of the six-year, $220 million contract with AIR. The change was part of a list which trimmed more than $100 million from AIR’s initial proposal.

The documents also indicate Florida will license its test items from Utah in 2015, the first year the new Florida test will be given. AIR will create Florida-specific questions by the time the test is administered in 2016, saving $20.4 million in licensing fees.

Florida would also save another $14.5 million by limiting the number of pencil and paper tests in favor of online exams. The documents call for just 2 percent of tests to be delivered by pencil and paper the first two years, and 1 percent in future years.

That would put more pressure on school districts to ensure they have the bandwidth and computers necessary to administer the new test.

And Florida will eliminate all paper reporting of test results, saving $14 million.

The use of computer-graded essays may become a necessity, said University of Akron researcher Mark Shermis, because Common Core-tied exams will expand the number of students taking writing exams each year.

Currently, Florida students in grades four, eight and ten take the FCAT writing exam. Under Common Core, students take a writing exam every year.

Florida and 44 other states have fully adopted the Common Core. The standards outline what students should know at the end of each grade.

“Even if you had the money,” Shermis said, “you wouldn’t have the people to do the vast amount of grading required under the Common Core State Standards.”

Shermis found computer programs — including AIR’s AutoScore– performed at least as well as human grading in two of three trials that have been conducted. His research concluded computers were reliable enough to be used as a second reviewer for high stakes tests.

But while the technology is improving, Shermis said districts need to study whether computer-graded essays put any class of students at a disadvantage.

Other researchers are less bullish on the technology.

“Of the 12 errors noted in one essay, 11 were incorrect,” Les Perlman of the Massachusetts Institute of Technology told our colleagues at StateImpact Ohio in 2012. “There were a few places where I intentionally put in some comma errors and it didn’t notice them. In other words, it doesn’t work very well.”

Many states and the two multi-state consortia developing Common Core-tied tests said they are watching computerized essay grading.

Utah has used computer essay grading since 2010, said Utah Department of Education spokesman Mark Peterson. The state trusts the technology enough that computers provide the primary scoring for the state’s writing exams. Peterson said state reviews have found fewer biases in computer grading than human grading.

Utah uses Measurement Incorporated technology to grade essays and will switch to AIR when the current contract runs out, Peterson said.

Smarter Balanced spokesman Jackie King said the test would use only human grading on the writing portion, but that the technology is promising. Officials with the Partnership for Assessment of Readiness for College and Careers, or PARCC, said they have not yet made a decision about the use of computerized grading.

You can read Shermis’ paper on the research below:

Comments

Leave a Comment

(0 Comments)

Your email address will not be published. Required fields are marked *