Writing Samples Employee ProfilesDino Anzures Builds Scoring Capacity (2007)“It is not just a test booklet, it is a real live person who we are trying to help by accurately assessing his or her strengths and weakness in a given area of learning,” says Armando “Dino” Anzures, Measured Progress director of scoring services. “So it is extremely important that we do a good job of scoring for every student.” Anzures has led the Measured Progress scoring department since 2004. His experienced team, now consisting of 30 staff members, manages and trains thousands of temporary employees at various locations across the country to accurately and consistently score student responses to open-response test items in state assessments. Anzures came to Measured Progress from CTB/McGraw-Hill. “I was looking around for something to do after leaving the Air Force.” He joined CTB as a scorer and soon was promoted to table leader and then supervisor. Eventually, as scoring manager, he was responsible for opening the Indiana scoring facilities in Indianapolis and Clarksville. He also directed the development and implementation of standard operating procedures. Anzures’s experience in setting up large operations stems from his international management background as a senior military officer responsible for the formulation, implementation, and application of communications computer systems worldwide. He managed professional staff that supported more than 500 unit organizations and approximately 120,000 military personnel. At Measured Progress, Anzures once again started with a small team and built an extensive operation. “The most exciting achievement during my time at Measured Progress has been the growth of our scoring capacity and our staffing. We currently have 1175 computer scoring work stations that we use in two shifts at four sites in Dover, New Hampshire, Troy, New York, Longmont, Colorado, and Louisville, Kentucky.” Measured Progress recently opened the Colorado and Kentucky facilities to help accommodate the company’s growth and multi-year, multi-million dollar contracts such as the Massachusetts Comprehensive Assessment System, the New England Common Assessment Program, and the Kentucky Commonwealth Accountability Testing System. Computer stations for 432 scorers are set up in the 30,000-square-feet Kentucky facility. The facility in Colorado provides 35,000 square feet of space for scoring operations, with 455 scoring workstations. During peak hours, each facility will employ more than 800 test readers. The addition of these facilities has more than tripled test reader workstations. Anzures expects to fill all of these workstations to capacity, in both daytime and evening shifts. “In 2004, we scored around four million test questions,” he says. “We scored 25 million in 2006 and expect to score more than 30 million student responses in 2007.” “In addition, Measured Progress now has the option to add more workstations through our iScore Web-based scoring system,” says Anzures, using seasonal facilities such as the State University of New York, the Christian Brothers Academy in Sento, and others. “During our peak work period this year, we will have 1,600 to 1,700 scorers working in two shifts, which in the course of the year accounts for over 3,000 individual, temporary staffing assignments.” “In order to support our increased capacity and large number of temporary employees, it has been necessary to increase our full-time staffing,” Anzures adds. “We are very fortunate to have a great leadership staff. I feel blessed to have a very talented, versatile, and dedicated group of team players that pull together to meet our mutual goals and objectives. It’s not a me thing, it’s a we thing!” While managing daily scoring activities, Anzures has been involved in the successful development and implementation of complex scoring technologies. During his first years in educational testing, “I started with paper-and-pencil systems, and went through the whole development of electronic scoring.” Clients continually ask for faster delivery of test results. Online testing can now provide instant results for multiple-choice test questions. Students will soon be able to type their responses to open-response questions into the online form, allowing immediate delivery to a scorer’s computer and therefore eliminating delays shipping and scanning paper response booklets. “We can start scoring immediately,” says Anzures. “And the digitized form of student responses will soon allow us to utilize artificial intelligence scoring. We are looking at various systems available on the market.” As Measured Progress scoring operations continue to grow and improve, Anzures can rely on an experienced scoring leadership team. “We are here to provide insight, improve processes, make the workflow more efficient. I feel comfortable about the competence of our staff and the capacity of our systems,” he says. “Every student response we score accurately leads to correct test results and improved student learning.” Scoring Staff and Operations at Measured Progress (sidebar, second page)The scoring director responds to requests for proposals and is involved in initial planning stages and budgeting processes for each contract. He works closely with program management, test development, measurement, and information technology staff, as well as our clients, to develop and maintain high-quality scoring guides and benchmarking, training, and qualification materials during the scoring process. He leads efforts to implement state-of-the-art scoring practices, procedures, and technologies. The iScore administrators create the contract database and user authorizations in the iScore system, our patented software program that allows computer-based scoring of student responses. They are also responsible for data cleanup after scoring is complete, as well as the export of scoring results to data processing. Chief readers, assistant chief readers, and quality assurance coordinators are content experts in science, math, writing, arts and humanities, and social studies who prepare benchmarking materials and lead the review of materials, working closely with test development staff and our clients. After approval from the client on the benchmarking/training materials, they train, qualify, and monitor scorers during the scoring process. Scoring center managers oversee all scoring activities at their scoring site, coordinate the recruitment of all temporary scoring staff, manage scoring assignments, monitor scoring progress for all grades and content areas (e.g., math and writing) assigned to their location, and approve millions of dollars in payroll for accounts payable. Scoring project managers, one assigned to each contract, oversee the overall contract from a scoring perspective and act as a liaison with contract management staff, data analysis staff, and the client while managing the content experts (chief readers, quality assurance coordinators, etc.) assigned to each contract. They also prepare scoring specifications at the beginning of the project, provide reports to the client during the scoring process, ensure that scoring is progressing on schedule, monitor labor costs, and produce a final report at the end of the project. The scoring production manager identifies the number of scorers required to score all grades and content areas of each contract within the predetermined scoring window and assigns the scoring workload across all scoring centers based on talent, prior experience, and capacity. Administrative support staff at each of our locations helps with all the support requirements: supplies, equipment, purchasing, scanning, copying, mailing, ID badges, timekeeping for payroll purposes, travel arrangements, assigning charges to contracts, researching unique scoring needs, and a multitude of other activities. Our scorers (readers), trained by our chief readers, are qualified temporary staff members who spend a scoring season evaluating and scoring student responses according to the guidelines provided for each contract, grade level, and content area.
|
|||