The Internet as a Means of Assessing the NJ Core Content Standards in Music: What Happened?
By Dr. James Frankel
As you may recall from an article that I wrote a few years ago, my doctoral research project centered on using the Internet to alternatively assess the Core Content Standards in Music, and whether it was a viable assessment tool for these particular standards. The basis of the project emerged directly from my frustration with the original state-proposed assessment of the NJSCCC Standards in Music in October of 1999, a multiple-choice only examination. As a member of the Curriculum Framework Writing Team for the standards, I did not believe that the state-proposed assessment followed the philosophy that both the standards and the curriculum frameworks were written in. I then looked for a viable method of alternatively assessing a large number of students on the NJSCCC Standards in Music. After a brief review of some of the literature, the idea of Internet-based assessment of the NJSCCC Standards in Music was born. It is my belief that the website created, www.musicassessment.com, could make a significant contribution to the field of music education in the State of New Jersey. This research project set out to determine whether or not that perspective was justified.
Eight months later, in June of 2000, www.musicassessment.com went ³live², opening to the participants on March 27, 2001. Participants were recruited through articles written in Tempo, and two presentations at NJMEA In-Service Conferences in February of 2000 and 20001. An accidental sampling of twelve teachers participated in the study. The participants filled out an Initial Teacher Questionnaire to provide me with some preliminary information. The questionnaire found the participants represented a wide array of experience, geographic location, and technical expertise,
Following their agreement to participate in the study, all e-mail communications between the participants and the researcher were saved, coded, and analyzed. The e-mail messages sent from the participants were primarily for assistance in sending student work for submission.
At the conclusion of the project, the teacher participants were asked to complete a 30-question, online Likert-style survey about the project. The questions were divided in to four categories: Attitudes towards the Standards and their Assessment, General Attitudes towards Assessment, Attitudes towards the Internet, and Attitudes towards the Website. All 12 of the teacher participants completed the survey. The results of this survey showed overall support for the rationale behind the creation of the website, and its viability as an assessment device. Results were mixed however as to its viability on the statewide level.
After the students completed their assessment activities, they were asked to complete a twenty question, online Likert-style survey about their experience using the website and completing the assessment activities. The questions were divided among three categories: Attitudes towards the Internet, Attitudes towards the Assessment Activities, and Attitudes towards the Website. Of the one hundred and seventy-four students who submitted work during the project, eighty-six (49.42% of the sample) completed the survey. The results of this survey show that students were very comfortable using the Internet, and enjoyed the activities on the website. The also preferred taking tests on the Internet over traditional pencil-and-paper tests. The students did however recommend more attention to graphics and the inclusion of more game-like activities.
The student participants submitted 340 completed assessments from 174 different students. A great variety of student work was collected. The majority of the student submissions (71.18%) received a score of proficient or advanced. These results speak very well towards both the quality of instruction these students have received from their teachers, as well as the student participantıs overall musical knowledge and ability.
The results of the four systematic web reports collected throughout the research project show that the website had no technical difficulties while it was online.
During the last week of June 2001, I contacted all of the participating teachers by telephone to ask eight questions which could not be answered effectively by the online survey. The participants were asked to set up a convenient time for the phone interview via e-mail. The results of the Teacher Phone Interviews supported the findings of the online survey and gave a more rich description of the teachers attitudes towards the website. The teachers generally felt that the website was a viable assessment device for the NJSCCC Standards in Music. The teachers expressed their feeling that the website in itıs current form, would not be viable on the statewide level in terms of 100% participation. Also, the teachers expressed that time played a major role in their selection of the activities to complete. Teachers opted for the activities that took the least amount of time to administer.
The infrastructure of the Internet currently in place makes an Internet assessment more than viable. Many universities and standardized tests like the GMAT are currently using web assessment as a regular part of their operation. With the advanced web authoring software now readily available, including Macromediaıs add-on to DreamWeaver called CourseBuilder, creating web-based assessment is quite simple. This study however focused primarily on the second definition of viability that is defined as whether teachers would use this site comfortably and regularly on a statewide basis.
It is clear from the data collected that teachers are able to use the site with clear directions but they are not completely comfortable. Many of the teachers needed extra guidance in attaching files to email messages. Others steered clear of activities that required extra technical expertise, including using a scanner and digital camera. The most commonly administered assessment required students to either take a multiple-choice test or write an essay. Only two of the teachers chose the activity that required teachers to create sound files of their students performing. Not one teacher submitted a MIDI file or a HyperCard stack. Teachers also avoided activities that required students to research a given topic and write a short essay about it. This would lead one to believe that although teachers feel that traditional means of assessment cannot accurately measure a studentıs musicianship skills, they are certainly easier to administer and therefore more common. It was a surprise that teachers who signed up to participate in a project dedicated to alternatively assessing the standards would chose those activities which were much closer to traditional means of assessment. Perhaps a reason for this is the lack of technology experience and training of the teachers.
This website is fully functional but from the data collected it is clear that it is not quite ready for ³prime time². In order to ensure a place for online assessment in the future, more training is necessary as well as a stronger commitment to training pre-service teachers at the undergraduate level. In summary, the following is a list of the major findings of this research project:
This project has uncovered some interesting questions that require further research. The first is whether or not training teachers to use technology on a more comprehensive level would have affected the viability of this project. The teachers who agreed to participate in this study must have had an inherent interest in technology to participate. What about the teachers who do not share that same interest? Does this website accommodate all of the various levels of technical expertise to ensure that teachers with little or not technology background could use it effectively? What changes should be made to the website to ensure that all teachers in the State of New Jersey would be able to use the site comfortably?
Perhaps a more salient question is the aspect of the most commonly administered activities and why the teachers chose them over the others. Many of the teachers stated that they chose the easiest activities to administer to their students because of the amount of time that was taken away from their normal curricular objectives. How did time affect the student work and the teacherıs selection of the assessments to administer? Why did some teachers select the more difficult activities? Was it because of their technical expertise? Why didnıt high school music teachers participate in this study? Are there differences between primary and secondary music teachers when it comes to technology? Finally, it is recommended that a feasibility study should be conducted to determine the exact cost of running this site on a statewide basis.
The content of the assessment activities was created by a team of 20 music educators from around the State of New Jersey to implement the standards, and is not necessarily reflective my perspective. These activities were used to provide continuity between the standards and the assessment. The State is basing their assessment on these activities, so should this project. It is more important to note that what is being studied in this project was whether or not the Internet serves as a viable alternative assessment device for the standards.
The relatively small amount of schools participating in the field test (12) does not indicate whether the Internet can be used as an assessment device on a larger scale. Because I served as the only evaluator in this study, and because of the vast amount of information that was sent for assessment, it was necessary to restrict the study to a small amount of schools. It is possible, however, to use the same site to evaluate every student in the State of New Jersey. To make this happen, there would need to be a major upgrade in memory, and many more music teachers would need to serve as evaluators.
In the advertising for the website, there was a specific call for ³technology-minded² music educators. While there were certainly schools involved in the study that did not possess a great deal of technology, the music educators involved in the study all shared the opinion that technology plays an important role in the music classroom. I felt that for the purposes of this field test of an assessment website, the educators involved were ³technology-minded² if they have agreed to participate in the study. While this factor might play a role in the widespread utilization of the Internet as an assessment device, for the purposes of this particular study only the viability of an assessment website as an alternative to the multiple-choice format of the ESPA, GEPA and HSPA was investigated.
Reliability of Student Sample:
Because each teacher chose the activities that their students completed it is difficult to generalize the responses. In addition, factors such as race, gender, previous musical and technology experience, and socioeconomic status were not measured in this particular study.
The Internet will revolutionize large-scale assessment in the future. Music educators have long been at the forefront of utilizing technology in the classroom setting. Today, music online is one of the fasting growing elements of the World Wide Web. In order to stay on the leading edge of instructional and educational reforms, music educators must make an effort to utilize the Internet in their classroom in an effective way. The reason that this project came about was, in part, to create an effective use of the Internet in the music classroom. It is my sincere hope that this project serves that purpose, and that the music educators of New Jersey and the nation have a viable alternative to assess their studentıs musical learning online.
A very special thank you goes to all of the teacher-participants that were involved in this study. Your help in this project was sincerely appreciated.
If you would like to visit the site, it is still online. Just visit www.musicassessment.com.