The Associate Faculty Development Institute from the lens of Henderson and Gornik
The assumptions of this evaluation will be very much different from those of the first two concepts and much more in line with that proposed by Johnson and La Salle but not as initially focused on data derived evaluation as their work.
Henderson and Gornik discuss evaluation in chapter 6, Evaluating 3S Education of their work, Transformative Curriculum Leadership
Among the important questions that they raise are these:
1. Who decides what will be assessed and evaluated? (Henderson & Gornik, p.159) This assessment covers only a part of the curriculum, the presentation made for one section of the elective material called Complex Teaching Discussions. This covered five weeks of topics. The curriculum for this class was to discuss in a group setting, topics of the teachers choice. The topics were decided in advance by the curriculum committee. The presenter had the choice as to how the topics were presented.
2. What questions are to be answered? The central question from the view point of the curriculum committee is how the teachers will use the material that has been presented in class to improve the understanding of their students as it relates to the five areas of student engagement highlighted in the CCSSE report.
3. How might data be gathered and analyzed? Data was gathered using staff reports of their learning experience and from the instructors notes of the students/teachers responses and questions.
4. What criteria will be used to interpret and judge data? We will look at the data from two different directions, one, the number of areas that the different sessions "map" to the different areas of student engagement and from the other, how the comments of the teachers provide insight into the feelings of the teachers toward the process and how they might use the knowledge gained from the session.
5. Who analyzes data, makes judgments, and uses judgement? The data will be analyzed by the investigator to map the areas of student engagement to the areas that address these interests in the written and presented curriculum. This mapping will help in the construction of the survey instruments that will be used to determine the success of the program. This will be done with the help and assistance of the curriculum committee and other faculty members as deemed necessary by the committee.
Data Analysis: Using the instructors notes of student responses, I compared the subject matter discussed to the general areas listed for improvement in the student survey. The different sessions mapped primarily to the areas of academic challenge, active and collaborative learning and student-faculty interaction. The analysis of the student/teacher responses did show there was interest in the topics presented as well as a belief by teachers that they should not focus on student retention as that was an issue between the student and the school administration.
I note that the initial thrust of the 3S concept is not the initial concern of the curriculum committee. The emphasis of the data collection will be on elevating student graduation rates and not on the ideas of self-understanding by teacher and student. This does not mean that this orientation may not change as the process of evaluation and feedback are analyzed by the committee. But for the immediate future it is not being measured.
Conclusion:
There is no perfect curriculum. Success in one area may not mean success in another. At the present time, there is a focus on improving student graduation success by focusing on a range of factors that influence student engagement. In fairness to this approach, I have read a detailed examination of these factors that were studied in a community college setting that support this approach. I also note that the other models used for study by the College, of successful community college faculty development programs, have used the "learning community" approach with great success.
The assumptions of this evaluation will be very much different from those of the first two concepts and much more in line with that proposed by Johnson and La Salle but not as initially focused on data derived evaluation as their work.
Henderson and Gornik discuss evaluation in chapter 6, Evaluating 3S Education of their work, Transformative Curriculum Leadership
Among the important questions that they raise are these:
1. Who decides what will be assessed and evaluated? (Henderson & Gornik, p.159) This assessment covers only a part of the curriculum, the presentation made for one section of the elective material called Complex Teaching Discussions. This covered five weeks of topics. The curriculum for this class was to discuss in a group setting, topics of the teachers choice. The topics were decided in advance by the curriculum committee. The presenter had the choice as to how the topics were presented.
2. What questions are to be answered? The central question from the view point of the curriculum committee is how the teachers will use the material that has been presented in class to improve the understanding of their students as it relates to the five areas of student engagement highlighted in the CCSSE report.
3. How might data be gathered and analyzed? Data was gathered using staff reports of their learning experience and from the instructors notes of the students/teachers responses and questions.
4. What criteria will be used to interpret and judge data? We will look at the data from two different directions, one, the number of areas that the different sessions "map" to the different areas of student engagement and from the other, how the comments of the teachers provide insight into the feelings of the teachers toward the process and how they might use the knowledge gained from the session.
5. Who analyzes data, makes judgments, and uses judgement? The data will be analyzed by the investigator to map the areas of student engagement to the areas that address these interests in the written and presented curriculum. This mapping will help in the construction of the survey instruments that will be used to determine the success of the program. This will be done with the help and assistance of the curriculum committee and other faculty members as deemed necessary by the committee.
Data Analysis: Using the instructors notes of student responses, I compared the subject matter discussed to the general areas listed for improvement in the student survey. The different sessions mapped primarily to the areas of academic challenge, active and collaborative learning and student-faculty interaction. The analysis of the student/teacher responses did show there was interest in the topics presented as well as a belief by teachers that they should not focus on student retention as that was an issue between the student and the school administration.
I note that the initial thrust of the 3S concept is not the initial concern of the curriculum committee. The emphasis of the data collection will be on elevating student graduation rates and not on the ideas of self-understanding by teacher and student. This does not mean that this orientation may not change as the process of evaluation and feedback are analyzed by the committee. But for the immediate future it is not being measured.
Conclusion:
There is no perfect curriculum. Success in one area may not mean success in another. At the present time, there is a focus on improving student graduation success by focusing on a range of factors that influence student engagement. In fairness to this approach, I have read a detailed examination of these factors that were studied in a community college setting that support this approach. I also note that the other models used for study by the College, of successful community college faculty development programs, have used the "learning community" approach with great success.