Jump to content

Directing Technology/Evaluate

From Wikibooks, open books for an open world

Set Goals

[edit | edit source]

Establish and train the evaluation committee

[edit | edit source]

Participants

[edit | edit source]
In order to implement a thorough evaluation, it is necessary for all constituents to be involved in the process. In addition to involving students, parents and teachers in the information gathering phase of the evaluation, a committee should be formed to include as many participants as possible. The configuration of the committee will vary from district to district and will reflect the goals and priorities of that district.
These may include the following:
  • Administrators- principals, department chairs, business office staff
  • Technology Staff- network engineers, instructional technologists
  • Teachers- from various departments
  • Students
  • Parents
  • School board members
  • Community members

Training

[edit | edit source]
Once the committee members have been chosen, it is important to train them on how the evaluation process works.
  • Orientation of training program
  • Goals and milestones for process
  • Initial responsibilities assigned
  • Timeline and benchmarks for completion

Goals

[edit | edit source]

Depending on the school district, the goals for the evaluation may be different. Most evaluation processes will be for the purpose of checking the status of existing programs.

Possible goals:

  • Yearly technology evaluation
  • Evaluating a pilot program
  • Evaluating parts of the program such as teacher training or hardware implementation
  • Interim evaluation of a grant program

Formulate and review evaluation question

[edit | edit source]

After the initial training, the committee will meet to formulate key evaluation questions and to create indicators and rubrics for those questions. In many cases it is easier to break into smaller sub-committees in order to work on individual questions.

Key questions

[edit | edit source]
There can be many reasons for an evaluation and the beginning of the process will involve deciding the scope of the process. Yearly technology evaluations, professional development programs, laptop program implementations, and incremental grant evaluations are some examples of programs you may be evaluating.
Some key questions to ask might be:
  • What have we done up until now?
  • Have we completed said task?
  • Where are we now?
  • What is working well?
  • What needs improvement?
  • What do we need to do for the future?
  • Do we need to revamp our schedules?
  • Are there tasks we need to complete?

Indicators

[edit | edit source]
Looking at the technology plan for your district, the implementation plans for professional development or laptop programs and checking the grant guidelines will help you create indicators for checking progress. You should have a schedule for each program and benchmarks for progress depending on the evaluation.
Indicators:
  • Yes or no questions
  • Detailed narratives
  • Checklists
  • Rubrics

Subcommittees

[edit | edit source]
Subcommittees can be formed to develop key evaluation questions and create indicators.

Develop indicator rubrics

[edit | edit source]

Rubrics are a useful graphic way to gather information for the evaluation process. There are many combinations of information you can include in a rubric and it will depend on the goals of your institution and exactly what you are evaluating.

Some possible uses for rubrics:
  • At the beginning of a training session to assess the skills or comfort levels of the participants before beginning.
  • Re-scored at the end of the session to evaluate program effectiveness
  • At the end of a school year to check the yearly progress of the faculty
  • Used from year to year in the teacher's evaluations or as data for technology program evaluations

Question editing

[edit | edit source]
Question editing is important to fine tune the evaluation. It is important to be as specific and as succinct as possible when editing questions. You will want the most accurate information possible so you need to make the rubrics easy to understand and fill out. Rubrics with short phrases are easier to read than ones with full sentences.

Additional input

[edit | edit source]
It is helpful to include an area on the rubric form where the participant can write in comments. You could also have a section for you to write comments after you interview a participant.

Finalized rubrics

[edit | edit source]
  • Rubrics in the form of grids are easy to complete.
  • The far left column will have the points of information you are scaling.
  • The columns to the right will have the degrees of understanding or completion.
  • There should be a scoring line at the bottom for ease of gathering data.
Here is an example of a simple rubric.
Category Beginner Intermediate Expert
Basic Computer Operation I do not use a computer I can use the computer sometimes. I use a computer all the time
File Management I do not save any documents. I save some documents. I save all files.
Totals


Teacher Self-Efficacy Rubric- Click here to view

[1]

Teacher Self Evaluation Rubric Information- Beginner- Click to view

Teacher Self Evaluation Rubric Information- Advanced-Click to view

Collect and Analyze Data

[edit | edit source]

Collect data

[edit | edit source]

It is important to have data that shows how technology is used to enhance teaching and learning. Within the context of the evaluation process, data collection is based on the indicator rubrics developed earlier. The main purpose is to collect detail information in order to answer the questions and measure the performance on the rubric created by the committee. There are various tools useful for technology evaluation data collection such as

  • surveys
  • focus group interviews
  • observations
  • review of teacher/student work

Surveys

[edit | edit source]
Survey is the most common tools used for collecting user opinion. Unique surveys can be designed for teachers, administrators, students, and/or community members. Survey formats can be online or hard copy. Online surveys usually have advantages over hardcopy. They can produce immediate results that can be viewed online or downloaded in a variety of file formats for later analysis. Online surveys can be an efficient way of collecting data from a large number of teachers especially when the district has a well-developed network. If the district does not have its own network, a few free or low cost online survey tools are also available in the market (e.g. Zoomerang). When conducting a survey, regardless of online or hard copy, a common and dedicated time for respondents to complete there survey is important. A dedicated 10 minutes at the beginning or the end of a staff meeting is a good way to complete a survey. If a paper survey is distributed in teacher mailbox with the note “return it to the principal’s office” it will probably produce low response rate.
Here is a sample teacher survey that investigates teacher technology proficiencies, use of specific technologies at school, and teacher’s perceptions of technology's impact on teaching and learning,

Focus group interviews

[edit | edit source]
A meaningful evaluation will consist of data coming from several different sources. Besides surveys, focus groups are one of the most common and valuable data collection methods. Just like surveys, we have the options to collect more or less data by scheduling more or fewer focus groups. They can be teachers, administrator, and technology staff.
Here is a sample teacher focus group interview that suggests the interview process and the questions focusing on student access and use, teacher fluency, teacher vision and strategy, and teacher access and professional development. Wording of these interview questions can be modified for either administrator or technology staff focus group.

Observations

[edit | edit source]
Sometimes the best way to collect information about people’s behavior is to watch them.[2] Observation is a way of collecting data without bothering the persons who provide the information. Building and classroom observations are the third and often most detailed of data collection. Most evaluators spend time in schools and classrooms throughout the district. The goal is not to only observe teachers and students using technology, but also observe classroom setups, teaching styles, and student behaviors. The information collected is then used to determine how technology impact teaching and learning.
Here is a sample school building data sheet and a sample classroom observation template that can be used to observe where, what, and how technology is used in schools.

Review of teacher/student Work

[edit | edit source]
An assessment protocol of teacher and student technology work should be developed. This assessment should focus on examining how students of different grade levels and subject areas have used technology to enrich content-area learning, as well as how teachers have used technology to create and enhance their teaching material/environment. The evaluation committee and school administrators should develop an accurate sample of student work for this assessment.
The Kathy Shrock’s Guide for Educator – Assessment and Rubric Information contains plenty of subject-specific and general rubrics for reviewing and evaluating teacher/student work.

Others

[edit | edit source]
Other data collection methods can also be used. For example, collecting existing school records (attendance, grades, test scores) and holding public meeting to discuss about a community’s goals and concerns related to technology implementation.

***Remember that the data collected could be at the individual level and should never be reported. Our mission is to measure the progress of the district (school) as a whole in meeting its goal, not to assess individual achievement. If individual assessment is important, there should be a separate evaluation to serve this purpose.

Analyze Data

[edit | edit source]

Data should be summarized in a format from which a range of statistical analyzes can be done. It should also be easy for people other than evaluators to read. Visual aid is always desirable. Here are some types of format that are regular used for data analyze.

Recommend and Report

[edit | edit source]

Score Rubrics Using Collected Data

[edit | edit source]

Create a report that provides explanation of how scores occur and rationale

[edit | edit source]
The report must first summarize in a clear way where the Data has led us, and how that score was derived. For example, if your school has embarked upon a pilot program to evaluate performance gains if an entire class is given laptops for course work, there has to be measurement of what is an acceptable gain or difference maker. It can't just be a hunch or perceived increase in student performance. It also cannot be a Technical director providing anecdotal evidence. If you can't measure it, it does not count.
So once the scoring system is understood, and why for instance they might be weighted a certain way, it is clear then what results were important for the pilot to be deemed a success. Becoming more familiar with computers might be an outcome, but if there are no points awarded because it carries no importance, then it does not matter to the TD. This would be the part that details the rationale for why the pilot was structured a certain way. Likewise, something like a 10% gain in the class SAT Math score might be a highly desired result, might have above average points assigned, and is measurable. That would be explained in the Rationale section, and would be the kind of result that would drive purchasing more laptops beyond just the pilot number.

Detail and documentation

[edit | edit source]
The actual findings will need to be explained in the Detail section, in a sense what the Data that we have collected actually means. This could be the total range of scores - high,, low, median, etc. - and where they originated. It is also strongly encouraged that you present some of the key data using an easy to grasp bar chart, or express trends in a before and after graph. Remember, the Data will not be helpful to the average layperson if they cannot understand the key meaning or outcome. Make it simple, keep it simple. Stay away from impressive explanations that relies heavily on Statistical phrases such as standard deviations or Chi factors as the audience is not likely to be a Reserach oriented or Academic group. The end result should be to make something that is complex, easy to understand, not the other way around.

Final report must fully support Findings section

[edit | edit source]
The report must contain only those recommendations that can be traced directly back to the Data, and what the Findings tell us. Again, a hunch or perceived finding cannot be part of a recommendation. If the Data is not there to support a conclusion then it cannot be included. In this way, what is reported is based on scientific process and not just introduced casually, and it must be fully documented.

Present Findings and Recommendations in a Final Report

[edit | edit source]

Adapt or change recommendations for how to Improve

[edit | edit source]
Beyond just findings, the final report should provide a path for how the District can adapt or change existing practices to achieve higher levels of performance in succeeding years. For example, if a particular pilot project showed a noticeable gain after 1 year of use, then there should be a recommendation that details expected gains over several years if this same method is repeated again and again. Introducing a 4th grade class to their first use of laptops for submitting homework assignments might be repeated with every 4th grade class over ten years to gain a specific goal. Likewise, using Blackberrys in every 7th grade class may have proven no gains, and the report should advise that this practice be discontinued even if the Blackberrys are free.
All recommendations must be relative to the findings and not made incongruent with a District's desired outcomes. These would be documented in the earlier indicator rubrics.

Review final report recommendations with Key Individuals who provide input

[edit | edit source]
Prior to issuing a final report, findings and logical recommendations must be reviewed with the District evaluation committee and the overseeing administrator. No one likes surprises. This is also a good step to ensure that the report does not move forward with any errors such as incorrectly referenced titles, buildings, misspellings, etc. which would detract from the credibility of the findings. Additionally, if the report is going to reflect negatively on an individual it is best to make sure that kind of language is permitted to stay. Otherwise, there can be a full-scale attack on the report and the recommendations will not be welcome. The key individuals will be given the opportunity to modify, edit, tone down the language, change an adjective, or suggest additional information. After this input is received, the final report is ready. You will also have obtained the buy-in of the key individuals and know that nothing they see later will be a surprise.

Issue final report after reviewing last chance input

[edit | edit source]
Provide a written copy of the report to the key individuals 24 hours in advance of issuing the report in written form to all other parties. Note in the report that there will be a presentation (by you, the Technical Director) and the day, time, and place scheduled for that event. Questions will be welcome at that time, and generally you should give everyone 5 working days at a minimum to digest the written report. Wear a nice suit, and eat a good breakfast as you'll need both.

Disseminate the report in a formal presentation

[edit | edit source]

Presentation of final report for Reflection and Positive Change

[edit | edit source]
The presentation of the report is the most critical stage of a formative evaluation effort as it establishes a common knowledge base for reflection. If you come across as arrogant or scolding in even a small way, it will hurt the message and inhibit positive change. Think of yourself as a Trusted Advisor when presenting and mentally picture yourself as a member of this District team, rather than an outsider or worse, a hired change agent. Use phrases such as "We found" not "I uncovered" and be sensitive to the egos in the room as you speak about the findings in a neutral - these are the facts - manner. State diplomatically that any evaluation that never gets shared with the community it evaluates never results in reflection. And, no reflection means no positive change. Use a professional font and storyboard theme in a PowerPoint presentation that expresses 3-5 key points you wish to make, and include a final Summary slide of those points. Do not include animation (moving cartoons), witty jokes, or any other casual humor to make light of the subject. For something based on scientific data and reporting, those will generally backfire and can be in poor taste, no matter how uniquely clever you might think they will be. Be professional when presenting the PowerPoint slides (while standing if possible) and ask a colleague to assist with advancing each slide for you (just say "next slide" when ready). Rehearse with a dry-run in the actual room and loaded laptop, projector to be used, and time the practice runs. Ensure that you have enough time to adequately speak on each point shown, and also allow sufficient time for questions to be taken. Practice, practice, practice.

Summarize key report Elements - Process, Findings, and Recommendations

[edit | edit source]
At the conclusion of the PowerPoint presentation, it should be clear to everyone what key Process, Findings, and Recommendations result from the work and those are emphasized again verbally after the last Summary slide has been seen. If the District has a functioning website you may also indicate that the presentation will be archived there provided the overseeing administrator has agreed in advance that you may say that. You should ask upfront, and in fact know for certain, if any outside newspaper or press individuals will be present, but even if they are not keep in mind that you may be quoted. Excerpts from your report "could" become part of an article in the local newspaper so be careful with your comments during the PowerPoint time itself and in answering any questions. Nothing should be offered off the cuff, or without advance thought. You are there to deliver the report findings and not launch any controversies, so stick to the subject matter at hand.

Lastly, thank everyone for their time and review of the written report.

References

[edit | edit source]
  1. http://www.ed.gov/pubs/EdTechGuide/appc-6.html An Educator's Guide to Evaluating the Use of Technology in Schools and Classrooms
  2. An Educator's Guide to Evaluating - The Use of Technology in Schools and Classroom, US DOE, 1998.