Monday, April 22, 2013

Professional Development…again? -_-

-->

Those are usually the words and faces we make when we hear about another teacher training, or what we call in Broward; Professional Development. Interestingly enough, through all my years of teaching, I have always questioned the effectiveness of our professional development teams, especially with Instructional Technology. All too often our trainings on “innovative technology” (yes…I intentionally used quotations) are mostly lectures and demonstrations of dated software and web-apps, which, at some point in time were cutting-edge or innovative, but those days have long since passed. These trainings are usually staffed by one person to a room of 40-50 teachers, with varying computer skills, instructional preferences, and content areas, forced to sit in the training, when REALLY, they’d rather be in their rooms getting caught up on grades for their. This may or may not pertain to me J But really…how effective can these trainings be? We get credit when we sign in, sit through it, and complete the evaluation online at the end that does not ask relevant questions to provide meaningful feedback to the trainer.

There are a number of elements missing from these trainings inhibiting maximum effectiveness. The major piece missing: successful use of an Instructional Design Model. In fact, I doubt one is used to begin with. By how to use the ASSURE Model (http://www.instructionaldesign.org/models/index.html) and comparing that with the trainer’s delivery instruction, much is missing as summarized below:

Analysis of Learner: An analysis of the learner usually consists of nothing more than a “Raise you hand if you’ve used…” of “How many of you have experience with…” What? How is that going to give anyone really data on their audience?

State Objectives: IF the trainer mentions any learning objectives at all, they often are very broad and vague versus directly explaining what the learner should expect to be able to do at the completion of the training. 

Select Methods, Media and Materials: The “go-to” materials, media and methods are usually a projector and computer with a PowerPoint.

Utilize Methods, Media, and Materials: Often the trainers opt for direct lecture/ “instruction” for the majority of the instructional time. Interesting how we must teach in “student-centered” environments, but that expectation does not hold true when the teacher becomes the student.

Require Learner Participation: Learner participation is often not required, as again, the audience’s focus is on watching and listening rather than any type of hands on activity. Furthermore, there is nothing ever to submit or turn in as evidence of participation or completion. Often, I’ve observed colleagues “checking out” of the trainings, mostly because they feel the training has little value to them and their instructional practice.

Evaluate and Revise: The only element used 100% of the time is the evaluation at the end. Participants are required to complete the evaluation to receive In-service points, so I might question the accuracy of the evaluation results as well as how it might be used for revision. I have not seen much changed, which leads me to believe little to no revisions have been made based on these evaluation results.


So how do we change this? I suggest teaching more relevant topics with regard to technology. Ipad/Tablet computing, Web 2.0 tools, Interactive White Boards, and maybe even Digital Storytelling are topics more current and relevant to today’s teaching. I also suggest doing away with the “Mandatory” nature of these technology-based staff developments. That method of teaching does not generate results; neither does having a disinterested/unmotivated learner endure such training. Why not create different learning modules and lessons for teachers to complete (similar to an online course) where they proceed through the modules at their own pace, completing various activities (relevant to the task), to conclude with some form of deliverable that must be submitted as demonstration that the learner did in fact participate and learn something. Additionally, I propose having a variety of these courses from which teacher can choose from based on their own skill levels and professional interests. Why make a teacher sit through an entire presentation on creating a Podcast in Garageband if they can barely even upload photos to their Dell desktop?

Interesting thoughts…right? What are yours?

-Mark O

Friday, April 19, 2013

Highs and Lows of Learner Response Systems

As with everything ed tech related, using learner response systems (LRSs) for instruction can be viewed as a double-edged sword. A learner response system uses wireless technology to connect hand-held devices (used by the student) to the teacher's computer. These devices predominantly are used to facilitate student interaction and engagement with a lesson. (Click here for an example) In my classes, we use these devices mostly for formative assessment during a lesson and occasionally for quizzes (self-paced) if I've properly planned and prepared.

So why the double-edged sword comment? I'll begin with the negatives and end on a high-note. The only issue I've had with using LRSs during lessons deals with student immaturity. CLASSROOM MANAGEMENT IS A MUST! As with most technology in the classroom, when classroom management slips, as does the quality and effectiveness of the lesson and subsequently student learning. I first used LRSs during my second year teaching. While eager to try new things, I had not properly "planned for the worst" and arranged the devices to be used in the most effective way. I had the students choose whichever device they wanted and respond (via text) some of the responses. As you could imagine, with lower level freshmen and sophomores mixed with high levels of anonymity, the words that popped up as responses were...lets just say...less than desirable and pretty offensive.

So how does one best use these devices? First (now I know) assign a student to each device. The truth is, they REALLY like to use these devices. It becomes a game and often a race to see who can get the answer in the fastest and of course, most accurate. Assigning each device increases the accountability factor, which does minimize the immature choices from the students. My next suggestion is to cater your questions to the level of the audience. KNOW YOU STUDENTS! Can they handle texting in an answer or would it be best for them to respond numerically, T/F, or Multiple Choice? Lastly, be sure to assign some sort of value to their responses. Inevitably, the novelty will wear off, and students will want to know how you will use their responses. Points for participation, quizzes, or maybe even have it be some form of game where the highest scoring (team) wins something? Again, I feel your audience must dictate how you use these devices. Push them and hold them to a higher level yes, but keep the goals attainable.

Once YOU are set and ready, using LRSs is SO much fun! When used properly, they do encourage engagement in all students; even the most shy and timid learners. In my experience, they provide another means of taking the focus off you as the instructor and back on content and material where it belongs. Once you have properly designed and implemented LRSs into your lesson, most will be impressed by the increased level of student engagement in the lesson and class. No doubt!

Good luck and have fun with them!

-Mark O

Friday, April 5, 2013

Storyboard for a math related digital story.

-->
Greetings!

As mentioned in past posts, I am working on creating a lesson that uses digital storytelling in a mathematics context. The example I created is for my 9th/10th grade Geometry Honors students. While this example focuses on calculating the area of polygons the lesson could theoretically be applied to other mathematical topics, not just those in geometry. My ask is that you look through the storyboard below and provide me with feedback on what you see. This includes: likes, dislikes, strengths, weaknesses, critiques, errors, etc. Any thoughts and comments you feel would improve the example I will produce would be greatly appreciated.

Thank you for your support!

-Mark O