Thursday, November 24, 2011

Reflecting on the TPACK Professional development (PD) design process

Recently I was involved in designing a TPACK professional development for elementary natural science teachers within Indonesian context. For this reason, I worked elbow to elbow with two Indonesian friends. Everything from the design process, working with TPACK, finding ways to stimulate teachers to integrate technology in education as well as thinking about the feasibility of our design in the Indonesian context was very challenging; but later it turned out to be a crucial eye opener towards becoming a better TPACK professional development (PD) designer. From this design experience, I gained insight into so many important factors that needed to be considered before, during and after the design process. In my opinion, I summarize these factors into five categories, namely: context, literature, design and implementation, assessment and evaluation and finally feasibility and up-scaling.

To start with, from our design experiences, I was able to immediately establish the significance of the context in guiding the development of an effective design. Under the context, we analyzed teachers, students, schools, Government policy and the environment surrounding the school. For example before designing, we made attempts to understand the teachers’ knowledge (TPACK), skills, beliefs and attitudes in Indonesia. With regard to students, we analyzed their learning goals, preferences and backgrounds. In the school, curriculum content, school leadership style(s), access to technology tools and the school infrastructure facilities guided our analysis, just to mention a few. The results of the analysis in turn informed our design decisions and choices. For example, the context helped us to identify the problem and in-turn the goal of the TPACK PD. In addition, it aided us in choosing the technology to use during the workshop phase with teachers in that Indonesian context. In short, the complexity of the technology chosen was entirely context dependent.

Again from that design experience, I can now from my own words define the context as the environment in which the PD design will be implemented to bring the desired change. I also intend to highlight that the context is a crucial factor that needs deeper analysis before you begin to design because it will directly determine the level of impact during implementation of your design. In other words, you can be an excellent designer, but the moment your TPACK PD design fails to align with the context then rest assured that your design is slowly heading towards the road of failure if at all it escapes from being branded irrelevant.

Finally, the design activity enabled me understand better why the first TPACK framework proposed by Mishra and Kohler (2006) deserved the scathing attack it received when it failed to consider the context in its framework. Now I know why the two proponents of TPACK had to coil their tails and forced to include the dotted line around the TPACK frame work to represent the context. Surely you cannot take a TPACK PD that works for instance in the Kenyan context and export it for use in the Dutch context. Similarly, a designer cannot adopt the design that works for school A for use in school B. That will be tantamount to professional suicide and a bad career move because of the differences in contexts. Therefore, since each context has its own unique characteristics, then each design must be drawn based on the uniqueness of that particular context. Designing with TPACK to me is not like the computer cut and paste activity.Each design must align independently in its own context.

Lastly, my advice to other TPACK PD designers is that a thorough understanding of the context for which you are designing is a big milestone towards the success of your design.

At this level of the design we went about reviewing relevant literature that helped us (the designers) to choose the right models to ground our design. For example the TPACK model by Mishra and Kohler (2008), TPACK measurement instruments like the TPACK survey (Schimidt et al., 2009), TPACK Rubric and so on. Literature review also helped us find additional information about the context as well as how previous TPACK PD were designed and implemented successfully. For those that failed, the literature can also give you an account of what caused the failure and the possible ways to remedy it. The literature thus became an important tool that informed some of our decisions and choices before, during and after the design process.

Design and Implementation
To me, this stage was made easier by our deeper understanding of the context and the literature review. Without the two, proceeding to this stage would be almost impossible. Therefore, armed with the knowledge of the context and literature what now mattered was simply translating our prior analysis into a concrete plan of activities. For example, allocations of time for all activities which in our case were the workshops, lesson study, choosing the best technology (MS PowerPoint), pedagogical approach (collaborative problem solving). May be what I found difficult in this part is choosing what activities to include or leave out, how they should follow each other logically, when to conduct them and what amount of time to allocate for each activity. This was a big challenge. In short, safeguarding a smooth logical flow of activities to promote cognitive construction of knowledge for the participants was not a bed of roses for us. Furthermore, helping teachers develop and operationalize their TPACK also turned out difficult. It took us several days to align this. From the design experience am now am aware that any slight disorganization in the logical flow can be detrimental.

I suggest that in order to guard against waste of money and time; all designers need to be careful with this stage since it’s also another potential source of TPACK PD design failure if not handled well.

Assessment and evaluation
Professional development in science education is grounded on continuous improvement (Blank de las & Smith, 2008). Right from the start, I was able to notice how difficult it was to come up with one final excellent TPACK PD design because humanly as a designer you may overlook certain factors or issues that may emerge later on and thus affect the impact of your design. Your design therefore needs to be flexible enough to allow for revisions during the implementation process. For example, the views of teachers that will arise during the workshop and implementation stages will need to be accommodated to help improve your design. It is due to this fact that an on-going assessment and evaluation become a key aspect in informing our revisions. In our case, we designed a pre and post continuous assessment instruments for use throughout the process. For instance, the TPACK survey and TPACK Rubric were to be used. We also developed TPACK lesson observation instrument based on the standards of the International Society of Technology education (2007) and Partnership for 21st century skills (2008). Also catered for was summative evaluation using TPACK survey (Schimidt et al., 2009).It was meant to enable us measure the degree of TPACK knowledge and skills acquired by the teachers at the end.
It is my belief that the knowledge gathered from evaluation of the PD program, coupled with on-going monitoring indeed would have influenced my thinking to a larger extent about the design framework and how designers collect data to improve their TPACK PD design programs.

Feasibility and up-scaling
A TPACK PD design to me is worthless if it comes to a halt after implementation and fails to consider its feasibility and up-scaling. Feasibility is ensuring that the design is doable while up-scaling is spreading the tentacles of the TPACK PD knowledge and skills to other teachers within the district and beyond whom for one reason or another did not find the opportunity to learn from the workshop. In our design we considered these factors well in advance although I later came to realize that we overlooked some factors. For instance, budget estimates, source of funding, motivation to participants such as certificates, provision of lunch just to mention a few were overlooked and could slightly disturb the feasibility and up scaling of the design. Therefore, whoever is tasked to design a similar program should not repeat the same mistake we did but to learn from our mistakes.

Integrating technology into classroom teaching and learning is complex according to the TPACK framework. For teachers to effectively integrate technology in their teaching; they must synthesize their knowledge of curriculum content, teaching strategies and the affordances and constraints of technological tools and resources. Layered behind and underneath these three intersecting domains of knowledge is the context of the classroom, including social, political, and cultural factors, plus student learning styles and preferences, among many other considerations during the design process of a TPACK PD for teachers.

A better way to help teachers plan for technology integration is to base primarily upon their students’ curriculum-based learning needs that is grounded in the TPACK framework. Our emerging understanding of how teachers plan for instruction and the contextual constraints teachers face daily in their classrooms should guide the design of the TPACK PD. Only then can we be able to design Learning Activity Types (LAT) to assist teachers in connecting curriculum-based learning goals with content area-specific learning activities and complementary technology tools.


1.Blank, R. K. & de las Alas, N. (2008) Current models for evaluating effectiveness of teacher professional development: Recommendations to state leaders from leading experts. Council of Chief State School Officers: Washington, DC.
2.Mishra, P., & Koehler, M. J. (2006). Technological Pedagogical Content Knowledge: A new framework for teacher knowledge. Teachers College Record 108 (6), 1017-1054.
3.Schimidt et al.(2009).Technological Pedagogical Content Knowledge (TPACK): The Development and Validation of an Assessment Instrument for Pre-service Teachers. ISTE (International Society for Technology in Education), 800.336.5191JRTE, 42(2), 123–149