Scale

SCALE is the principle that enables institutions to offer their best educational value to learners and to achieve capacity enrollment. Institutional commitment to quality and finite resources require continuous improvement policies for developing and assessing cost-effectiveness measures and practices. The goal is to control costs so that tuition is affordable yet sufficient to meet development and maintenance costs -- and to provide a return on investment in startup and infrastructure. Metrics may compare the costs and benefits of delivery modes by discipline and educational level; faculty salary and workload; capital, physical plant and maintenance investments; equipment and communications technology costs; scalability options; and/or various learning processes and outcomes, such as satisfaction levels and retention rates. These types of comparison enable institutions to: develop better strategic plans for market demand and capture; achieve capacity enrollment; develop brand recognition; and secure long-term loyalty among current and prospective constituents. Practices for scale help to leverage key educational resources while offering new online learning opportunities to students and faculty.

Effective Practice Awards Submissions Due June 30

Submitted by janetmoore on May 27, 2010 - 2:06pm
New effective practices  submitted by June 30 are eligible for awards to be presented at the July 21, 2010 Emerging Technologies for Online Learning Symposium Awards Presentation Luncheon.
Thousands visit effective practices for innovative practices supported by eviden
Award Winner: 
2014 Sloan-C Effective Practice Award
Author Information
Author(s): 
Brichaya Shah
Author(s): 
David E. Stone
Author(s): 
Derrick Sterling
Author(s): 
Kathryn C. Morgan
Institution(s) or Organization(s) Where EP Occurred: 
Instructional Design Unit: Office of Faculty Support and Development
Institution(s) or Organization(s) Where EP Occurred: 
Southern Polytechnic State University
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

The Teaching Academy for Distance Learning (TADL) was created to provide faculty with a formal certification process. This process also helps faculty develop quality online courses. The program format was initially a 47-hour, 3-part training course for Southern Polytechnic State University (SPSU) faculty. This program began in the fall 2008 and mostly focused on technology tools and their use in instruction. As part of an ongoing quality improvement process, TADL has continued to enhance the online learning capabilities at SPSU.

The program’s dependency on a particular software has decreased as the program has evolved. The Instructional Design Unit (IDU) has worked with faculty in the development of online courses across academic disciplines. Their balance between pedagogy and technology allows minimal changes in the program format as new technologies emerge. Key to the program is the team-based approach. This approach brings in expertise from instructional design, instructional technology, as well as digital media.

The TADL program is housed within the faculty-driven Center for Teaching Excellence (CTE). The CTE has explored a broad range of teaching and learning activities. It has built strong relationships across campus. Faculty value the CTE and their partnership helps validate the activities of the Teaching Academy for Distance Learning.

TADL evolved from a single face-to-face only program into three versions: face-to-face, online, and blended formats. Within the program, faculty from across campus are brought together to build online courses as well as discuss issues related to online learning. This has created a community of practice around online learning. This community supports informal learning networks within the institution and has allowed for growth in online learning.

Description of the Effective Practice
Description of the Effective Practice: 

TADL is not just a faculty development program. It is a truly hands-on program that allows faculty to learn new skills and acquire knowledge to design, develop, and deliver quality online courses at SPSU. Some of the components of this course include: weekly meetings (face-to-face or online), multimedia-rich learning modules, and interactive learning objects that address different learning styles. This course also includes assignments that allow faculty to apply their newly acquired skills. While completing the course, participants in TADL have full access to a diverse team of instructional designers, digital media specialists, and an instructional technology specialist.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

Upon completion of TADL, all of the newly developed courses were sent out for external review. These reviews were completed by instructional design professionals working in the field. This process resulted in a 100% pass rate for the 5 years that TADL has been offered. As the program matured, it went from an informal process to a more formal review of the course objectives, modular objectives, and course alignment. This review includes a Subject Matter Expert reviewer from each participants’ department.

Course components are developed during the TADL program and feedback is given to participants as they progress. This continuous review, in combination with the external review of the developed courses, provides multiple opportunities for feedback.

TADL was developed for several reasons. First, single workshops and short term training sessions were not valued as significant professional development for faculty. Second, there was a demand for a more in-depth exploration of online learning and course development. Since developing TADL, this program has become recognized and supported by several deans and department chairs who insist that new hires go through this program. They also insist that the department adopt some of the practices that TADL instills in its participants.

How does this practice relate to pillars?: 

The TADL instructors’ practice aligns with the pillars of “Faculty Satisfaction”, “Learning Effectiveness”, and “Scale”. Faculty are empowered by the TADL experience and develop a support network with their peers. This allows for continuing discussion and learning outside of TADL. TADL is now offered in multiple formats (Hybrid, Fully-Online Instructor-led, and Self-Paced) to accommodate faculty schedules and learning preferences. Many of the resources for TADL are re-used between formats. As a result of the TADL experience, some departments have developed standard templates for their courses that have unified the student experience throughout their academic program. SPSU instructors exhibit learning effectiveness because after the successful completion of TADL, participants can continue to develop quality online courses that constantly improve based on the available technologies.

Equipment necessary to implement Effective Practice: 

Many of the resources necessary to build this program would already exist at most institutions. We have made use of a classroom equipped with computer stations and common university software. Infrastructure required includes the learning management system, as well as a desktop/web conferencing solution.

Estimate the probable costs associated with this practice: 

We provide a small stipend ($1,000) for TADL participation and departments often pay the faculty for the development of the course built as part of TADL, with the amount at the department’s discretion. This amount is often roughly the amount adjunct faculty are paid to teach courses.

References, supporting documents: 

An extensive description of the program, along with videos and TADL materials are available online at:
http://spsu.edu/instructionaldesignsupport/TADL/index.htm

Contact(s) for this Effective Practice
Effective Practice Contact: 
Brichaya Shah
Email this contact: 
bshah@spsu.edu
Effective Practice Contact 2: 
Kathryn C. Morgan
Email contact 2: 
kmorgan@spsu.edu
Effective Practice Contact 3: 
David E. Stone
Email contact 3: 
dstone@psu.edu
Award Winner: 
2014 Sloan-C Effective Practice Award
Author Information
Author(s): 
Rick Lumadue, PhD
Author(s): 
Rusty Waller, PhD
Institution(s) or Organization(s) Where EP Occurred: 
Texas A&M University-Commerce
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

Programmatic student-learning outcomes of an online master’s degree program at a regional University in Texas were assessed in this study. An innovative use of emerging technology provided a platform for this study. The Astin Model provided the framework for the evaluation. This study has provided a model for conducting well-informed, instructional and programmatic assessments of student-learning outcomes. The results of this study demonstrated that emerging technology can provide a platform for students to both showcase and preserve their ability to meet programmatic student-learning outcomes.

Description of the Effective Practice
Description of the Effective Practice: 

This online master’s degree program is taught using a fully interactive online format in a primarily asynchronous delivery model. Asynchronous activities used in the program included: threaded discussion, video and audio presentations, written lecture linked to video and audio presentations embedded into the course management system, Voicethreads, faculty developed MERLOT web pages created using the MERLOT Content Builder, e-Textbooks, etc.
The Astin Model (1993) provided a framework for this assessment. In the Astin Model, quality education not only reaches established benchmarks but also is founded upon the ability to transition students from where they are to reach intended competencies. An innovative use of MERLOT Content Builder combined with emerging technology provided a means for assessing the seven student-learning outcomes in an online master’s program at a regional university in Texas.
Two full-time faculty and one adjunct faculty used rubrics to evaluate each of the programmatic student-learning outcomes by assessing a random sample of student assignments from courses.
The goal of this study was to help students reach the intended learning outcomes for metacognition, digital fluency, communication, cultural fluency, global fluency, servant leadership, and commitment to life-long learning. Definitions of these learning outcomes are provided here. Students will evidence metacognition by demonstrating the knowledge and skills for designing, developing, and evaluating personal strategies for learning and leading. Students will evidence digital fluency in the adoption and integration of appropriate technologies into digital presentations. Students will be able to communicate ideas and content to actively engage participants. Students will evidence understanding of generational and cultural learning styles. Students will develop instructional materials appropriate for a global perspective. Students will practice the principles of servant leadership as espoused by Robert Greenleaf in his work titled, The Leader as Servant (1984). According to Greenleaf, “The servant-leader is servant first. It begins with the natural feeling that one wants to serve first. Then conscious choice brings one to aspire to lead. Students will evidence a commitment to lifelong learning in the production and evaluation of learning materials.
Digital education presents many challenges. Barnett-Queen, Blair, and Merrick (2005) identified perceived strengths and weaknesses of online discussion groups and subsequent instructional activities. Programmatic assessment is required for all institutions accredited by the Council of Higher Education Accreditation or the US Department of Education. Walvoord (2003) indicated that good assessment should focus on maximizing student performance. The following questions rise to the forefront: (1) Have graduates mastered programmatic expectations; (2) What relationships exist between student performance and other factors; and (3) How can faculty improve the program based upon the analysis of student performance. Walvoord further stresses the importance of direct assessment in determining student performance. Indirect measures may provide evidence of student-learning, but direct assessment is widely viewed as more valid and reliable.
Brandon, Young, Shavelson, Jones, Ayala, Ruiz-Primo, and Yin (2008) developed a model for embedded formative assessment. The model was collaborative and stressed embedded assessment. Their study stressed the difficulties associated with broad-based collaboration given the difficulties of formally identifying partners and spanning large geographic distances. Price and Randall (2008) demonstrated the importance of embedded direct assessment in lieu of indirect assessment. Their research revealed a lack of correlational fit between indirect and direct assessment of the same aspect of student-learning with the same course in a pre- and post-test design. They documented a difference between student perceived knowledge and actual knowledge. These findings further underscore the importance of direct assessment of student-learning. Walvoord’s (2003) findings further indicated the need for embedded direct assessment of student-learning owned and supported by those who will implement the change. Those implementing change would include program faculty and students.
Gardner (2007) found that education has long wrestled with defining and assessing life-long learning. Though loosely defined as the continued educational growth of the individual, lifelong learning is rapidly rising to the forefront of 21st century education to assume a more prominent place than that held in the 20th century. Brooner (2002) described the difficulty of assessing the intention to pursue learning beyond the completion of a program. Intention and subsequent performance are affected by many different factors including, but not limited to, normative beliefs and motivation. Educational programs have often been encouraged to avoid assessment of behavior beyond the point of graduation as such behavior as been viewed as beyond the control of program educators (Walvoord, 2003). The question arises as to the importance of future behavior as an indicator of current learning.
Astin (1993) pointed out that educators are inclined to avoid assessment of the affective domain viewing such as too value laden. Accordingly, the cognitive domain became the defacto assessment area though affective assessment more closely paralleled the stated aims and goals of most institutions of higher education. The avoidance of assessment in the affective domain is well documented by Astin. The advent of social media tools coupled with e-portfolios offers some intriguing possibilities in regard to assessment in the affective behavioral domain. Astin pointed out that a change in the affective domain should translate into changed behavior.
Secolsky and Wentland (2010) found many advantages to portfolio assessment that transcend regular assessment practices by providing a glimpse into non-structured behavioral activities. Behavior beyond the classroom can be captured and documented within a properly designed portfolio. Behavior that has not been directly observed by the teacher can be measured in light of portfolio submissions via a broad collection of relevant and targeted information. Established performance criterion can be assessed to measure student-learning and determine specific areas for programmatic improvement. Though Secolsky and Wentland point out that reliability and validity concerns still exist with portfolio measurement, they concur that portfolio assessment potentially gauges authentic student performance outside the educational environment. With the development of a portfolio transportable beyond program enrollment and across the life experience the opportunity exists to assess the impact of the instructional experience upon real time student performance. Evaluation of life-long portfolios promises to provide meaningful insight into the real life impact of the educational experience. Astin (1993) viewed changed behavior over time as the real evidence of affective enlightenment.
An interesting finding from this study was the creative manner in which some of the students layered or nested other web 2.0 technologies into their MERLOT web pages. Examples of layering or nesting included embedded student developed Voicethread presentations, embedded open-ended discussion Voicethreads used to promote participation and feedback, embedded YouTube Videos, embedded Prezis and the like.
The integration of MERLOT GRAPE Camp peer review training into this Master Degree Program has provided an additional platform for further research to be conducted relative to the assessment of all seven of the programmatic learning outcomes of the program. For example, metacognition may be assessed as it relates to MERLOT’S peer-reviewers serving as content expert in assessing materials that pertain to one’s field. Communication may be assessed through interaction with peers and peer-reviews. Digital fluency is obviously what is required to contribute to MERLOT. Cultural Fluency may be demonstrated through peer reviewing submissions of MERLOT’s international community of partners. Global Fluency may be measured through the development and contribution of appropriate content for use in a global community of learners. Servant Leadership is the motto of MERLOT, “Give a Gift not a Burden!” (Gerry Hanley, 2010). Finally, the development of students into lifelong learners will help to establish the identity of the program. Student performance outside of the program is one of the best measures of student-learning and the MERLOT Content Builder along with MERLOT peer-reviews is a tremendous platform for measuring student-learning outcomes.
Life long learning may be assessed by current and former students’ contributions of materials to MERLOT and by those providing peer reviews of materials contributed to MERLOT. As a benefit of being a MERLOT partner, the dashboard report provides information on contributions made by members of the partner organization. Contributions and/or peer reviews completed by students who have graduated from the program will be recorded in the dashboard report. This is a tremendous tool to measure the commitment to life long learning. Ultimately, this study has demonstrated that the MERLOT platform combined with emerging technology are integral in assessing student-learning outcomes in an online master’s program at a regional University in Texas. Other online degree programs should seriously consider the MERLOT Content Builder’s potential to help them assess student-learning outcomes.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

The Online Master of Science in Global eLearning equips specialists in education for practice in public education, private education, business, industry, and non-profit organizations. Learning and technology are intertwined as we develop the next generation of enhanced training, development, and teaching to engage learners with key components of instructional technology. Technology provides access to all forms of education and this program will teach educators how to implement technology across curricula and classrooms of all kinds. With a blend of theory and technical skills, this program will prepare teachers and corporate trainers alike.

Metacognition – Students will demonstrate the knowledge and skills for designing, developing, and evaluating personal strategies for learning and leading.
5 journal entries will be selected at random from a course offered in Fall 2012. These will be evaluated by the fulltime bachelor’s and master’s faculty using the Global eLearning Metacognition rubric. Scores will be deemed acceptable at an average of 4.0 or higher on a 5 point scale in each of the areas of context & meaning, personal response, personal reflection, and interpretive skills.

The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Context & Meaning 4.27
Personal Response 4.13
Personal Reflection 4.40
Interpretive Skills 4.47

All standards were met.
Though all standards were met, the faculty noted that the personal response section scored the lowest at 4.13. Accordingly the course, EDUC 595 Research Methodology, was expanded to include more opportunities for students to provide self and peer-evaluation feedback on projects and assignments. Two assessments were recommended for AY 2013-2014. We will assess one course in the Fall and one course in the Spring.

Communication – Students will communicate ideas and content to actively engage participants.
5 student digital presentations will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Assessment of Digital Student Presentation Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 42 on a 50 point scale in each of the five areas of purpose, organization, content, language, and voice & tone. The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Purpose 45.33
Organization 46.67
Content 46.00
Language 44.00
Voice & Tone 44.67
Technology 45.33

All standards were met. Though, all standards were met Faculty noted that Language scored the lowest. The faculty decided to conduct two assessments for the next cycle. One will be done in the fall and one in the spring.

The faculty modified an assignment in EDUC 515 Intercultural Education to provide students an opportunity to develop their language skills on a project to provided heightened sensitivity to language that might be offensive in other cultures.

Two assessments were recommended for AY 2013-2014.

Digital Fluency - Students will evidence digital fluency in the adoption and integration of appropriate technologies into digital presentations.
5 student digital presentations will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Assessment of Digital Student Presentation Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 45 on a 50 point scale in the area of technology.

The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Technology 45.33

The standard was met.
The faculty noted that the students tended to use more familiar software and avoid the utilization of emerging software. Accordingly, EDUC 510 Utilizing Effective Instructional Technology was modified to include requirements for the utilization of at least one Web 2.0 software program to complete an assignment.

The faculty will conduct two evaluations in AY 2013-2014.

Cultural Fluency – Students will evidence understanding of generational and cultural learning styles.

5 student digital presentations will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Cultural Fluency Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 3.0 on a 4 point scale in the areas of knowledge & comprehension, analysis & synthesis, and evaluation.

The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Knowledge & Comprehension 3.53
Analysis & Synthesis 3.07
Evaluation 3.67

The standard was met. The faculty noted that analysis and synthesis scored lowest. Accordingly the curriculum for EDUC 552 Global Fluency was expanded to include group projects on the education system of other cultures.

The faculty will also conduct two evaluations in AY 2013-2014.

Global Fluency – Students will develop instructional materials appropriate for a global perspective.

5 group project entries will be selected at random from a course offered in Summer 2012. These will be evaluated by the fulltime bachelor’s and master’s faculty using the Global eLearning Global Fluency Rubric. Scores will be deemed acceptable at an average of 2.8 or higher on a 4 point scale in each of the areas of knowledge & comprehension, application, and evaluation.

The assessment was conducted by two fulltime and one external faculty on July 22, 2013. The external was added to strengthen the review. Results were as follows:

Knowledge & Comprehension 2.87
Application 3.00
Evaluation 2.87

The standards were met.

Faculty found student performance in this area to be adequate. Some challenges were noted in the use of stereotypes in identifying people from other cultures. For example, a student made a comment on. EDUC 515 Intercultural Education will be expanded to include a project in which students will interview someone from a different culture to discover differing worldviews of other cultures and share these findings in a forum with classmates.

Servant Leadership – Students will practice the principles of servant leadership as espoused by Robert Greenleaf.

5 student group project self-assessment packets will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Servant Leadership Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 40 on a 50 point scale in each of the five areas of purpose, organization, content, language, and voice & tone.

The assessment was conducted by two fulltime and one external faculty on July 22, 2013. The external was added to strengthen the review. Results were as follows:

Servant Leadership 41.33
Strategic Insight & Agility 39.33
Building Effective Teams & Communities 44.00
Ethical Formation & Decision Making 43.33

The standard was NOT met for Strategic Insight & Agility.

Faculty noted problems in the effective feedback of peer-evaluation assignment. Accordingly, the group peer assessment process has been expanded to include MERLOT GRAPE Camp to provide training on conducting peer-evaluations. All students will be required to complete MERLOT GRAPE Camp training. These changes will be enacted in all new course sections.

Commitment to Life-Long Learning – Students will evidence a commitment to lifelong learning in the production and evaluation of learning materials. 5 portfolio entries will be selected at random from a course in Fall 2012. These will be evaluated by the fulltime bachelor’s and master’s faculty using the Global eLearning Commitment to Life-long Learning rubric. Scores will be deemed acceptable at an average of 3.0 or higher on a 4 point scale in each of the areas of production of educational materials, publications, presentations, including personal response, personal evaluation, and interpretive skills.
The assessment was conducted by two fulltime and one external faculty on July 22, 2013. The external was added to strengthen the review. Results were as follows:

MERLOT Web Pages 3.4
Presentations 3.8
Peer Evaluations 3.60

The standard was met. Though, all standards were met Faculty noted that MERLOT Web pages scored the lowest. The faculty decided to conduct two assessments for the next cycle. One will be done in the fall and one in the spring.

The faculty modified an assignment in EDUC 528 Intro. to Presentation Design to make the MERLOT Web page a requirement rather than an option

Two assessments were recommended for AY 2013-2014.

How does this practice relate to pillars?: 

1) Leveraging MERLOT Content Builder with emerging technology to assess programmatic student learning outcomes is scalable because it encourages more online instructors and instructional designers to consider integrating this model to measure the effectiveness of assignments in meeting the goals for Institutional effectiveness planning.

2) Increases access by providing open access using MERLOT’S Content Builder combined with emerging technology to showcase learning outcomes for students and faculty to assess regardless of location as long as they have an internet connection.

3) Improves faculty satisfaction by providing faculty with open access to evaluate student assignments to assess programmatic student learning outcomes for Institutional effectiveness planning.
Since this model was used to complete a recent Institutional Effectiveness Plan for an online master’s degree program in preparation for a regional accreditation visit other instructors can easily replicate this model to evaluate their programs.

4) Improves learning effectiveness by providing instructors with effective online strategies that are supported by empirical data from assessments of random samples of student assignments .

5) Promotes student satisfaction by providing valuable opportunities for interaction with their instructor and other students. Students work together on group projects for both synchronous and asynchronous presentations. Students are also assigned group and individual projects to evaluate the work of their peers and provide feedback. Rubrics are embedded in the grade book of the LMS to evaluate student assignments. Also, an evaluation tool of the programmatic student-learning outcome that is tied to the assignment is also included in the grade book to assess the level of student understanding. Students regularly comment about how valuable these practices are to their learning experience.

Equipment necessary to implement Effective Practice: 

The only aspect completely necessary is an internet connection and an LMS. In our program, the students also used Camtasia, Quicktime and Captivate for creating videos to complete some of their individual projects. Group projects were completed using Google+ Hangouts, Skype, Voice Thread and Adobe Connect. Students also created MERLOT web pages, MDL 2 Courses and digital portfolios.

Some of the tools we used have costs associated with them. Here is a list of some them:

• Synchronous tools: Adobe Connect, Google Hangouts, Google chats, Skype
• Asynchronous tools: Voicethread, MERLOT Content Builder, Prezi, MERLOT GRAPE Camp, Peer Review Workshop and Discussion Forums in LMS
• Reflective tools: Journals, self-assessments, and digital portfolios

Estimate the probable costs associated with this practice: 

The only additional cost would be optional and would involve the use of some emerging technologies that are not open source. All other resources used in this project were open source and we did not incur additional costs using them. There was essentially no budget for this project.

References, supporting documents: 

Astin, A. (1993). Assessment for Excellence. Wesport, CT: Oryx Press.

Barnett-Queen, T., Blair, R., & Merrick, M. (2005). Student perspectives of online discussions: Strengths and weaknesses. Journal of Technology in Human Services, 23(3/4), 229-244.

Brandon, P., Young, D., Shavelson, R., Jones, R. Ayala, C., Ruiz-Primo, M., & Yin, Y. (2008). Lessons learned from the process of curriculum developers’ and assessment developers’ collaboration on the development of embedded formative assessments. Applied Measurement in Education, 21, 390-402.

Gardner, P. (2007). The ‘life-long draught’: From learning to teaching and back. History of Education, 36(4-5), 465-482.

Greenleaf, R. A. (2008). The Servant as Leader. Westfield, IN: The Greenleaf Center for Servant Leadership.

Price, B., & Randall, C. (2008). Assessing learning outcomes in quantitative courses: Using embedded questions for direct assessment. Journal of Education for Business, 83(5), 288-294.

Secolsky, C., & Wentland, E. (2010). Differential effect of topic: Implications for portfolio assessment. Assessment Update, 22(1), Wilmington, DE: Wiley Periodicals.

Walvoord, B. (2003). Assessment in accelerated programs: A practical guide. New Directions for Adult & Continuing Education, 97, 39-50.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Rick Lumadue
Email this contact: 
proflumadue@gmail.com
Effective Practice Contact 2: 
Rusty Waller
Email contact 2: 
rusty.waller@tamuc.edu
Award Winner: 
2013 Sloan-C Effective Practice Award
Author Information
Author(s): 
Kelvin Thompson, Ed.D.
Author(s): 
Baiyun Chen, Ph.D.
Institution(s) or Organization(s) Where EP Occurred: 
University of Central Florida
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

The faculty development programs, instructional designers, and individual teaching faculty of the University of Central Florida have found affordances in integrating into their work the online teaching practices codified in the Teaching Online Pedagogical Repository (TOPR). The faculty development programs, instructional designers, and individual teaching faculty of other institutions can just as readily benefit from integrating TOPR entries into their work as enhancements to existing faculty development strategies. TOPR is freely available online under the terms of a Creative Commons BY-NC-SA 3.0 unported license at http://topr.online.ucf.edu.

Description of the Effective Practice
Description of the Effective Practice: 

The University of Central Florida (UCF) is one of the fastest-growing universities in the country, currently ranked as the second-largest public institution in the US with approximately 60,000 students. To meet students’ needs, over 30% of the university's student credit hours are generated by online and blended courses and nearly three-fourths of all UCF students take one or more online courses every year. As a result, the need for faculty development for online teaching has been increasing in recent years. The Center for Distributed Learning (CDL) at UCF provides a variety of faculty development offerings to meet these needs, including semester-long training programs, webinars, individual instructional design consultations, self-directed learning objects and others. It is a challenge to keep the professional development materials updated and streamlined. Further, as the number of individual faculty teaching online and blended courses at UCF and the associated number of instructional designers serving them has grown, it has been challenging to identify and disseminate emergent effective teaching practices. One initiative, the Teaching Online Pedagogical Repository (TOPR), is an effort to solve these challenges: http://topr.online.ucf.edu.

The Teaching Online Pedagogical Repository (TOPR) is a public resource for online faculty and instructional designers seeking inspiration from online teaching strategies that have proven successful for others. We at UCF took the learning practices that we endorsed to our faculty member in our professional programs and featured them on TOPR. These strategies get updated by collaborating contributors on a regular basis and we link to these strategies in our faculty development programs. TOPR has also become a handy resource for UCF's instructional designers to use in individual consultations with faculty or email responses. After instructors hear about TOPR, they save the resource as one of their favorite bookmarks and come back to these strategies when they need new ideas for online teaching.

In the Teaching Online Pedagogical Repository (TOPR), each entry describes a strategy drawn from the pedagogical practice of online teaching faculty, depicts this strategy with artifacts from actual courses, and is aligned with findings from research or professional practice literature. Emphasis is placed upon practices that are impactful and replicable. TOPR entries are tagged with relevant keywords to aid discovery of relevant content. Additionally, site visitors may also find entries by searching or by browsing a topical index. The index of published teaching practices from TOPR is available at: http://topr.online.ucf.edu/index.php/Pedagogical_Practice.

The Teaching Online Pedagogical Repository (TOPR) is offered within a wiki which makes contribution and collaboration very easy, and all entries are provided as open resources under the terms of a Creative Commons license. Thus, faculty development programs, instructional designers, and faculty from other institutions can readily adopt and adapt TOPR entries for their needs.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

Specific entries from the Teaching Online Pedagogical Repository (TOPR) are linked to from within UCF’s internal faculty development materials (e.g., LMS-based materials for UCF’s award-winning IDL6543 faculty development course) and external resources (e.g., the publicly accessible http://BlendedLearningToolkit.org site presented by UCF and the American Association of State Colleges and Universities). UCF instructional designers report sharing resources from TOPR routinely in consultations with teaching faculty. Anecdotally, some individual instructors have noted consulting TOPR for ideas. However, it is perhaps more telling to look at some of the evidence that has emerged outside of the UCF context.

The Teaching Online Pedagogical Repository (TOPR) was presented to a national audience for the first time at the 2011 Sloan-C ALN Conference. The theoretical underpinning and development background were presented in that presentation and may be reviewed at: http://ofcoursesonline.com/?p=132. Since then, promotion of TOPR has continued, and an editorial board comprised of leaders in online and blended learning from the US and Canada has been formed. (See http://topr.online.ucf.edu/index.php/Board.) The following evidence of TOPR use has emerged since that time.

While the exact number of institutions and individual faculty connecting to TOPR is infeasible to determine, it is clear that TOPR is proving useful beyond UCF. For instance, some other institutions include links to TOPR in their online resources. (See http://www.wabashcenter.wabash.edu/resources/teach-web-result.aspx?pid=3..., http://teach.granite.edu/?p=8983, and http://edtech.uvic.ca/edci335/wiki.) The statistic page for one custom url for one TOPR entry reveals that this entry has been accessed hundreds of times from multiple countries. (See https://bitly.com/discussion_rubrics+.) The easily citable TOPR entries have even appeared in research articles (e.g., http://www.westga.edu/~distance/ojdla/winter154/eskey_schulte154.html) and at least one dissertation (i.e., http://ufdcimages.uflib.ufl.edu/UF/E0/04/43/67/00001/JOHNSON_M.pdf).

As of August 2013, the most popular of the 33 public TOPR entries (e.g., related to discussion rubrics, social networking, and discussion prompts) have each received tens of thousands of page views. (See http://topr.online.ucf.edu/index.php/Special:Statistics.)

The evidence above would seem to suggest that the practice of leveraging the online teaching practices codified in TOPR for use in faculty development materials, instructional designer consultations, or individual instructor inquiry is a practice that is both replicable and potentially effective in supporting the Sloan-C pillars.

How does this practice relate to pillars?: 

Integrating practices from the Teaching Online Pedagogical Repository (TOPR):
1) Enables scale by allowing more online instructors and instructional designers to learn about effective online strategies.

2) Increases access by providing an open access online compendium for faculty development. Other institutions can link to TOPR in their professional development programs; instructional designers can recommend strategies to instructors with concrete examples; instructors can also use TOPR as a just-in-time resource whenever they need new strategies for their classroom.

3) Improves faculty satisfaction by providing faculty with open access to professional development resources that they can use in their daily classroom teaching. Since each strategy includes a detailed description and artifacts to support how the strategy is used in real classes, instructors can easily replicate these strategies in their own teaching.

4) Improves learning effectiveness by providing instructors with effective online strategies that are supported by literature.

Equipment necessary to implement Effective Practice: 

There are no extraordinary equipment costs associated with this practice. UCF maintains the Teaching Online Pedagogical Repository (TOPR). Contributors offer entries under the terms of a Creative Commons BY-NC-SA 3.0 unported license. Other institutions, instructional designers, and instructors can use TOPR with computer and internet access.

Estimate the probable costs associated with this practice: 

Costs associated with replicating this practice are negligible and equate to the opportunity costs of one's time in searching for practices of relevance within the Teaching Online Pedagogical Repository web site and applying them to one's work.

References, supporting documents: 

See attached and the links included within the evidence section.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Kelvin Thompson, Ed.D.
Email this contact: 
kelvin@ucf.edu
Effective Practice Contact 2: 
Baiyun Chen, Ph.D.
Email contact 2: 
baiyun.chen@ucf.edu
Author Information
Author(s): 
Dr. Marija Franetovic
Author(s): 
Dr. Richard Bush
Institution(s) or Organization(s) Where EP Occurred: 
Lawrence Technological University
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

The Master Course Shell Practice recreated the way courses were developed, archived, and delivered. It involved the redesign of the institutional online course look and feel and terminology used for both the front end of the course and the course repository. All Master courses were redesigned to fit the Master Course Shell, which included other standard components such as a syllabus template, an orientation module, and uniformity within the weekly look and feel of modules. Each Master course was reviewed to include the core course content aligned to the course’s learning objectives. In that way, the Master courses could be reused with different faculty and also by more than one section.
The Master Course Shell Practice included redesign by the course developer and faculty member within a development shell prior to teaching, then the course developers moved the courses to the Master course, and then each semester the Master courses were pushed to active courses by IT. In terms of quality improvement, three times a semester course audits were completed and the course content and delivery reevaluated toward the end of the semester. The practice has improved the partnership between IT, course developers, faculty, and students, eliminated inconsistencies in processes, course design and course delivery, allowed faculty to focus more on teaching, and is ultimately believed to improve student learning and satisfaction.

Description of the Effective Practice
Description of the Effective Practice: 

Problem. The basic look and feel of courses, terminology as well as the core course content was continually being changed by different faculty and from semester to semester. Course developers, faculty, and students were challenged in accessing core course content and navigating courses. There were inconsistencies in how the faculty members were being trained and in course quality in terms of alignment to course objectives and in course delivery. The existing course development practice of exporting and importing courses resulted not only in messy courses but also in a lot of unnecessary time spent cleaning up courses and dealing with artifacts from past course iterations. And, the demand for online programs was increasing.

Solution. When reviewing the education literature, there was a lot of articles on standardization, universal design, and templates, however, there were very few references to the Master Course Shells practice. Using the Master Course Shell practice, a standard look and feel was implemented to the courses, which not only included the structure of the course and the course repository, but also the core content of each course. All existing and new courses were standardized in terms of a syllabus template, orientation module, presentation guidelines for theory presentation, core activities for practice, and an emphasis on aligned major assignments and assessments.
In addition to the initial redesign or design of new courses, courses were audited three times a semester against quality standards. Course developers worked with faculty on quality improvement based on course audit results and mid-semester and end of term student evaluations. Overall, the Master Course Shell practice conserved cost, resources, time and effort.
The Master Course Shell practice can be replicated easily but it does require a dedicated IT, course developer and faculty, a learning management system and server space, and approximately a school year for the full development and first iteration of the intervention. When reviewing education practices, there were a few universities and community colleges which had a similar practice in place, some of which are listed or mentioned in the references.

Outcomes. There were a number of positive outcomes. The course structure redesign and the emphasis on identifying core course content brought about standardization and streamlined processes. This, in turn, affected other instructional design processes such as the collaboration between IT, the course developers and faculty, bringing about a greater focus on course quality improvement. Due to the standard look and feel of courses, course developers could also assist faculty from different colleges and as such affect response times as well as provide high touch solutions to faculty and student inquires. Due to a greater emphasis on instructional design prior to teaching, faculty could afford more time to teaching and enriching the learning experience.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

The following data sources are varied, including documents, reflection, interview quotes, and survey data from ranking reports and evaluations. The data are also from multiple sources including reflection from course developer and administrator, adjunct and full time faculty quotes, and student evaluations. The time period for this practice involves a semester for testing and development during summer 2012, and a year of development and intervention including fall 2012, spring 2013, and summer 2013. Due to its relatively early stages of development, further research is planned, especially with regards to learning effectiveness, student satisfaction, and retention rates.

Learning Effectiveness. Since it is still at its early stages of development, conclusive evidence does not exist for learning effectiveness however there are plans to evaluate learning outcomes between a representative set of courses, including online and traditional, prior to intervention of the master course shell practice and after the intervention.

Scale and Access. Reflection from authors of this article is relevant to mention as one brings the course developer perspective and the other brings forth the administrator perspective. The authors of this article see the Master Course Shell practice as an effective way to develop, archive, and deliver courses. The working space is centralized while internal processes between IT, course developers, and faculty are more streamlined. Other universities are encouraged to adopt the Master Course Shell as it can improve practice in all Sloan-C pillars.

The following data indicate that a) institutional support and b) student choice for online delivery from LTU courses has steadily increased from the time prior to the intervention to the time after the intervention:
• In Summer 2013, LTU Online accounted for 21% of the university total credit hours produced whereas prior to the intervention in Summer 2011, it accounted for 15%.
• In Spring 2013, LTU Online accounted for 10% of the university total credit hours produced whereas prior to the intervention in Spring 2011, it accounted for 8%.
• In Fall 2012, LTU Online accounted for 8% of the university total credit hours produced whereas prior to the intervention in Fall 2010, it accounted for 7%.

Faculty Satisfaction and the other 4 Pillars. Representative quotes are listed below from full time and adjunct faculty surveyed for their feedback regarding the Master Course Shell practice. This data directly illustrates faculty satisfaction, and as a secondary data source points to practice effectiveness in terms of the other Sloan pillars such as learning effectiveness, scale, access, and student satisfaction. Faculty satisfaction includes but is not limited to satisfaction with support and course developers’ consultations, an appreciation of workload decrease while teaching the course such that they may focus more on teaching, and sharing of an effective practice for the benefit of student learning.

“Keep the Master Course Shell…it is a structure that is standardized for all courses and very helpful. It kept me on track while developing the course and it gave students a similar look and feel. It eliminates scrambled courses, getting lost, and keeps students focused.” – FA_I1.1

“We didn't have the Master Course Shell when I developed my first online course, and I found myself constantly rearranging and redoing material after discussions with the Course Developer. The second time I developed a course it was much easier because of the standardized organization of the Master Course Shell. The feedback I got from the first course initially was that the students were somewhat confused as to where they should look to find content. The second course's feedback was very positive right from the start. Using the Master Course Shell also helped me communicate ideas and changes with my Course Developer.” – FA_I2.1

“Having a Master Shell for the course is really necessary if the course is not taught every semester. It allows the instructor to focus on instruction by refamiliarizing theirselves with the well documented content structure instead of having to redevelop the basic plan of instruction. ” – FT_I3.1

“In the pre-course phase, the Master Course Shell allowed me to place my course materials in one place. When I taught the course, this allowed me to focus on the teaching aspect of the course and not on developing course materials. In the post course phase, the Master Course Shell allowed me to reevaluate my course materials with ease and to make changes in order to improve the effectiveness of the course." – FT_I4.1

"...not only does the instructor need to be knowledgeable with the subject being taught and with the technology used to deliver the instruction, but also need to have a good structured design for the online course. A design that is characterized with consistency and alignment of the critical elements of teaching, and illustrates uniformity with how a course is delivered. Such a design requires knowledge and experience that many teachers/instructors may not have. I personally did not have such skills and knowledge until I have worked with our Course Developer...we started our process by reevaluating the core content of the courses including outcomes, presentation/demonstration of information, practice/feedback of skills and knowledge, and assessment of the overall outcome...this provided the consistency and standards that allow for students to know where to go and what to do, which resulted in helping them to focus on learning. In addition, it will enable future instructors to focus more on the teaching the concepts of the course instead of worrying about how to use the technology to effectively present the contents to the students. Moreover, sharing the same objectives with the Course Developer and being able to exchange ideas and brain storm the issues while we were developing six Master courses resulted in a good solid academic relationship based on respect and trust." – FA_I5.1

Student Satisfaction. In terms of student satisfaction, there may be an indirect link between the Master Course Shell practice and our 2012/13 school year rankings. In the U.S. News and World Report, Lawrence Tech University ranked 6th nationally for online undergraduate education and 1st in the nation for undergraduate online engagement. This could be due to our efforts in redesigning the look and feel of courses, our processes of capturing core content, the improved consultations between course developers and faculty, which together created the opportunity for overall course quality improvement and enhanced student engagement.
For the past fall 2012, winter 2013, and spring/summer 2013, the average for our student evaluation completion rate has been 34%, which may indicate that students were willing to be engaged to fill out evaluations as compared to other universities with lower completion rates. A representative set of evaluations indicates that students reported being satisfied with their courses. A factor which may have negatively affected the response rate is that simultaneously we were piloting different evaluation software. There are plans to review student evaluation data in greater detail once the new evaluation software is in place starting fall of 2013.

How does this practice relate to pillars?: 

Learning Effectiveness. When redesigning or creating the course with the Master Course Shell practice, the faculty members and the course developers reevaluate the core content of the course aligning it to program outcomes and accreditation bodies. They ask the question as to what is ‘essential to know’ versus what is ‘nice to know’. The latter may depend on each semester and individual instructor. They also consider whether the course objectives align with the content presentation, course activities and course assessments. They review whether the most appropriate technologies are being used and how they may maximize student to student, student to faculty, student to content, and student reflection interactions.
The intricacies of developing a Master Course contribute toward the quality of course design, teaching, and delivery, which in turn are believed to influence learning effectiveness. LTU Online management courses utilize case studies and our architecture courses utilize the studio-model for online sessions. Our online science courses facilitate laboratory experience through simulations and creative approaches to experiments. Our online humanities courses utilize multimedia to enrich the learning experience and ask open-ended and reflective questions to engage students. Our online math and computer science and engineering courses’ students solve real-world problems and engage in team projects.

Scale. The Master Course Shell has allowed for more streamlined processes between IT, course developers and faculty. In its initial phase of creating the Master Course Shells delivered for fall 2012, spring 2013, and summer 2013 terms additional time was required. However, the time invested has already begun to yield more work cycles for enhancing individual course quality between the course developers and faculty members.
The practice is also scalable because now there are Master courses developed by subject matter expert faculty of record, which are in place and ready for whoever is teaching the course. In addition, scalability is also observed when each Master course may be pushed to however many sections of the course there are. The Master course is reusable from semester to semester yet flexible enough to account for updates and changes at the time of teaching and review. Though there is still room for improvement in terms of documentation, learning management system functionality, and processes in general, the institution as a whole has a more effective and efficient approach to developing, archiving, and delivering online courses.
Additionally, the partnering between IT, course developers and faculty members and students improved in terms of communication, coordination, and faster response times. For example, a course developer, with whom a faculty member is not working with directly, is able to assist faculty or students with greater ease because there is uniformity with how the courses are structured and delivered.

Access. Access is often evaluated with how many people can access the materials, the ease of access, and the resources available. In this case, the Master Course Shell practice is effective because the same core content is delivered to all sections of the course, which means that all students have access to the same core content learning resources. The IT department and course developers have the same centralized access to the development shells, Master Course Shells and active courses. Similarly, the course developers and faculty have the same centralized access to the development shell, used to create the Master Course Shell, and the active course shell.
With a uniform structure to the intuitive navigation of the course as well as standardized look and feel of the weekly learning modules, students may easily know where and how to access course materials. In addition, the Master Course Shell design includes an orientation module, which provides students with information of student success strategies for online learning and technology help guides. Within the learning management system, the students also have easy access to the library, tutoring, help desk and other student services. When reevaluating courses and providing support, course developers and faculty seriously take into account student evaluation data and feedback: They continually improve the functionality and reliability of the learning management system, university-used tools and methodologies, and resources available.

Faculty Satisfaction. Faculty report that having standardization in the look and feel and core content of the course prior to the course being taught allowed for them to focus more on the teaching instead of worrying about recreating course content and figuring out technology. The practice of redesigning or designing a new course in the Master Course Shell format requires development prior to the semester of teaching. In such a way, the faculty member and course developer can evaluate the course prior to deployment and work out any course design issues prior to the course being taught. In addition, course audits, which evaluate against the Master Course Shell practice, provide course developers and faculty with quality improvement needs regarding improving teaching presence such as weekly announcements, discussion board interaction, and grading feedback.

Student Satisfaction. The same structure of the course allows for students to know where to go and what to do, which in turn, helps them focus on learning. As students get used to familiar elements of courses, they themselves provide feedback in course evaluations if the same quality standards are not exactly followed. Instructors may focus on orienting students to the course, setting weekly expectations, bringing in timely real-world examples and creating a learning community. All of these strategies are encouraged through consultations with course developers through a) course audits, which support the Master Course Shell practice, and b) student evaluations.

Equipment necessary to implement Effective Practice: 

The equipment necessary to implement the effective practice would include the learning management system and the server space used.

Estimate the probable costs associated with this practice: 

The estimate of the probable costs associated with this practice would depend on:
a) the learning management system and server space used;
b) informational technologist(s) and course developer(s) wages, and;
c) at least a year of time for the full redesign (4 semesters – 1 for testing, and the others for fall, spring and summer courses).

Contact(s) for this Effective Practice
Effective Practice Contact: 
Dr. Marija Franetovic
Email this contact: 
mfranetov@ltu.edu
Effective Practice Contact 2: 
Dr. Richard Bush
Email contact 2: 
rbush@ltu.edu
Tags:
Author Information
Author(s): 
Margaret Reneau, PhD, RN
Institution(s) or Organization(s) Where EP Occurred: 
Saint Xavier University, School of Nursing
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

Hiring quality online adjunct faculty is essential for student learning success. Online faculty members influence student satisfaction and ultimately student retention. The virtual nature of hiring online faculty makes finding quality online adjunct faculty challenging. One way to assess online teaching behaviors, prior to hiring, is conducting a mock online class as part of the interview process. By observing online teaching practices of potential online faculty candidates, the university has a behavioral gauge from which to make sound hiring decisions. Quality online adjunct faculty influence one of the pillars of quality online education, student satisfaction.

Description of the Effective Practice
Description of the Effective Practice: 

Saint Xavier University (regionally accredited by the Higher Learning Commission, North Central Association), School of Nursing offers an online Masters in Nursing Science (MSN) degree program with two tracks, clinical nurse leader and nursing administration. The MSN program is professionally accredited by the Commission on Collegiate Nursing Education (CCNE) and is one of only nine nursing programs to hold the honor of Center of Excellence designation from the National League for Nursing.

The size of the university limits resources; subsequently, the university seeks to hire quality adjunct online faculty, which require little to no training with the LMS or effective practices on online teaching. Human resource experts (Hammons & Gansz, 2004; Training and Development, 2008) find past behavior is the best predictor of future behavior. The dilemma is how to assess potential online adjunct teaching behaviors prior to hiring. Faculty members who use effective practices of online teaching consistently, and understand the learning management system with minimal support, are ideal. Short of being able to access a candidate's prior online courses for virtual observation, past behavior assessment might be limited to the candidate’s phone interview and how the candidate presents over the phone or through Skype prior to hiring.

One way to assess online teaching behaviors is conducting a mock online class as part of the interview process. Class participation is part two of the interview process, the first part being a phone or Skype interview. During the mock online class, faculty participate in a week long online class with a facilitator. The mock online class runs for one week, Monday- Sunday and includes:

• four modules of discussion
• a mock student paper to grade
• identification of course teaching preferences via referral to a website with the course catalog for review
• creation of a biography (content demonstrating expertise in a class they prefer to teach)
• development of a welcome announcement (for a class they prefer to teach)
• a final exam to assess knowledge of online teaching effective practices and university policies contained in required course reading material throughout the four modules.

The mock class, or part two of the interview process, requires rubrics for all faculty assignments to set clear expectations for faculty candidates participating. A grading rubric for the mock student paper to grade is also included. Candidates must earn at least 90% to be considered for a teaching assignment. Even with meeting the 90% threshold, faculty candidates only become eligible for teaching. Hiring faculty candidates takes place if the candidates’ skills fit well with university needs.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

The virtual nature of hiring online adjunct faculty makes finding quality faculty challenging. Filling adjunct online faculty positions occurs, literally, "sight unseen". For many, if not most institutions, hired faculty begin an orientation phase after hire, with considerable use of time and money for the hiring process. More resources are spent if a lengthy orientation occurs. Even after a two to eight week training/orientation, no guarantee exits for the hired online adjunct’s student interactions. The effective practice of behavioral interviewing to hire quality online adjunct faculty saves on these precious resources and increases the likelihood of positive online faculty-student interactions.

A screening process of over 100 resumes, resulting in 25 telephone interviews, yielded 12 online adjunct faculty candidates. Candidates enrolled in part two of the interview process, the mock online course. Two candidates withdrew within the first two days of the course start. Both candidates indicated their schedules could not support the amount of time it would take to teach online after experiencing the mock class for a couple of days. Two more faculty candidates did not reach the threshold of 90%. Of the eight remaining faculty, the top three performing candidates were selected to fill the available online adjunct faculty positions.

The three hired faculty candidates consistently demonstrated online teaching effective practices throughout the mock course. The five other faculty candidates in the course inconsistently demonstrated online teaching best practices, yet still managed total course scores ranging from 90-96%. The same five candidates were sent notices that the positions were filled with candidates best meeting the needs of the university and that unfortunately, the faculty were not selected. The top three scoring faculty received notices of successful interview completion and an offer for employment.

After one month, the three selected faculty continue to demonstrate online teaching effective practices in their virtual classrooms. The faculty function independently within the LMS and faculty support consists of weekly positive reinforcement of the faculty activities in the online classrooms. Plans are to analyze student satisfaction of faculty effectiveness at the end of the semester. Anecdotally, the process of behavioral interviewing, using a mock online class, produces quality online adjunct faculty hires that require minimal oversight, but instead require continual positive reinforcement and prompt response to resource requests from the university to maintain faculty satisfaction.

How does this practice relate to pillars?: 

Online faculty members influence student satisfaction (a pillar of quality online education) and ultimately student retention (Drouin, 2008; Thompson, 2011; Hoskins, 2012). The implications of poor quality online adjunct faculty members are numerous. Faculty interactions with online students are among the top three factors influencing online student satisfaction (Herbet, 2006).Poor student evaluations commonly result from poor performing faculty. Students dissatisfied with faculty, may question their decision to continue in an online program.

Online adjunct faculty consistently utilizing effective practices contribute to the pillars of quality in online education. Effective practices of hiring quality online adjunct faculty include the ability to assess online teaching behaviors prior to direct student interactions. Past behavior predicts future behavior. Subsequently, hiring quality online adjunct faculty includes a way to observe online teaching practices using a mock online class as part of the interview process.

The practice of using a mock online course as part of the interview process also contributes to online operational scale and is flexible. Interview courses can be ongoing or periodic based on the number of positions to fill. Enrollment in the mock course can be as few as three or as many as 20. Large programs use multiple facilitators to conduct several mock online courses concurrently if needed. Small programs use annual or biannual mock online courses to assist with the interview process. The frequency of mock online classes as part two of the interview process is driven by institutional need and the availability of facilitators.

Equipment necessary to implement Effective Practice: 

No additional equipment is necessary to implement the behavioral interviewing effective practice for hiring quality online adjunct faculty. Existing course development resources and the current learning management system are utilized. Approximately 10-15 hours of facilitation time is required for the one week mock online course. Like any online course, the time required to facilitate is enrollment dependent.

Estimate the probable costs associated with this practice: 

The costs are instructional design time of approximately 4 hours to develop the mock online course. Facilitator time of 10- 15 hours for the week, including grading would also be part of the costs. Facilitator time is dependent upon online adjunct faculty candidates enrolled in the mock course.

References, supporting documents: 

Anticipated growth in behavioral interviewing. (2008). Training and Development, 62, 4.

Drouin, M. A. (2008). The relationship between students' perceived sense of community and satisfaction, achievement, and retention in an online course. Quarterly Review of Distance Education. 9, 267-284.

Hammons, J. O., & Gansz, J. L. (2004). Selecting faculty with behavioral-based interviewing. Community College Journal. 75, 38-43.

Herbet, M. (2006). Staying the course: A study in online student satisfaction and retention Online Journal of Distance Learning Administration, 9(4). Retrieved from http://www.westga.edu/~distance/ojdla/winter94/herbert94.htm

Hoskins, B. J. (2012). Connections, engagement, and presence. Journal of Continuing Higher Education. 60, 51-53.

Thompson, J. T. (2011). Best Practices In Asynchronous Online Course Discussions. Journal of College Teaching & Learning (TLC), 3(7).

Other Comments: 

The effective practice for behavioral interviewing to hire quality online adjunct faculty was also used at a large national academic institution for at least a year; however, no purposeful follow up of faculty occurred. Anecdotally, those candidates from the large national academic institution who earned 85-90% in the mock online class as part 2 of an interview, scored lower on instructor effectiveness in student evaluations compared to candidates who scored greater than 90% in the mock online class.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Margaret Reneau
Email this contact: 
Reneau@sxu.edu
Award Winner: 
2013 Sloan-C Effective Practice Award
Collection: 
Student-Generated Content
Author Information
Author(s): 
Allison P. Selby
Author(s): 
Julie Frieswyk
Institution(s) or Organization(s) Where EP Occurred: 
Kaplan University
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

A virtual internship program forged international connections between a Peace Corps volunteer, a faculty member and students at Kaplan University, School of Information Technology. Virtual internships and international partnerships provide high-impact experiential learning opportunities for students while providing means for non-governmental organizations (NGOs) to build capacity and cultural bridges. This type of program allows non-traditional adult students in particular to maintain their family responsibilities and to continue their full time jobs while working on projects overseas in an online capacity. This program has led to increased student confidence in their skillsets as they continued to develop their assigned projects for the NGO. They also gained exposure to cultural diversity and international collaboration atypical of your average IT class.

Description of the Effective Practice
Description of the Effective Practice: 

Project iNext exemplified an institutional partnership between Kaplan University, Information Technology School and a Peace Corps Volunteer (PCV), Julie Frieswyk. Julie reached out to Kaplan University on behalf of her partner non-governmental organization (NGO), Pro-Business Nord (PBN), located in the Republic of Moldova. PBN is directly funded by United States Agency for International Development (USAID). One of the key goals of PBN was to develop a new social enterprise model for a sustainable Women Career Development Program in the northern part of Moldova.

The virtual internship program of Kaplan University was implemented to connect Information Technology students with the NGO. The partnership goals were to gain expert advisory in updating the older versions of their NGO website, testing server security and help to develop a new website for PBN’s new social enterprise, ProBizNord, a regional Business Resource Center.

The partnership with Pro-Business Nord (PBN) in Moldova was led by Allison Selby, Kaplan Information Technology Faculty and Julie Frieswyk, Peace Corps Volunteer. Frieswyk ensured the internship project goals were in alignment with the priorities of PBN and Peace Corp goals. Selby ensured the weekly outcomes were being met by the students and the students were receiving the necessary assets to complete their assigned tasks. This partnership was also important for the very practical concern of language translation. While the PBN team did speak English well, Frieswyk was also able to translate Russian to English as necessary.
The project provided excellent opportunities for students to apply their knowledge, skills and abilities in an authentic context. They were exposed to negotiating schedules, timeframes, project outcomes and clearly communicating the assets needed to progress to subsequent stages of the project. Students were able to participate in conversations that quickly became a mix of Russian and English, spanning multiple time zones, and developing materials for people they discovered they had much in common with. The exposure to cultural diversity, businesses and lifestyles was greatly appreciated by the students.

At the end of the ten week experience, two students exceeded expectations and one student did not perform per expectations. Two fully functional websites were developed and met the requirements of PBN. The students were able to apply new skills for the site development and learned the process of client interaction, requests for revisions and practiced final presentation skills. The third project involved conducting security forensics, which were never fully completed. Many factors could be attributed to this outcome. Conducting security forensics as a team may be more effective, having a mentor with strong expertise and practice in forensics would be an asset and providing projections of some of the possible testing outcomes would have provided a stronger set of parameters for experimentation. The fact that one of these projects was not wholly successful was actually just as valuable to us as we continued to evaluate the program.

The overall outcome included engaged students with opportunities to gain authentic work experience and international exposure. PBN received considerable student-conducted training with the platform Wordpress, marketing skills including Search Engine Optimization (SEO), and practice in project planning and implementation. The Moldovan NGO gained exposure to more skills and up-to-date technology, building their own capacity, while continuing building cultural bridges through their experience with the Peace Corps. They also became co-educators of the students (Holland, 1997), while the students learned how to professionally interact, accept constructive criticism, and design for the clients’ aesthetic taste rather than their own.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

This international partnership resulted in a small sampling of student participation and as such, my evidence is largely anecdotal and based upon student and NGO team feedback. The students remarked this was a unique international opportunity to learn the process of web development for clients, working together with international clients reviewing risks and suggestions, and to experience real-world project management. The NGO loved to be a part of something innovative and to learn more about our school system. The skills transfer and global understanding were repeated themes that appeared in the feedback and discussions.

As a high-impact experiential activity, the student and NGO partnership provides a type of global community based opportunity for the students’ worldview and perception to transform (Cress, 2004). The NGO benefits from the partnership by gaining access to resources and networks (Ferman & Hill, 2004) while collaborating with the students to build social change. The ‘mutually beneficial agenda’ (Holland, 1997), collaborative effort and shared gains of knowledge and practice becomes a transformative relationship (Bushouse, 2005), which in turn provides a sense of purpose to motivate student engagement and learning (Colby & Sulivan, 2009).

How does this practice relate to pillars?: 

It relates to pillars due to the learning gains made by the students as evidenced by implementing and customizing the Wordpress platform, participating in professional dialogues with the partners, demonstrating project management skills to stay on task, and interacting with a culturally diverse team. The student survey feedback stated this experience was not something they could experience in a typical classroom and they gained confidence and increased abilities throughout the process. They completed the program knowing they did possess professional skills in an authentic context.

Virtual internship partnerships could involve studies on social entrepreneurship, micro-finance, marketing, business administration and design. The virtual internships creates problem-solving activities with the potential to result in real-world skills such as collaboration for problem-solving, technology proficiency, presentation skills, and a greater appreciation for intercultural diversity (Humphreys, 2009). This opportunity provides students with an international experience who may otherwise be limited by finances, work responsibilities, family obligations or physical limitations. In addition, there is a considerable cost-savings when compared to studying abroad for the same amount of time. A virtual internship program incurs regular tuition fees, no additional costs are required by the student.
Students enjoyed the experience overall and loved the new addition to their resume and credentials. We learned a lot about how to support the students more efficiently. This type of project benefits tremendously by considerable advanced preparatory stages. Using project charters to outline weekly outcomes and deliverables is very important. Defining the exact scope of the deliverables, what assets may be needed and the key stakeholders were all important topics to clarify. Synchronous weekly team meetings using Skype with the clients gave the students a vested interest and motivation to succeed. And having the students train the clients for site maintenance gave them ownership of the process and pride in their proof of success. It was exciting, engaging, and could definitely be accomplished by other institutions with great success.

Equipment necessary to implement Effective Practice: 

The only aspect completely necessary is an internet connection and email. In our program, the students also used Captivate for creating videos to present the finished products and instructional materials for the clients. Jing would be a reasonable free alternative for short presentations under five minutes. The students also used Wordpress and installed the framework on the client server. The students used free themes for both Wordpress sites.

All other tools enhance the experience and few have costs associated with them. We recommend the following:

• Synchronous tools: Adobe Connect, Google Hangouts, Google chats, Skype
• Asynchronous tools: email, discussion board in LMS
• Reflective tools: Blog, journals, status reports

Estimate the probable costs associated with this practice: 

The only additional cost would be optional and would involve the use of Adobe Connect. All other resources were open source and we did not incur additional costs using them. The client already had server space and the students used free Wordpress themes. There was essentially no budget for the site so our costs were very minimal for this project.

References, supporting documents: 

Bushouse, B. K. (2005). Community Nonprofit Organizations and Service-Learning: Resource Constraints to Building Partnerships with Universities. Michigan Journal of Community Service Learning, 32-40.
Colby, A., & Sulivan, W. M. (2009). Strengthening the foundations of students’ excellence, integrity, and social contribution. Liberal Education, 22-29.
Cress, C. M. (2004). Critical thinking development in service-learning activities: Pedagogical implications for critical being and action. Inquiry: Critical Thinking Across the Disciplines, 87-93.
Cuban, S., & Anderson, J. B. (2007). Where’s the Justice in Service-Learning? InstitutionalizingService-Learning from a Social Justice Perspectiveat a Jesuit University. Equity & Excellence in Education, 144-155.
Ferman, B., & Hill, T. L. (2004). The challenges of agenda conflict in higher-education-community research partnerships: Views from the community side. Journal of Urban Affairs, 241-257.
Holland, B. (1997). Analyzing institutional commitment to service: A model of key organizational factors. Michigan Journal of Community Service Learning, 30-41.
Humphreys, D. (2009). College outcomes for work, life,and citizenship: Can we really do it all? Liberal Education, 14-21.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Allison Selby
Email this contact: 
aselby@kaplan.edu
Effective Practice Contact 2: 
Julie Frieswyk
Email contact 2: 
juliefrieswyk@gmail.com
Author Information
Author(s): 
Carol A. McQuiggan, D.Ed., Manager & Senior Instructional Designer, Penn State Harrisburg
Author(s): 
Laurence B. Boggess, Ph.D., Director of Faculty Development and Support, World Campus/Academic Outreach
Author(s): 
Brett Bixler, Ph.D., Lead Instructional Designer, IT Training Services
Author(s): 
Wendy Mahan, Ph.D., Senior Instructional Designer, The College of Health and Human Development
Institution(s) or Organization(s) Where EP Occurred: 
The Pennsylvania State University
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

The Faculty/Staff Engagement and Professional Development subcommittee is composed of faculty, faculty developers, and learning designers from throughout the University, representing multiple colleges, campuses, and support units. They collaboratively identify, complete, and disseminate projects that have the potential to promote excellence in online teaching and learning, to increase faculty interest in online teaching activities, and to pursue collaborative endeavors within and outside the university to continue to build a strong foundation for faculty engagement in online teaching. This unique, cross-campus, interdisciplinary, and multi-unit approach provides multiple perspectives and addresses common needs in providing quality online teaching and learning experiences.

Description of the Effective Practice
Description of the Effective Practice: 

The Faculty/Staff Engagement and Professional Development subcommittee sits within a Penn State Online structure composed of the Penn State Online Steering Committee and the Penn State Online Coordinating Council. Understanding the structure is important to putting this effective practice into the context of a large multi-campus university, while also understanding that this practice could be implemented in an institution of any size.

The Penn State Online Steering Committee serves as the governing body for Penn State Online, reporting to the Provost. The Steering Committee has strategic leadership responsibility for Penn State Online, serves as the policy board for the e-Learning Cooperative and the World Campus, and as the governing board for the Penn State Online Coordinating Council, through which it encourages effective cross-unit coordination of several key functions. These key functions include the effective use of course development resources, professional development, establishment of standards, and innovation and research.

The Penn State Online Coordinating Council reports to the Steering Committee. It includes representatives of the key central University units involved in e-learning—Teaching and Learning with Technology, Undergraduate Programs, University Libraries, and the World Campus—and college-based e-learning development and support units. It meets bimonthly to develop University-wide best practices, standards, and procedures that will facilitate the growth of high-quality e-learning at Penn State.

The Coordinating Council is responsible for identifying opportunities for collaboration, promoting the effective coordination of resources across organizational units to achieve synergy and create capacity to address strategic priorities, and developing common standards to guide work across units. In some cases, the Council responds to requests from the Steering Committee; in other instances, the Council identifies issues and proposes remedies to the Steering Committee; and in other situations, the Council addresses operational issues and simply reports the results to the Steering Committee.

The Faculty/Staff Engagement and Professional Development subcommittee reports to the Coordinating Council at its bimonthly meetings. Its completely volunteer membership of faculty, faculty developers, and learning designers, includes representation from six campuses, nine colleges, and three support units, and all have some responsibility for and/or interest in online teaching and learning. It is co-chaired by the director of World Campus Faculty Development, maintaining a direct relationship with Penn State’s online campus.

Project ideas trickle down from the Steering Committee and the Coordinating Council, and also trickle up from the needs and practices of the subcommittee members and the faculty and administrators with whom they work. Using a team approach, the projects are designed for wide use and adaptability across the University and beyond.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

Resources developed by the Faculty/Staff Engagement and Professional Development subcommittee include the Penn State Quality Assurance e-Learning Design Standards, Hiring Guidelines for Online Program Lead Faculty, Online Course Authors, and Online Course Instructors, the Faculty Self-Assessment for Online Teaching, Peer Review for Hybrid Courses, and Faculty Competencies for Online Teaching. All are accessed via the Weblearning site (http://weblearning.psu.edu), and selecting the “Resources” tab for “Penn State Online Resources” (http://weblearning.psu.edu/resources/penn-state-online-resources/).

Penn State Online has adopted the Quality Assurance e-Learning Design Standards (http://weblearning.psu.edu/resources/penn-state-online-resources/quality...), providing a measure of quality assurance for online courses in order to best serve the e-learning needs of our students. For each of the twelve standards there is a link to a short description of the standard, a list of the required evidence that the standard has been met, suggested best practices, and resources to learn more.

The Hiring Guidelines (http://weblearning.psu.edu/resources/penn-state-online-resources/hiring-...) are used to help guide the hiring process for online program lead faculty, course authors, and course instructors. The subcommittee is currently using these guidelines to suggest interview questions to accompany each guideline document in the near future.

The Faculty Self-Assessment for Online Teaching (https://weblearning.psu.edu/FacultySelfAssessment/) tool was packaged as open source and licensed with Creative Commons in order to share as broadly as possible. The tool was the result of a literature review, focus group input, and usability testing, and was vetted at a well-attended Sloan-C Conference presentation. To date, the tool has been shared with over thirty colleagues representing academies, community colleges, state colleges, and universities throughout the United States. It has also been shared with three doctoral students for potential use in their dissertations. We are hoping that all of our tools can be shared as broadly.

The Faculty Competencies for Online Teaching (http://weblearning.psu.edu/resources/penn-state-online-resources/faculty...) were derived from research conducted by a team at Penn State’s World Campus, which the subcommittee used to develop a document detailing those competencies alongside additional guidelines, examples, and best practices. They provide faculty and administrators with a better understanding of the instructional requirements of online teaching.

The Peer Review for Hybrid Courses (https://www.e-education.psu.edu/facdev/hybridpeerreview) is based on the “Seven Principles for Good Practice in Undergraduate Education,” a summary of fifty years of higher education research that addressed good teaching and learning practices. This process was designed, implemented, and assessed by the subcommittee based on a need shared by the campus learning designers and faculty.

The Web Learning @ Penn State (http://weblearning.psu.edu) site continues to evolve and grow, with a newly revised site just launched on August 29th. A member of the Faculty/Staff Engagement and Professional Development subcommittee is a contact for each of the site’s webpages.

Together, these resources, collaboratively designed and shared broadly, have provided access to tools that increase the quality of our online courses. They identify and share best practices for online teaching and learning, identify and share competencies for online teaching success, and establish and share guidelines for creating quality online courses and hiring qualified instructors.

How does this practice relate to pillars?: 

Learning Effectiveness: The effective practices supported by the subcommittee in the area of learning effectiveness are most evident in the Faculty Self-Assessment for Online Teaching, which stresses the skills needed by online instructors to be effective online teachers. This will be enhanced when the redesign of the tool is completed to align with the Faculty Competencies for Online Teaching. The Competencies also provide an opportunity for faculty professional development, as do the Peer Review tools. Two current projects, the New Instructor Orientation to Online Teaching Checklist, and the New Faculty Manual, will contribute to faculty development and the core elements of quality online learning.

Scale: Some of our newest tools contribute to the scale pillar. The Checklist for Administrator Review and Approval of Online Courses and the Course Revision Worksheet both contribute to continuous improvement. The Checklist for Administrator Review builds administrative awareness of the scope of faculty authoring of online courses, and how new courses fit within a program of study. It also addresses the need for a course to not be dependent upon one faculty member to teach, addressing the need for faculty capacity. The Course Revision Worksheet creates more awareness as to the personnel who need to be involved in a revision, and the overall scope of work required.

Access: The “Resources” (http://weblearning.psu.edu/resources/penn-state-online-resources/) the Faculty/Staff Engagement and Professional Development subcommittee have built and provided on the Web Learning @ Penn State (http://weblearning.psu.edu) site provides the Penn State University community access to resources that promote best practices in online learning and sets standards for excellence in hiring online faculty. Because it is a public site, access is also provided more broadly beyond the University. By building and providing these tools broadly, all e-learning design units at Penn State have access to the same tools which overlap in their communication of quality standards, providing students with online courses that are designed with the same quality framework. We hope to learn more about how these units are using these tools and, even more important, learn how they are impacting design considerations. Then we would like to learn how those design considerations are impacting student learning and their access to quality online courses.

Faculty Satisfaction: Our subcommittee and the broader committee structure we are nested in serves the “support” and “institutional study/research” aspects of the institutional factors related to faculty satisfaction. We provide all of the institutional supports in a unique, cross-campus, interdisciplinary, and multi-unit approach. We provide opportunities for innovation by asking online instructors to engage in self-improvement. As more and more instructors teach online and more administrators are responsible for hiring and developing them, we need a way to ensure self-learners have optimal materials at their disposal for just-in-time learning. The tools we create and disseminate do this.

Equipment necessary to implement Effective Practice: 

There is no special equipment necessary to implement this Effective Practice. The Faculty Engagement subcommittee uses the resources already available within their various units. The Web Learning @ Penn State (http://weblearning.psu.edu) site is used for broad dissemination.

Estimate the probable costs associated with this practice: 

There are no direct costs associated with this practice. The indirect costs are the time the subcommittee members spend on specific projects, but they are offset by the opportunities they provide for their own professional development and by the tools they create that afford new efficiencies and quality processes. No one person or unit could have created these tools alone, but by collaborating across units, common needs are being met with resounding success.

References, supporting documents: 

As the transition is made to our newly designed website (http://weblearning.psu.edu), we plan to gather web analytics on its traffic. We are also planning to survey the various Penn State eLearning design units as to their use of the various tools, both to increase awareness and to learn of implementations. Within implementations, we hope to dig more deeply to learn about the effectiveness and transformative possibilities of the various tools. If possible, we would even like to link design and teaching/learning decisions made based on tool use, and improvements in student learning. We will identify research partners and submit proposals to Penn State’s Center for Online Innovation in Learning (COIL).

A number of our tools support and/or are aligned with the research-based "Competencies for Online Teaching Success" (http://sites.psu.edu/wcfacdev/2013/05/15/competencies-for-online-teachin...).

Other Comments: 

New projects marinate in the subcommittee and extend into the University and beyond just as the Faculty Self-Assessment for Online Teaching tool did. This is additional evidence that the subcommittee has a proven track record of innovation, honing to best practices, and then generous dissemination - traits Sloan-C supports.

Projects that have been completed and will be added to the Web Learning site very soon:
1. The Checklist for Administrator Review and Approval of Online Courses was created to guide an administrator through an initial review of an online course that has been developed by a faculty member from their unit in collaboration with a learning designer. It is yet another tool to ensure quality review; in this case, by program administrators. It gives the administrators a checklist of items to review, and a feedback loop with the learning designer and faculty author.
2. The Course Revision Worksheet is intended for use by course development teams to communicate the reasons for a course revision, the specific course items in need of revision, the percentage of revision needed for each course item, the personnel who need to be involved in those revisions, and the total percentage of effort that will be required. This information can then be used to assign the appropriate resources to the course revision project. This is also a learning document in that it creates an awareness of the needs for revision and the effort required by a potential team of people.

Our work continues on these current projects:
1. Asteria - This will be a decision support tool for faculty to use to match pedagogical purpose with an educational technology. Two approaches for use are planned. Users may approach this tool with an unfocused search in which they simply want to explore different tools. There will also be a focused approach available in which faculty who know what they want to do pedagogically can search for the appropriate tool. The intent is to have the focus on the user’s pedagogical purpose, and not be tool-driven.
2. An update of the Faculty Self-Assessment for Online Teaching tool to align with the Faculty Competencies for Online Teaching - The current tool will be revised to align with the Faculty Competencies for Online Teaching.
3. New Instructor Orientation to Online Teaching Checklist - A number of Penn State’s eLearning units have a new instructor orientation. They are comparing their orientations in order to develop a checklist of core elements to share as a basis for different units to use and adapt for their own use. This will also be able to be used for new units developing their own orientation. The checklist end users will most likely be learning designers and faculty developers.
4. New Faculty Manual - The Faculty Manual will provide faculty new to online teaching with a comprehensive manual to which they can refer as they are teaching online. The team will use the World Campus faculty manual to create a common manual for faculty that can be adapted by individual units.
5. Online Mentoring Program Pilot - Through a partnership with the Schreyer Institute for Teaching Excellence, a mentoring program is being developed to provide instructional support for those teaching online, and to create opportunities for networking with others teaching online. They are reading about the Community of Mentoring practices, and are putting a research lens on the project as they plan to move forward.
6. New Online Faculty Interview Questions based on the Hiring Guidelines - This will be a logical companion piece to the Hiring Guidelines already available on the Web Learning site.
7. Best Practices Examples - Listening to faculty needs, the subcommittee is collecting examples of best practices (instructor introduction, discussion rubric, team project structure, various learning activities, providing effective feedback, flipped classroom design, etc.) and plans to host them on the Web Learning site for all to access and use.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Carol A. McQuiggan
Email this contact: 
cam240@psu.edu
Effective Practice Contact 2: 
Larry Boggess
Email contact 2: 
lbb150@psu.edu
Award Winner: 
2013 Sloan-C Effective Practice Award
Collection: 
Student-Generated Content
Author Information
Author(s): 
Michael Wilder
Institution(s) or Organization(s) Where EP Occurred: 
University of Nevada, Las Vegas
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

For the last four years, undergraduate seniors in UNLV's journalism department have been participating in an innovative approach to learning digital convergence. These students have been engaged in transitioning from traditional methods of gathering, delivering, and marketing news from print media to digital media by actively manipulating online technologies. At the core of this technique is an effective combination of second-generation blogging environments.

Although blogging is not new to education, current improvements to blogging systems have enabled this technology to be applied in innovative and creative ways. Instead of simply being mechanisms for individual reflection or announcement, newer features allow open-source blogging systems to become full-fledged virtual communities that enable sophisticated social interaction, collaboration, and peer evaluation.

Participants in this educational approach have expressed high satisfaction through course surveys and testimonials.

Description of the Effective Practice
Description of the Effective Practice: 

A combination of the open-source blogging software, WordPress, with the free BuddyPress plugin, creates an environment in which students can create academic publications with a full-range of contemporary word-processing features (including the addition of images, video, and podcasts), share resources, work collaboratively, and comment and evaluate each other's work. At the same time, this system allows students to have direct control of their learning environment (through customized themes), to participate in Facebook-like social interactions ("friending," "liking," commenting, user profiles), to integrate social media (such as Twitter, Facebook, Pinterest, YouTube, Flickr), and to collaborate (via document sharing, group support, and wikis). In addition this educational practice allows mobile access (via smartphones and tablets) as well as badges and gamification.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

* Student evaluations:
Student evaluations of the course indicate a high degree of satisfaction with the course and the learning environment. In a recent end-of-term student evaluation, ninety-two percent of all students strongly indicated satisfaction (n=12). Eighty-three percent of the students felt that the course increased their interest in the subject area at an above-average level (or higher). One-hundred percent of the students felt that the course increased their knowledge in the subject area at an above-average level (or higher).

* Student testimonials:
Students are asked to reflect on their experience participating in the course. Over the last four years of teaching using this technique, these reflections are overwhelmingly positive.

Some examples:

"Learning to blog is fun and a great way to express yourself. Knowing how to use a blog is helpful in many ways such as bringing out creativity and design. Before having a blog I wrote only when I had writing assignments in school, but now I can write about anything that comes to mind. Thanks to this Interactive Media Design class I’m able to create stories with videos and podcasts that I couldn’t do before.

"It’s great to learn new things and use them to reach out to my peers. It feels great accomplishing things that seemed out of reach before. I had an idea of how blogging worked before, but not like this. Having videos and images in a story adds more to the story and makes it approachable to others. YouTube was fun and new to me, but with a little practice I ended up surprising myself."

"The aspect I was most enthralled to take on was simply learning the basics of WordPress, a place I had no knowledge whatsoever. Overall stepping in this world that is so new to me has been a very positive experience, and I am eager to take on this new challenge."

"I now feel like I have a chance in this high tech world. Not only do I feel I may be able to keep up my own blog in a manner that might be described as competent, I no longer see tasks like getting video onto YouTube, or starting a podcast, as so daunting."

"I never would have called myself a technologically savvy person, but now I have a fully working blog with all the add-ons. If you had said the word "widget" four weeks to me I would have thought you were speaking a foreign language. Now I am fully schooled on the blogging language and all the details it entails."

"After taking part in this class I have learned that a relative novice, like myself, is able to do some really cool thing on a site like WordPress that not only looks clean and professional, but also can do some really creative things that will separate your blog from the other ones on the Internet."

* Student product
In addition to developing their own student portfolio, students produce at least nine full-length articles that are peer reviewed and evaluated. Once the semester is over all students migrate their work to off-campus, free blogging systems. Many of these students go on to use the skills they learn as part of this course to continue professional blogging and becoming employed in online journalism and communication.

How does this practice relate to pillars?: 

Access
Due to the stringent accessibility standards of WordPress, students with learning disabilities are able to access the online system. WordPress itself provides advice and resources that continue to enable this technology to be accessible. "Make WordPress Accessible" (http://make.wordpress.org/accessibility/) is the official blog for the WordPress accessibility group - dedicated to improving accessibility in core WordPress and related projects. Furthermore, course content is fully accessible to mobile devices.

Faculty satisfaction
By enabling students to interact with each other and provide peer evaluation, the reliance on faculty as the sole provider of support and feedback is decreased. Students develop instructional content that extends learning beyond what the instructor may have provided. As a result, faculty express appreciation and happiness.

Learning effectiveness
Prior to enacting the current blogging practice, prior course offerings had no mechanism for students to collaborate and affect their learning environment. Once implemented, learning outcomes using the multiuser blogging system far exceeded previous methods of teaching the course.

Scale
Due to the nature of this open-source technology, this practice is both inexpensive and scalable. Cost is minimal, and the technology scales to accommodate thousands of students. Dozens of institutions of higher learning (such as SUNY, Texas A&M, Penn State, and Yale) are currently using combinations of these technologies to serve educational blogging communities.

Student Satisfaction
Surveys of student satisfaction both during and after the course indicate a high degree of satisfaction with the practice. Furthermore, alumni repeatedly return to the course and participate through guest blogs, communicating that participation in the course has impacted their lives positively.

Equipment necessary to implement Effective Practice: 

A Web server (or host)
A connection to the Internet
WordPress blogging software
BuddyPress plugin software

Estimate the probable costs associated with this practice: 

Assuming that an educational institution already has a Web server and connection to the Internet, then the set-up cost would be minimal. The software and associated plugins are open source (free). Configuration and maintenance may require time and technical expertise from a salaried technician.

Cost to an individual instructor is also relatively inexpensive. In addition to the free open-source software, an instructor may need Web-hosting service (~$100 a year or less), and domain-name service (~$10 a year).

References, supporting documents: 

Brescia, W., & Miller, M. (2006). What's it worth? The perceived benefits of instructional blogging. Electronic Journal for the Integration of Technology in Education, 5, 44-52.
Bruning, R., Schraw, G., Norby, M., & Ronning, R. (2004). Cognitive psychology and instruction. New Jersey: Pearson Education, Inc.
Dickson, K., Wiggins, M. & Harapnuik, D. (2010). WordPress as a Mobile Learning Environment. In D. Gibson & B. Dodge (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference 2010 (pp. 2212-2213). Chesapeake, VA: AACE. Retrieved August 30, 2013 from http://www.editlib.org/p/33691.
Downes, S. (2004). Educational blogging. EDUCAUSE Review, 39(5), 14-26.
Ellison, N., & Wu, Y. (2008). Blogging in the classroom: a preliminary exploration of students attitudes and impact on comprehension. Journal of Educational Multimedia and Hypermedia, 17(1), 99-122.

Farmer, B., Yue, A. & Brooks, C. (2008). Using blogging for higher order learning in large cohort university teaching: A case study. Australasian Journal of Educational Technology, 24(2), 123-136.

Glogoff, S. (2005). Instructional blogging: promoting interactivity, student-centered learning, and peer input. Innovate: Journal of Online Education, 1(5),
Jenkins, H., Purushotma, R., Clinton, K., Weigel, M. & Robison, A. J. (2006). Confronting the Challenges of Participatory Culture: Media Education for the 21st Century, John D. and Catherine T. MacArthur Foundation, Chicago, IL.
Kerawalla, L., Minocha, S., Kirkup, G., & Conole, G. (2008). An empirically grounded framework to guide blogging in higher education. Journal of Computer Assisted Learning, 25, 31-42.
Mezirow, J. (1997). Transformative learning: Theory to practice. In P. Cranton (Ed.), Transformative Learning in Action: Insights from Practice. New Directions for Adult and Continuing Education (pp. 5-12). San Francisco, CA: Jossey-Bass.
Paulus, T., Payne, R., & Jahns, L. (2009). Am I making sense here? What blogging reveals about undergraduate student understanding. Journal for Interactive Online Learning, 8(1), 1-22.
Vygotsky, L. (1978). Mind in society. Cambridge, MA: Harvard University Press.
Xie, Y., Ke, F., & Sharma, P. (2008). The effect of peer feedback for blogging on college students' reflective learning processes. Internet and Higher Education, 11, 18-25.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Michael Wilder
Email this contact: 
michael.wilder@unlv.edu
Author Information
Author(s): 
Dr. Janet Welch
Author(s): 
Anwen Burk
Author(s): 
Sherri Fricker
Author(s): 
Carol Tonhauser
Author(s): 
Kim Peacock
Institution(s) or Organization(s) Where EP Occurred: 
University of Alberta, Faculty of Education
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

In 2012 the mandatory educational technology undergraduate course in the Faculty of Education at the University of Alberta underwent a significant redesign. Approximately 800 students go through this course in one year. This redesign was driven by a number of goals. First, the course content needed to meet the needs of 21st century teachers. Second, we wanted to increase flexibility for how our students interacted with and moved through the course. Third, we wanted to build a course in which we modelled effective technology use in education. In order to meet these goals we worked as a team with a variety of educational and professional backgrounds to build a course which has reduced face to face class time, deepened student interaction with the course content and uses a team approach in its delivery.

Description of the Effective Practice
Description of the Effective Practice: 

EDU 210: Introduction to Educational Technology (formerly called EDIT 202) is a mandatory undergraduate course in the Faculty of Education at the University of Alberta. A recent redesign of the course incorporates the International Society for Technology in Education’s (ISTE) NETS for Teachers and the Alberta Education’s Information and Communication Technology Program of Studies outcomes to prepare students to meet international and local standards for their technology use in their Introductory Professional Term (IPT), Advanced Professional Term (APT) and careers as educators.

The face to face section of EDU 210 has been redesigned as a blended course. The original course consisted of 3 hours/week each of face to face lecture and 3 hours/week of lab time.

The face to face lecture time has been reduced to 1.5 hours/week. The remaining 1.5 hours/week is spent on asynchronous activities such as discussion boards, interactive digital stories, pre-reading and peer assessment. The face to face lectures are called Interactive Lectures and students are required to bring their own device. We have moved away from traditional lectures and make use of polling software, social media and other web 2.0 tools for group activities, to test understanding and gather student feedback.

The extensive incorporation of technology into the lectures has enabled us to make large classes interactive, gather feedback from students in real time, check on student understanding and progress and model successful integration of technology into a synchronous classroom. The other 1.5 hours of face to face lecture time has been replaced primarily by resource exploration, dialogue through discussion forums and knowledge checks through interactive digital stories. This has lead to an increase in student to student interactions. It also means that students need to do some of their own research on the lecture topic and demonstrate that they have more than a superficial understanding of the topic. Online groups are formed that are replicated in the face to face environment. This allows students to build community online, but carry it through to the live environment.

The 3 hours/week of lab time has been replaced by FlexLabs. The FlexLabs are asynchronous activities which require the exploration, creation and reflection on a variety of tools and types of technology that may be used in the K-12 classroom. They are designed to take approximately 3 hours/week and are matched with the topics and content from the interactive lectures.

Support for the FlexLab activities is provided in an open lab space by members of the EdTEch Services team. Students can also receive help via live chat. This has replaced the need for scheduled lab time, has freed up lab space on campus and allowed the students who need the most help to receive one on one assistance.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

Responses from a survey completed by the students in the course revealed the following.

Compared to typical face to face courses they had taken:

43% of respondents indicated increased engagement in this course
75% of respondents indicated that this course required more time and effort
62% of respondents indicated the course improved their understanding of key concepts
44% of respondents indicated an increase in their interaction with other students
43% of respondents indicated improved quality of interactions with other students
81% of respondents indicated that this course offered the convenience of not having to come to campus as often
62% of respondents indicated that the course allowed them to reduce their total travel time each week and related expenses

In addition

75% of respondents indicates that they would choose a blended course format over a fully online or fully face to face format
82% of respondents indicated that the course experience improved their opportunity to access and use the class content.
An article written for the Faculty of Education magazine (EDIT 202 students learn skills that will help them engage the 21st century learner) contains interviews with two students who took the course. Here are a couple of quotes from these students.
“Once I started seeing what the EDIT 202 course was about, once I realized the impact it was going to have on my future teaching career, I began to get excited. I remember sitting in the first class and being handed an iPad to use for an assignment and feeling this rush of excitement. I was sitting on the edge of my seat. Suddenly everything was interesting again. Good lessons do that to you.” - Shaun

“I know that learning how to use tools like this is a big step in the right direction in terms of connecting with students in my future classroom. I am filled with confidence and excitement when I think about integrating technology into my future lessons. I have EDIT 202 to thank for that.” - Samantha

How does this practice relate to pillars?: 

This practice relates most strongly to the pillars of Student Satisfaction, Learning Effectiveness and Scale.

Student Satisfaction
The survey results and anecdotal evidence indicate that in many ways students preferred and were more engaged in the blended delivery of the course. This shows that the integration of online components has not just met the standards set by face to face learning, but has exceeded them. Student feedback has also indicated that the outcomes for the course are being met and students can see how they can use technology in their teaching.

Learning Effectiveness
Again, the survey results and anecdotal evidence suggests that the blended format for this course often exceeded the expectations students had for their face to face learning. We made extensive use of our Learning Management System (Moodle) to design a clear course path and provide a “one stop shop” for students.

Scale
One of the things we are especially excited about is that we have managed to reduce face to face class time while improving the quality of student engagement even in large classes. We have also been able to leverage the support of teaching assistants and other staff in the department to build capacity and support instructors.

Equipment necessary to implement Effective Practice: 

Learning Management System
Access to Web 2.0 tools

Estimate the probable costs associated with this practice: 

The most significant cost consideration is staff. Both the redesign of the course and the support team while the course is running needs to be considered. No additional equipment or software needed to be purchased with exception of Poll Everywhere (approximately $700/year).

References, supporting documents: 

Alberta, G. 2013. Alberta Education - Information and Communication Technology. [online] Available at: http://education.alberta.ca/teachers/program/ict.aspx [Accessed: 28 Aug 2013].

Brandon, D. 2013. EDIT 202 students learn skills that will help them engage the 21st century learner | Faculty of Education University of Alberta. [online] Available at: http://beditionmagazine.com/edit-202-students-learn-skills-that-will-hel... [Accessed: 29 Aug 2013].

Iste.org. 2013. Nets Standards. [online] Available at: http://www.iste.org/standards [Accessed: 28 Aug 2013].

Contact(s) for this Effective Practice
Effective Practice Contact: 
Dr. Janet Welch
Email this contact: 
jewelch@ualberta.ca
Effective Practice Contact 2: 
Sherri Fricker
Email contact 2: 
stfricke@ualberta.ca
Effective Practice Contact 3: 
Anwen Burk
Email contact 3: 
anwen@ualberta.ca