The Evergreen Education Group and Christensen Institute launch project to find and publicize examples of success in blended learning

If your public school or district fits the criteria below, please visit http://ow.ly/Cldag to fill out the short survey telling us about your success.

Today the Evergreen Education Group and Christensen Institute are launching a project to find and publicize examples of success in blended learning.

We believe that proof points will help practitioners who need to demonstrate to a variety of stakeholders that blended learning can be successful in a setting that the stakeholders are familiar with. We intend therefore to publish case studies that collectively will cover a variety of elements including different geographic areas, school/district sizes, and urban/suburban/rural characteristics.

Specifically, we are seeking:

  • Examples from regular traditional public schools and districts, not including specialized schools or charter schools.
  • Blended learning implementations that can demonstrate improvements in outcomes based on student achievement as determined by assessments, course grades, or other measures.
  • A range of implementation types, geographic areas, student populations, grade levels, and subject areas.

We will review all submissions, and through evaluations of the survey responses and follow-up interviews, determine the best examples that represent an assortment of blended learning successes. Case studies will be developed based on these examples. We will invite the schools that are selected to be featured in the case studies to co-present with us at the November 2015 iNACOL Blended and Online Learning Symposium. Evergreen and Christensen will pay a portion of expenses to defer travel costs.

The survey is short and will take no more than about 10-15 minutes. If your program fits what we are seeking we look forward to hearing from you! The survey will remain open through Sunday, October 19.

Evergreen Education Group and Christensen Institute seeking examples of success in blended learning

Early next week we, along with the Christensen Institute, are going to be launching a survey that will seek to find examples of blended learning success in traditional public schools. Based on the survey results and subsequent research and interviews with selected schools, we will be publishing a set of case studies demonstrating a range of blended learning successes.

Why are we doing this? It’s our belief that proof points showing blended learning success will help practitioners who need to demonstrate to a variety of stakeholders that blended learning can be successful in a setting that the stakeholders are familiar with. We intend therefore to publish case studies that collectively will cover a variety of elements including different geographic areas, school/district sizes, and urban/suburban/rural characteristics.

If you run a blended learning program in a traditional public school and have evidence of success, we hope you will take the time to fill out the survey (it won’t take long). If you are with a content, technology, or other provider, we hope that you will pass along the survey to your colleagues and clients who are demonstrating success with blended learning.

Look for the formal announcement, and the survey to go live, on Monday October 6.

MOOCs in K-12 education: more hype than substance

A recent headline reads Massive Online Classes Expand into K-12. In a narrow sense, the headline is correct. MOOCs are moving into K-12 education. But the move is extremely limited so far, with few courses and relatively few students passing them, and media reports tend to overplay the extent to which MOOCs are being used for credit by high school students.

As we report in the upcoming Keeping Pace 2014 annual report (to be released at the iNACOL Symposium during the first week of November), MOOCs have received extensive attention in the media in recent years, as the largest of the courses have attracted tens of thousands of students. Some postsecondary institutions have partnered with MOOC providers to offer remedial or credit-bearing courses, creating additional awareness of the potential—and the current drawbacks—of MOOCs.

But media coverage similar to the headline at the start of this post obscures the fact that MOOCs have not yet had a significant impact on K–12 education.

Within K–12 education, in fact, it is not clear that MOOCs should be considered a category separate from online learning from a policy or practice perspective. There are very key differences between most MOOCs and many online courses, including the role of teachers and the level of student interaction and data integration. The role of teachers and student interaction, in particular, tend to be lower in MOOCs than in other online courses. But there is overlap between these categories, and not a clear distinction. Some online courses have a very limited teacher role—as do most MOOCs—and some MOOCs are being used with online or onsite teachers.

Florida is among the very few states formally examining whether MOOCs should be among the educational options available to K–12 students. HB7029 (2013) required the Florida Department of Education (DOE) to develop the Florida Approved Courses and Tests (FACT) initiative by SY 2015–16, to expand student choice and online course options, explicitly including MOOCs. In addition, the law required the creation by the DOE of a new approval process for MOOC providers (and other online course providers). This approval process was submitted to the legislature in February 2014.

When the legislation was passed in 2013, some Florida schools responded by formally offering MOOCs to their students. Students in Pinellas County (FL) are taking advantage of a series of three MOOCs offered by St. Petersburg College to help high school students prepare for college-level courses. As of November 2013, 1,100 students had enrolled in the first class, a math MOOC, and 130 had completed it. Broward College offers a similar course to students that combines reading, writing, and mathematics into one course to prepare students for college; it had 3,200 worldwide students enrolled as of May 2014. These are small numbers in a state in which students accounted for about 400,000 course enrollments in online courses, from Florida Virtual School (mostly) and other providers.

Evaluating education research

In an earlier blog post about a Harvard study about FLVS, I mentioned that results of education research are often oversimplified, and in passing linked to an Atlantic article, How to Read Education Data Without Jumping to Conclusions. Several points in that article are worth reviewing in more detail.

Researchers are sometimes accused of operating in the ivory tower and being disconnected from real, on-the-ground conditions. That is sometimes the case, but it’s also true that advocates and policymakers sometimes misinterpret or misuse research results in ways not supported by the studies. As we’ve discussed in previous blog posts, anyone citing research should have some knowledge of what the study says, and of its limitations.

Some of the article’s key points, and some of my own thoughts related to educational studies, include:

Absence of evidence does not equal evidence of absence.” The oft-cited example refers to the fact that studies have yet to find evidence of life on other planets, yet that doesn’t prove that such life doesn’t exist. Within education, if someone says “we have no evidence supporting this” the question in response is “how much effort has been made to find such evidence?”

Sample size should always be examined. Small samples can lead to misleading results. All other things equal, a larger sample equates to a more robust study. But media reports rarely include sample sizes, and often will report studies of 30 students and 30,000 students equally.

And finally, correlation does not imply causation. If statistics appear to show a correlation between a treatment and an outcome, a theory of action for why that treatment would cause that outcome is necessary. This is particularly important in blended learning implementations that show success. These programs often involve the use of digital content, tablets, or laptops, and—perhaps most importantly—extensive professional development that involves a significant change to the instructional model that is being used. The successful outcome is likely to be a result of all aspects of the implementation, but the details of success are often simplified to “the school started using tablets and its scores went up the next year.” Although that statement is technically true, it leaves out the pedagogical changes tied to tablet use that are critical to success. It may be that the tablet use and the other changes together accounted for the change, or it may in fact be the case that pedagogical changes only partly related to use of technology are responsible.

 

 

Cartoon taken from http://imgs.xkcd.com/comics/correlation.png. 

These and other issues related to research in education don’t make such research less important or less valuable. But findings should be used appropriately, and in particular those people who publicize the results of studies should include all necessary conditions and caveats. In my experience the researchers usually do this, but when the results are reported by general media and advocates, often the details get lost in translation.

Harvard study shows Florida Virtual School courses to be “about the same or somewhat better” than traditional courses

A recently released paper from the Harvard Kennedy School claims to provide the first “credible evidence on the quality of virtual courses, ” and concludes that students at the Florida Virtual School (FLVS), in courses from school year 2008-09 and prior, “perform[ed] about the same or somewhat better on state tests once their pre-high-school characteristics are taken into account.”

Virtual Schooling and Student Learning: Evidence from the Florida Virtual School reports the following:

“FLVS students…perform about the same [as non-FLVS students] or somewhat better on state tests once their pre-high-school characteristics are taken into account. We find little evidence of treatment effect heterogeneity across a variety of student subgroups, and no consistent evidence of negative impacts for any subgroups. Differences in spending between the sectors suggest the possibility of a productivity advantage for FLVS.”

In addition, because the authors believe that a “student may be more likely to take a course through FLVS if the teacher of that course at their local school is known to be lower quality, and more likely to take in-person courses with higher quality teachers…our estimates are likely lower bounds of the true FLVS effect.”

In addition to this most-reported aspect of the research, which shows that digital learning can produce results similar to traditional education, the study provides a valuable description of the ways in which online learning can potentially be valuable. First, online courses can increase access to courses for students who otherwise would not be able to take those courses. Second, online learning might increase student achievement for students who have access to similar courses in a face-to-face, onsite format. Third, it might reduce the cost of providing education.

Regarding the first goal, the study points out that “Virtual schools meet the first goal, almost by definition, in that they provide a variety of courses that students can take from anywhere and at any time.” The researchers provide a data point related to this, noting during school year “2008-09… at least 1,384 AP courses (916 unique students) were taken by students enrolled in high schools where those courses were not offered.”

This point has been overlooked in some of the media reports about the study. For example, Online Learning at Least Not Terrible, Says Study, completely misses this aspect of the research when it says the “study does not show an advantage for online instruction.” The research most certainly does show an advantage that was provided by FLVS, in the courses that were (and are) available to students who otherwise would have no access to those courses.

As we discussed in a previous blog post: The U.S. Department of Education reports that nationwide, only half of our high schools offer calculus, a little more offer physics, and among high schools with the highest percentage of black and Latino students a quarter do not offer Algebra II, and a third don’t offer chemistry.

The Harvard study nails this point, and mentions it in both the body and the conclusion. It is disappointing that more reports on the study aren’t highlighting the critical issue of equal access.

Demonstrating that achievement in online courses is equal to or better than achievement in traditional courses should be a finding that is celebrated, because it demonstrates the path to equality of educational opportunity for all students. It is now possible for every student with Internet access to have a similar level of educational access to the students in the wealthiest districts.

It’s also important to note the fact that FLVS courses showed outcomes similar to traditional courses doesn’t mean that all online courses will automatically produce similar outcomes. Although this point may seem obvious, it is worth repeating because of the extent to which the results of education research are often oversimplified. FLVS has a long history of experience, has benefited from a relatively high level of investment by the state, and has previously demonstrated success. These conditions are certainly not true of all online course providers.

In the course of reviewing the study, I got into an email conversation with Julie Young, who was CEO of FLVS at the time of the study. In that conversation she reminded me that although the data (from SY 2008-09) are a bit old, “the results clearly demonstrate the teaching and learning methodology used to support the virtual experience was spot on.” She also added that  “the trend is extremely positive,” and I expect that with an additional six years of experience, results from FLVS—along with other experienced online course providers—are likely even better now.

Disclosure: FLVS has been a Keeping Pace sponsor and an Evergreen client, and Julie Young has been a client and colleague for many years.

WordPress SEO fine-tune by Meta SEO Pack from Poradnik Webmastera