Virtual School Meanderings

February 27, 2017

[AECT] March 8th Webinar

Notice of another webinar.

AECT Webinar
“What are we preparing our students for? An argument for alt-format dissertations”

March 8th at 4:00 PM EST
[Webinar Registration]

     Hosted by: Feng-Ru Sheu, Kent State University
Presented by: Rick West, Brigham Young University

Most scholars agree that the main purposes of the dissertation are to train students in proper research methodology and to contribute original findings to research. However, some worry that the traditional dissertation format is not conducive to either of these goals. Research has shown that dissertations rarely get disseminated into academic journals, and academics rarely cite dissertations that have not been published as articles. Additionally, some scholars argue that the traditional dissertation format is a poor training tool because it does not prepare scholars for future professional pursuits. Many departments, including mine, now offer alternative-format dissertations, including the option of defending a series of articles. In this webinar, Dr. West will share some of the research about alternative-format dissertations and our experience at BYU. He will discuss what lessons we have learned, and engage you in a discussion about the strengths and weaknesses of the alternative-format dissertation and how it might be used to improve scholarship in our field.

Click Here to Learn More and Register.

Free AECT RTD Webinar on April 5 [Dr. Charles Reigeluth]

Note that this webinar may be of interest to some…

The AECT Research & Theory Division would like to invite you to join us for this FREE professional development webinar.


Research to Make a Difference


Dr. Charles Reigeluth,

Professor Emeritus,

Instructional Systems Technology,

Indiana University Bloomington (

Time & Date:

2:00pm (EST), Wednesday, April 5, 2017



Could you improve the kinds of research you are doing?  Your chances of getting published?  Your impact on the real world?

Dr. Reigeluth will discuss a variety of topics that may help you to answer these questions, which may be crucial to the success of your career.

  • Different purposes for your research (e.g., to improve practice or to advance theory),
  • Different kinds of knowledge you may want to produce for each purpose (descriptive theory or design theory) and the appropriate forms of such knowledge (e.g., conditional and probabilistic),
  •  Different kinds of research methods for producing each kind of knowledge (research to prove or research to improve – confirmatory or exploratory),
  •  When in the history of a given focus area each kind of research method is most useful,
  • Different methodological issues for each kind of research (e.g., validity or preferability),
  • Specific methods you might find helpful (e.g., design-based research, formative research, grounded theory development).

This discussion will focus on highly practical issues for improving the research you do – ones that you likely have not encountered in research courses or projects.


Questions: Contact Feng-Ru Sheu, Webinar coordinator at

Enilda Romero-Hall, Ph.D.

Assistant Professor

College of Social Science, Mathematics, and Education
The University of Tampa
Department of Education | Plant Hall, Room 439
(813) 257-3372

News from the NEPC: Center for American Progress Receives NEPC’s 2016 Bunkum Award For Shoddy Research

From the inbox late last week.

Center for American Progress receives 2016 Bunkum Award for shoddy research for its report: Lessons From State Performance on NAEP.
Is this email not displaying correctly?
View in your browser

Center for American Progress Receives NEPC’s 2016 Bunkum Award for Shoddy Research

Key Review Takeaway: Center for American Progress receives 2016 Bunkum Award for shoddy research for its report: Lessons From State Performance on NAEP.


William J. Mathis: (802) 383-0058,

BOULDER, CO (February 23, 2017) – The 89th Academy Awards will be celebrated this weekend, which means it’s also time to announce the winner of the 2016 National Education Policy Center Bunkum Award. We invite you to enjoy our 11th annual tongue-in-cheek salute to the most egregiously shoddy think tank report reviewed in 2016.

This year’s Bunkum winner is the Center for American Progress (CAP), for its report, Lessons From State Performance on NAEP: Why Some High-Poverty Students Score Better Than Others.

The CAP report is based on a correlational study with the key finding that high standards increase learning for high-poverty students. The researchers compared changes in states’ test scores for low-income students to changes to those states’ standards-based policy measures as judged by the researchers. Their conclusions were that high standards lead to higher test scores and that states should adopt and implement the Common Core.

Alas, there was much less than met the eye.

In choosing the worst from among the many “worthy” contenders competing for the Bunkum Award, our judges applied evaluation criteria from two guidelines for how to understand research, Five Simple Steps to Reading Research and Reading Qualitative Educational Policy Research.

Here’s how the CAP report scored:

  • Was the design appropriate?
    No: The design was not sensitive, so they tossed in “anecdotes” and “impressions.”
  • Were the methods clearly explained?
    No: The methods section is incomplete and obtuse.
  • Were the data sources appropriate?
    No: The variables used were inadequate and were aggregated in unclear ways.
  • Were the data gathered of sufficient quality and quantity?
    No: The report uses just state-level NAEP scores and summary data.
  • Were the statistical analyses appropriate?
    No: A multiple correlation with just 50 cases is too small.
  • Were the analyses properly executed?
    Cannot be determined: The full results were not presented.
  • Was the literature review thorough and unbiased?
    No: The report largely neglected peer-reviewed research.
  • Were the effect sizes strong enough to be meaningful?
    Effect sizes were not presented, and the claims are based on the generally unacceptable 0.10 significance level.
  • Were the recommendations supported by strong evidence?
    No: Their conclusion is based on weak correlations.

The fundamental flaw in this report is simply that it uses inadequate data and analyses to make a broad policy recommendation in support of the Common Core State Standards. A reader may or may not agree with the authors’ conclusion that “states should continue their commitment to the Common Core’s full implementation and aligned assessments.” But that conclusion cannot and should not be based on the flimsy analyses and anecdotes presented in the report.

Watch the 2016 Bunkum Award video presentation, read the Bunkum-worthy report and the review, and learn about past Bunkum winners and the National Education Policy Center’s Think Twice Think Tank Review project:
About the Think Twice Think Tank Review Project:

Many organizations publish reports they call research – but are they? These reports often are published without having first been reviewed by independent experts – the “peer review” process commonly used for academic research.

Even worse, many think tank reports subordinate research to the goal of making arguments for policies that reflect the ideology of the sponsoring organization.

Yet, while they may provide little or no value as research, advocacy reports can be very effective for a different purpose: they can influence policy because they are often aggressively promoted to the media and policymakers.

To help the public determine which elements of think tank reports are based on sound social science, NEPC’s “Think Twice” Think Tank Review Project has, every year since 2006, asked independent experts to assess strengths and weaknesses of reports published by think tanks.

Few of the think tank reports have been found by experts to be sound and useful; most, however, are found to have little, if any, scientific merit. At the end of each year NEPC editors sift through the reviewed reports to identify the worst offender. We then award the organization publishing that report NEPC’s Bunkum Award for shoddy research.

The National Education Policy Center (NEPC) Think Twice Think Tank Review Project ( provides the public, policymakers, and the press with timely, academically sound reviews of selected publications. The project is made possible in part by support provided by the Great Lakes Center for Education Research and Practice:

The National Education Policy Center (NEPC), housed at the University of Colorado Boulder School of Education, produces and disseminates high-quality, peer-reviewed research to inform education policy discussions. Visit us at:

Copyright © 2017 National Education Policy Center. All rights reserved.

February 25, 2017

Statistics In Education For Mere Mortals – MOOC Offered Again By Lloyd Rieber – February 27-April 3, 2017

Note this MOOC – which has been run (and advertized here) several times in the past – is beginning again on Monday.  For those interested in understanding, collecting, and manipulating statistics (including K-12 distance, online and/or blended learning practitioners), I would highly recommend this MOOC.

Hi everyone,
I am again offering my MOOC on introductory uses of statistics in education. This section will run from February 27-April 3, 2017 on Here is the link to the course sign-up page:
The course is free.
I made a short 3-minute ‘mash-up’ of a selection of the course’s videos to give people a taste of the course:
Also, all of the course videos are available on YouTube – here is a link to the playlist:
I designed the course for “mere mortals,” meaning that I designed it for people who want to know about and use statistics as but one important tool in their work, but who are not — and don’t want to be — mathematicians or statisticians. A special note that I also designed it with doctoral students in mind, especially those who are about to take their first statistics course. It could also be good for those students who just finished a statistics course, but are still fuzzy on the details.
However, this course would be useful to anyone who wants a good, short, hands-on, friendly introduction to the most fundamental ideas of statistics in education.
Here’s my approach … I provide a short presentation or two on each statistics topic, followed by a video tutorial where you build an Excel spreadsheet from scratch to compute the statistic. Then, I ask you to take a short quiz — consisting of sometimes just one question — where I ask you to plug in some new data into your spreadsheet and then copy and paste one of your new calculations as your answer. (And yes, there is also a short final exam at the end on the conceptual stuff.)
Examples of specific skills to be learned include the scales of measurement, measures of central tendency, measures of variability, and the computation of the following: mean, mode, and median, standard deviation, z (standard) scores, Pearson product-moment correlation coefficient (r), correlated-samples t test (i.e. dependent t test), independent-samples t test, and a one-way analysis of variance (ANOVA).
This will be the ninth time I will have taught this free online course. Almost 6000 people worldwide have enrolled in it to date.
The course is scheduled to be offered next in July, 2017. The plan is to offer it twice a year.
* Lloyd P. Rieber
* Professor of Learning, Design, & Technology
* 203 River’s Crossing
* The University of Georgia
* Athens, Georgia  30602-4809  USA
* Phone: 706-542-3986
Lloyd’s LiveCode Blog:
Check out Lloyd’s Video Analysis Tool in the Mac App Store:

February 24, 2017

Congratulations Michael, You Reached A Milestone

From one of my open scholarship networks.

Your publication has a new achievement:
Building a Better Mousetrap: How Design-Based Research was Used to Improve Homemade PowerPoint Games
View achievement
Next Page »

Blog at