The fourth session that I am blogging here at SITE 2017 related to K-12 Online and Blended Learning is:
Online Course Design: Incorporating Data Analytics for Improved Course Delivery
Presider: Michelle Bartlett, North Carolina State University, United States
Information presented in this paper describes the many factors considered through data analytics that help support the development, delivery and assessment of online course design. While course content plays an important role in determining how instructional components are delivered in an online course, many factors that provide feedback on course delivery and design should be developed and implemented through applied course logistics to provide data analytics that reveal effectiveness, satisfaction, and design measures determined by students who complete the online course. Examples of data analytic factors are provided through questionnaires, LMS tracking features, assignment performance, and system-wide observations are considered. Suggestions for designing and developing online courses that employ data analytics are presented.
While this course was tagged as being relevant to the K-12 Online Learning SIG, it was strictly focused on a higher education example. Having said that, data or learning analytics is a general topic that could have a great impact within the K-12 distance, online and blended learning environment – so I stayed in the room.
The session – which was largely read by the presenter from notes that he had – began with some of the general types of data analytics that may be available in online course design. The presenter then transitioned into some of the basic in simply planning out the nature of and actual content for an online course.
The first study that was being reported looked at a five module online course at Texas Tech University with 20 students. The presenter provided a model through which to examine the online course design – but then presented the findings, which were based on student perceptions on a researcher-created Likert scale instrument, to determine what students liked and didn’t like. The presenter indicated that the students generally liked the course, but the table that was displayed with the results – and really all of the text throughout – was too small to read any of the results.
The second study focused on a specific assignment – a photographic assignment. I’m not sure where the data for this one came from, but the students had varying success and opinions of the different tools used to complete the assignment.
The third study was based on a end-of-course survey of students about the effectiveness of the design. Again, the text was too small to read, but the presenter did indicated that the course design was found to be adequate, but many aspects were found to be not challenging enough.
Basically, this was largely a wasted session – as the data analytics were largely based on student surveys or what they liked and didn’t like. So it didn’t really offer much guidance from an actual data analytics or learning analytics perspective.