2015 – Poughkeepsie, USA
The first LAK Hackathon! The first hackathon, in 2015, focused on the Apereo Open Dashboard, with data sourced from an xAPI (Experience API, 2013) Learning Record Store. It illustrated how the concept of an Open Learning Analytics3 architecture was developing but also shone light on some structural weaknesses: shortage of usable data for demonstration/development/quality-assurance/etc, and something of a gulf between the conceptions held by different stakeholders as to what a learning analytics dashboard would contain. Subsequent work by workshop organizers has begun to develop repeatable methods for generating synthetic data to help address the first weakness (Berg, A.M. et al. 2016). The second has been the topic of ongoing research (vide infra).
2016 – Edinburgh, The U.K.
The second hackathon, in 2016, continued to explore the practicalities of Open Learning Analytics. Using Jisc’s emerging Learning Analytics architecture (Sclater, et al. 2015) as a reference point, with some data generated using the synthetic data methods which the first hackathon stimulated, the participants in the hackathon: scrutinized Jisc’s interoperability recipes, tested the interoperability of learning record stores, learning analytics processors, and dashboards, and assessed the learning analytics standards landscape. The hackathon had a lasting effect, with numerous improvements to Jisc’s interoperability recipes, and a strong message from the LAK community in favor of the greater integration of emerging learning analytics standards – xAPI and Caliper – contributing to the cooperation of ADL and IMS from mid-2016.
2017 – Vancouver, Canada
The third hackathon built upon three assets: previous workshops, recent research, and recently-developed software. The first comprises the previous two LAK hackathons, the 2015 LAK Workshop “Visual Aspects of Learning Analytics” (Duval, E. et al., 2015), and the 2016 LAK Workshop “Data Literacy for Learning Analytics” (Wolff, A. et al. 2016). We set the scene for the workshop using recent research on actionable analytics (Pardo et al. 2016), student feedback (Khan, I. and Pardo, 2016), and embedding learning analytics in pedagogic practice (Kitto et al. 2016). We introduced Jisc’s student app, which is being piloted with students across the UK after extensive consultation and design activities, as stimulus for discussion on the student perspective.
2018 – Sydney, Australia
Last year’s LAKHackathon saw a continuation, expansion and documentation of previous themes (Bakharia, Kitto, Pardo, Gašević, & Dawson, 2016). The outcomes are discussed in section 5 objectives and outcomes. Challenges such as goal setting for portfolios and employability, multimodal learning analytics, the data literacy playground, algorithmic transparency, ethical workflows and “hacking the hackathon” were all addressed by different groups.
2019 – Tempe, Arizona
Revolved around three main challenges: the Interoperability Challenge, sought synergies between xAPI and Tin Can API profiles. The Game-based analytics challenge, which aimed at creating a process to integrate LA in game-based assessment ( Kim at al. 2019 ) and wondered how to detect when students are stuck and disengaged. The third challenge which envisioned a markup language to describe blended learning courses was curriculum analytics. This LAKathon challenge created a JSON markup which can qualify a curriculum both in distance and in lab learning scenarios.
Berg, A.M. et al. 2016. The Role of a Reference Synthetic Data Generator within the Field of Learning Analytics. Journal of Learning Analytics. 3, 1 (2016), 107–128.
Duval, E. et al. eds. 2015. VISLA 2015, Visual Aspects of Learning Analytics. CEUR Workshop Proceedings (2015).
Experience API v1.0.1: 2013. http://www.adlnet.gov/wp-content/uploads/2013/10/xAPI_v1.0.1-2013-10-01.pdf.
Khan, I. and Pardo, A. 2016. Data2U. Proceedings of the Sixth International Conference on Learning Analytics & Knowledge – LAK ’16 (New York, New York, USA, 2016), 249–253.
Kitto, K. et al. 2016. Incorporating student-facing learning analytics into pedagogical practice. Proceedings of the Annual ASCILITE Conference (2016).
Pardo, A. et al. 2016. Generating actionable predictive models of academic performance. Proceedings of the Sixth International Conference on Learning Analytics & Knowledge – LAK ’16 (New York, New York, USA, 2016), 474–478.
Sclater, N. et al. 2015. Developing an open architecture for learning analytics. EUNIS Journal of Higher Education. (2015).
Wolff, A. et al. 2016. Data literacy for learning analytics. Proceedings of the Sixth International Conference on Learning Analytics & Knowledge – LAK ’16 (New York, New York, USA, 2016), 500–501.