ALASI 2015: Australian Learning Analytics Summer Institute
ALASI is the main forum in Australia for the exchange of knowledge, innovation, and experiences in the area of Learning Analytics. The event offers a great opportunity to learn about analytics in education, showcase your activities and benchmark results through exposure to the thinking, experiences and feedback of other practitioners and researchers in the field.
Learning analytics has been gaining momentum as a community and research discipline. University leaders now consider analytics has high potential to increase the quality of the student experience and outcomes. However, realising this potential requires complex transdisciplinary relationships within and between institutions.
The use of analytics can provide institutions with improvements in admissions, the student learning experience, retention, curriculum design, and overall teaching and learning quality. However, institutions are now facing the challenge of bridging the gap between these desired improvements and the current organisational structures and infrastructures. Some of the pertinent questions at present are: How can data about student progress in a unit of study be made easily available to instructors and to students? What is the impact of using data in relation to a learning design? How can students be better supported throughout their entire educational experience? What type of tools and methods are being used within institutions to bridge this gap?
At ALASI 2015 many contributed to this collective intelligence event in order to tackle the challenge!
The event was hosted by the Educational Innovation Team (formally known as the Institute for Teaching and Learning), the School of Electrical and Information Engineering, and the Human Centred Technology Cluster at the University of Sydney.
Learning analytics research groups at local universities:
- UNSW Learning Analytics & Data Science in Education Research Group
- UTS Connected Intelligence Centre
- Melbourne Learning Analytics Research Group
- Sydney Learning Analytics Research Group
Australia and NZ learning analytics groups:
- Victorian and Tasmanian Learning Analytics Network. Contact Linda Corrin to join the Google Group, and for general queries.
- Australian Society for Computers in Learning in Tertiary Education (ascilite) Learning Analytics SIG
International learning analytics groups:
- Australian Learning Analytics Summer Institute (ALASI): Sydney 2015, UTS 2014, Macquarie 2013
- ascilite Conference
- International Conference on Learning Analytics & Knowledge (LAK)
Why come to ALASI?
- It is the most established Australian forum to connect with the nation's leading practitioners and researchers in learning analytics.
- ALASI is a highly interactive event. The program is comprised mostly of workshops, tutorials and panels. There are no long presentations except two keynote speakers to engage and provoke thought and discussion.
- Previous ALASIs have provided an ideal forum to make new collegial connections in the area.
- ALASI focuses on how to bridge the distance between the disciplines of pedagogy, educational design, technology, and business intelligence, in order to foster an interdisciplinary approach to learning analytics.
Prof Peter Reimann
Professor in the Faculty of Education and Social Work at the University of Sydney, and Senior Researcher in the Centre for Research on Computer-supported Learning and Cognition – CoCo.
Peter received his Masters and PhD from the University of Freiburg, in Psychology. He has worked at the University of Freiburg's Psychology department, in the Learning Research and Development Centre in Pittsburgh, and at the University of Heidelberg, where he was Professor for Educational Psychology before he moved to Sydney. His primary research areas have been cognitive learning research with a focus on educational computing, multimedia-based and knowledge-based learning environments, e-learning, and the development of evaluation and assessment methods for the effectiveness of computer-based technologies. Current research activities comprise among other issues the analysis of individual and group problem solving/learning processes and possible support by means of ICT. Concerning methods and methodology, he has a special interest in cognitive modelling, computational analysis of process data, and application of e-research methods to learning research.
There is nothing as practical as a theory: Connecting learning analytics to learning research.
Listen to Prof Peter Reimann's Keynote below and download Peter's presentation here (pdf).
Your browser does not support the audio element.
Learning Analytics is typically seen as ‘applied’ research, as serving mainly practical purposes. But why be so humble? In my presentation, I will look at LA as a potential tool (or toolbox, rather) in the hands of learning researchers, and as a (partially) new tool at that. I will be addressing two questions: What is the potential of LA to innovate learning research? And what must LA and learning research become like to realize this potential? The gist of my argument will be that LA has to move beyond an event- and activity-centric ontology of learning to one that is informed by a view of learning as a complex phenomenon with emergent properties. Then LA can change learning research, contributing to theories of learning that are not only explanatory, but also practical. More precisely: They are practical because they are explanatory. I will illustrate these points by examples referring to my work on using process-mining methods in the context of learning in small groups and for analysing data on self-guided learning.
Peter is Professor in the Faculty of Education and Social Work at the University of Sydney, and Senior Researcher in the Centre for Research on Computer-supported Learning and Cognition – CoCo. Peter received his Masters and PhD from the University of Freiburg, in Psychology. He has worked at the University of Freiburg's Psychology department, in the Learning Research and Development Centre in Pittsburgh, and at the University of Heidelberg, where he was Professor for Educational Psychology before he moved to Sydney. His primary research areas have been cognitive learning research with a focus on educational computing, multimedia-based and knowledge-based learning environments, e-learning, and the development of evaluation and assessment methods for the effectiveness of computer-based technologies. Current research activities comprise among other issues the analysis of individual and group problem solving/learning processes and possible support by means of ICT. Concerning methods and methodology, he has a special interest in cognitive modelling, computational analysis of process data, and application of e-research methods to learning research.
Dr Lyn Alderman
Associate Director, Academic Quality and Standards, Chancellery, QUT.
With over 20 years experience in higher education and 10 years focused on evaluation and learning analytics, Dr Lyn Alderman has a wealth of understanding in institution-wide evaluation frameworks, evaluation of teaching, learning analytics and performance models, and how to engage in broad and rich stakeholder engagement to inform curriculum decision-making. As the sole investigator of an illuminative evaluation into Australian Government policy borrowing and implementation, lead investigator to research Post Occupancy Evaluation (POEs) of education facilities, external evaluator to examine the quality assurance framework of an international university and consultant to reconceptualise the student evaluation framework for a national university, Lyn is judiciously situated to present and disseminate her research and experience in higher education and evaluation. Lyn is the President of the Australasian Evaluation Society (2014 - current) and an Editor of the Evaluation Journal of Australasia (2012 - current).
The application of learning analytics to improve student learning: Answering questions from the academy
Watch Dr Lyn Alderman's Keynote below and download Lyn's presentation here (ppt).
Your browser does not support the video tag.
Since 2007, close collaboration between the Learning and Teaching Unit’s Academic Quality and Standards team and the Department of Reporting and Analysis’ Business Objects team resulted in a generational approach to reporting where QUT established a place of trust. This place of trust is where data owners are confident in date storage, data integrity, reported and shared. While the role of the Department of Reporting and Analysis focused on the data warehouse, data security and publication of reports, the Academic Quality and Standards team focused on the application of learning analytics to solve academic research questions and improve student learning. Addressing questions such as:
- Are all students who leave course ABC academically challenged?
- Do the students who leave course XYZ stay within the faculty, university or leave?
- When students withdraw from a unit do they stay enrolled on full or part load or leave?
- If students enter through a particular pathway, what is their experience in comparison to other pathways?
- With five years historic reporting, can a two-year predictive forecast provide any insight?
In answering these questions, the Academic Quality and Standards team then developed prototype data visualisation through curriculum conversations with academic staff. Where these enquiries were applicable more broadly this information would be brought into the standardised reporting for the benefit of the whole institution. At QUT an annual report to the executive committees allows all stakeholders to record the performance and outcomes of all courses in a snapshot in time or use this live report at any point during the year. This approach to learning analytics was awarded the Awarded 2014 ATEM/Campus Review Best Practice Awards in Tertiary Education Management for The Unipromo Award for Excellence in Information Technology Management.
ALASI accepted proposals for the following session types:
Panels (1 hour)
Panels aim at informing attendees about an important or controversial issue through a 1 hour debate among experts. These experts must come from different institutions to widen the debate and provide breadth of perspective.
Tutorials (2 or 4 hours)
Tutorials present a software tool, methodology or procedure through a 2 or 4 hour hands-on participatory session. The objective is to introduce attendees to new practical skills so that at the end of the session they are ready to start applying it in their own context.
Workshops (2 or 4 hours)
Sessions of 2 or 4 hours for in-depth exploration of a topic. We strongly recommend hands-on, highly participatory sessions that maintain the engagement of the participants.
Posters & Demos
Show your ideas or how a software product works during the conference evening reception while having extended conversations with fellow delegates.
Are you a vendor in the learning analytics space? We offer the opportunity to participate in ALASI and to showcase your product during a special session comprised of 15 minute presentations exclusively on analytics products. Furthermore, you can also demonstrate your product in the Posters & Demo session if you wish (see above).
ALASI program outline
|FREE PRE-WORKSHOP on Learning analytics and retention: WEDNESDAY 25 NOVEMBER|
|12noon - 5pm||Click here for the full workshop program - The event is free, but separate registration is required.|
The ALASI Conference will be held in the Peter Nicol Russell Building.
|DAY 1: THURSDAY 26 NOVEMBER|
|8am||Registration opens and early morning tea/coffee|
|8.45am||Acknowledgement of Country – Kathryn Bartimote-Aufflick, Co-chair ALASI 2015 Organising Committee
Opening and welcome – Professor Pip Pattison, Deputy Vice-Chancellor (Education)
Chair – Kathryn Bartimote-Aufflick
|9am||Keynote – The application of learning analytics to improve student learning: Answering questions from the academy, Dr Lyn Alderman|
|10.30am||Parallel sessions 1|
|Learning Studio 310||Learning Studio 311||Learning Studio 315|
|Identifying and contacting disengaged students in MOODLE
Jean-Christophe Froissard & Danny Liu
|Collaboration, Conversation and Exploration – leveraging collective intelligence for enhanced performance
Craig Napier, Olivera Marjanovic, Kate Shanahan & Elizabeth Hu
|1.30pm||Parallel sessions 2|
|Learning Studio 310||Learning Studio 311||Learning Studio 315|
|Building Dashboards for Massive Open Online Courses
Andrew Clayphan, Lorenzo Vigentini, Lisa Zhang & Catherine Zhao
|Down and Dirty with Data for Social Learning Analytics
Kirsty Kitto, Aneesha Bakharia, Abelardo Pardo, Simon Buckingham Shum, Shane Dawson & Grace Lynch (Part 1)
|3.45pm||Parallel sessions 3|
|Learning Studio 310||Learning Studio 311|
Down and Dirty with Data for Social Learning Analytics
|5.45pm||Close of parallel sessions|
Evening reception and interactive posters/demos - School of IT Wintergarden 122
Prototype of a mobile app to monitor and enhance student engagement - Jason Chan; Norm referencing rides again: learning analytics, assessment and standards- Phillip Dawson; Mapping analytics data to the learning design cycle to promote evidence-based teaching practice - Cathy Gunn and Claire Donald; The Connected Learning Analytics Toolkit - Aneesha Bakharia and Kirsty Kitto; Customisable and actionable data for personalised learning- Adam Bridgeman, Danny Liu & Charlotte Taylor; What's the right answer when there's no right answer? CAPTAIN : Comprehensive Auditory-Perceptual Training for Speech Pathologists - Cate Madill, Sonya Corcoran, Elizabeth Murray, Alison Purcell, Terry So, Tricia McCabe; Interactive Surfaces and Learning Analytics - Roberto Martinez-Maldonado & Simon Buckingham Shum; Short answers to deep questions: insights from tutorial dialogue - Jenny McDonald, Amal Zouaq, Michel Gagnon, Rebecca Bird; Pacing through MOOCs: course design or teaching effect? - Lorenzo Vigentini, Andrew Clayphan
|7.00pm||Dinner – make your own arrangements|
|DAY 2: FRIDAY 27 NOVEMBER|
|8am||Early morning tea/coffee|
|9am||Chair – Dr Abelardo Pardo, Co-chair ALASI 2015 Organising Committee
Keynote – There is nothing as practical as a theory: Connecting learning analytics to learning research, Professor Peter Reimann
|10.30am||Parallel sessions 4|
|Learning Studio 310||Learning Studio 311|
|Scaling Instructor-driven Personal Support Actions
Abelardo Pardo & Jurgen Schulte
Using data for targeted student interventions - lessons learned and issues raised
Transferring Theories: The Case of Learning Analytics for MOOCs
|12.30pm||Plenary panel and close|
|2.00pm||Birds of a feather|
Free Pre-Workshop: LA and Retention
Learning analytics in Oz: Where are we at, where are we heading, and how can we get there?
This interactive workshop will report and respond to findings from the recently released Office for Learning and Teaching (OLT) commissioned report Learning analytics and retention: a framework for advancement. The research presented in this report scrutinised learning analytics implementations around Australia to glean insight into the factors that appear to afford or mediate them, and that will ensure their sustainability.
Venue: The University of Sydney, School of IT Building (J12), Lecture Theatre 123 J12.123
Date: 25 November 2015
Time: 12noon -5pm
Details: Download (pdf)
Registrations are now closed.
|OLT WORKSHOP: WEDNESDAY 25 NOVEMBER|
|12noon||Registration and luncheon|
|1.10pm||Overview of report’s findings|
|1.40pm||Short rejoinders/introductions for interactive workshops|
|1.50pm||Short break and room change|
|Room 1||Room 2|
|2.00pm||Technological Readiness||Stakeholder Engagement|
|2.20pm||Retention and Learning Analytics||Leadership for Learning analytics|
|3.00pm||Developing learning analytics capacity|
|Room 1||Room 2|
|Workshop 1: Interactive workshop designed around a systems model of LA (as presented in the report). The workshop will use a simulation, presented as a ‘game’, that participants can play (byo laptop or large tablet) to consider how different strategies for implementing LA practice across an institution might work, and how elements of the overall system might produce synergies (or not!)||Workshop 2: This interactive workshop will focus on the Model of Strategic Capability presented in the report and apply it to ‘synthetic’ institutional case studies designed to highlight learning analytics implementations as situated and multidimensional. Participants will be encouraged to consider how learning analytics implementations are mediated by a complex intersection of conceptual, contextual and logistical affordances, and reminded of the imperative to be cognisant of these elements when designing implementations and related interventions.|
|4.20pm||Charles Darwin University presentation on OLT commissioned learning analytics research|
|4.50pm||Close and drinks|
Abelardo Pardo, The University of Sydney (Co-chair)
Kathryn Bartimote-Aufflick, The University of Sydney (Co-chair)
Simon Buckingham Shum, University of Technology Sydney
Shane Dawson, University of South Australia
Tim Rogers, University of South Australia
Grace Lynch, RMIT
Negin Mirriahi, UNSW Australia
Lori Lockyer, Macquarie University
Linda Corrin, University of Melbourne
David Gibson, Curtin University
Marcel Lavrencic, University of Queensland