1. Conceptualisations of LA
The data suggested that institutions not only differed in terms of how LA was operationalised and ‘looked’ within their institution, but importantly found that these differences were owing to diverging conceptualisations and understandings of LA. Simply, how an institution understands the drivers and purpose of LA appears to mediate how it is operationalised, both in the present and future. The strong relationships between Concept, Readiness and Context variables highlight the need to consider LA implementation as involving more than tangible, overt structures, and reinforce the critical role of underlying epistemological, ontological and contextual elements in shaping learning analytics deployments.
Leadership was revealed to influence learning analytics deployments through both its structure and scope of influence. Distributed and centralised structures were adopted across institutions, with 18/32 institutions adopting a distributed leadership model. Distributed leadership models are based on empowered and distributed systems of ownership and leadership, and advocated strongly in the emerging leadership literature as ideal for organisational contexts that are complex, emergent and dynamic.
Leadership scope of influence was operationalised to incorporate multiple elements (including an ability to secure organisational mandate, and evidence of knowledge of the field of LA) and highlighted the importance of leadership that is both informed by and grounded in knowledge of the field. However, it is difficult to determine whether knowledge was a precursor to, or a product of, effective LA leadership. Notwithstanding, the findings suggest that leadership should be afforded a more discriminating and nuanced scrutiny in future research to better understand how different leadership processes, structures and styles mediate LA implementation outcomes.
An institution’s stage of strategy development typically correlated with the strength and scope of leadership exhibited within it, and ultimately the institution’s level of readiness for, and conceptualisation of, LA. It is worth noting that the strategic positioning of LA (operationalised as where and how LA was situated and structured within an organisation) varied across institutions. For example, some institutions embedded LA in existing functions, others created independent LA units. While this report did not find a significant relationship between strategic positioning and LA outcomes, it is possible that this may be owing to the relatively nascent stage of LA within institutions. As LA grows in profile and scope within institutions, it is possible that decisions vis-a-vis how LA should be positioned and structured may take on further significance.
Stakeholder feedback and capacity building emerged as significant dimensions. Institutions appeared to engage in one of two levels of consultation: either primarily focused on senior management, or involving a broader institutional profile (such as academics, students and professional staff).
The mediating potential of organisational (stakeholder) capacity was affirmed. Institutions that were actively addressing capacity through the development and implementation of targeted programs and initiatives typically conceptualised and operationalised LA as a mechanism for growing teaching and learning capacity.
Sound technological infrastructure was found to be critical to the success of any LA implementation is the establishment of sound technical infrastructure. Educational Data Warehouses (EDW) were revealed a particularly salient. Only 10 institutions in the sample (n=32) had not implemented an EDW. Of these, 8 reported no LA program. The absence of an EDW appears to constrain the establishment or at least the commencement of institutional LA projects.
The data capacity of the EDW’s diverged across the population sample. While some institutions had linked their data warehouse to dynamic, LMS data, and other forms of student experience across the campus (such as library usage), in other institutions the EDW was host to more limited range of data, particularly student demographic and progress data. There appeared a relationship between the types of data accommodated in EDWs and broader conceptualisations and operationalisations of LA. When compared to other ‘readiness’ dimensions of organisational capacity and organisational culture, technology readiness was more developed. These findings suggest that technology readiness takes precedence over organisational culture and capacity in the early stages of LA implementations. Interestingly technology enhancements included products developed in house and also external to the organisation, with the majority of initiatives incorporating elements of both. However, the data suggests that the adoption of vendor products in Cluster 2 was accompanied by a more critical, circumspect awareness of their capacities and limitations.
For the purposes of the current study, context was operationalised as institutional profile data, notably retention and student success data, and institutional category or group (for instance, Go8, or ATN member). While these dimensions did not emerge as statistically significant, data did suggest a relationship between these three dimensions and LA outcomes. Typically, institutions with lower retention and success rates, were more likely to adopt LA implementations with a strong focus on retention. By contrast, institutions with higher retention and student success outcomes were more likely to pursue LA activity that possessed an emerging learning-focus. It is therefore possible that for some institutions, broader institutional contextual factors were shaping LA implementations. That a relationship was suggested when context adopted such a narrow operationalisation is significant, and it is argued that future research could incorporate a broader conceptualisation of context to embrace the social and institutional structures in which it is situated.