Limitations of Social Network Analysis in Teaching and Learning

If the goal of Learning Analytics is to enhance students’ learning what’s missing for me is how Social Network Analysis contributes to that goal, how it benefits the human learner and how it can make any significant contribution to optimizing student learning. Athabasca’s own George Siemens states a real danger in any assessment system is when “the target becomes the object of learning, rather than the assessment of learning”. A cursory acknowledgement of the Hawthorne Effect supports the idea that data collected from SNA is subject to reflecting performative behaviour rather than being a representation of something authentic. Some research in a workplace environment reports a reduction in productivity when people are under constant observation, and an increase in productivity in areas of the workplace that were more private 1.

Even when social media is used intentionally as a tool to augment online (LMS) learning, there is little evidence to support the claim that the use of social media fosters richer collaboration, better engagement or enhanced opportunities for learning [2]. The results of Veletsianos’ study also notes in purely quantifiable terms that less than half of social media users were learners and those that contributed even less than that. It follows that SNA fails to represent all learners.

SNA also fails to accurately represent even a subset of learners. In the education space where access, safety and privacy are (or should be) a primary concern, learners with accessibility requirements that either make it more difficult to engage on a social network or prevent them from meaningful engagement online would not be represented in SNA. Any data derive from SNA would also fail to represent learners whose personal safety is at risk should the information they share on a social network potentially expose them to danger. In situations where participation in a social network is optional those learners who choose not to be part of a social network would be excluded from the dataset. For those whose access to the internet is difficult, expensive or non-existent they would also not be represented in the data derived from SNA. Social networks are not immune from inundation of bots which intentionally manipulate online metrics to achieve the intended effect of distorting perception 2. Social media as a data source has significant limitations and more importantly is vulnerable to bias.

The assertion SNA has value as a tool for educators is a questionable proposition. A case study highlighting a masters student who failed an online module challenges assessment practices in higher education that monitor learning, reduce students to data points and claim an enhanced learning experience 3. I can see from the perspective of an institution or the software company providing the service how a tool like this might be sold as a way to serve an administrative benefit, but with that comes a set of assumptions about how engagement is (or can be) measured, and ignores what else it has and could be used for and says nothing about the quality of information exchange between people.

Where learning science is a given proper consideration prior to the design of a learning analytics system, Marzouk et al recommend extending social network graphs to include a text mining feature which would give insight into the quality of information students contribute to in online discussions 4.


  1. Ethan S. Bernstein, “The Transparency Paradox: A Role for Privacy in Organizational Learning and Operational Control,” Adm. Sci. Q., vol. 57, no. 2, p. 181, 2012.
  2. G. Veletsianos, “Toward a generalizable understanding of Twitter and social media use across MOOCs: who participates on MOOC hashtags and in what ways?,” J. Comput. High. Educ., vol. 29, no. 1, pp. 65–80, Apr. 2017.
  3. C. Watson, A. Wilson, V. Drew, and T. L. Thompson, “Small Data, Online Learning and Assessment Practices in Higher Education: A Case Study of Failure?,” Assess. Eval. High. Educ., vol. 42, no. 7, pp. 1030–1045, Jan. 2017.
  4. Z. Marzouk et al., “What if learning analytics were based on learning science?,” Australas. J. Educ. Technol., vol. 32, no. 6, pp. 1–18, Nov. 2016.


Brad Payne is currently the lead developer for the Open Textbook Project whose work focuses on open source software using PHP (LAMP). When not contributing to other developers’ projects on github, he builds his own. Through exploiting API’s and with a penchant for design patterns, he helps BCcampus implement new technologies for post-secondary institutions. Prior to his current position at BCcampus, Brad worked in IT at Camosun College and the BC Ministry of Education.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *