A Biological Approach to Change Leadership for Education

Macquarie University

September 2015

“When the rate of external change exceeds the rate of internal change, it’s bad for business” (Dan Cathy, 2009).


Download Link – a-biological-approach-to-change-leadership-for-education

Change management is a recent phenomenon developed by consulting firms in the United States in the late 1980’s and not fully adopted as a field of business until 2011 (Prosci, 2015). The growth of this industry mirrors growth in the volatility of the market during the same time period. Between 1985 and 2006 Standard and Poor’s risk ratings changed from 41% low risk and 35% high risk to 13% low risk and 73% high risk (Colvin, 2006). Bill Ford of Ford Motor Company poignantly summarized the implications of this changing environment by saying, “The business model that sustained us for decades is no longer sufficient to sustain profitability.” Unshielded from the impact of a demand-driven economy, educational institutions face a similar scenario, which requires them to redefine their understanding of change as a process.

Although it may be more widely recognized today, grappling with the implications of change is not a recent phenomenon. Ancient philosophers recognized the significant impact of change upon the identity of objects and individuals. Kirk (1951) used Plato’s reference to Heraclitus to show the idea that all things are in a constant state of change. Heraclitus is the philosopher who is credited with saying “you cannot step into the same river twice.” He was attempting to prove that despite its ever-changing nature, the river was still a river. Kumar (2003) showed that Buddhist thought holds strongly to this idea of impermanence, however without the requirements of a linear model. As Mortensen described (unpublished), within Buddhist philosophy, “identity over time does not exist.” Variability within existence is inevitable and follows a cyclical pattern like the water cycle. Similarly, as a condition of its existence, an organization or organism must remain in a constant state of change.

Charles Darwin attempted to apply this concept to biology through his observation of microevolution taking place within the Galapagos Finches. Though these birds shared a common ancestry with other finches, they had adapted to their various environments in ways that ensured their survival (Darwin, 2009). Unfortunately, on a macro-scale this explanation becomes overly simplistic. At higher levels of change, “Un-predictability appears as an emergent property of systems that are predictable on a lower level” (Auyang, 1998). Likewise, Bennett (2010) cited several studies to show that “the link between environmental change and evolutionary change is weak – not what Darwinists might have predicted.” In fact, within complex ecosystems, he showed, the internal DNA of the organism may provide a more useful explanation of change than the external circumstances.

Borrowing from Bennett’s model, organizational change may depend more heavily on random chaos than on predictable variables of adaptation. Bennett provided three reasons for this: changes to one part of a system effects the other parts; the type of response is dependent upon initial conditions; and from the fractal perspective it is possible to trace a change backward but not to predict it forward because the model will be different. Fortunately, although “predicting chaos is hard, controlling chaos is easy” (Hubler, Foster, & Phelps, 2007). Thus, Colvin (2006) has recommended that organizations embrace the challenge of getting comfortable with chaotic growth.

Shona Brown, Google’s senior vice president for business operations would agree. Her recommended style of leadership is “to avoid creating too much structure, but not to add too little either” (Lashinsky, 2006). Unlike the change management described in the first paragraph, this leadership style moves beyond simply trying to manage chaos to creating an environment that welcomes it as a means to positive change (Kotter, 2011).

Educational organizations that wish to survive in the competitive world of technology must cultivate the ability to thrive in unpredictable environments. Mossberg recommended the chaos theory of management for educational institutions as early as 1993. Cummings, Phillips, Tilbrook, and Lowe in 2005 supported a response to external change based on empowering individuals in the middle of the organization. Others hearken back to the biological research on the importance of DNA in the process of change.

The DNA of the organization can be found in the individuals and the organizational identity (mission) rather than structural models (Karp & Helg, 2008). By embracing the inevitability of change and building a strong network of human relationships and organizational identity (the DNA), one can effectively lead and manage the non-linear, chaotic adaptation and change required for the ongoing existence and success of elearning in the educational organization.




Auyang, S. (1998). How science comprehends chaos. Talk at Harvard University. Retrieved from http://www.creatingtechnology.org/papers/chaos.pdf

Bennett, K. (2010). The Chaos Theory of Evolution. New Scientist 2782. Retrieved from https://www.newscientist.com/article/mg20827821.000-the-chaos-theory-of-evolution/

Cathy, D. (2009). Presentation to Cedarville University. Personal notes.

Colvin, G. (2006). Managing in Chaos. Fortune Magazine 154(7). Retrieved from http://archive.fortune.com/magazines/fortune/fortune_archive/2006/10/02/8387417/index.htm

Cummings, R., Phillips, R., Tilbrook, R., & Lowe, K. (2005). Middle-out approaches to reform of university teaching and learning: champions striding between the top-down and bottom-up approaches. The International Review of Research in Open and Distributed Learning, 6(1).

Darwin, C. (2009). The origin of species by means of natural selection: or, the preservation of favored races in the struggle for life. W. F. Bynum (Ed.). AL Burt.

Hübler, A. W., Foster, G. C., & Phelps, K. C. (2007). Managing chaos: Thinking out of the box. Complexity, 12(3), 10-13. Retrieved from http://server17.how-why.com/blog/ManagingChaos.pdf

Karp, T., & Helg, T. I. (2008). From change management to change leadership: Embracing chaotic change in public service organizations. Journal of change management, 8(1), 85-96.

Kirk, G. S. (1951). Natural change in Heraclitus. Mind, 35-42.

Kotter, J. (2011). Change Management vs. Change Leadership. Forbes Magazine. Retrieved from http://www.forbes.com/sites/johnkotter/2011/07/12/change-management-vs-change-leadership-whats-the-difference/

Kumar, S. M. (2003). An introduction to Buddhism for the cognitive-behavioral therapist. Cognitive and Behavioral Practice, 9(1), 40-43.

Lashinsky, A. (2006). Chaos by Design. Fortune Magazine 154(7). Retrieved from http://archive.fortune.com/magazines/fortune/fortune_archive/2006/10/02/toc.html

Mortensen, C. (unpublished). Change and Inconsistency, The Stanford Encyclopedia of Philosophy (Fall 2015 Edition), Edward N. Zalta (ed.), forthcoming URL: http://plato.stanford.edu/archives/fall2015/entries/change.

Mossberg, B. (1993). Chaos on Campus: A Prescription for Global Leadership. Educational record, 74(4), 49-55.

Prosci (2015). Change Management History. [web page] Retrieved from http://www.prosci.com/change-management/change-management-history/



The Significance of Instruction for Technology-based Learning

Abstract: Market-driven transformation of higher education has focused on content development and technology integration enabling students to access vast amounts of information and individuals for interaction. However, the increase in activity has not led to an increase in learning. Overwhelmed with too much information and lacking the ability to use technology for learning, student engagement is limited to the surface level of information transfer. Cognitive overload provides insight into why this takes place and the premise for suggesting a greater emphasis on instruction in the technology conversation. Instruction is necessary to provide students with the context and skills that they need to make the tool of technology useful for learning.


Keywords: pedagogy, content, technology, TPACK, connectivism, cognitive overload, information, instruction, interaction


Though it once occupied the central place in the university experience, instruction seems to have been displaced by an emphasis on outputs in conversation of change in higher education. Pursuit of measurable outputs has led to an emphasis on business operations (Ernst & Young, 2011), the development of content and programs for employability (Knight & Yorke, 2003), and most importantly the use of technology (Barnes & Tynan, 2007) to meet expected demands of Generation Y students for a digital learning experience (Sternberg, 2012). A recent survey of professors revealed that the operational principles governing their work clearly demonstrate that the value of instruction has been marginalized in favor of more measurable outputs like research (Field, 2015).

Meanwhile educational institutions continue the trend toward differentiation in terms of technology use despite numerous reports that technology does not necessarily have a positive impact on student performance. The British Journal of Technology recently published an analysis of a range of data measuring the impact of various technologies on student outcomes, which found there was no consistent relationship (Lei, 2010). Instead, the data showed that technology integration produced a range of effects from positive to negative and had no direct correlation to GPA. Some unknown factor determined whether or not technology use would result in learning for the students.

Literature Review (The Problem)

One reason that technology integration has not automatically enhanced student performance is that students and teachers alike struggle to use technology for the purpose of learning. Assuming that students expect novel and extensive use of technology in their learning experiences, universities have embraced technology in a way that decries consistent evidence that students don’t know how to use it effectively for learning (Sternberg, 2012). In terms of connectivist pedagogy, this is a dangerous scenario. The rapidly changing nature of knowledge has made the student’s ability to access and process information stored in digital repositories more valuable than the development of a broad knowledge base that will soon be outdated (Siemens, 2005). Depending on how it is used, the internet can be an excellent platform where students can develop these learning skills, but it can also devolve into little more than a channel for transferring information (Schank, 1998). Unfortunately, the emphasis of technology integration has focused on content and administration rather than on learning purposes (Brown, Dehoney, & Millichamp, 2015), and students are expected to be familiar with using technology for the purposes of learning (Diaz, 2010) even though the teachers are still unsure of what this would look like (Bull, et. al, 2008). Thus, the dramatically increased level of access to information and interaction in the online environment has not led to the same kind of increase in the amount of learning taking place (Swan, 2002).

On the other hand, widespread use of web 2.0 technology has altered the way that students think in ways that may not be beneficial to the learning process. The social, connected, bite-size, instant-access stream of information that constantly bombards and distracts its users eventually damages their ability to participate in deep, focused, reflective learning (Carr, 2011; Trapnell, Sinclair, & Immordino-Yang, 2012). Unable to think in a linear fashion, students enter the online or offline learning environment expecting constant change, distraction, and stimulus (Ally, 2004). Individuals who experience this constant demand on their attention may completely lose their ability for deep engagement or fail to develop this cognitive ability altogether (Imordino-Yang, Christodoulou, & Singh, 2012).

In a non-linear learning context students only have time or interest to ask ‘what happened’ or ‘how to do this’ before moving on to the next bit of information (Ibid). This is representative of Piaget’s early stages of cognitive engagement which does not move beyond concrete representations to abstraction and personalization (Piaget, 1964). Sadly, the design of the learning experience seems to follow the downward progression of student ability rather than the other way around as course facilitators upload the content, activate the discussion boards, and open the class to learners who have no idea how to use these resources effectively. The “quick burst” transactions that have resulted enable a kind of rote learning identified by the lowest of the six levels included in Bloom’s taxonomy (Mayer, 2002).

Given the inevitable constraints of time and mental endurance, even students who have the ability to evaluate or reflect on the meanings and implications of information will find it hard to resist the temptation to skim across the surface of multiple resources and interactions rather than diving deeply into a few. This trade-off of depth for breadth of learning often results in less-than-meaningful experiences (Mayer, 2002). Studies indicated a significant, but negative relationship between course completion and the amount of interaction between learners (Grandzol & Grandzol, 2010). The number of discussions that students have to pay attention to is also negatively correlated with their satisfaction in the learning environment. Some other factor is needed to transform greater access to information and interaction into greater amounts of learning.

Literature Review (Potential Solutions)

If education is meant to be anything more than the simple transfer of information, then students must somewhere encounter the chance to develop skills for deeper levels of thinking and higher levels of cognitive engagement. Information literacy, as defined by The American Library Association (ACRL, 1989), goes beyond the simple ability of knowing how to identify words. It requires some ability to screen and evaluate the text, the meaning, the context, and the implications of what is written. The internet, provides students with access to a vast pool of resources for information and interaction, but students need some kind of instructional guidance to apply these toward a productive educational outcome (Woo & Reeves, 2007).

In response to the chaotic nature of progressive schools, Dewey (1953) suggested that constructivist learning was not meant to exist independently of all oversight. Rather, instructors were needed to guide the process of exploration in a positive and worthwhile direction. Depending on their level of ability, students may need greater levels of instruction to take advantage of learning opportunities (Kirschner, Sweller & Clark, 2006). The amount of instructional support that students will need to manage the excessive amount of information and interactions available online will vary depending on the person. Some people are better prepared to handle greater amounts of information and ambiguity than others (Mulder, dePoot, Verwij, Janssen & Bijlosma, 2006). Additionally, some technology environments require less cognitive capacity from learners than others (Reeves, 1999).

The question that educators must consider is whether they must adapt their teaching methods to the ability of students (Barnes & Tynan, 2007) or whether they can design their approach to help students develop their abilities to engage in the reflective and linear thinking required for deep understanding. A useful starting point for the design of learning experiences that require students to exercise higher levels of cognitive complexity may be found in Bloom’s Taxonomy. Mayer used this framework to suggest 19 kinds of cognitive processes that teachers can use to guide students toward the creative end of the learning spectrum where ‘meaningful learning’ takes place (2002).

Additional suggestions for the design of an effective technology-based learning experience include interactive instructors, dynamic discussions, and clear consistent course structures (Swan, 2002). Yet, the opportunities for information and interaction should not be overwhelming as demonstrated by previous research. Further suggestions include the importance of giving students the opportunity reflect, develop abstract understanding, and personally apply the information that they encounter if they want to engage more deeply (Immordino-Yang, Christodoulou & Singh, 2012). The value of instruction is its ability to limit the scope of student effort in the most productive manner.

These features require additional time and come at the necessary expense of some informational breadth. However, as recognized previously, developing the process of learning is more important than mastering the information set. The content and technology are not as important as the use that students make of these resources (Woo & Reeves, 2007). The cognitive processes developed through deeper levels of engagement will retain their value much longer than a knowledge base of rapidly expiring content. However, the development of these processes will require some level of instruction.

The Future

As online learning technology continues to enhance, extend, and empower the education process (Smyth, n.d.) it is vital for educators and students to consider the importance of pedagogy in making this tool useful for learning. It would be detrimental to higher education continue to make the mistake of confusing information and interaction for education. Fortunately, it is quite likely that the trend toward an overreliance on content and technology is coming to an end. Progressive educators like Dewey (1938) and Montessori (2004) saw an application of their ideas that resulted in chaotic classrooms and effectively argued that experiential learning with social and didactic tools still requires some kind of over-arching structure in order to be effective. Today, several voices have recognized the chaos of online learning separated from instruction and suggested models like TPACK to provide over-arching structure for the use of technological tools (Jenson, 2015; Mishra & Koehler, 2006).

Instruction plays a significant role in enabling students to use technology for learning. This guidance can be embedded within the technology itself (Reeves, 1999) or within the design of technology-based instruction (Swan, 2002). Those students that are not part of a formal institution of higher education may find it valuable to connect with a learning community in which their use of technology can be guided in a coherent fashion toward increasing levels of cognitive complexity. In other words, “The role of the teacher is essential to facilitating the process and providing the learners with the resources and kinds of activities that will help them to build knowledge collaboratively, using the internet” (Harasim, 2012, no page). Perhaps it is time to re-emphasize the important role of instruction for students who want to maximize their use of technology for learning both online and in the classroom.

Learning Activity

The causes of ineffective learning are not limited to student in higher education. It is quite possible that you are bombarded with information overload on a regular basis. If you wish to personalize some of the concepts in this report, please proceed through the following steps.

Activity A

  • Read This Article for an overview of some features of information overload
  • Complete This One Minute Survey to find out what features of information overload others struggle with. Be sure to view the results at the end.
  • Think about your most frustrating experience with information overload, then read through the “Potential Solutions” section again. How can you apply these ideas to overcome your next encounter with information overload?

Activity B

Search for an online learning course (e.g. “learn to code”). Then consider the following:

  • How many search results did you get back?
  • How will you decide which results are worth looking at?
  • How will you make your decision about which resource to use?
  • Compare this process to asking an expert which resource to use? What are the pros and cons of this alternative?
  • What kind of guidance would you need to make this process simpler or more accessible? (This is the most important question as it should help you discover some instructional steps you can apply to helping others learn with technology.


ACRL. (1989). Presidential Committee on Information Literacy: Final Report. Retrieved from http://www.ala.org/acrl/publications/whitepapers/presidential

Ally, M. (2004). Foundations of educational theory for online learning. In Anderson, T. & Elloumi, F. (Eds), Theory and practice of online learning (pp 3-31). Athabasca, Canada: Creative Commons: Athabasca University.

Barnes, C. & Tynan, B. (2007). The adventures of Miranda in the brave new world: learning in a Web 2.0 millennium. Research in Learning Technology, 15(3), 189-200.

Brown, M., Dehoney, J., & Millichap, N.. (2015). What’s Next for LMS?  EDUCAUSE Review 50(4) Retrieved from http://er.educause.edu/articles/2015/6/whats-next-for-the-lms

Bull, G., Thompson, A., Searson, M., Garofalo, J., Park, J., Young, C., & Lee, J. (2008). Connecting informal and formal learning experiences in the age of participatory media. Contemporary Issues in Technology and Teacher Education, 8(2), 100-107. Retrieved from http://www.citejournal.org/vol8/iss2/editorial/article1.cfm

Carr, N. (2011). The shallows: What the Internet is doing to our brains. San Francisco, CA: WW Norton & Company.

Dewey, J. (1938). Experience and education. New York, NY: Macmillan.

Diaz, V. (2010). Web 2.0 and emerging technologies in online learning. New Directions for Community Colleges2010(150), 57-66.

Ernst & Young. (2011). Higher education and the power of choice. Retrieved from http://www.ey.com/Publication/vwLUAssets/Higher_education_and_the_power_of_choice_Australia/$File/Higher%20education%20and%20the%20power%20of%20choice%20Australia.pdf

Immordino-Yang, M. H., Christodoulou, J. A., & Singh, V. (2012). Rest is not idleness implications of the brain’s default mode for human development and education. Perspectives on Psychological Science7(4), 352-364.

Grandzol, C. J., & Grandzol, J. R. (2010). Interaction in online courses: More is not always better. Online Journal of Distance Learning Administration13(2), 1-18.

Harasim, L. (2012). Learning theory and online technology. New York, NY: Routledge.

Jenson, K. (2015). Behind the Screens. Unpublished manuscript. Retrieved from http://www.academia.edu/12279274/Behind_the_Screens_Developing_a_Digital_Learning_Literacy

Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching.Educational psychologist41(2), 75-86.

Knight, P. T., & Yorke, M. (2003). Employability and good learning in higher education. Teaching in Higher Education, 8(1), 3-16.

Lei, J. (2010). Quantity versus quality: A new approach to examine the relationship between technology use and student outcomes. British Journal of Educational Technology41(3), 455-472.

Mayer, R. E. (2002). Rote versus meaningful learning. Theory into practice,41(4), 226-232.

McArthur, J. (2011) Reconsidering the social and economic purpose of higher education. Higher Education Research and Development, 30(6), 737-749.

Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017-1054.

Mitra, S. (2007, February). Sugata Mitra: Kids can teach themselves [Video file]. Retrieved from http://www.ted.com/talks/sugata_mitra_shows_how_kids_teach_themselves?language=en

Mokhtar, I. A., Majid, S., & Foo, S. (2008). Information literacy education: Applications of mediated learning and multiple intelligencesLibrary and Information Science Research, 30(3), 195-206. doi:10.1016/j.lisr.2007.12.004

Montessori, M. (2004). The Montessori method: The origins of an educational innovation: Including an abridged and annotated edition of Maria Montessori’s The Montessori method (G. Gutek, Ed.). Lanham, MD: Rowman & Littlefield.

Mulder, I., de Poot, H., Verwij, C., Janssen, R., & Bijlsma, M. (2006, November). An information overload study: using design methods for understanding. In Proceedings of the 18th Australia conference on Computer-Human Interaction: Design: Activities, Artefacts and Environments (pp. 245-252). ACM, 2006.

Oliver, M. (2005). The problem with affordance. E-Learning and Digital Media, 2(4), 402-413.

Pacific Northwest National Laboratory (n.d.). Compfight [photo]. Retrieved from https://www.flickr.com/photos/36016325@N04/6310387725/

Piaget, J. (1964). Development and learning. In R.E. Ripple & V.N. Rockcastle (eds.), Piaget Rediscovered  (pp.7-20). Retrieved from http://www.psy.cmu.edu/~siegler/35piaget64.pdf

Reeves, W. (1999). Learner-centred design: A cognitive view of managing complexity in product, information, and environmental design. Thousand Oaks, CA: Sage Publications.

Schank, R. C. (1998). Horses for courses. Communications of the ACM41(7), 23-25.

Siemens, G. (2005). Connectivism: A learning theory for the digital age. International Journal of Instructional Technology and Distance Learning, 2(1), 3-10.

Smyth, K. (n.d.). 3E Framework [Blog post]. 3E Education. Retrieved from http://3eeducation.org/3e-framework/

Strauss, V. (2012). Does Khan Academy know how to teach? The Washington Post. Retrieved from: https://www.washingtonpost.com/blogs/answer-sheet/post/how-well-does-khan-academy-teach/2012/07/27/gJQA9bWEAX_blog.html

Swan, K. (2002). Building learning communities in online courses: The importance of interaction. Education, Communication & Information2(1), 23-49.

Trapnell, P., Sinclair, L., & Immordino-Yang, M. H. (2012). Can Too Much Texting Make Teens Shallow? In K. Doheny (Ed.), Report presented at Society for Personality and Social Psychology 13th Annual Meeting (Jan 26-28, 2012) San Diego, CA. Retrieved from http://teens.webmd.com/news/20120203/cantoomuchtextingmakeyouthshallow?print=true 4/4

Vincent-Lancrin, S. (2004). Building futures scenarios for universities and higher education. An international approach. Policy Futures in Education, 2(2), 245-263.

Woo, Y., & Reeves, T. C. (2007). Meaningful interaction in web-based learning: A social constructivist interpretation. The Internet and higher education10(1), 15-25.

eLearning Systems Management

The August Collegium is an ongoing experiment to redesign and reposition higher education as a self-directed, democratic model of developing of learning fluency (Jenson, 2015). Students collaborate with each other, with learning coaches, and with subject experts to design an individualized learning experience that is somewhat reminiscent of Bishop and Verleger’s (2013) cooperative or problem-based learning. In this context, education transcends simple content mastery and thus requires a high level use of technology for purposes beyond administration and information transfer.

Documentation like this strategic plan is only effective at describing functions of technology, not at initiating them (Stensaker, Maasen, Borgan, Oftebro, & Karseth, 2007). However, initiation requires an understanding not only of the starting point, but also of the objectives that must be reached (Gosper, Malfroy, & McKenzie, 2013). The success of the August Collegium depends upon the leadership and management practices that follow for the development and maintenance of a high quality and sustainable eLearning environment

Developing an eLearning environment

The eLearning environment provided by the August Collegium is built around a strong core of institution-wide support for data management, connectivity, and integration. In higher education, the learning environment must be designed to support the needs of students (Biggs, 2012) and current trends indicate that this requires support for a learning experience that is open, social, collaborative and mobile (White, 2008). These objectives are largely reached by a decentralized integration of Web 2.0 technologies (Brown, 2010) into learning design and experiences by teachers and students as needed. Though it is more difficult to manage, this distributed approach may improve the amount of student participation as it more closely aligns with their needs (Gosper et al., 2013). Nevertheless research into current uses of technology indicates a need for a high level of support in effective use for learning (Gosper et al., 2013; Stensaker et al. 2007; Way & Webb, 2007).

Core technology services provide integration, connectivity and data management leaving the choice of applying these resources to the students and departments. Users are expected to provide their own devices for personal access and use of these core technologies (Gosper et al., 2013). This focused application on the development and maintenance of core technologies rather than peripheral applications is expected to drive higher-level use of technology for learning and teaching (Way & Webb, 2007).

The challenge of scalability is somewhat abated with the affordances of cloud computing and Web 2.0 technologies, however it is still difficult to respond to changing motivations, applications, and impact of using the eLearning environment (Way & Webb, 2007). Reaching a certain stage of technological maturity does not reduce this challenge, it simply changes it, and there are few good models to follow (Brown, 2010). What good examples and best practices exist are difficult to apply since technology operates in a complex environment where “the whole is far more than the sum of its parts” and one little change can lead to dramatic results (Snowden & Boon, 2007). Thus, the key challenge in developing this eLearning environment has been and continues to be establishing the vision, outcomes, organizational relationships, and thought processes that will support its creation and use (Stensaker et al., 2007).

Leading and Managing Educational Change

Change leadership within the August Collegium is driven primarily by “people and pedagogy” requirements (Stensaker et al., 2007). Students, learning coaches, and subject experts (key human factors in the August Collegium) all require different kinds of support from technology. Under the direction of an ICT representative, members from each of these groups will form a single governing body to develop and update the strategic vision for the eLearning environment. The strength of this diverse approach to leadership is that it bypasses the organizational silos separating vision from implementation from use. It represents the real needs of users (Holt et al., 2013; Puzziferro & Shelton, 2009) but this comes at the cost of time and the need for strong leadership and communication skills to promote understanding among diverse groups of people (Bryson, 1995).

The process of transition into a technology-rich environment is managed from the bottom up in terms of governance, but top down in terms of implementation. As the strategy is developed by the governing body, a task force of ICT members and the governing body work together on the process of implementation (Bryson, 1995). These cooperative groups are designed around the initiation of projects, but ongoing maintenance is the responsibility of the dedicated ICT staff. From an HR and budgeting standpoint, long-term costs for maintenance and training must be considered for the lifetime of any proposed technology changes. This helps to insure that development and support of core technologies is emphasized over peripheral applications that are not crucial to the institution as a whole (Gosper et al., 2007). It may also encourage an iterative approach of testing technology solutions before the entire strategy has been developed (Bryson, 1995).

Accompanying proposals for new technology should be a plan for their use (Brown, 2010; Holt et al., 2013). A balance must be found between constant input and revision to stay ahead of the curve, but also keeping enough structure and consistency to be useful. Planning and transparency in communication are important to bring these diverse aspects together (Holt et al, 2013). However, will likely be some level of misunderstanding between the technology developers and the users of the technology. This creates a challenge of insuring alignment with strategic plans, comprehensive plans, and devices used to implement the plans (Bryson, 1995). Crucial to successful resolution of such disagreements is the consideration of the mission of the organization and how this relates to the changes that are proposed (1995).

Infrastructure for eLearning

Cloud technology providing an open, mobile, collaborative, and social learning experience has shifted the focus of internal infrastructure development toward management, integration, and connectivity support rather than hardware maintenance (Hayes, 2008; White, 2008). External Web 2.0 and open education resources are preferred to internal resources as they enable wider access to the learning environment (White, 2008) and enjoy dedicated support. They also enjoy the benefits of “peer review” and “transparency of process” resulting in “better quality, higher reliability, more flexibility, lower cost” and no “vendor lock-in” (p. 6).

Some local hardware is still necessary for connectivity: power outlets, spaces for technology use, wireless internet coverage and printing, scanning, and communication services required for flexible and consistent access to the cloud-based resources (Gosper et al., 2013). All these kinds of inputs and outputs that students cannot conveniently provide on their own are considered part of the infrastructure for eLearning.

In the digital environment, students should enjoy a seamless experience with every aspect of the institution across multiple platforms (Marshall, 2004). However, it will be a challenge to integrate data from external systems that is essential for tracking student progress and designing the personalized learning models. This challenge of inter-operability requires some kind of common standard to “codify the boring so the exciting can happen on top of them” (Cooper & Kraan, 2009). The simplest method to follow will be to store data separately from the applications that use them and to create interoperability standards and APIs with which multiple programs can access the core user, information, and activity details for purposes of learning and teaching or administration.

These types of institution-wide technology provisions are suggested and designed by the iCT department to support the strategic needs identified by the representative body described already. Examples of this technology might include fundamental services like database maintenance, Internet access, technical support, security, and digital rights management. Library, finance, record, timetable and in-classroom systems will also be managed at this level because of their sensitivity and potential for working together more effectively (Cooper & Kraan, 2009). On the other hand, academic resources for research and scholarship that apply to a specific discipline or department (Stensaker et al., 2007; White, 2007) will only be considered for development as internal infrastructure as determined by the governing board.

Curriculum Design

The emphasis on core infrastructure is purposeful to “promote innovation in teaching,” (Johnson et al., 2013). In order to encourage use of the systems for these creative ends, Stensaker et al. (2007) recommend investing in communication about application of systems, not just in their development. A strong infrastructure and model for change only becomes valuable for learning when users can create “authentic activities” or meaningful experiences (Herrington & Kervin, 2007). Students receive guidance in the effective use of this learning environment (Biggs, 2012) from learning coaches and subject experts.

The “user-centric” model of the August Collegium “emphasizes participation (e.g., creating, re-mixing) over presentation” (White, 2008, p. 7). Learning coaches are responsible to identify and screen learning resources and subject experts do the same for information resources. A central resource database repository of these curriculum resources then allows any member of the collegium to add rankings, use details, and related resources or comments to them. Platforms for inter-departmental communication are an important part of encouraging collaboration across departments and transparency among users for improving the process of curriculum design (Puzziferro & Shelton, 2015). Insights from technology like “learning patterns” and “interactions” should be incorporated into the redesign of curriculum (White and Larusson, 2010).

Student and Staff Capability

Studies have demonstrated that students are interested in greater use of technology for the education process, but only if they perceive it has some strategic value for their learning experience (Gosper et al., 2013). Students want tools to be efficient, reliable, and established (Ibid). However, there is potential to expand the depth of interaction that takes place in terms of content (White & Larusson, 2010) and social interactions (Gilbert & Dabbagh, 2005).

This expansion is difficult because a different set of features will appeal to the expectations of an early adopter, who is needed to drive change, versus the mainstream user, who is needed to accept the change (White, 2007). Staff proficiency and digital help centres may help the mainstream users overcome their hurdles to accepting the technology and support them in using it effectively (Biggs, 2012).

Because students and staff have responsibility for developing the learning experience on a regular basis, they have a dedicated interest in knowing a broader range of functionality than that which they use regularly. The responsibility for this ongoing education may be given to HR (Marshall, 2004) or to a group of the staff who are interested in new applications of technology. Borrowing from the flipped classroom model (Bishop & Verleger, 2013), information and discussion resources are available asynchronously and classroom time will focus on discussion and application. However, the effectiveness of strategies for building staff capability in ICT depends on the incentives that are in place (Stensaker et al., 2007). Teacher perceptions of innovation are crucial and may be informed by opportunities for recognition or reward (Way & Webb, 2007). In regards to students, Biggs suggested constructive alignment of objectives, activity, and assessment. (2012) This should also apply to staff with the application of performance recognition, incentives, and assessment around the objectives for technology use and activity (Marshall, 2004). Evaluation of staff should focus on actual opportunities for improvement – revealing what the teacher doesn’t know and should try to discover (Puzziferro & Shelton, 2015).

Quality and Sustainability

Current models for ICT are focused on short-term gains, but sustainability requires a long-term focus for critical interventions (Marshall, 2004). There are two critical areas for ongoing development. First, the proprietary software database that manages user, information, and activity data must be updated in response to changes in use and demand. Second, learning coaches and subject experts must be familiar with uses they can make of this software. Finally, a need that can be expected to arise on a regular basis will be the integration between the core infrastructure and peripheral applications.

I have extensively studied the implications of learning and teaching in online environments and have proposed a theory of digital learning literacy derived from multiple sources on the subject (Jenson, 2015). This theory has informed the design of the August Collegium and the structure of its online learning environment. However, my understanding of the technology to support the teaching and learning process is limited to a basic functional theory. Before launching this eLearning environment, I will need to partner with individuals who have a comprehensive understanding of the design and maintenance of technology resources. Those who join the organization will find it necessary to develop an ability to discover and apply new resources to learning and teaching on a regular basis.

In the complex environment of technology, it is not possible to prepare a best response for every circumstance (Snowden & Boone, 2007). However, it is possible to prepare people to respond effectively by providing structure, strategy, policy, procedure, and tactics like those outlined in this document (White, 2007). The objectives for the eLearning environment proposed here will need to be updated as the ideas become actions with unexpected implications.

“The founders [of YouTube] could not possibly have predicted all the applications for streaming video technology that now exist. Once people started using YouTube creatively, however, the company could support and augment the emerging patterns of use” (Snowden & Boone, 2007). Preparation for this kind of response requires an understanding of the entire ecosystem in which the technology operates as well as a clear vision for the use of technology to support learning and teaching (Bryson, 1995). Because no change in technology is entirely predictable, management should be prepared to accept an iterative process whereby proposals are partially funded and tested before full development (Cummings, Phillips, Tilbrook, & Lowe, 2005). If there is need for this particular feature, users will be attracted to it and it will grow on its own (Snowden & Boone, 2007).



Biggs, J. (2012). What the student does: Teaching for enhanced learning. Higher Education Research & Development31(1), 39-55.

Bishop, J. L., & Verleger, M. A. (2013, June). The flipped classroom: A survey of the research. In ASEE National Conference Proceedings, Atlanta, GA. Retrieved from: http://www.studiesuccesho.nl/wp-content/uploads/2014/04/flipped-classroom-artikel.pdf

Brown, S. (2010). From VLEs to learning webs: the implications of Web 2.0 for learning and teaching. Interactive Learning Environments18(1), 1-10.

Bryson, J. M. (1995). The Strategy Change Cycle. Strategic planning for public and non-profit organizations : A guide to strengthening and sustaining organizational achievement (Revised Ed.). San Francisco, CA: Jossey-Bass

Cummings, R., Phillips, R., Tilbrook, R., & Lowe, K. (2005). Middle-out approaches to reform of university teaching and learning: champions striding between the top-down and bottom-up approaches. The International Review of Research in Open and Distributed Learning6(1). Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/224/307

Cooper, A., & Kraan, W. (2009). JISC business case for standards. In Ed. S. Holyfield JISC CETIS. Retrieved from http://www.jisc.ac.uk/publications/briefingpapers/2009/bpbusinesscaseforstandards.aspx

Gilbert, P. K., & Dabbagh, N. (2005). How to structure online discussions for meaningful discourse: A case study. British Journal of Educational Technology,36(1), 5-18.

Gosper, M., Malfroy, J., & McKenzie, J. (2013). Students’ experiences and expectations of technologies: An Australian study designed to inform planning and development decisions. Australasian Journal of Educational Technology, 29(2).

Herrington, J., & Kervin, L. (2007). Authentic learning supported by technology: Ten suggestions and cases of integration in classrooms. Educational Media International44(3), 219-236.

Holt, D., Palmer, S., Munro, J., Solomonides, I., Gosper, M., Hicks, M., … & Hollenbeck, R. (2013). Leading the quality management of online learning environments in Australian higher education. Australasian Journal of Educational Technology29(3). Retrieved from http://www.ascilite.org.au/ajet/submission/index.php/AJET/article/view/84/271

Jenson, K. (2015). Behind the screens: Developing a digital learning literacy. Retrieved from http://www.academia.edu/12279274/Behind_the_Screens_Developing_a_Digital_Learning_Literacy

Johnson, L., Adams Becker, S., Cummins, M., Freeman, A., Ifenthaler, D., & Vardaxis, N. (2013). Technology outlook for Australian tertiary education 2013-2018: An NMC Horizon Project regional analysis. Austin, Texas: The New Media Consortium.

Marshall, S. J. (2004, December). Leading and managing the development of e-learning environments: An issue of comfort or discomfort. In 21st ASCILITE Conference, Perth Australia (pp. 5-8). Retrieved from http://www.ascilite.org.au/conferences/perth04/procs/marshall-keynote.html

Puzziferro, M., & Shelton, K. (2009). Supporting online faculty-Revisiting the seven principles (a few years later). Online Journal of Distance Learning Administration12(3).

Snowden, D. J., & Boone, M. E. (2007). A leader’s framework for decision making. Harvard Business Review85(11). Retrieved from https://hbr.org/2007/11/a-leaders-framework-for-decision-making

Stensaker, B., Maassen, P., Borgan, M., Oftebro, M., & Karseth, B. (2007). Use, updating and integration of ICT in higher education: Linking purpose, people and pedagogy. Higher Education54(3), 417-433.

Way, J., & Webb, C. (2007). A framework for analysing ICT adoption in Australian primary schools. Australasian Journal of Educational Technology, 23(4).

White, B., & Larusson, J. A. (2010). Strategic directives for learning management system planning. Educause. ECAR Research Bulletin. Retrieved from: http://www.educause.edu/library/resources/strategic-directives-learning-management-system-planning

White, G. (2008). ICT trends in education. Digital Learning Research, 2. Retrieved from: http://research.acer.edu.au/digital_learning/2

White, S. (2007). Critical success factors for e‐learning and institutional change—some organisational perspectives on campus‐wide e‐learning. British Journal of Educational Technology38(5), 840-850.

Behind the Screens – Developing a Digital Learning Literacy

Kevin Jenson

Colorado State University

Dr. Cynthia Stevens

May, 2015


Lifelong learning, the goal of higher education and the foundation of a free society is inaccessible to many individuals around the world because they lack the literacy skills necessary to leverage the numerous resources available through digital learning technologies. Online learning experiences require individuals to have some level of learning fluency in order to maximize the opportunity. Lack of this has caused the explosion of information online to become more of a barrier than a bridge to the development of literacy skills among those who need them most. A solution to this problem would be to shift the focus of formal education onto the development of student learning fluency. Scaffolded development of digital learning literacies would empower students to maximize their online learning opportunities and develop the ability to continue learning on their own. Teaching students to fluently relate to information, technology, other people, themselves, and the learning process empowers them to transform any kind of external circumstance into a positive learning experience. Educators that aspire to this end must understand that technology is not a deuce-ex-machina that solves every problem it encounters. Rather, student success in the online learning environment is all about what goes on behind the screens.

Keywords: literacy, technology, lifelong learning, higher education, education reform, information technology, education, internet, constructivism, self-directed learning, scaffolding, online learning, information, technology, interpersonal, intrapersonal, academic, learning fluency, accessibility, curriculum development

Download Link: jenson-k-behind-the-screens-developing-a-digital-learning-literacy

Behind the Screens:

Developing a Digital Learning Literacy

With the advent of the Internet and modern software, access to information and learning opportunities have exploded around the globe. However, the results of this overwhelming flood of information have left researchers wondering whether it really provides a solution to the problem of illiteracy. This synthesis explores the problems of online learning and proposes a solution: the development of a digital learning literacy.

Some have looked at technology as a deuce-ex-machina solution to problems of illiteracy, but studies show that its tendency is to accentuate the disparity that already exists between educated and uneducated individuals. Technology access, difficulties in learning the technology, and problems with the reliability and manageability of information have transformed learning technology from a bridge to a better future into a wall that jealously guards access to lifelong learning from many of the individuals who need it most. Increasing the amount and diversity of information an individual can access is helpful, but only when it is accompanied by knowledge of how to do something useful with that information.

Individuals require a unique form of digital learning literacy in order to succeed as students in an online environment. Research suggests five categories of learning fluencies that could be used to train students in digital learning literacy. Information fluency involves training students to relate to information resources effectively and is sometimes thought of as basic literacy. Technology fluency has evolved as a distinct field from information literacy that focuses on the unique skills required for students to develop the various learning fluencies in a digital environment. Interpersonal fluency describes the student’s ability to navigate relationships and communication as foundational to success in learning. Intrapersonal fluency revolves around cognitive, emotional, physical, and spiritual development of the student as the framework for the learning experience. Finally, academic fluency applies all other learning fluencies toward the evolution of a self-motivated, self-guided, self-assessed, lifelong learning experience.

Equipping students with such competencies has been shown to make them more effective learners and researchers have used this to argue for a stronger emphasis on developing these skills through the formal education process. Most commonly, teachers are encouraged to incorporate these fluencies into the regular classroom instruction, but some have begun to suggest that the entire learning experience needs to revolve around developing these fluencies. They are essential to the creation of lifelong learners and should be at the center and not the periphery of the student educational experience.

Learning technologies on their own may not provide a solution to the problem of illiteracy, but when paired with the development of a digital learning literacy they show great potential. After all, education is not just about words showing up on the screens of connected individuals around the world, its about what the individual students have learned to do with them by developing learning fluency behind the screens.

Lifelong Learning and Technology

Importance of Lifelong Learning

In the information age where knowledge is constantly updated, the ability to continuously learn is vital to individuals, business, and citizenship (ACRL, 1989; Candy, 2002). To be effective, lifelong learning must become more than just a passive process of consuming information, knowledge, or experiences. “In times of rapid and pervasive change, an existing or static body of knowledge does not equip people with the ability to cope, much less to thrive and advance” (Candy, 2002, p. 3). Rather, the knowledge base of an individual must be under constant development in an ever-changing environment. Recognizing the need to equip individuals to survive in this dynamic environment the ALA has stated, “Developing lifelong learners is central to the mission of higher education institutions” (2000).

Learning and Literacy

According to UNESCO (2006), literacy and lifelong learning are interconnected: “Literate societies enable the free exchange of text-based information and provide an array of opportunities for lifelong learning” (p. 159). A report by Young, Macrae, Cairns, and Pia (no date) supported this claim by demonstrating the benefits of literacy education for social justice, health, economic development and lifelong learning. Probert added that schools that want to prepare students for lifelong learning must have a strong focus on developing student proficiency with information literacy skills (2009).

The most commonly used definition of information literacy was presented by the Association of College and Research Libraries in 1989: “To be information literate, a person must be able to recognize when information is needed and have the ability to locate, evaluate, and use effectively the needed information.” Where individuals have mastered these skills, they can enjoy a “dynamic community…that exchanges ideas and engages in debate” (UNESCO, 2014). On the other hand, UNESCO continued, “Illiteracy…is an obstacle to a better quality of life, and can even breed exclusion and violence.” Clearly, strong literacy training that equips adults for effective life-long learning can have a wide-reaching impact.

The Achievement Gap

Unfortunately, for many individuals, a solution to illiteracy has been impossible to find. Literacy, even in its early days was a tool that separated people into different categories of learning: those who could use the tools and those who had to rely on their own methods (Candy, 2002). In recent decades the gap in literacy access has continued to grow – including in the United States. “The evidence suggests” said Candy, “that those people (and peoples) who fail to keep up with developments are likely to fall progressively further and further behind…” (2002, p. 4). Instead of serving as a bridge to a new life, for many people literacy has become a blockade preventing access to engagement in economic, civic, and social functions.

One example of the increasing gap in access comes from data published by the Higher Education Research Institute in 2015. Analysis shows that the number of students enrolled in higher education in the United States increased between 1980 and 2013. The percentage of A students increased from 26% to 52% of this number while the percentage of C students declined from 15% to 3% during the same period. Meanwhile, the number of students who identified higher education as a means to financial stability increased from 62% to 82% over the same time period. This data confirms the increasing impact that a student’s ability to learn has upon his or her potential to earn a reasonable living. College marketing programs proudly display statistics about the difference in lifetime earnings by individuals with a college degree versus those without. Sadly, this only accentuates the fact that students in the United States who fall behind in high-school are at a greater risk of losing the opportunity for higher education and economic success than their parents only 2 decades previous.

The Golden Hand of Technology

Many individuals have begun looking toward learning technology as the answer to this problem of accessibility. Candy (2002) identified the impact of technology on learning through, globalization, automatization, work requirements, family and community relationships, and the explosion of information. Much of this information has been curated by individuals and organizations offering informal learning experiences through Massive Open Online Courses (MOOCs). One of these, Connexxions Academy, was founded by Richard Baraniuk who delivered an address in 2006 celebrating the way in which online information helped to remove the barriers students faced to accessing information. With access to the internet, all the knowledge of the world is at the fingertips of any individual. Furthermore, Ghaffari identified this digitalization of information as helping students to focus on learning and participation rather than on searching for information (2008).

Students themselves have gravitated toward online learning opportunities in search of accessibility (Agarwal, 2013). For some, accessibility means that the learning opportunity fits with their lifestyle, unlike local schools that require specific meeting times and locations (Ghaffari, 2008). Others appreciate the self-paced nature of many courses (Agarwal, 2013; ALA, 2000; Codreanu & Vasliescu, 2013). For others still, the financial price of online learning is more accessible than other learning opportunities. In the case of YouTube and other video lecture series, it might even be free. Although their focus is on accessibility, many of these students would be glad to know that online lectures are a good substitute for face-to-face lectures, according to a study by Wieling and Hoffman (2010).

Besides accessibility, students give a variety of other reasons for choosing online educational options. Personalized, instant feedback is important to many students (Agarwal, 2013; Koller, 2012). However, Wieling and Hofman (2010) could not find any significant correlation between instant automated feedback and student performance. Peer support and interaction (Agarwal, 2013; Codreanu & Vasilescu, 2013) are other valuable aspects of the online learning experience and blend together with the idea of instant feedback to spur the recent development of gamified learning experiences.

Howard Gardner’s theory of Multiple Intelligences may explain some of student interest in online learning opportunities. The greater range of teaching styles and learning experiences available online naturally appeals to a wider diversity of learning styles (Codreanu & Vasilescu, 2013). A study by Bhatti and Bart (2013) showed a significant relationship between learning styles and academic achievement. This is similar to an observation by Dewey (1938) that “those to whom the provided conditions were suitable managed to learn” (p. 53). If one free online lecture is not a suitable condition, it is very easy for students to find a more relevant teacher somewhere else online.

Problems in Online Learning.

The preceding examples demonstrate that technology has revolutionized the learning experience in terms of access to information and opportunities to learn. Individuals who have the right background may be able to maximize the experience of online learning. On the other hand, it is impossible to say “go”: and expect an inexperienced swimmer to figure out how to win a race. They will be too busy trying not to drown to achieve a goal.

Similarly, Kirschner, Sweller, & Clark (2006) have observed that students need guidance if they are going to do anything useful with the opportunities presented to them by online learning. Accessibility issues run much deeper than access to a computer and the Internet, information and knowledge are not the same thing as education, and most online learning experiences rarely transcend the lower levels of Bloom’s Taxonomy. Mere access to the rapidly expanding amount of information online does not guarantee that an individual will have a helpful learning experience. There must be some guiding metanarrative to the individual’s encounter with online learning opportunities.

John Dewey in Experience and Education (1938) defined a positive constructivist learning experience as one that gradually builds on itself over time. Most online learning does not meet this criteria. Only a small percentage takes place in a formal environment (Merriam and Bierema, 2014), and this has led to a fragmented experience for many students. For these reason and others, Young, Macrae, Cairns, and Pia (no date) have questioned whether e-learning holds the potential to include a greater number of individuals in education.

Technology Access

For many students around the world the most basic issue facing their access to learning is the lack of technology. For nearly a decade, education researcher Sugata Mitra conducted “Hole in the Wall” experiments. These involved placing one computer in an impoverished location in India and watching to see what happened over time. The results of this study showed that children were able to share their resources and develop their own opportunities for learning outside of a formal educational context (E. G. West Centre, 2015). Thus, a formal educational context is not necessarily a requirement for learning with technology. However, the experiment was conducted with children and may be more difficult to duplicate with adult learners.

Unfortunately, access to technology is not the only problem for many individuals in developing countries. Those who do have the technology available may not know how to use it or find it to function poorly. Al-Suqri’s study of university students in Oman (2007) showed limited accessibility of full-text resources, poor Internet connection if it was available, and relatively few sources in Arabic. But even in developed countries, training is not always available, as demonstrated by a survey of nursing students in the UK in 2004 (Kaminsky, Switzer, & Gloeckner, 2009). Though the situation has obviously changed since then, a study by Healey (2015) reiterated that access to technology resources is still limited.

Another issue of access was discovered in the poor areas of Scotland by Young, Macrae, Cairns, and Pia (no date). Their investigation showed that even when offered access to literacy education, individuals who need it do not necessarily have a value for it and either avoid it altogether or drop out before making progress. Cieslik and Simpson also recognize that opportunity does not necessarily mean people who need help will take it (2006). A study by Heath and Miller (2012) showed that perceptions of the benefits of online learning can vary based on culture, environment, and age. Not everyone is looking for the same thing. Furthermore, by the time late individuals arrive on the scene eager to participate in online learning, there are already hierarchies and power structures in place that may be difficult, if not impossible for them to navigate (Charlton, no date). For this reason, Mitra (2007) has recommended that technology be introduced in advance of the ideology surrounding its use so that it can develop naturally rather than along predefined pathways.

Despite the growth in access to learning experiences through technology (Agarwal, 2013), it remains to be seen whether this can be used to develop information literacy instead of becoming yet another barrier to learning for those who need it the most. According to the ACRL (1989), those who most need the information are least likely to have the resources to access it. Even though a growing number of individuals have received access to technology, and now information, the OECD has recognized that this access must be complemented by certain technical skills if students are to be able to use the information that becomes accessible (Virkus, 2003). Without these skills, the sheer volume of information may only serve to make literacy access more complicated and confusing.

Technology Use

A study cited by Merriam and Bierema showed that “[students feel] overwhelmed and excluded…in a Higher Education culture-a culture that speaks a ‘foreign language,’ which follows procedures that are unfamiliar and not understood (Smears, p. 108)” (p.129). It is in the middle of this bewildering context that students are expected to acquire fluency in information literacy and technology (Candy, 2002). “Most online settings are characterized by minimal guidance, which require learners to be more autonomous and self-directed…” (Rienties, B., Giesbers, B., Tempelaar, D., Lygo-Baker, S., Segers, M., & Gijselaers, W., 2012). This has the tendency to accentuate the difference between learners who are fluent with technology and those who are not.

In the regular classroom, learners that have issues with developing the necessary literacy skills are often left behind by their teachers, said DeValenzuela (2001). Individuals with cognitive impairments are treated as having no potential for development. The same thing occurs with students who do not have the technical skills to keep up with the online learning experience, but their absence is less visible. Thus, Mokhtar, Majid, & Foo cited studies to show that “furnishing schools with advanced technology does not necessarily mean the students and teachers are sufficiently information literate to use those tools effectively (Grafstein, 2007 and Usluel, 2007).” This becomes a glaring problem when a study by Holder, Jones, Robinson, and Krass (2006) has showed that the level of student literacy could be used as an effective predictor of academic performance in a university degree program.

Surprisingly, Probert (2009) found that very few teachers know how to develop their students skills in information literacy, and Candy (2002) found that those who do offer information literacy training seldom go beyond teaching basic reading or technology user skills. To counteract the lack of training, Morgan (2010) tried to link classroom education methods with online experiences that students may already be familiar with. A similar thought by the University of Chester Business School led to the discovery that even a distraction like Facebook can be used as a learning tool (Page and Webb, 2013). However, as the learning communication modality transitions from written to visual communication (Charlton, no date), such training may become more difficult. As the ASI (2015) reported, technology now outpaces learning skills making it nearly impossible for teachers or students to depend on certain types of fluency.

Information Problems

A third problem with online learning is that more information does not necessarily translate into ability to use information (ALA, 2000). Much of the information created and shared online comes to individuals through multiple “unfiltered” channels (ALA, 2000; Candy, 2002). In 1989 the ACRL noted: “in an attempt to reduce information to easily manageable segments, most people have become dependent on others for their information” (ACRL, 1989). In the same year that statement was published, the Internet was born leading to an exponential increase in the amount of information available with questionable reliability.

This may be part of what prompted the OECD to emphasize the importance of skills in “accessing, handling and using data” rather than just remembering information from certain fields of knowledge (Virkus, 2003). Mokhtar, Majid, and Foo (2008) supported this conclusion by saying,

The information explosion has created the need for more guidance in the evaluation, selection, and use of information (Foo, Chaudhry, Majid, & Logan, 2002). Thus, even with the widespread availability of the Internet, students still need guidance and coaching on how to use online information effectively (Halttunen & Jarvelin, 2005).

One example of this is the need for students to recognize the economic, legal, and social issues surrounding their use of information (ALA, 2000).

Unfortunately, the legal framework for information management has not kept pace with developments in technology and information use. Baraniuk in his 2006 presentation noted that contemporary models of collaborative development and sharing are actually not allowed under the current legal framework for information ownership. This empowers the scenario in which information technology creates an “increasingly fragmented information base…components of which are only available to people with money and/or acceptable institutional affiliation” (ACRL, 1989). Baraniuk, however looks optimistically to the development of fair use as a potential solution to this barrier of accessibility.

Information is Not Education

If students are able to make it past the hurdles of accessibility, technology use, and information problems, they effectively discover that information does not equal education. In the days before Google, the two were more closely aligned. However, Ghaffari (2008) said, “The information age has changed the role of educators from one of disseminators of facts and theories to one of facilitators who assist students to become lifelong learners.” A similar argument could be made for learning platforms. Their value does not consist of their ability to distribute information, but to provide a learning experience that reproduces on-campus results (ALA, 2000).

In order to accomplish this, many colleges have attempted to replicate the offline learning experience in an online environment (Codreanu & Vasilescu, 2013). “Education really hasn’t changed in the past 500 years,” said Agarwal (2013), but now Codreanu and Vasilescu (2013) have suggested that the e-learning environment challenges the traditional practice of teaching and learning as conducted in an offline environment. Zucca (2013) echoed this sentiment by criticizing the attempt of universities to recreate the classroom experience online. As an alternative, Zucca argued that working adults need a unique online learning experience.

This recommendation is in line with a report by Steffens (2015) who presented an article arguing that the purpose of learning is far deeper than simply knowing. To support this argument, he cited the European Commission, which said, “…Learning to know encompasses more than acquiring knowledge, it also includes the desire to know, a positive attitude towards learning and learning to learn” (p. 54). Parents, educators, and business leaders all want better “thinkers, problem solvers, and inquirers,” said the ACRL (1989). An online learning environment that is built around distributing information will not produce these kinds of results. It can produce workers with knowledge and technical skills, but in the workplace these only account for 25% of success (Shawn Achor as cited in Barker, 2014).

It is obvious from these difficulties of accessibility, technology, information, and education that technology is not necessarily the solution to the problem of lifelong learning and literacy development. According to Healey (2015), “Technology does not create good rubrics; a knowledgeable teacher creates good rubrics” (p. 57). Similarly, technology does not create a good student; a good student makes good use of technology. According to Schocken, in an online environment “educators don’t necessarily have to teach. Instead, they can provide an environment and resources that tease out your natural ability to learn on your own. Self-study, self-exploration, self-empowerment: these are the virtues of a great education” (2012). Regrettably, students can only take advantage of this great education if they already have some skill in literacy – particularly in digital learning literacy.

Digital Learning Literacy:

A Possible Solution

Digital Learning Literacy can be Taught

According to Harvard Psychologist Shawn Achor, “Most people accept that they’re just born some way and that’s how they’re going to be the rest of their life, and whatever they were last year is what they’re going to be this year. I think positive psychology shows us that that doesn’t actually have to be the case” (as cited in Barker, 2014). However, early external factors like the accessibility of learning experiences can influence future attitudes and academic performance of students (Cieslik & Simpson, 2006). As noted previously, students who struggle with literacy in the early stages of their education generally tend to fall further and further behind.

Unfortunately, the structure of education is such that students who fall behind are not given the opportunity to improve their skills. Cieslik and Simpson (2006) provide an example of a dyslexic learner who was excluded from learning opportunities because of conditions outside of her control. In order to succeed, she needed personalized training to overcome her barrier to learning. Similarly, every student brings individual obstacles to the learning environment. With support and training, they can overcome these. However, they must first experience a mindset shift.

Carol Dweck of Stanford University has said,

In a fixed mindset students believe their basic abilities, their intelligence, their talents, are just fixed traits. They have a certain amount and that’s that, and then their goal becomes to look smart all the time and never look dumb. In a growth mindset students understand that their talents and abilities can be developed through effort, good teaching and persistence. They don’t necessarily think everyone’s the same or anyone can be Einstein, but they believe everyone can get smarter if they work at it. (as cited in Clear, 2015)

Student abilities to learn in an online environment are not fixed, they just have different starting points and strengths. Online learning partially caters to this with its variety of learning opportunities and teaching styles. Student skill levels may limit access to these opportunities but “…skill is something you can cultivate, not merely something you’re born with” (Clear, 2015). Supporting this idea from a cognitive perspective, Mokhtar, Majid, and Foo (2008) cited Feuerstein’s postulation “that intelligence is dynamic and variable, not static or fixed from birth […] intelligence can be modified through a mediator if given the right stimulation and environment (Feuerstein, 1980).” They also pointed to an improvement in test scores when exposed to pedagogy that catered to Gardner’s multiple intelligences.

Finally, Murray (2000) showed that high achieving students demonstrate high levels of self-regulation. However, low-achieving students when properly motivated show an increase in self-regulation. This indicated that internal factors like self-regulation can be trained. In an online learning environment that depends heavily on such skills, this is good news. It is possible to teach students how to maximize the opportunities to learn that come with access to digital technology and the results of doing so have been very positive.

Results of Training

Self-directed and constructive learning theories advocate for the process of learning by discovery, but studies have shown that this kind of learning is only successful when students have previous experience or information to build on (Kirschner et al., 2006). Similar observations by Rienties et al. (2012) showed that certain students perform well in a less structured environment (e.g. online) while others had greater difficulty. When the structure was increased in their experimental study, the first group of students began to disengage from the learning experience. However, the second group finally had the support they needed to learn from the experience.

A similar example by the ACRL (1989) described a public library in which individuals can access a wide variety of information. Students who are familiar with the process of research would not enjoy the supervision of a librarian. It would restrict their experience. However, new researchers would welcome such personalized guidance until they had developed their own internal abilities. The role of the public library, according to the ACRL is not just to provide information, but also to provide people “with the knowledge necessary to make meaningful use of existing resources” (1989).

Some teachers have applied this idea to their classrooms creating a blend of online and in-person learning experiences called a “flipped classroom.” Research has demonstrated this kind of learning experience produces better results than either online or classroom-only learning models. An example of a flipped classroom is one in which information is accessed and practiced online then applied in the classroom with guidance, interaction, and access to resources. In one test comparing flipped classrooms to general classrooms, failure rates in a particular class fell from 40% to 9% (Agarwal, 2013). For this reason, Agarwal first suggested transitioning learning entirely online, but then could only demonstrate the benefits of blended instruction to support the suggestion.

Another study by Codreanu and Vasilescu (2013) concluded that adult learners needed previous training in order to effectively maximize the online learning environment. However, they also suggested that the learners be given more opportunity to personalize the experience. Like the observations of Cieslik and Simpson (2006), they recognized that every student has a different obstacle to overcome in order to effectively utilize technology for lifelong learning.

Digital Literacy

As the picture has unfolded so far, lifelong learning has been shown to be heavily influenced by an individual’s fluency with information literacy skills. “Information literacy and lifelong learning are inextricably intertwined,” said Candy (2002, p. 6). The age of the Internet has made information more widely accessible, but not necessarily the literacy skills to make good use of it. Online education has not successfully transitioned beyond a platform for the presentation of information and so requires that students bring a certain amount of technological and information literacy to the learning experience. It is possible to train students to develop the literacy skills they need to maximize the online learning experience. However, these skills have not been clearly defined.

In 1989, the ACRL pointed out that there were so many different kinds of literacy that the former definition of knowing how to read must be expanded. Since then, developments in technology have forced the definition of literacy to expand far beyond simply knowing how to read. Some have grouped all the various literacy types (cultural, information, technology, media, etc.) under “information literacy,” but many others have created separate categories for both information literacy and the various other kinds. The International Panel on ICT Literacy that convened in the United States defined ICT (Information and Communication Technologies) literacy as, “using digital technology, communications tools and/or networks to access, manage, integrate, evaluate, and create information in order to function in a knowledge society” (ICT Literacy Panel, 2002, p. 2 as cited in Candy, 2002). While this and other definitions like it describe the results of literacy, there are fewer studies that identify the process of developing literacy.

According to Clear (2015), identity-based habit development has more potential for long-term transformation than results-based methods. Education is a results-based environment, but it can be overcome by the development of internal factors. As seen previously, if students are going to be empowered to learn online, they need to begin with internal change by challenging their mindsets.

Simon Schocken has said that students also need a set of tools they can experiment with in the online learning environment (2012). These tools, according to Nazari and Webber cannot be properly defined in the online environments where the complex and evolving environment will not conform to generic models (2012). Instead, they argued for teaching the students how to develop their own set of literacy skills that will meet their particular needs as individual learners.

Kirschner (1999) created a list of some competencies that students may choose from. These include:

…the ability to operate in ill-defined and ever-changing environments, to deal with non-routine and abstract work processes, to handle decisions and responsibilities, to work in groups, to understand dynamic systems, and to operate within expanding geographical and time horizons. In other words, competencies are a combination of complex cognitive skills (that encompass problem solving, qualitative reasoning, and higher-order skills such as self-regulation and learning-to-learn), highly integrated knowledge structures (e.g., mental models), interpersonal skills and social abilities, and attitudes and values. In addition, competencies assume the ability to flexibly coordinate these different aspects of competent behavior. (as cited in Virkus, 2003)

The skills listed here and others besides them can be divided into five basic categories of fluency that the literature identifies as necessary to student success in life-long learning.

These categories are information fluency, technology fluency, interpersonal fluency, intrapersonal fluency, and academic fluency. Unsurprisingly, their application to the online learning environment is matched by their ability to empower students to learn in any environment. As demonstrated earlier, it is not the environment that determines the student’s ability to engage. It is the student that determines the value of the environment to create a positive learning experience. A study published by Rienties et al. (2012) concluded that students needed some level of training in online learning at the beginning of the experience, but that it could be gradually removed over time to allow more autonomy in learning. As time progresses, the learner should be able to function in increasingly diverse and unstable environments both online and off. This kind of digital learning literacy can be developed by teaching students to master the five learning fluencies.

5 Learning Fluencies

Information Fluency

Information literacy and information fluency are terms used interchangeably throughout the literature and often substituted with other terms. Occasionally researchers will use the term information fluency, or IL (Information Literacy), to refer to all five of the learning fluencies identified here. However, most often information fluency refers to the 1989 definition of information literacy by the ALA: Recognize a need for information, identify what information is needed, find and evaluate information, and apply it to some purpose (ACRL, 1989; ALA, 2000). Webber and Johnson expand on this by defining literacy as,

effective information seeking; informed choice of information sources; information evaluation and selection; comfort in using a range of media to best advantage; awareness of issues to do with bias and reliability of information; and effectiveness in transmitting information to others. (Candy, 2002, pp. 6-7)

This last definition transcends the limitations of information fluency to touch on aspects of interpersonal and academic fluency demonstrating that none of the categories are mutually exclusive. Aspects of each one are interrelated with aspects of the others. However, in general digital learning literacy skills tend to fall mostly into one of the five categories identified here. Information fluency deals with the student’s relationship to information and data in contrast with technology, other people, themselves, and the learning process.

In addition to the descriptions given above, information literacy research tends to include skills clustered around the observation (Merriam & Bierema, 2014) and organization (ACRL, 1989; ALA, 2000; Huvila, 2011; Al-Suqri, 2007) of information.

The usefulness of observation is made clear by an example from the ACRL that showed how a company saved itself from a lawsuit by finding information about relevant patents on its product. Another example occurs frequently when organizations spend months or even years searching for a solution to a problem that had been previously solved. Sadly, much time and money is wasted because “many companies do not know how to find and use such information effectively” (ACRL, 1989).

Information fluency also includes such basic literacy skills as reading ability (ASI, 2005), research skills (Candy, 2002), finding resources (Candy, 2002), knowledge and skill in accessing information (ALA, 2000; ACRL, 1989), and screening this information for reliability (ALA, 2000; ACRL, 1989). Screening depends heavily on critical thinking skills, an aspect of organizing information and is vital to the self-governance of a people (ACRL, 1989). In a democracy, the process of finding reliable information cannot be outsourced.

Critical thinking may be the most significant aspect of information literacy as its significance is emphasized by multiple studies (ALA, 2000; Hsiao, Chen, & Hu, 2013; Kaminsky et al., 2009; Mokhtar, Majid, & Foo, 2008; Sousa, 2011). Critical thinkers ask questions like: How do you know that?” and “What evidence do you have for that?” “Who says?” and “How can we find out?” (ACRL, 1989). Hsiao, Chen, and Hu (2013) have integrated critical thinking with technology fluency by proposing that critical thinking and problem solving should be used as grading factors in online discussion.

Two other aspects of information fluency identified by the landmark 1989 study by the Association of College and Research Librarians include the ability to manage large amounts of information effectively, and to integrate information. Other authors place higher levels of Bloom’s Taxonomy of learning like the application (ACRL, 1989; ALA, 2000; Candy, 2002; Sousa, 2011), synthesis (Baraniuk, 2006; Kaminsky et al., 2009), and creation (Huvila, 2011; Probert, 2009; Sousa, 2011) of information in their definitions of information fluency. These are the ultimate goal of academic fluency and are explored further under that category.

Literacy in general, but particularly information literacy has been linked by many studies to improvements in academic performance (Mokhtar, Majid, & Foo, 2008). Other factors like knowledge acquisition and application as well as cognitive complexity are identified by Kuh, Kinzie, Buckley, Bridges, and Hayek (2006) as having an impact on student performance after college. As seen earlier, lifelong learning and literacy skills are closely related, so education on any aspect of literacy development should demonstrate these kinds of results.

Technology Fluency

Technology fluency is so foundational to ongoing student success that some universities have begun to require proficiency in information technology skills for graduation (Kaminsky, et al., 2009). Technology is the crossroad where the other five fluencies meet and find their functionality impaired or empowered. Thus, the International Standards for Technology in Education have suggested that teachers “encourage leadership by demonstrating a vision of technology infusion, participating in shared decision making and community building, and developing the leadership and technology skills of others” (as cited in Healey, 2015, p. 55). It will impact every other aspect of learning fluency.

Regrettably, the research discussed earlier demonstrates the impossibility of limiting the definition of technology fluency to one particular skill set. Nearly every field of study has its own unique requirements for technological mastery and function within the learning process. However there are some basic processes that apply in a majority of situations. For example, the use of library databases as part of information fluency or communication platforms as part of interpersonal fluency. In these examples technology is positioned as an addition to the other fluencies. However, it is often very difficult to transfer skills from the offline world online (Candy, p. 8). This may be part of the reason why information literacy and technology literacy are usually treated separately in research reports like Mokhtar, Majid, and Foo’s 2007 study on the integration of IT into Singapore schools.

The integration of information technology into the learning process consists of two distinct processes. Part one is all about learning how to use the technology. It includes basic IT, computer use, file management, word processing, spreadsheets, databases, presentations, information, and communication (Candy, 2002). According to Oliver and Towers (2000), the aspect of learning how to use technology is a necessary skill in itself (as cited in Candy, 2002). Healey (2015) built on this idea by identifying the transfer of this ability to multiple platforms as another significant step toward technology fluency.

The second part of technology fluency deals with applying skills of technology use to the learning process. The ALA (2000) has cited a 1999 report from the National Research Council, which underscored the difference between knowing how to use technology, and how to apply that knowledge toward some educational purpose. Kaminsky et al. (2009) highlighted this same contrast in their study on student fluency with information technology. On the one hand, it is assumed that students know more than faculty about technology. On the other hand, it is assumed that faculty don’t have time to teach basic information skills. A marriage of these two problematic assumptions leads to both unprepared students and to faculty who expect higher levels of technical performance.

However, just because a student can use technology doesn’t mean he or she can use that technology to make a positive impact on the learning experience. A study by Akhter (2013) demonstrated that it’s not just enough to know how to use the internet. In fact, there is a negative correlation between excessive Internet use and academic performance. Other aspects of technology fluency like sociability, utility (ability to function with a given tool), and reciprocity (need for engagement) all effect the user’s ability to maximize online learning (Codreanu & Vasilescu, 2013). Thus, Kuh, Kinzie, Buckley, Bridges and Hayek (2006) have warned that it is important to gauge learner readiness to benefit from engagement in technology intense learning experiences.

As with all five of the learning fluencies, educators should never assume that their students have a certain level of proficiency. They should not only provide opportunities to learn the skills, but should also stretch students’ technological abilities by requiring students to use them regularly (Virkus, 2003). Online learning experiences that require some level of technical fluency may become more effective by offering training in this particular skill.

Interpersonal Fluency

Interpersonal fluency deals with relationships between individuals and can be an important predictor of academic achievement (Barker, 2014). The “Hole in the Wall” experiment by Sugata Mitra showed that children could learn how to use technology with no instruction. However, they could do this only in the context of relationships with others. A study by Andretta (2007) supported this significant role of interpersonal fluency by showing how the relational aspects of learning were vital to developing a practice of lifelong learning. Merriam and Bierema (2014) offered further explanation by showing how interaction between the person and the social world create a connection to emotion. This connection may be one reason why Codreanu and Vasilescu (2013) noted that the social roles of adults are a strong indicator of their motivation to learn.

Relationships are an important aspect of learning, especially when individuals encounter new cultural environments and perspectives. Building connections between cultures may be part of Baraniuk’s reference to teaching as the process of making connections between disparate ideas (2006). Not only do individuals need to understand the influence that their culture has on their learning experience (Gruber & Barron, 2011), they need to understand and navigate the various cultural differences they will encounter in their diverse social, academic, and institutional relationships (Mokhtar, Majid, and Foo, 2008; Richardson et al., 2012).

Digital communication technologies have made the importance of cultural navigation even more significant as individuals have a broader and longer-lasting reach to their communication. Email, graphics, web design, VOIP (Voice Over IP), and other avenues of digital communication have made technology use an indispensible aspect of interpersonal fluency (Candy, 2002; Charlton, no date).

Relational interaction and communication are important factors of student success as identified by ACRL (1989) and Sousa (2011). Barker (2014) offered one possible explanation for this link by showing how social connection is necessary for optimism and stress reduction. Vann (1996) applied this idea to human resource development programs adding that individuals with a strong network of relationships demonstrate greater levels of self-direction.

Finally, a report by The Boyer Commission ties multiple ideas of interpersonal fluency into a pedagogy design that engages students in “the framing of a significant question…research or creative exploration to find answers, and the communication skills to convey the results” (as cited in ALA, 2000). The purpose of this design is to give students experience with interpersonal fluency that they can apply to learning experiences beyond the educational environment.

Intrapersonal Fluency

Transformative and humanist learning theories focus the educational experience on the development of the individual. This is a helpful idea since the mindset of individuals toward the process of learning and their own ability plays a significant role in their success at developing literacy skills and becoming life-long learners.

Erik Barker (2014) cited research by Shawn Achor that suggested,

If we can get somebody to raise their levels of optimism or deepen their social connection or raise happiness, turns out every single business and educational outcome we know how to test for improves dramatically. If we can get somebody to raise their levels of optimism or deepen their social connection or raise happiness, turns out every single business and educational outcome we know how to test for improves dramatically.

Likewise, Kurbanoglu (2003) discovered a correlation between student self efficacy and “beliefs regarding information literacy and computers….”

Richardson et al. (2012) identified perceptions of self-efficacy as predictive of student performance. Their study identified five key measures of personality that may affect academic achievement: “conscientious, openness, agreeableness, neuroticism, extraversion….” Among these, the researchers identified conscientiousness as one of the most influential personality factors on academic performance as measured by GPA (2012). The study confirmed previous conclusions that effort regulation was one of the strongest correlates with GPA performance and that procrastination may be the greatest negative correlate to academic achievement.

Other significant aspects of intrapersonal fluency include reflection (Andretta, 2007; Brookfield, 2013; Merriam & Bierema 2014), development of cognitive processes (Kirschner et al., 2006; Richardson et al., 2012), responsibility (ASI, 2005), purpose (ASI, 2005; Codreanu & Vasilescu, 2013), motivation (Codreanu & Vasilescu, 2013; Inan, 2013; Richardson et al., 2012), and persistence (ASI, 2005). Sousa (2011) has argued that arts should be considered a fundamental part of the learning experience because of their profound impact on emotional, mental, cultural, and physical development of the individual (p. 217). Research studies showed that meditative practice in higher education can lead to “the enhancement of cognitive and academic performance, the management of academic-related stress, and the development of the ‘whole person’” (Shapiro, Brown & Astin, 2011, p. 496, as cited in Merriam & Bierema, 2014, p. 138). Finally, Villavicencio and Bernardo (2013) studied the impact of emotions on academic performance and found that joy and pride can moderate a student’s self-regulating behavior and subsequently impact student’s academic achievement.

Academic Fluency

The last of the five learning fluencies, academic fluency deals with the student’s relationship to the learning process. It is the catalyst that moves the student from lower order to higher order thinking skills as exemplified by Bloom’s Taxonomy (ALA, 2000). It is partly influenced by Kirschner’s recognition of a difference between the processes required to learn a subject and the processes required to perform within that field (as cited in Kirschner et al., 2006). Academic fluency ties all the other learning fluencies together in a process of learning how to learn.

According to the ALA (2000) and Sousa (2011), students need to learn how to direct their own process of discovery and education. Inan (2013) agreed on the basis of a study in Turkey that showed significant positive correlation with self-regulated learning strategies and higher grade point averages. Chief factors influencing student performance included motivation, planning and goal setting, and strategies for learning and assessment. Richardson et al. (2012) also observed that self-regulation was an aspect of academic performance that could be enhanced. Additionally, they said, goal setting, an aspect of self-regulation was one of the strongest correlates to GPA. It is also important to developing the habit of life-long learning (Clear, 2015). Demonstrating the significant amount of interaction among the various learning fluencies, self regulation is greatly influenced by emotions (Villavicencio & Bernardo, 2013), and relationships (Vann, 1996).

Other aspects of academic fluency include: adaptability to multiple environments (ACL, 1989), curiosity applied to some end (ACRL, 1989; Codreanu & Vasilescu, 2013), goal setting (Inan, 2013; Richardson et al., 2012), self assessment (Codreanu & Vasilescu, 2013; Inan, 2013), self-engagement (ACRL, 1989), and problem solving (ACRL, 1989; ALA, 2000; Candy, 2002; Sousa, 2011). The ACRL (1989) identifies problem solving as the driving force behind the process of information literacy: “…information literate people know how to find, evaluate, and use information effectively to solve a particular problem or make a decision.” Craik and Lockhart (1972) promoted a similar idea years earlier called active learning in which individuals increase their retention through interaction and depth of mental processing (as cited in Agarwal, 2013).

Mokhtar, Majid, and Foo (2008) created a model of developing academic fluency that ends with a process of specialization in one area after another. This development of mastery proceeds from practical, to technical, contextual, and then constructive (or creative). Creativity is a valuable skill for the lifelong learner in the 21st century according to Probert (2009) and Sousa (2011). Huvila (2011) supported this claim by decrying the focus of information literacy on finding information rather than on creating information and identifying this last process as a central aspect of information literacy. Along the same lines, Koller suggested “maybe we should spend less time at universities filling our students’ minds with content by lecturing at them, and more time igniting their creativity, their imagination and their problem-solving skills by actually talking with them” (2012). Adding its voice to the call for a focus on creativity, one of the ISTE Standards draws upon Bloom’s Taxonomy and Gardner’s Multiple Intelligence Theory to support this emphasis (Healey, 2015). Creativity is an essential aspect of academic fluency.

Building on this, Baraniuk (2006) presented a model of learning through technology as an ongoing process of collaboration. In this process students discover ideas, break them apart, restructure them, share them, and build upon them in an ongoing evolution of ideas and information. To participate in this kind of advanced model of learning, students must develop their academic fluency. Or perhaps the educators that are aware of the importance of this process could begin to guide students in how to participate in the conversation of learning through creativity.

Academic fluency is unique in that it highlights those skills that students need to transition from the guided experience of formal learning to the self-directed experience of life-long learning. None of the fluencies, though, can be fully developed on their own. Students do not suddenly begin to demonstrate the learning fluency required to develop a digital learning literacy. Usually they require a catalyst like a dedicated learning environment or perhaps individualized guidance from someone who is experienced with the process of learning.

Implications and Recommendations

Students Need Training

In the days when beakers and Bunsen burners were the latest classroom technology, Chemistry professors believed that their students needed instruction on how to make the most of these instruments. Today, when online platforms for communication and information distribution are the latest classroom technology, professors should not expect their students to figure things out for themselves. The internet is not as fragile as a glass beaker, but it can be just as complicated and dangerous to figure out how to use it effectively in the learning process. Lack of training in basic fluency skills like this prevents students from continuing the learning process once they leave formal education. It has also led to a college readiness crisis identified by Kuh et al. (2006) in which many high school graduates are unprepared for success at the college level.

It is not enough to expect students to know how to learn. Kirschner et al. (2006) argued against the popular trend toward constructivist, problem-based, experiential, and inquiry-based teaching on the basis of the cognitive processes involved in learning. Students should not be placed in settings that they have no cognitive grid to explore. In fact, empirical research has shown that instructional techniques that offer minimal guidance are less effective than those “designed to support the cognitive processing necessary for learning” (p. 76). Although this runs counter to the popular trend toward independent self-directed online learning, the research supports their conclusion. Students need to learn how to learn in order to turn their use of technology into an educational experience. This is not a new argument.

In 1989, the Association of College and Research Librarians made the argument for restructuring the education experience around teaching students how to use information effectively across multiple contexts. Their report heralded that teaching students how to “gather, synthesize, analyze, interpret, and evaluate information…should be central, not peripheral…” to the learning experience (ACRL, 1989). More recently, UNESCO has embraced a mission of “training teachers to sensitize them to the importance of information and media literacy in the education process, enable them to integrate information and media literacy into their teaching and provide them with appropriate pedagogical methods and curricula” (UNESCO, 2014). In order to insure access to lifelong learning, higher education curriculum must incorporate training in learning fluency.

Redesign Formal Education

Numerous studies have indicated the need to develop this kind of student proficiency in learning, and a few have focused on the methods teacher’s use to communicate these skills. However, the teachers are often limited in this exploration by the requirements of their context. Even those students that seem to be gifted academically do not necessarily receive further training in learning fluency. Kettler (2014), for example, showed that students in gifted student programs did not demonstrate higher levels of critical thinking than other students. As discussed previously, critical thinking skills are essential to the success of students, but even the program for gifted students did not specifically focus on developing this skill. Kettler recommended including specific teaching on this learning fluency for gifted students.

Usually if education programs include such teaching, it is incorporated into other instruction or else incidental training is provided at some point in time. Mokhtar, Foo, and Majid (2007) found that this was not sufficient. Students need ongoing exposure and guidance to information literacy training if they are to master such skills. Kaminsky et al. (2009) demonstrated that student’s perceptions of their technological ability increased in the functions that were a regular part of their educational experience, but decreased in those that they may not have practiced with. This shows the importance of consistent practice with learning fluency if students want to develop the confidence to use it on their own. Kaminsky et al. (2009) recommend that professors require students to expand their technology proficiency through assignments.

Yet, integrating the practice of learning fluency into the study of other subjects may not be the only way to effectively impart such skills to students. Webber and Johnston (2000) have suggested a more overt focus on learning fluency than that provided by the curriculum integration model. They suggested creating an academic discipline around the subject of learning fluencies (as cited in Virkus, 2003). In this model, students study the art of learning along with their other subjects.

Institutions that are not ready to add another discipline to their list of offerings may find a happy compromise in the suggestion of scaffolded training. Larkin (2002) proposed this kind of instruction that provides “…a supportive environment while facilitating student independence.” As students begin to demonstrate mastery of the learning process, the teacher gradually provides less support. A study by Barron (2004) showed how this might be effective by observing that experienced students used a greater number of learning resources to achieve their educational goals (as cited in Kaminsky et al., 2009). As students become more experienced they depend less upon the teacher as a resource and more upon their own ability to discover and develop resources to guide and support their learning.

If given the opportunity to master the five learning fluencies in the context of in-person learner teacher relationships, the students who leave the institutions of higher education will be far better equipped to maximize the opportunities for lifelong learning available online. By the time the students graduate, they could be functioning almost entirely on their own – able to apply their learning ability to just about any environment.

The Role of the Teacher

A change in the design of the learning experience implies a simultaneous change in the role of educators. As the availability of online resources grows, the function of the teacher has begun to shift away from “subject expert” to “learning expert.” As Codreanu and Vasilescu (2013) suggested, the teacher is becoming less of a lecturer and more of a “facilitator and resource person.” In a world in which the growth of information outpaces the human ability to learn, such a transition is increasingly urgent.

Several different models are available for teachers that want to attempt this transition. The first was explored by the Great Books programs for liberal arts education in the early 1900’s and continues to this day in several small pockets of education (Rivendell, 2014). In this model, the teachers guide students through an encounter with information presented by the authors of the writings that influenced western civilization (Jenson, 2014). They help the students master their information skills, interpersonal skills, intrapersonal skills, and academic skills in the process. Developments in technology have added this fifth dimension to the experience of learning and have helped to streamline the process.

Another model is offered by the Swedish education system that partners with librarians to help the students develop information literacy. Norway has also begun to change the function of its library system by repositioning it as a learning center and not just an information resource (Virkus, 2003). Researchers from Singapore have suggested a similar function for librarians that focuses specifically on helping student develop technology fluency (Mokhtar, Foo, & Majid, 2007). They recommend creating an entirely new position of teacher that focuses on training students in learning fluency.

These teachers will be supported by subject experts like Simon Schocken who “laid out everything on the Web and invited the world to come over, take whatever they need, and do whatever they want with it” (2012). His online learning experience gave students all the information and instructions to learn. Those who had the right background were able to use it effectively. Those who did not might wish that they had been given the opportunity to develop their learning fluency as defined above.

Numerous online resources for sharing information, communicating with students, and facilitating learning experiences will also support the new teacher of the 21st century. Classroom administration, social platforms, communication technologies and assessment management tools will provide in-depth interaction around student progress and help to create personalized learning plans. Driven by this vision, a team from Singularity University recently designed an assessment platform called Cookie to analyze student performance from multiple angles and offer personalized recommendations on improvement (Singularity University, 2014).

Independent of teachers and students that know how to use it effectively, such technology has limited use. However, when paired with teachers and students who understand the importance of mastering the learning process, technology like this has the potential to surpass the performance already demonstrated by the flipped classroom. Developments like this reach toward the beginnings of a personalized, customized, and incredibly effective education experience that never ends. Instead, as students become more confident in their ability to learn, the supports of formal education are removed and they continue the process guided by their own internal ability to learn.

Future Research

Learning fluency may be the key to unlocking the potential of the information and learning opportunities available through online technology. Each one of the learning fluencies identified above has received support from numerous studies showing its impact upon student performance. Beyond the aspects of learning fluency presented here, though, there are many other factors that have been shown to influence student success. Those that were included in this study were identified as having significant impact on student literacy or performance in an online learning setting. The purpose of including them here was to provide evidence for the usefulness of the 5 categories of learning fluency. However, more research is needed to create an exhaustive list of features included in each category.

Thorough research did not uncover any experimental studies conducted to discover what kind of an impact teaching these learning fluencies as a whole set could have on student academic performance in an online environment. Perhaps this is due to the novelty of designing a curriculum around them. If such a curriculum were designed, it should be scientifically tested to discover if it had the potential to improve student performance. If a significant correlation was found between mastery of these learning fluencies and student performance, it may pave the way for a more comprehensive model of learning and assessment that takes the whole student into account. It may also introduce a model of education that reduces the disparity in student performance rather than emphasizing it. The author is currently conducting a pilot study to discover whether student mastery of learning fluency can help to reduce the achievement gap in student performance.

Resources used for this particular synthesis were primarily focused on adult learning and higher education, but there are several indicators that it would be more valuable to teach these skills to younger students. Experiments by Sugata Mitra, whose studies were mentioned previously, show incredible promise for helping children master basic literacy and learning skills through a blend of local and internet technology (E. G. West Centre, 2015). The Montessori Method (Montessori, 2004) is still employed with preschool children around the world and continues to prepare young children for academic success. It would be incredibly helpful to discover whether learning fluency might be even more valuable to students if it is taught at a younger age.

In order to put the ideas of this study into practice, more focused research is needed to pinpoint those aspects of learning most crucial to developing the learning fluency needed for ongoing learning outside the classroom. Practical resources are needed for teachers to incorporate learning fluency into curriculum or to begin teaching it as its own discipline. Supporting research is needed to encourage administrators and policy makers to begin repositioning the education system around the development of individual life-long learners rather than the transfer of content. Finally, online learning opportunities might make themselves more competitive and useful if they incorporate the development of learning fluency into the experience. Perhaps some could be designed entirely around helping students develop the internal skills of digital learning literacy so they can effectively use the other resources available through the internet.


In summary, the relationships that students have with information, technology, other people, themselves, and the process of learning should not be left to develop on their own. Whether they constitute a discipline of their own or play a greatly expanded role in the current structure of higher education, these learning fluencies need to take their place at the heart of the education experience. Without the development of this kind of digital learning literacy, students will be unable to maximize the opportunity presented by the explosion of information and learning programs available through the internet.

Although access to information is an important step to developing literacy, individuals need guidance on how to use this information to create a positive, connected educational experience. Educators can train students in the various learning fluencies as a way to develop their digital learning literacy. Information fluency, technology fluency, interpersonal fluency, intrapersonal fluency, and academic fluency can provide the foundation students need to become independent, self-directed, lifelong learners in any environment.

Until this happens, the Internet will continue to function effectively as an access point to an overwhelming repository of content and information. The technology resources proliferating throughout the online world are great tools for those who can use them effectively, but their power is only truly harnessed by those who understand that online learning begins behind the screens.





ACRL. (1989). Presidential Committee on Information Literacy: Final Report. Retrieved from http://www.ala.org/acrl/publications/whitepapers/presidential

Agarwal, A. (2013, June). Anant Agarwal: Why massive open online courses (still) matter [Video file]. Retrieved from http://www.ted.com/talks/anant_agarwal_why_massively_open_online_courses_still_matter?language=en

Al-Suqri, M. (2007). Information needs and seeking behavior of social science scholars at Sultan Qabos University in Oman: A mixed-method approach (Doctoral dissertation). Retrieved from ProQuest. (Order No. AAI3294687)

ALA. (2000). Information literacy competency standards for higher education. Retrieved from http://www.ala.org/ala/ acrl/acrlstandards/informationliteracycompetency.htm

Akhter, N. (2013). Relationship between internet addiction and academic performance among university undergraduates. Educational Research And Reviews, 8(19), 1793-1796. Retrieved from http://www.academicjournals.org/journal/ERR/article-abstract/292023541377

Andretta, S. (2000). Phenomenography: a conceptual framework for information literacy education. ASLIB Proceedings, 59(2), 152-168. doi:10.1108/00012530710736663

Applied Scholastics. (2015). What is study technology (Study tech?) [Web page]. Retrieved from http://www.appliedscholastics.org/study-tech.html

Applied Scholastics International (2005). Study Technology for College Success. Retrieved from http://f.edgesuite.net/data/www.appliedscholastics.org/files/study-tech-at-community-college.pdf

Barker, E. (2014, September 28). New Harvard research reveals a fun way to be more successful [Web log post]. Retrieved from http://www.bakadesuyo.com/2014/09/be-more-successful/

Baraniuk, R. (2006, February). The birth of the open-source learning revolution [Video file]. Retrieved from http://www.ted.com/talks/richard_baraniuk_on_open_source_learning

Bhatti, R., & Bart, W. M. (2013). On the Effect of Learning Style on Scholastic Achievement. Current Issues In Education, 16(2). Retrieved from http://cie.asu.edu/ojs/index.php/cieatasu/article/view/1121/498

Candy, P. C. (2002). Information literacy and lifelong learning. White Paper prepared for UNESCO, the U.S. National Commission on Libraries and Information Science, and the National Forum on Information Literacy, for use at the Information Literacy Meeting of Experts, Prague, The Czech Republic. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi:

Charlton, M. (No Date). Gunther Kress’s Literacy in the New Media Age [Review of the book Literacy in the new media age, by Kress, G.]. Retrieved from http://www2.bgsu.edu/departments/english/cconline/reviews/charlton/Kress.htm

Cieslik, M., & Simpson, D. (2006). Skills for Life? Basic Skills and Marginal Transitions from School to Work. Journal Of Youth Studies, 9(2), 213-229. doi:10.1080/13676260600635656

Clear, J (2015). Fixed Mindset vs. Growth Mindset: How Your Beliefs Change Your Behavior Build a learning habit [Web log post]. Retrieved from www.jamesclear.com/fixed-mindset-vs-growth-mindset

Codreanu, A., & Vasilescu, C. (2013). E-learning behaviors and their impact on andragogy. Conference paper for the 9th international scientific conference eLearning and software for Education Bucharest. Retrieved from http://www.researchgate.net/publication/256254358_E-LEARNING_BEHAVIORS_AND_THEIR_IMPACT_ON_ANDRAGOGY

De Valenzuela, J. S. (2001). Definitions of Literacy. [Website]  Retrieved from http://www.unm.edu/~devalenz/handouts/literacy.html

Dewey, J. (1938). Experience and education. New York: Macmillan.

E.G. West Centre. (2015). Self organised learning [Web page]. Retrieved from http://egwestcentre.com/research/holeinthewall/

Ghaffari, M. (2008). Digitized lectures: Making education more “student centered”. International Journal of Technology, Knowledge and Society, 4(5), 107-118. Retrieved from http://search.proquest.com/docview/868010305?accountid=10223

Gruber, S., & Barron, N. (2011). New learning and participatory citizenship: Creating knowledge communities in educational environments. International Journal of Technology, Knowledge and Society, 7(2), 101-118. Abstract retrieved from http://search.proquest.com/docview/1037881963?accountid=10223

Healey, D. (2015, February). Review 2 [Review of the book A constructivist approach to the national educational technology standards for teachers, by V. N. Morphew] Language Learning & Technology, 19(1) pp. 54-58. Retrieved from http://llt.msu.edu/issues/february2015/review2.pdf

Heath, P. & Miller, R. (2012). Cultural background and the benefits of e-learning. INTED2012 Proceedings, pp. 5938-5947. Abstract retrieved from http://library.iated.org/view/HEATH2012CUL

Higher Education Research Institute (2015). College Freshmen-Summary Characteristics: 1980 To 2013 [Sex, High School Grades, Political Orientation, Probable Field Of Study, Personal Objectives, And Family Income, Selected Years, As Of Fall] ProQuest Statistical Abstract of the U.S. 2015 Online Edition. Retrieved from  http://statabs.proquest.com.ezproxy2.library.colostate.edu:2048/sa/abstract.html?table-no=290&acc-no=C7095-1.4&year=2015&z=6201280A12C31CC1E4CC8F1BD3F5C61FA6AA612D

Holder, G. M., Jones, J., Robinson, R. A., & Krass, I. (2006). Academic Literacy Skills and Progression Rates Amongst Pharmacy Students. Higher Education Research and Development, 18(1) 19-30 doi:10.1080/0729436990180103

Hsiao, W., Chen, M. W., & Hu, H. (2013). Assessing online discussion: Adoption of critical thinking as a grading criterion. International Journal of Technology, Knowledge and Society, 9(3), 15-25. Retrieved from http://search.proquest.com/docview/1641424331?accountid=10223

Huvila, I. (2011). The complete information literacy? Unforgetting creation and organization of information. Journal of Librarianship and Information Science. doi:0961000611418812.

Inan, B. (2013). The Relationship between self-regulated learning strategies and academic achievement in a Turkish EFL setting. Educational Research And Reviews, 8(17), 1544-1550. Abstract retrieved from http://academicjournals.org/journal/ERR/article-abstract/318A4BC5992

Jenson, K. (2014). The great books: Hutchins, Adler, and the liberal arts. Retrieved from https://www.academia.edu/8609009/The_Great_Books_Hutchins_Adler_and_Liberal_Arts

Kaminsky, K., Switzer, J., & Gloeckner, G. (2009). Workforce readiness: a study of university students’ fluency with information technology. Computers & Education, 53(2), 228-233. doi:10.1016/j.compedu.2009.01.017

Kettler, T. (2014). Critical thinking skills among elementary school students: comparing identified gifted and general education student performance. Gifted Child Quarterly, 58(2), 127-136. doi:10.1177/0016986214522508

Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: an analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75-86, doi:10.1207/s15326985ep4102_1

Koller, D. (2012, June). What we’re learning from online education [Video file]. Retrieved from http://www.ted.com/talks/daphne_koller_what_we_re_learning_from_online_education?language=en

Kuh, G. D., Kinzie, J., Buckley, J. A., Bridges, B. K., & Hayek, J. C. (2006). What matters to student success: A review of the literature. Commissioned report for the National Symposium on Post secondary Student Success. Retrieved from http://nces.ed.gov/npec/pdf/Kuh_Team_Report.pdf

Kurbanoglu, S. S. (2003). Self‐efficacy: a concept closely linked to information literacy and lifelong learning. Journal of Documentation, 59(6) 635-646. doi: 10.1108/00220410310506295

Larkin, M. (2002). Using scaffolded instruction to optimize learning. Retrieved from http://www.vtaide.com/png/ERIC/Scaffolding.htm

Merriam, S.B. & Bierema, L.L. (2014). Adult learning: Linking theory and practice. San Francisco, CA: Jossey-Bass.

Mitra, S. (2007, February). Sugata Mitra: Kids can teach themselves [Video file]. Retrieved from http://www.ted.com/talks/sugata_mitra_shows_how_kids_teach_themselves?language=en

Mokhtar, I. A., Foo, S., & Majid, S. (2007). Bridging between information literacy and information technology in Singapore schools: an exploratory study. Education, Knowledge and Economy, 1(2). doi:10.1080/17496890701372749

Mokhtar, I. A., Majid, S., & Foo, S. (2008). Information literacy education: Applications of mediated learning and multiple intelligencesLibrary and Information Science Research, 30(3), 195-206. doi:10.1016/j.lisr.2007.12.004

Montessori, M. (2004). The Montessori method: The origins of an educational innovation: Including an abridged and annotated edition of Maria Montessori’s The Montessori method (G. Gutek, Ed.). Lanham, Md.: Rowman & Littlefield.

Morgan, B. (2010). New literacies in the classroom: digital capital, student identity, and third space. International Journal of Technology, Knowledge and Society, 6(2), 221-240. Retrieved from http://search.proquest.com/docview/754042053?accountid=10223

Murray, B. (2000). Teaching students how to learn. Monitor, 31(6). Retrieved from http://www.apa.org/monitor/jun00/howtolearn.aspx

Nazari, M., & Webber, S. (2012) Loss of faith in the origins of information literacy in e-environments: Proposal of a holistic approach. Journal of Librarianship and Information Science, 44(2), 97-107. doi:10.1177/0961000611436095

Page, S., & Webb, P. (2013). Facilitating research methods pedagogy through Facebook: addressing the challenge of alternate learning styles. International Journal of Technology, Knowledge and Society, 9(3), 125-139. Retrieved from http://search.proquest.com/docview/1641423809?accountid=10223

Probert, E. (2009). Information literacy skills: teacher understandings and practice. Computers & Education, 53(1), 24-33. doi:10.1016/j.compedu.2008.12.018

Richardson, M., Abraham, C., & Bond, R. (2012). Psychological Correlates of University Students’ Academic Performance: A Systematic Review and Meta-Analysis. Psychological Bulletin, 138(2), 353-387. doi: 2048/10.1037/a0026838

Rienties, B., Giesbers, B., Tempelaar, D., Lygo-Baker, S., Segers, M., & Gijselaers, W. (2012). The role of scaffolding and motivation in CSCL. Computers and Education, 59(3), 893-906. doi:10.1016/j.compedu.2012.04.010

Rivendell (2014). Rigorous academics [Web page]. Retrieved from http://rivendell.sdcc.edu/our-program/the-five-pursuits/rigorous-academics/

Schocken, S. (2012, June). The self-organizing computer course. [Video file]. Retrieved from http://www.ted.com/talks/shimon_schocken_the_self_organizing_computer_course?language=en

Singularity University. (2014). Cookie [Video file]. Retrieved from https://vimeo.com/104346284

Sousa, D. A. (2011). How the Brain Learns. Fourth Edition. Thousand Oaks, CA: Corwin.

Steffens, K. (2015). Competences, learning theories and MOOCs: recent developments in lifelong learning. European Journal of Education, 50(1), 41-59. doi:10.1111/ejed.12102

UNESCO. (2006). Understandings of literacy (Education for All Global Monitoring Report). Chapter 6, pp. 147-159. Retrieved from http://www.unesco.org/education/GMR2006/full/chapt6_eng.pdf

UNESCO. (2014). Literacy. [Website] Education. Retrieved from http://www.unesco.org/new/en/education/themes/education-building-blocks/literacy/

UNESCO. (2014). Media and Information Literacy. [Website] Communication and Information. Retrieved from http://portal.unesco.org/ci/en/ev.php-URL_ID=15886&URL_DO=DO_TOPIC&URL_SECTION=201.html

Vann, B. A. (1996). Learning self-direction in a social and experiential context. Human Resource Development Quarterly, 7(2), 121-130. Retrieved from http://search.proquest.com/docview/61490247?accountid=10223

Villavicencio, F. T., & Bernardo, A. I. (2013). Positive academic emotions moderate the relationship between self-regulation and academic achievement. British Journal Of Educational Psychology, 83(2), 329-340. doi:10.1111/j.2044-8279.2012.02064.x

Virkus, S. (2003)  “Information literacy in Europe: a literature review.” Information Research, 8(4), paper no. 159. Retrieved from http://informationr.net/ir/8-4/paper159.html

Wieling, M. B., & Hofman, W. H. A. (2010). The impact of online video lecture recordings and automated feedback on student performance. Computers & Education, 54(4), 992-998. doi:10.1016/j.compedu.2009.10.002

Young, S., Macrae, C., Cairns, G., & Pia, A. (No Date). Adult Literacy and Numeracy in Scotland [Pdf]. Retrieved from http://www.gov.scot/Resource/Doc/158952/0043191.pdf

Zucca, G. (2013). Classroom course model: A different model needed for adult online students? International Journal of Technology, Knowledge and Society, 9(4), 99-107. Retrieved from http://search.proquest.com/docview/1641423813?accountid=10223


The Impact of Learning Fluency on The Achievement Gap

Kevin J

May, 2015
For Dr. Jeffrey Foley

Colorado State University


Author Note

This is a pilot study for research testing the feasibility of designing higher education around teaching students how to learn. To contribute to future study, see results, or view details on the program in development, please visit http://www.listenlovelead.com/education.


Students that know how to learn demonstrate higher levels of academic performance. This pilot study tested 5 categories of learning fluency identified by the literature to discover their impact on student performance as measured by GPA. A selective sample of students in higher education completed a survey measuring demographic variables GPA and self-perception of proficiency in information, technology, interpersonal, intrapersonal, & academic fluency. Students reported that they did not necessarily receive direct training in learning fluency, but their GPA’s had improved overall (M=3.33) since the beginning of the program (M=2.375). Students with higher levels of learning fluency reported higher grades. However, a self-perception bias was present and a two-tailed independent samples t-test did not show statistically significant differences in the mean for the small number of samples (t(14) = 1.69, p = .11). Similarly, there was no statistically significant correlation between higher levels of learning fluency and greater amounts of change in GPA (r =0.18, p =.52) or current GPA (r =0.40, p =.12) . Even though they were not statistically significant at p <.05, the findings of this study align with the literature that reports a positive relationship between learning fluency and academic performance.


Keywords: higher education, learning fluency, achievement gap, academic performance, information literacy, curriculum design, academic mobility

Download Link – The Impact of Learning Fluency on the Acheivement Gap

Impact of Learning Fluency on the Achievement Gap

“There is a danger of a new elite developing in our country: the information elite” (ACRL, 1989). These words by Terence Bell, former US Secretary of Education accompanied the definition of information literacy (or information fluency) most commonly cited among research on the subject. According to the American Library Association, information literacy is a set of abilities requiring individuals to “recognize when information is needed and have the ability to locate, evaluate, and use effectively the needed information” (ACRL, 1989).

The same year this definition was published, the Internet was born and its development would eventually lead to redefining information literacy to include a technology component. Less than two decades later, studies conducted in Scotland, Singapore, and by UNESCO all recognize the importance of training students in both information literacy and in technology skills (Young, et al., n.d.; UNESCO, 2006; Mokhtar, Foo, & Majid, 2007). Though access to technology makes learning opportunities more accessible to students of multiple backgrounds (Schocken, 2012), not all of them are prepared to thrive in this self-directed environment (Rienties, B., Giesbers, B., Tempelaar, D., Lygo-Baker, S., Segers, M., & Gijselaers, W., 2012).

Students need more than just information and technology fluency to succeed in today’s learning context. Richardson et al. (2012) conducted a study associating various psychological attributes to student performance. Other studies have identified factors like motivation (Codreanu & Vasilescu, 2013), creativity (Bloom, 1956), critical thinking (Brookfield, 2013), and many other factors as crucial to student success in learning. Some researchers group all these aspects under the heading of information fluency but others identify them separately. The following literature review outlines five distinct aspects of learning that are considered vital to student performance. Mastery of these learning skills plays such an important role in student success that some have suggested including information fluency (or learning fluency) as its own discipline instead simply trying to incorporate its ideas into the general curriculum of a school (Mokhtar, Foo, & Majid, 2007; Virkus, 2003).

The Research Problem

Although many studies show that teaching students how to learn can have a positive impact on their performance, it is nearly impossible to find a study that references the impact of learning fluency on the achievement gap. Maria Montessori observed that her mentally challenged students were able to outperform the public school students after experiencing her education system, but assumed that this improvement would be more significant for students that were not mentally challenged (Montessori, 2004). A more recent study by Chatterall, Chapleau, & Iwanga showed a dramatic increase in academic performance among students with low economic status when they were given musical training (as cited in Sousa, 2011). However, Sousa did not report whether this was compared with the performance of students in other economic brackets. There has been no correlation between training students in aspects of information fluency and a subsequent reduction in the gap between high and low performing students.

Justification of the Problem

A reduction in the achievement gap could be identified through academic mobility. Although the term has been hijacked to refer to students moving from one school district to another, its meaning should be similar to that of social mobility: students moving from a less-desirable academic status to a more desirable one. According to the theory of multiple intelligences (Gardner, 1995), this mobility should be taking place naturally as some students are more competent at certain subjects. However, the only way to achieve a 4.0 is to be good at every class – or perhaps, as Dewey suggested, to be part of a system that is “suitable” to one’s learning needs (Dewey, 1938, p. 45). Student-centric learning has attempted to make the classroom more suitable to a diversity of students, but research on learning fluencies suggests that training students to thrive in a diversity of environments may be a more effective alternative. Richardson et al. (2012) have shown that while some aspects of an individual are more “stable” or unchanging, other aspects have great potential for development. This suggests that the skills required for academic mobility can be taught.

Purpose Statement

The purpose of this study is to identify whether training students how to learn can lead to a reduction in the achievement gap. This is done by examining the relationship between self-reported proficiency in the five learning fluencies and changes in academic performance for students at a local higher education institution that offers some aspect of training in learning fluency. The relationship between these two factors could indicate whether shifting the function of formal education toward an emphasis on developing students’ learning abilities could lead to an improvement in student performance and academic mobility, thereby narrowing the achievement gap.

Research Questions and Personal Interest

An overview of the literature outlines the various factors that other studies have identified as aspects of learning fluency. This study tests the correlation between these factors and student performance as measured by GPA. Despite some downsides, no better indicator has yet been found to replace GPA as a measure of student performance (Richardson, et al., 2012). This study compares self-identified learning fluency levels with GPA scores to see if higher levels of fluency are related to higher GPA’s or greater amounts of change in GPA. If the relationship is significant, the author has a personal interest in further study to see whether student mastery of a subject may be inferred by measuring student mastery of learning fluencies.

Audiences that will Benefit from the Study

This study identifies the potential benefits of devoting classroom time to developing student learning fluency. The results are relevant to school administrators and policy makers who must decide between incorporating learning fluency into the curriculum or offering it as an independent subject of study. Remedial program coordinators and curriculum designers may find the insights useful in focusing their strategies on those aspects of learning that provide the greatest return on investment. Teachers may find this study useful in designing assessments and opportunities for students to improve. Professional students and life-long learners that want to improve their academic performance may find it helpful to understand the role that learning fluency can play toward this end. Finally, the learning centers that participate in this study and future research will gain insight into what their students perceive to be the most valuable differentiating factors that the school might contribute to their academic success.


Literature Review

A review of the literature has identified multiple factors of learning that can be classified into five different categories: information, technology, interpersonal, intrapersonal, and academic fluency. These five categories of learning fluency have been shown to have an impact on student success. Not every student has the same level of capability in these fluencies, but according to Sousa (2011), at least some of the fluencies can be developed. Candy (2006) demonstrated the importance of this intentional development by saying that the information explosion has made self-directed learning a required skill that cannot simply be trusted as an accidental byproduct of education. Although each field of learning has its own particular requirements, the skills commonly required by the self-directed life-long learner guided the selection of factors in this study.

Information Fluency

Within the literature, these skills are sometimes identified under a broad category called “information literacy.” The American Library Association developed the most commonly used definition of information literacy in 1989: “Information literacy is a set of abilities requiring individuals to “recognize when information is needed and have the ability to locate, evaluate, and use effectively the needed information” (ALA, 2000). Virkus (2003) identified multiple terms that are often substituted for information literacy and range from very narrow to very broad definitions. For purposes of this study, information literacy is limited to the definition provided by the ALA and is measured through concrete skills like reading comprehension, reading speed, mastery of research methods, and skill in evaluating resources.

Technology Fluency

Recent studies by UNESCO have called for an expansion of the definition of literacy to include a plethora of learning skills, but most importantly those related to technology (UNESCO, 2006). According to the International Society for Technology in Education (2015), developments in technology have changed how we learn. Today, students need to learn how to use the computer before they can use the computer to learn about something else. Mokhtar, Majid, & Foo (2008) argued that despite the technical ability of modern students, training is still required for students to use technology for educational purposes. Many students know how to use Facebook and search engines, but this does not automatically translate into fluency with the creation and exchange of information through databases (Virkus, 2003). Common examples of technology fluency are website design, basic IT skills, use of academic databases, MS Office proficiency, graphic design experience, and familiarity with communication technology.

Interpersonal Fluency

Among constructivist circles, interpersonal fluency plays a very important role in learning. A study by Andretta (2007) showed that using a relational approach to learning, and teaching students how to learn through information literacy could provide a framework for life-long learning. Learning methods imported form ancient Greece include the trifecta of logic, rhetoric, and dialectic, which was dominated by interpersonal communication skills. The ALA identifies the continuing importance of these skills by highlighting the importance of fluency in speaking, debating, writing, asking questions, presenting, and communicating across cultures (2000).

Intrapersonal Fluency

Merriam & Bierrema (2014) in their textbook, Adult Learning: Linking Theory and Practice identified multiple aspects of intra-personal learning in their overview of educational theories. Chief among these are critical reflection, personal experience, physical and spiritual aspects of learning, motivation, critical thinking, and cultural understanding. These are not untested ideas. They are important dynamics of the major learning theories. However, for some reason, literature directly testing the impact of training students in these aspects of learning was not readily available. Many teachers are aware of the impact of learning styles, personality types, and the external environment on student learning, yet their classroom time is often limited to presenting information and not to teaching students how to manage the learning process more effectively (Palmer, 2007).

Academic Fluency

A fifth aspect of learning fluency identified from the literature is specifically academic. Huvila (2011) declared that ideas of information literacy were not complete without first helping students develop skills in the creation and organization of information. Sousa (2011) suggested using Bloom’s Taxonomy as a guide to moving students toward higher levels of learning. Developing a study plan, applying ideas (Dewey, 1938), observation skills (Montessori, 2004), innovation, creative thinking (Bloom, 1956), and more fall into this category. Keen (as cited by Virkus, 2003) has identified academic competencies like these to be transferrable across multiple environments. Academic fluency can be summarized as the ability to apply the previous four learning fluencies in a specifically educational context.

Deficiencies in the Literature

Although the five learning fluencies are fairly well developed through the literature (for a full synthesis of this research, please see Jenson, 2015), research on their application has almost exclusively focused on their integration into the teaching of subject material. The ALA recommended teaching these skills by incorporating them into multiple subjects rather than on their own (ALA, 2000). More recently, Mokhtar, Foo, & Majid (2007) recommended creating a new position for a teacher-librarian whose responsibility is to educate students on information technology and literacy skills. However, the researcher could find no study testing the impact of teaching all five learning fluencies as a stand-alone subject to equip students for self-directed life-long learning. Studies have been conducted to show that one or another aspects of learning have an effect upon student performance, but none that explore their combined impact on the achievement gap.

Research Methods

Data for this quantitative study was collected through the use of a survey and analyzed using Microsoft Excel and SPSS. The objective of the research was to discover the relationship between self-reported levels of learning fluency (as defined by the literature) and changes in GPA between high school, first term, and current term. Change in GPA was calculated by taking the difference between current GPA and either high school or first term GPA. Scores were calculated for each of the five learning fluencies and these were added together to calculate the students learning fluency score. Change in GPA, current GPA, and the learning fluency score were analyzed for each student. High levels of change in GPA or a reduction of variance in GPA would indicate a reduction in the achievement gap and this would be related to self-reported levels of learning fluency.

Data Collection Instrument

To collect data for analysis, a survey was designed specifically for this pilot study, which included multiple variables for use in future research. The first part of the anonymous survey form covered demographic information, information on previous educational experience, academic performance, and several influencing variables that can be used in more advanced statistical analysis in follow-up studies. A variety of question types were included and students were asked to submit rational data wherever possible. The entire survey is included in the appendix.

The second part of the survey asked students about their familiarity with various aspects of the five learning fluencies from the literature review. Questions from each area of learning fluency were kept together rather than scattered in order to develop a general sense of student ability in each category. Students rated their agreement with statements by using the following Likert scale. They also had the option of circling S or T to indicate their opinions on the importance and accessibility of each skill. These are the instructions they received on the survey form.

Please indicate how much you agree or disagree with the statements on the next page by circling a number on the following scale:


Completely Disagree   1     2     3   4     5     Completely Agree


ALSO circle   S   and/or   T   whenever appropriate.

S = I believe this skill is important to my SUCCESS as a student

T = my school offers TRAINING on this skill


Example: I enjoy taking surveys!       1   2   3   4     5     S   T

Student enjoys taking surveys, does not believe this skill to be important to success as a student, but is aware of training offered by her school.


Validity and Reliability of the Instrument

Because this instrument was new, steps were taken to insure its validity and reliability (Creswell, 2015). Survey questions were reviewed for clarity and phrased for consistency. For example, students who believed themselves proficient in a skill would always indicate this by selecting a number closer to 5. Specific data was preferred over general data and all but one question had a predefined range of answers. The procedures for administering the survey were the same for every student and students had access to the researcher for any questions while completing the survey.

Measures of validity were built into the design of the survey. For example, students who identified their learning style on the front completely agreed that they knew their learning style when asked about this on the back. Those who did not identify a learning style indicated no agreement or partial agreement. A visual overview of the data showed that student responses within each category tended to vary less than they did between categories. If a student self identified a low level of proficiency in one aspect of interpersonal fluency, most aspects of interpersonal fluency would also be low. This consistency demonstrates the validity of the questions to examine each category of learning fluency.

Another measure of validity comes from the difference in agreement levels for objective measurements compared with subjective measurements. An average of 9.5 students agreed over whether training was offered or not (an objective measure) and 8.1 agreed about the importance of that measure to their success as a student (a subjective measure). The objective measure had less of a range of opinions showing that the survey was able to quantify student opinions somewhat effectively.

Survey Administration

The target population for the survey consisted of students not in their first term of study at an intensive liberal arts program at a local school. This target was selected on the basis of three factors. First, personal interviews with the staff identified a significant change in performance for disadvantaged students. Second, students in this program are admitted on variables besides GPA providing a variety of GPA starting points to analyze. Finally, students in the target population complete the program of study with the same materials, order of presentation, classmates, professors, and learning environment. This provided control for the influence of confounding variables like class size, instructional methods, and subject of study.

The data collection instrument was administered during class time by permission of the school and oversight of the professors. Students received an overview of the research project, guidance on how to answer the questions correctly, and a disclaimer that they did not need to participate or complete any questions if they did not feel comfortable doing so. The population of students was stratified into three groups based on their time in the program and one group was selected for the data collection process. Students in their final term (Group 3) were not available and students in their first term (Group 1) would not be able to provide data to measure the change in GPA. For this reason, Group 2 was chosen to provide the sample data.

From these students, 16 surveys were collected. This number is within the sample range recommended by Creswell for an experimental study, but is below the 30 recommended for a correlational study (2015). If the study had been designed as an experimental study with a pre-test, post-test, and verified training in the measured aspects of learning fluency, the sample size may have been appropriate. However, there was only one point of data collection and students were unsure of whether training was available in learning fluency, so the number of samples was too small. However, no additional students were available that met the criteria for participation.



Demographic Data

Sixteen students participated in the survey. There were 10 males and 6 females. Fifteen out of the sixteen were between age 18 and 21. They had an average of 1.4 years of higher education experience including the 1 year they had completed as part of their current program. Most students did not know their learning styles, but the four who did reported as kinesthetic, visual, and two aural learners. Three attended public school, six attended private school, three were homeschooled, and four attended a combination of these for their high school experience. For half of them high school was a challenge and all but one student had families who believed in the importance of academics. Future study interests ranged from music, liberal arts, dancing, writing and psychology to business, law enforcement, sports, architecture, and the biological, architectural, and actuarial sciences. All but four wanted to see the results of this research.

GPA Data

Most important for purpose of this study, student GPAs were reported for high school, the first term of higher education, and the current term to date (Table 1). Data from the table shows that average GPA declined dramatically in the students’ first year and began to rise to its current level. Because GPA is capped at 4.0 the highest scores for each group did not change. Although the range of reported GPA’s did not change between the first year and current year, the total variance in GPA scores was smaller. Combined with the rise in overall GPA, this indicates that a greater number of students were beginning to achieve higher scores – a reduction in the achievement gap between high and low scoring students. When compared with high school GPA data, this reduction is not significant, but it also accounts for a much greater range of scores because of an outlying GPA of 1.8.

Table 1
GPA Data
Group High School 1st Year Current
GPA Average 3.459 2.375 3.331
Low 2.8 1.8 1.8
High 4 4 4
Range 1.2 2.2 2.2
Standard Deviation 0.427 0.667 0.621
Variance 0.182 0.444 0.384


Learning Fluency Data Overview

Along with GPA, learning fluency data was the most important aspect of the survey. Students were asked three questions about their perspectives on learning fluency. The first measured their proficiency, the second measured their beliefs about its importance to success in learning, the third measured whether they believed their institution offered training in the subject. The last two provided reliability testing for the data collection instrument as well as a few insights into student perceptions of their institution.

Training data. Students tended to disagree over whether or not the school offered training in academic fluency with 6-8 students voting either way in the true/false test. However, they agreed that there was little or no training provided in technology fluency (10-16 students for each factor). On average 9.5 students agreed over whether any given factor was taught or not. Students were more likely to agree that training was not available for any randomly selected factor. This led to an overall average of 37% agreement that training was offered for a given variable. Where training was not offered, students tended to believe the skill was less important to their success in the program.

Success data. Out of the 18 factors ranked most important to student success, only 5 had little or no training offered by the school. These factors were listening skills, learning style, eating habits, motivational skills, and the use of MS Office. On the other hand, for every fluency factor except two, students were more likely to believe a factor was important to their success (S) than to believe their institution offered training (T).

This is significant because students scored lower on subjective vs. objective measurements in the control questions. The importance of a factor to student success is a subjective opinion and should therefore have been lower than students’ beliefs about whether training was offered. The exceptions to this were debate and speech, which are key parts of the learning experience at this institution. For these, students were more likely to agree that training was offered than to agree over its significance.

Fluency score. The average fluency score for students was calculated at 118. This was done by adding together the ranks (from 1-5) students gave to each factor of learning fluency. If students had completely disagreed with all statements of proficiency, they would have scored a 35. If they had completely agreed with every statement of their proficiency, they would have scored a 175. Actual student scores ranged from 94 to 155. The range was 61 with a variance (Excel: Var.S) of 240.8 and standard deviation of 15.

Sub-scores for each category were calculated but are invalid because the number of factors in each category was different. Instead of calculating an average for each section by adding the factors together, the average proficiency rank for each category was calculated for every student. The average proficiency identification for students within each category is shown by Figure 1. Students ranked themselves above average for every category except technology. The low score in this category has a distinct impact on the average student score.

Students who disagreed with statements of their proficiency received a score of 1. Those who completely agreed with proficiency statements received a score of 5. The first question students answered measured perception of their ability in taking surveys. On average, students rated their survey taking ability at 3.6, which is slightly higher than the average rank they gave to their fluencies (3.4). Students with lower GPA’s rated their survey taking ability lower (2.6) than students with higher GPA’s (4.0) indicating that grades may have influenced self-perception of ability.



Table 2 shows that students in this class believe they are academically proficient. They do not believe they are proficient in technology, but as noted previously, they do not receive training in this area or believe it to be significant to their success in the program. Interpersonal ranking showed the least amount of variance in student selection (0.17) compared to the average variance of 0.47 (for all data from the table). Benchmarks for the variance are provided by the objective question about learning (variance = 0.16) and subjective question of survey taking ability (variance = 0.65).


Table 2
Student Proficiency Rankings
Factor M Sum Range Variance SD
Surveys 3.63 58 3 0.65 0.81
Learn 4.81 77 1 0.16 0.4
Information Fluency 3.58 57.2 2.8 0.63 0.8
Technology Fluency 2.46 39.43 3 0.65 0.8
Interpersonal Fluency 3.72 59.5 1.67 0.17 0.42
Intrapersonal Fluency 3.65 58.38 2.75 0.61 0.78
Academic Fluency 3.85 61.67 2 0.4 0.63
Averages 3.67 58.74 2.32 0.47 0.66

Comparison of the Mean t-Tests

On the basis of their proficiency scores, students were divided into two categories: those whose proficiency score met or exceeded the average of 118 and those whose score was below 118. Current GPA and change in GPA were measured for students in each category to see if there was a significant difference in the mean. The null hypothesis stated that there was no significance between the two groups, Ho: µA-µB = 0. The alternative hypothesis for each of these studies tested for a difference in the means, Ha: µA-µB ≠ 0.

The mean change in GPA for students with higher than average levels of fluency was .15 (SD = .14) and the mean change in GPA for students with lower than average levels of reported fluency was -0.5 (SD = .88). A two-tailed, independent samples t-test (Table 3) led to the conclusion that there was not enough evidence to suggest a difference between the two groups, t(7.3) = .645, p =.53. The amount of variance between the two groups was significant F(7.33) = .017. The same t-test led to the conclusion that there was no statistically significant difference in the mean of current GPA’s for students of the two groups t(14) = 1.69, p = .11, though the amount of variance was not significant. It is important to note that fluency scores had a stronger relationship to current GPA (t =1.69, p =.11) than to 1st term GPA (t =1.53, p =.15) or high school GPA (t =1.15, p =.27).

A second t-test compared the mean fluency score and GPA for students grouped by positive or negative change in GPA. No significant differences were discovered, but the results are shown in Table 4. Additional analysis of these groups relative to the categories of learning fluency also failed to produce statistically significant results at p <.05. The greatest difference between the two groups was found in the ranking averages for interpersonal fluency (t = 1.24, p =.24). The least significant difference was found in the intrapersonal average rank and sum (t =-.29, p =.78).

Correlation Studies

Pearson Correlation showed a relationship of r =0.18 (p =.52) between the fluency score and the greatest change variables. The relationship between the fluency score and current GPA was stronger and more significant (r =0.40, p =.12). This is consistent with the literature from which the variables were drawn that demonstrates a relationship between GPA and learning fluency. Figure 2 shows a scatterplot of the samples for this second test of correlation.

A correlational matrix (Table 5) of the factors that make up the total fluency score revealed a strong relationship between technology fluency and interpersonal fluency (r =.505, p =.05) and between academic fluency and intrapersonal fluency (r =.700, p =.01). Since there were very few statistically significant findings in this study, these relationships are unique exceptions.

The most significant correlates to change in GPA were current GPA (r =.627, p =.01), academic fluency average (r =30, p =.26), and interpersonal fluency average (r =.15, p =.52). Current GPA was the only statistically significant correlate at p <.05. The most significant correlates to current GPA were change in GPA (r =.627, p =.01), academic fluency average (r =.45, p =.08), and intrapersonal fluency average (r =0.40, p =.13). Nearly all factors had a statistical significance at p <0.1. This indicates that there is a stronger correlation between the fluency factors and current GPA than between the fluency factors and change in GPA. It also suggests that the different categories of learning fluency may have different kinds of effects on student performance.


The purpose of this study was to discover whether there was a relationship between learning fluency, as defined in the literature review, and change in GPA. A comparison of the mean GPA for students with higher and lower levels of fluency showed a relationship between student GPA and proficiency in the five learning fluencies that fell .02 short of statistical significance at the p <.1. This indicates the validity of the factors selected to measure learning fluency as it matches the findings of the literature.

Comparison of Means

Comparison of the means did not reveal any other significant differences. However, in every test, students with higher GPAs showed higher levels of learning fluency (see Figure 3) and students with higher levels of learning fluency showed higher GPAs. Similarly, students with higher levels of learning fluency showed greater amounts of change in GPA (.15, SD =.14) than students with lower levels of learning fluency (-0.5, SD =.88). However, the standard deviation and variance were so large that none of the results were statistically significant. In fact, the confidence intervals provided by the t-tests included the possibility of negative or positive change for students that reported higher levels of learning fluency: CIs [-0.131, 1.125] [-0.535 to 0.942] for current GPA and change in GPA respectively.

Correlational Studies

Pearson Correlation supported these findings by showing a positive relationship between change in GPA (r =0.18, p =.52) and current GPA (r =0.40, p =.12) relative to learning fluency. Once again, the relationship between learning fluency and change in GPA is almost statistically significant at the p <.10 level. Correlation also showed that academic fluency is the most significant correlate to GPA for students in this particular program. This emphasized the idea that the particular program that students are part of and the emphasis of their training may have a significant effect on which factors of learning fluency students are exposed to.

Significance of the Results

Central Tendency indicates that as the sample size increases, the standard deviation and variance of the mean should decrease for a population with standard normal distribution curve. The sample size for this study was so low that the variance and standard deviation kept the statistical test from producing significant results, even at the p <.10 level. This led to a rejection of the alternate hypothesis that there was a difference between students with high or low GPA’s and high or low fluency scores (Ha: µA-µB ≠ 0). As the results showed, there was a difference, but it was not statistically significant. With a larger sample size and a more controlled environment, it is quite possible that researchers would find a statistically significant difference between the groups and accept the alternative hypothesis.

In addition to sample size, there are several other indications that this could be the case. First, students were sampled in their second term. Grades in the first term had declined dramatically since high school, but had begun to improve. This indicates that many students were not prepared for the difficulty of the program. However, by the end of second term when the data was collected, students had become familiar with the skills needed for success in that environment and GPA had begun to rise (see Table 1 for details). More importantly, the variance had declined from 0.44 to 0.38 and the standard deviation was smaller. If this trend continued, one could assume that student grades near the end of the third term would be significantly less distributed than after the first term – indicating a reduction in the achievement gap.

Second, the lowest GPA reported was an outlier and so was the student who reported the greatest negative change in performance (see Figure 2). T-tests are sensitive to these outliers and may have shown more statistically significant differences of the mean if they had been removed. Additional samples would also reduce the impact of these outliers.

Third, self-reported fluency levels were influenced by student self-perception. This was indicated by the control questions about survey taking ability and the availability of training in learning fluency. The researcher did not have adequate statistical skills to account for this influence. However, to control for it entirely would require an experimental design and a larger sample size.

Influencing Factors

The self-perception bias limited the scope of the study to discovering whether a relationship existed between student grades and student perception of their learning ability. It is impossible to say whether student self-perception of ability influenced their GPA scores or whether the GPA’s gave the students more or less confidence in their ability. It is possible that this bias also influenced the correlational studies to show a stronger association between learning fluency and current grades than between learning fluency and change in GPA.

Out of all the learning fluency factors, academic fluency had the greatest amount of relationship with change in GPA and current GPA. Since the curriculum of the school is largely designed around proficiency in these skills, it is possible that the lack of such skills could account for the drop in GPA for students in their first term, and its subsequent rise in the second term. Relative to this particular program, training in academic fluency skills may be vital to reducing the achievement gap.

Correlation between the various fluency factors (see Table 5) showed that students with higher levels of academic fluency also showed higher levels of intrapersonal fluency. Intrapersonal fluency questions were not limited to cognitive skills like memory and critical thinking, but included aspects of lifestyle like exercise, food choices, and emotional development. This indicates that in order to develop academic fluency among their students, schools may need to help students develop healthy lifestyle habits.

The other two factors of learning fluency that were strongly correlated were interpersonal fluency and technology fluency. This correlation may be a result of students using technology for the purpose of communication. Those with lower levels of interpersonal skills may have fewer applications for technology. This theory is supported by the lack of correlation between information fluency and technology fluency – which were shown by the literature to have a significant relationship. Because the students are not offered training in technology, they may not be aware of the benefits it can bring to their learning experience.

Most Significant Finding

With the exception of correlation between factors, the strongest relationship in this study was found between learning fluency and current grades (t(14) =1.69, p =.11; r =0.40, p =.12) This makes sense in light of the literature review. Factors for analysis in this study were chosen from research reports that demonstrated their relationship with student performance. Those same factors were used in this study, but in combination rather than as individual factors. Additionally, the environment in which these factors were tested was not the same as the environment in which the factors were originally studied. Thus, the results of retesting these factors should have been close to statistically significant, like they were in this study. As explored in the limitations section, this relationship was the only one that this study could effectively measure and it came close to statistical significance despite the small number of samples.


Correlational studies showed that the availability of training and the type of program students are in may have an impact on their mastery of various learning fluency factors. For this reason, the selective sampling method used in this study limits the findings of this study to students within the particular cohort that completed the survey. The results cannot necessarily be applied to students outside of the particular program studied or even students who did not experience the same environment as the selected sample group. At least one additional study of students in a different environment is needed to generalize these findings to the broader population of students in higher education.

Data issues. Issues with the data included two students that did not contribute their current GPA and two that did not contribute their first term GPA. In at least one instance, the GPA was unknown to the student. To account for the missing data, current GPA was assumed to be the same as the last known reported GPA.

Additional problems with the GPA data include one outlier whose academic performance dropped significantly between high school and higher education and the 37% of students who showed no change in GPA between first term and current term. To account for this missing or unhelpful data, change was calculated between high school and current GPA and between first term and current GPA. Instead of testing both differences, the greatest of these was selected from each student to create a new category called “greatest change.” These changes allowed for tests to be run on 16 instead of 12 sets of data, however they also reduced the depth of analysis that could have been done with two measurements of change in GPA.

Sampling. Although the students who were sampled experienced a controlled environment, they were only 37% confident that training was available for any randomly selected factor of learning fluency. For this reason, it is impossible to tell whether self-reported mastery of learning fluency was the result of innate capacity or training from participation the program. The rise in GPA between the 1st and current term may be a result of factors besides the development of learning fluency. To control for this, the sampling and research methods will need to change.

Research methods. The purpose of the study was to measure change in academic performance (GPA) relative to change in student proficiency in the five learning fluencies. Because the survey only took place at one point in time, it provided a snapshot of data that could not measure the change in student perceptions of proficiency in learning fluency. This meant that the study could measure the relationship between GPA and current learning fluency, but not between GPA and changes in learning fluency. Any relationship between changes in learning fluency and academic mobility could only be implied. Additional studies will be necessary to determine if helping students improve their learning fluency will lead to a decrease in the achievement gap.

Research Revisions and Recommendations

Based on the findings and limitations of this pilot study, it is possible to redesign the research methods of this study to produce statistically significant results that apply to a broader population and demonstrate whether training in academic fluency can lead to an increase in academic mobility. The sample for such a study should be randomly selected from a variety of classrooms, ages, subjects, and programs so that the results are more broadly applicable. The selected students and those who are not selected will receive a pre-test to establish a baseline of learning fluency and academic performance. Then selected students receive training in the five learning fluencies as a supplement to their regular program. The literature review suggested that training has been shown to improve student performance, however the type of training has not been specified. To account for this, a control group will receive supplemental training in study and test-taking skills. Students who are not selected for either training continue their regular program of study. After the training is complete, a post-test survey will be administered to students of all three groups to compare changes in learning fluency and changes in GPA.

Because the experience of the students will controlled, the relationship between learning fluency and change in GPA can be measured on the basis of training received and not on the basis of students’ self-perceived learning fluency. This controls for the self-perception bias and provides a stronger conclusion as to the effectiveness of training students how to learn. Students will still report their levels of fluency to evaluate the effectiveness of their training and to compare changes in learning fluency between the three groups.

Training details. In order to make this sort of training accessible to students, a pedagogy must be developed that not only defines the learning fluencies, but offers suggestions of how to help students develop their proficiency with them. As the drop in student performance between high school and college indicated in this study, students will be at different starting points in their development of learning fluency. Not only does this pedagogy need to identify the elemental starting points of academic fluency, but it should also provide the scaffolding needed for students to develop their skills to the point where they no longer need the teacher (Jenson, 2015).

Some studies have suggested that schools offer this training directly, while others advocate integrating various learning fluencies into the standard curriculum. Regardless of the method chosen studies like the report by Mokhtar, Majid, & Foo (2008) have showed that students require ongoing coaching from a learning expert in order to develop the skills they need for effective learning. Long-range studies are also needed to discover whether an integrated training program is more effective than a direct training program for students to develop learning fluency.

Program analysis. If a relationship between training in learning fluency and an increase in academic mobility is established, it will be possible to measure the effectiveness of educational programs using these factors. If a school has a significantly higher percentage of students who underperformed in high school, but now have high grades and there is a significant correlation between grades and learning fluency, then it will be possible to infer that the school does an effective job of helping students develop learning fluency. Levels of learning fluency can also be compared with the availability of training to see whether direct training in certain factors of learning fluency would be an effective supplement to existing programs.


Although this pilot study did not produce any statistically significant relationships, it did confirm the importance of continued research on the impact of learning fluency on the achievement gap. The insignificance of the findings in this study was largely an issue of sample size and experimental control. Instead of receiving training on all aspects of learning fluency, students on average reported the availability of training in a randomly selected learning factor 37%. Furthermore, the one-time survey was unable to measure whether training in learning fluency led to changes in student proficiency in learning fluency. Any relationship between improvements in learning fluency and improvements in GPA could only be implied and not directly measured.

Despite these limitations, there was still enough data to establish a weak relationship between the five learning fluencies and student performance. Comparison of the means and correlational studies consistently showed a positive relationship between higher levels of learning fluency and higher levels of change or current GPA. For current GPA, the results barely fell short of reaching statistical significance at p <.10. This makes sense given the indications of the literature that there should be a relationship between the fluency factors and student performance.

These results were not statistically significant, but they provide a starting point for future research that controls for student self-perception bias and indicates the effectiveness of training in learning fluency to increase student mobility. In this study, student mobility was defined as students moving from a less desirable to a more desirable level of performance – a reduction in the achievement gap. If improvements in learning fluency can be linked to this kind of change, it will be advisable to redesign the educational experience to help all students develop their proficiency in the 5 learning fluencies.


ACRL. (1989). Presidential Committee on Information Literacy: Final Report. Retrieved from http://www.ala.org/acrl/publications/whitepapers/presidential

ALA. (2000). Information literacy competency standards for higher education. Retrieved from http://www.ala.org/ala/ acrl/acrlstandards/informationliteracycompetency.htm

Andretta, S. (2000). Phenomenography: a conceptual framework for information literacy education. ASLIB Proceedings, 59(2), 152-168.

Bloom, B. (1956). Taxonomy of educational objectives; the classification of educational goals. New York: Longmans, Green.

Brookfield, S. D. (2013). Powerful Techniques for Teaching Adults. San Francisco: Jossey-Bass

Candy, P. C. (2002). Information literacy and lifelong learning. White Paper prepared for UNESCO, the U.S. National Commission on Libraries and Information Science, and the National Forum on Information Literacy, for use at the Information Literacy Meeting of Experts, Prague, The Czech Republic. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=

Codreanu, A., & Vasilescu, C. (2013). E-learning behaviors and their impact on andragogy. Conference paper for the 9th international scientific conference eLearning and software for Education Bucharest. Retrieved from http://www.researchgate.net/publication/256254358_E-LEARNING_BEHAVIORS_AND_THEIR_IMPACT_ON_ANDRAGOGY

Creswell, J. (2015). Educational research: Planning, conducting, and evaluating quantitative and qualitative research (5th ed.). Upper Saddle River, N.J.: Pearson/Merrill Prentice Hall.

Dewey, J. (1938). Experience and education. New York: Macmillan.

Gardner, H. (1995). Reflections on multiple intelligences. Phi Delta Kappan, 77(3), 200.

Holder, G. M., Jones, J., Robinson, R. A., & Krass, I. (2006). Academic Literacy Skills and Progression Rates Amongst Pharmacy Students. Higher Education Research and Development, 18(1) 19-30 doi:10.1080/0729436990180103

Huvila, I. (2011). The complete information literacy? Unforgetting creation and organization of information. Journal of Librarianship and Information Science. doi: 0961000611418812.

Inan, B. (2013). The Relationship between self-regulated learning strategies and academic achievement in a Turkish EFL setting. Educational Research And Reviews, 8(17), 1544-1550.

Jenson, K. (2014). The great books: Hutchins, Adler, and the liberal arts. Retrieved from https://www.academia.edu/8609009/The_Great_Books_Hutchins_Adler_and_Liberal_Arts

Jenson, K. (2015). Behind the screens: developing a digital learning literacy. Retrieved from https://www.academia.edu/12279274/Behind_the_Screens_Developing_a_Digital_Learning_Literacy

Kurbanoglu, S. S. (2003). Self‐efficacy: a concept closely linked to information literacy and lifelong learning. Journal of Documentation, 59(6) 635-646

Merriam, S.B. & Bierema, L.L. (2014). Adult learning: Linking theory and practice. San Francisco, CA: Jossey-Bass.

Mokhtar, I. A., Foo, S., & Majid, S. (2007). Bridging between information literacy and information technology in Singapore schools: an exploratory study. Education, Knowledge and Economy, 1(2). doi:10.1080/17496890701372749

Mokhtar, I. A., Majid, S., & Foo, S. (2008). Information literacy education: Applications of mediated learning and multiple intelligencesLibrary and Information Science Research, 30(3), 195-206. http://dx.doi.org/10.1016/j.lisr.2007.12.004

Montessori, M. (2004). The Montessori method: The origins of an educational innovation: Including an abridged and annotated edition of Maria Montessori’s The Montessori method (G. Gutek, Ed.). Lanham, Md.: Rowman & Littlefield.

Palmer, P. J. (2007). The courage to teach: Exploring the inner landscape of a teachers’ life. San Fancisco, CA. Jossey-Bass.

Richardson, M., Abraham, C., & Bond, R. (2012). Psychological Correlates of University Students’ Academic Performance: A Systematic Review and Meta-Analysis. Psychological Bulletin, 138(2), 353-387. Retrieved from http://dx.doi.org.ezproxy2.library.colostate.edu:2048/10.1037/a0026838

Rienties, B., Giesbers, B., Tempelaar, D., Lygo-Baker, S., Segers, M., & Gijselaers, W. (2012). The role of scaffolding and motivation in CSCL. Computers and Education, 59(3), 893-906. doi:10.1016/j.compedu.2012.04.010

Schocken, S. (2012, June). The self-organizing computer course. [Video file]. Retrieved from http://www.ted.com/talks/shimon_schocken_the_self_organizing_computer_course?language=en

Sousa, D. A. (2011). How the Brain Learns. Fourth Edition. Thousand Oaks, CA: Corwin.

UNESCO. (2006). Understandings of literacy (Education for All Global Monitoring Report). Chapter 6, pp. 147-159. Retrieved from http://www.unesco.org/education/GMR2006/full/chapt6_eng.pdf

Virkus, S. (2003)  “Information literacy in Europe: a literature review.” Information Research, 8(4), paper no. 159. Retrieved from http://informationr.net/ir/8-4/paper159.html

Young, S., Macrae, C., Cairns, G., & Pia, A. (No Date). Adult Literacy and Numeracy in Scotland [Pdf]. Retrieved from http://www.gov.scot/Resource/Doc/158952/0043191.pdf


Appendix: Data Collection Instrument

Survey of Learning Fluency



The Great Books, Hutchins, Adler, & The Liberal Arts

“The greatest university of all is a collection of books.”

Thomas Carlyle

Kevin Jenson (October 2, 2014)

The Great Conversation

The Great Books movement began thousands of years ago with what R.M. Hutchins called “The Great Conversation.” According to Hutchins, who is credited with applying this conversation to the liberal arts education through his Great Books seminars, “The tradition of the West is embodied in the Great Conversation that began in the dawn of history and that continues to the present day. Whatever the merits of other civilizations in other respects, no civilization is like that of the West in this respect” (Main Page, 2014). Over 500 elite rulers, historians, playwrights, philosophers, explorers, scientists, religious leaders, and thinkers participated in this writing conversation that shaped western history and thought as we know it today.

Among them are many names are commonly recognized today like Jane Austen, William Shakespeare, Charles Darwin, Mark Twain, Sigmund Freud, John Dewey, Albert Einstein, Karl Marx, Immanuel Kant, Adam Smith, Isaac Newton, Galileo, Martin Luther, St. Augustine, Virgil, Aristotle, and Homer. Their writings cover nearly every style, discipline, and time period. As a collection, the Great Books are sometimes referred to as “The Western Canon” and the selection is continuously changing to reflect the growth and change of the culture they underpin.

While the Great Conversation began long ago, it was not until the early days of the United States that the authors of this conversation began finding themselves together on lists compiled by people like Thomas Jefferson. As self-government became ever more important, people needed to be educated on the principles of the civilization for which they were now responsible. Lists of great books took the materials that had been used in elite circles and recommended their use by the common person.

The most famous of these reading lists, “The Harvard Classics,” was compiled by Harvard President Charles William Elliot as a commercial enterprise with the publisher P.F. Collier. While it was a commercial interest, it was driven by Eliot’s belief that anyone could receive a quality liberal arts education from a “five-foot shelf” of great books. According to Kirsch (2001), “In 50 volumes we have a record of what President Eliot’s America, and his Harvard, thought best in their own heritage…”. The publication of these resources was surrounded by public interest and controversy as there was no common consensus of which books should actually be included in this compilation.

Great Books in Education

Nevertheless, the Great Books grew in popularity, and so did the resurgence of classical liberal arts education led by John Erskine at Columbia University. Erskine taught a course called General Honors in 1920 that used various masterpieces from the great books for investigation and discussion. In part, Erskine’s method was a reaction to an education system that was abandoning the broad liberal arts education for a more practical and technical German-style of memorization and rote learning specific subjects.

St. Thomas Aquinas College has carried Erskine’s model to its logical conclusion by building an entire instructional method around the Great Books. “In the classroom, no more than 20 students sit around a table with their peers and with a faculty tutor as a guide, and together they grapple with the greatest works of Western civilization. There are no lectures, no didactic discourses, no simple regurgitation of others’ conclusions. Instead, ideas are proposed, rebutted, and defended, until, through discussion and critical argumentation, the class discerns the meaning of a given text…” (The Discussion Method, n.d.).

Erskine’s program ended after a few years, but was revived by Mortimer Adler, Robert Hutchins, and others who promoted the great books through seminars and their integration into programs at key universities. When Mortimer Adler took the program to the University of Chicago, he worked with Encyclopaedia Britannica to create a compilation that became known as “The Great Books of Western Civilization”. It included nearly 517 books in 60 volumes that could be read with a plan over the course of 10 years. Interested buyers could also purchase an accompanying 12 volumes of fiction.

To be included in this publication, the books had to be relevant to at least 25 of 102 great ideas listed in a two-volume “Syntopicon” of essays by Mortimer Adler on great ideas. They also had to pass 3 tests: “first, that there is something common to us as human beings…; and second, that there exists what might be called a permanent present in which books of this kind may be found—they are contemporary to every age,”” (Van Doren, 2001). Thirdly, “Adler thought philosophical and other expository works, at least, should stand the test of truth” (2001).


At the time of this compilation, a majority of books that had achieved this kind of cultural influence had been written by dead, white, western, male authors. This became a primary source of contention and received some thought from Adler himself. According to Adler, The Syntopicon did not feature sexual or racial diversity because the selection was made based on the impact of the books on the great ideas of western society. Hutchins also defended this selection by saying, “The Editors believe that an education based on the full range of the Western search is far more likely to produce a genuine openness about the East, a genuine capacity to understand it, than any other form of education now proposed or practicable” (Dorfman, R., 1997).

Another source of contention centered on the use of a standardized curriculum. John Dewey disliked the Great Books programs for probably a similar reason as the critical theorists who questioned whether a liberal arts education may actually be less than liberating: How are students going to confront the current problems of racial and sexual discrimination by learning from the western culture shapers that gave us the problem in the first place?

That is the question educators still grapple with. The Great Books programs have an assumption of “Perennialism” at their core. Perennialism focuses on a common body of core knowledge necessary for personal development. According to E.D. Hirsch, this basic knowledge is essential for developing “cultural literacy” (Searle, J., 1990). Columbia University, among others, recognizes the value of such “cultural literacy” by using a scaled-down version of Erskine’s original Great Books idea for their Literature Humanities courses. Dean James J. Valentini says it helps students “engage with others in a broad way about big ideas specific to the human condition”” (Cross, T., 2013).

Even if the ideas of Perennialism are not embraced per se, the books still offer insight into the human condition in a way that fosters discussion. Students don’t simply look to their ideological future, they explore its foundational past to understand the ideas that have shaped and influenced it. In the process, Valentini says, they “…formulate, deliver and defend arguments both in speech and in writing” (2013).

Losing Human Centered Learning

Similarly, many other institutions continue to use the Great Books as part of their literature or liberal arts education curriculum. However, as students begin to value education more as preparation for specific functions rather than for learning how to perform in a work position, technical degrees are replacing the liberal arts degree. Such a trend would sadden Hutchins who says, “The aim of liberal education is human excellence…It regards man as an end, not as a means; and it regards the ends of life, and not the means to it. For this reason it is the education of free men. Other types of education or training treat men as means to some other end, or are at best concerned with the means of life, with earning a living, and not with its ends” (Hutchins R., n.d.).

As education continues to trend away from that of Hutchins’ “free men,” it may be time to reconsider whether this is a positive movement. Alternatively, the Great Books method of liberal arts instruction may well be worthy of consideration for its value as a “means of understanding our society and ourselves. [For] They contain the great ideas that dominate us without our knowing it” (Hutchins, n.d.). Though they may have been valuable in local cultures and thought processes, very few cultural ideas have reached so far and so deep as those that accompanied the development of western civilization. Their end product may not look the way that we want it to yet, but a simple rejection of history does not empower individuals to change the future.

According to Mortimer Adler, the Great Conversation exists because, “In the works that come later in the sequence of years, we find authors listening to what their predecessors have had to say about this idea or that, this topic or that. They not only harken to the thought of their predecessors, they also respond to it by commenting on it in a variety of ways” (Adler M., 1990, p. 28). Perhaps educators could continue this great conversation by adding their voices to its ideological foundations in an attempt to direct it toward the ideals to which they aspire.


Adler, M. (1990). The Great Conversation: A Reader’s Guide to Great Books of the Western World. . Chicago: Encyclopaedia Britannica, Inc..

Cross, T. (2013). Students and Faculty Embrace Classic Readings, Modern Technology. Retrieved October 2, 2014, from http://www.college.columbia.edu/cct/spring13/features1

Dorfman, R. (1997, April 25). The Culture Wars and The Great Conversation. PBS. Retrieved October 2, 2014, from http://www.pbs.org/shattering/culture.html

Greater Books. (n.d.). Greater Books. Retrieved October 2, 2014, from http://www.greaterbooks.com/

Home. (n.d.). Western Canon. Retrieved October 2, 2014, from http://westerncanon.com/

Hutchins, R. (n.d.). The Great Conversation. Liberal Education. Retrieved September 29, 2014, from http://www.thegreatideas.org/libeducation.html

Kirsch, A. (n.d.). The “Five-foot Shelf” Reconsidered. Harvard Magazine. Retrieved October 2, 2014, from http://harvardmagazine.com/2001/11/the-five-foot-shelf-reco.html

Main Page. (2014, February 10). Wikipedia. Retrieved October 2, 2014, from http://en.wikipedia.org/wiki/Main_Page

Searle, J. (1990, December 6). The Storm Over the University. The Storm Over the University. Retrieved October 2, 2014, from http://www.ditext.com/searle/searle1.html

The Discussion Method. (n.d.). Home. Retrieved October 2, 2014, from http://www.thomasaquinas.edu/a-liberating-education/discussion-method

The Great Books. (n.d.). Home. Retrieved October 2, 2014, from http://www.thomasaquinas.edu/a-liberating-education/great-books

Towards a Definition. (n.d.). The English Literary Canon. Retrieved October 2, 2014, from https://sites.google.com/site/theenglishliterarycanon/home/the-western-canon/towards-a-definition

VanDoren, J. (n.d.). The Beginnings of the Great Books Movement at Columbia. Living Legacies. Retrieved October 2, 2014, from http://www.columbia.edu/cu/alumni/Magazine/Winter2001/greatBooks.html




A Multidimensional Model of Assessment


Philosophical Assumptions

Assessment in Practice

Purpose/Outcomes of Assessment




Appendix A – Five Levels of Learning

Appendix B – Six Types of Academic Proficiency

Appendix C – Four Factors of Learning

Appendix D – Three Communities of Learning

Appendix E – Four Realms of Learning

Appendix F – Program Transition Assessments

Appendix G – Integrated Chart of Multi-Dimensional Model of Assessment


Download Link –multidimensional-assessment-model

Philosophical Assumptions

Assessment and learning is about the student, not the content. Early definitions for the term ‘education’ come from the Latin word ‘educer’ meaning to lead out or to develop potential, as it was first used in the English language. According to OxfordDictionaries.com, the definition of educate in English means to “give intellectual, moral, and social instruction to someone” (Educate, n.d.). Assessment of this kind of personal development cannot be standardized, but must conform to the needs of every individual student. This assessment plan attempts to develop an individualized multi-dimensional model of assessment that is built around students rather than content.

Content mastery is defined by the Five Levels of Learning (Appendix A) that students may choose to pursue: Awareness/Information (a field of knowledge exists), Encounter/Knowledge (defining the field of knowledge), Interaction/Understanding (personal association with the field of knowledge), Engagement/Wisdom (ability to function within the field), and Creation/Love (the field of knowledge is improved or expanded through student learning). Although most teachers would be thrilled to see their students progress through the five levels of learning, current assessment models provide little incentive for students to go beyond an encounter with knowledge (level 2). To encourage student progress, assessment should take place at level three and higher. The point of learning to level 2 is to gauge student interest in a subject. Students that have chosen to specialize in a field should be assessed at level 3 or above.

Assessment of interaction, engagement, and creation all deal with the student as an individual scholar rather than a standardized information receptacle. This makes standardized assessment of information irrelevant. This assessment plan assumes that learning at higher levels can be inferred through the measurement of Academic Proficiencies and Outputs.


Assessment in Practice

The Six Types of Academic Proficiency (Appendix B) are the parts of learning in which students are expected to develop fluency. Observation, Communication, Organization, Execution, Specialization, and Innovation are the six categories of academic proficiency used for this assessment plan. The assumption is that students who become fluent in the six academic proficiencies will be able to use them to master the art of learning in whatever field they choose. Assessment is based on measuring proficiencies and using the results to inform learning contracts.

Outputs are the other learning factor that is assessed by this multidimensional model of learning assessment. Each student is required to participate in at least one Output Opportunity as part of his or her Program Transition Assessment Plan. The Four Factors of Learning (Appendix C) include Inputs, Learning Tools, People, and Output Opportunities. Traditional assessments usually take place on student recall of inputs. This assessment model is designed to guide students to develop their Learning Proficiencies with the Learning Tools by assessing their Output Opportunities through a variety of methods used by a wide range of People.

Conversations about student learning takes place on a regular basis between the student, peers, mentors, field experts, and other educational stakeholders. Similar to performance reports by a company, these assessments take place on a personal basis and are focused on around creating a narrative that traces multiple aspects of student development. Those who are involved in the learning process use assessment to identify opportunities for improvement of Learning Tools and Output Opportunities. These people can be categorized in three different Communities of Learning (Appendix D): Individual as Community, Scholastic Community, and Global (Local) Community.

Individual as Community recognizes the diverse factors of humanity and attempts to develop excellence and balance among the various possible expressions of individuality. The Scholastic Community includes all those that take interest in the development of the student and his or her ideas. The Global (Local) Community includes all those whose lives the student may impact through the process of life-long learning that begins with this program.

People who assess the student will do so relative to Four Realms of Learning (Appendix E). Learning outcomes vary for each Realm of Learning, however these are still assessed through Learning Tools and Outcome Opportunities. The difference deals with what the assessors are looking for. Each realm is different: Practical (I can do this), Technical (I can explain the process), Contextual (I understand its significance), and Constructive (I am changing how it’s done). The simplest form of assessment to measure all of these levels at once is student teaching.

Program Transition Assessments (Appendix F) combine all factors of assessment to provide students with guide-posts on the learning process relative to the content field they desire to master. To see how the multi-dimensional model works together in an integrated way, please view the attached Excel spreadsheet linked in Appendix G.


Purpose/Outcomes of Assessment

The purpose of assessment is to empower the learner to discover what he or she knows and to provide a starting point for future education. It has less to do with measuring the qualitative aspects of learning and more with measuring what framework is available for future learning. When performed in conjunction with an expert in the field, assessment can reveal aspects of study that will capitalize on student interests, strengths, and learning objectives.

Assessment and learning objectives should both inform each other. Assessments are built around learning objectives, however, learning objectives should be revised based on insights from assessments. This completes the assessment cycle. Since learning objectives are designed around student learning rather than information transfer, they can only be quantifiably or comparatively measured relative to self-critical student objectives.

Objectives should not be limited to measuring the reception and duplication of information. Assessment should examine student methods of learning, student attitudes toward learning, student performance relative to ability, and the application of content to the student fields of interest.

Assessment should be completed in a timely manner. Similar to a supply-chain workflow, certain concepts need to be understood before others can be introduced. Program Transition Assessments should take place at each of these strategic points to insure that student learning proceeds at a pace that stretches his or her ability to learn.

In summary, the Multi-Dimensional Model of Assessment is designed to measure student development rather than content transfer. It is based on the assumption that content mastery at higher levels of learning can be inferred through the measurement of Learning Tools and Output Opportunities. Many different individuals take part in the assessment process that informs the development of student learning contracts.

This particular program assessment plan needs further development of assessment tools like checklists, question guides, rubrics, and suggested action steps for each individual involved in the various aspects of the assessment process. These tools will outline the scope of assessment, how to administer assessments, how to measure results, what to do with feedback, who to share the feedback with, and how to evaluate the assessment process for improvement.



Educate. (n.d.). Retrieved December 2, 2014, from http://www.oxforddictionaries.com/definition/english/educate

Educe. (n.d.). Retrieved December 2, 2014, from http://www.oxforddictionaries.com/definition/english/educe




Appendix A – Five Levels of Learning

Awareness/Information – A field of knowledge exists.

The process of learning begins when an information gap is revealed. Information makes the student suddenly aware of his or her lack of knowledge. The job of a teacher is to create accessible information gaps. Students at the information level of learning simply know what they don’t know.

Encounter/Knowledge – Defining the field of knowledge

If a student decides to close the information gap, he or she must consciously choose to pursue knowledge. It requires a personal choice to move beyond awareness of one’s ignorance and begin collecting information to solve it. As the students collection of information grows, it becomes knowledge of the field. There is always more to be known, but now the student is developing a framework for discovery.

Interaction/Understanding – Personal association with the field of knowledge

In this Level of Learning, the student goes beyond merely knowing about the field to interacting with the field.  This would be a systematic exploration of information and hopefully involve some hands-on learning. This level of learning relates closely to the contextual and technical realms of learning. There is give and take in the learning process as previous facts become building blocks for more advanced levels of learning. Critical thinking is a great tool for this level of learning.

Engagement/Wisdom – Ability to function within the field

Students know what to do within the field. This level of learning is only necessary for students who choose to specialize within a particular field. In a scientific experiment, this would be the point beyond planning where the experiment actually takes place. The necessary components of action are available and students can attempt to perform in the field with existing resources and tools of interaction. Students use their proficiency within the field to help them in other aspects of learning. Teaching is a great avenue for learning at this point.

Creation/Love – The field is improved or expanded through student learning

Synergy between the individual student and the field of interest leads to discovery and creation that expands the field and maximizes the value of the individual. This level of learning is only accessible to individuals that have specialized in a particular or related field. Very often this level of learning is seen at the doctorate or masters level of college. If the student has not reached a point of loving the field or loving the way the field can impact others, it is unlikely that he or she will have the inspiration to create something new.

Although most teachers would be thrilled to see their students progress through the five levels of learning, there is little incentive for students to go beyond an encounter with knowledge (level 2). Current models of assessment focus on the first two categories. To encourage student progress, assessment should take place at level three and higher.  The point of learning to level 2 is to gauge student interest in a subject. Students that have chosen to specialize in a field should be assessed at level 3 or above.

Assessment of interaction, engagement and creation deal with the student as an individual scholar rather than a standardized information receptacle, so assessment must focus on inferred learning by the measurement of something else. This can be done through the measurement of Academic Proficiencies and Outputs.


Appendix B – Six Types of Academic Proficiency

Academic proficiency factors are the parts of learning in which students are expected to develop fluency. The assumption is that students who become fluent in the Six Academic Proficiencies will be able to use them to master the art of learning in whatever field they choose. Assessment is based on measuring proficiencies and using the results to inform learning contracts.


This deals with the input factor of learning, but its application spills over into every part of the learning process. This proficiency focuses on maximizing student reception of data through physical, mental, emotional, spiritual, and relational inputs. Personality tests, reading and comprehension training, and psychological development may all fall under this category.


The process of communication begins by processing inputs. Putting names to faces, definitions to words, and rules to grammar allows students to share their experience of inputs with one another. Without the communication process, learning takes place in a silo and severely limited. Common language is essential for success in communication, however students go beyond written and spoken language skills to practice non-verbal communication, and other types of unspoken language skill sets. As learning progresses through the five levels, their practice of communication shifts from defining inputs to sharing inputs to creating outputs. Some level of communication fluency is required for students to develop the organization proficiency.


Where communication begins by processing inputs into sharable pieces of data, organization continues by arranging the data into logical sequences. Students learn to seek and discover relationships and patterns among diverse bits of information. Skills in logic, reasoning, critical thinking become important. Organization provides the metadata to observation.


Once students are able to receive inputs, share their experience of these inputs, and arrange the inputs into logical sequences, they are prepared to begin searching for uses for the information. This is a transition stage from learning what is to asking how it can effect the physical and non-physical worlds of objects and relationship. Execution may be as simple as remembering information. However, it will more likely be demonstrated through skills in rhetoric and dialectic as students begin to learn through constructivism.


As students discover their propensities toward certain types of execution, they can begin to personalize their interaction with information. They know how to find, absorb, share, organize, and use information effectively. Now they take control of this process to achieve some particular goal that they desire. Learning is a way that students are empowered to express their individuality. It is integrated with their lifestyle. However, students are still dependent on received skills sets.


Innovation can begin as early in the process of learning as observation, however, it should come fully to life and be the primary function of students near the beginning of the fourth realm of learning. Innovation involves independent and novel engagement with all academic proficiencies and combinations thereof. Students become scholars that create their own tools of learning.

Assessment reveals which areas of proficiency need more focus. Proficiencies are measured relative to student goals as determined by Life and Learning Coaches, Discussion Groups, Subject Groups, and learner Self-Assessment.




Appendix C – Four Factors of Learning

Academic Proficiencies and Outputs are two of the four factors of learning. They are the point at which inputs are converted and applied. Typically assessment measures the effectiveness of inputs. However as seen in the last paragraph, assessment must go beyond this to provide incentive for advanced levels of learning.


Nearly every aspect of life can be an input for learning. Individuals are surrounded by inputs. Books, lectures, personal experience, conversations, observations, media, online resources, thinking, imagination, and more can become effective tools of learning as students develop academic proficiency. An excellent student is able to learn from every aspect of life. Education exists to equip students to do so.

Learning Tools

Inputs are not valuable to people that don’t have the right tools. For the sake of this assessment plan, these tools fall under six categories of Academic Proficiencies: Observation, Communication, Organization, Execution, Specialization, Creation.


Assessment of Learning Tools and Output Opportunities takes place in the context of relationships. Outside of this context, assessment cannot provide meaningful opportunities for change or improvement. Many of the Academic Proficiencies also depend heavily on a student’s ability to apply their principles in the context of relationships. For the sake of definition, these contexts of relationships have been categorized into three communities of learning: Individual as Community, Scholastic Community, and the Global (local) community.

Output Opportunities

Most students will take advantage of different kinds of output opportunities that reflect their personalities, interests, and fields of specialization. Output opportunities can be as simple as a discussion with another person or as complicated as developing an organization. Output opportunities used to design this assessment model include:


  • Discussions with discussion groups or student union
  • Presentation on specialty field or as a project
  • Teaching: every student functions as a teacher within field of specialty
  • Mendota Portfolio: Collects student research and tracks digital community participation.
  • Project: applying the field to solve a particular problem.
    • All students participate in a project upon entering the second learning realm
    • Project examples: public education, new business, non-profit startup, consulting, or employment.

Conversations about student learning takes place on a regular basis. Similar to performance reports by a company. Those who are involved in the learning process use assessment to identify opportunities for improvement of learning tools and outputs. These people can be categorized in three different communities of learning.



Appendix D – Three Communities of Learning

Individual as Community

  • Example Competencies: Reading, Comprehension, Memory, Logic, Analysis
  • Individual Students (Self-assessment) – assess attitudes, perceptions, experiences, difficulties, passions, progress etc…
  • Life and Learning Coaches – assess learning methods and context to maximize student experience. Coaches are classified under individual community since they are a resource for maximizing individual community.
  • Academic Coach: assesses learning contract changes to insure compliance with accreditation and desired graduation date.
  • Output and Proficiency Specialist: assess student performance in particular output or proficiency category.

Scholastic Community

  • Example Competencies: Rhetoric, Dialectic, Mendota Score
  • Subject Group – Assess value of interaction around content.
  • Discussion Groups – assess communication skills, leadership, and other skills outlined by the “Discussion Group Rubric for Peer Review”
  • Project Group – three way assessment of practical application of information and skills by and for leaders, juniors, and peers
  • Learning Group – Assess learning methods and life performance
    • Begins with Year 1 Peers and does not change through academic career
  • Students – because every student learns through facilitating learning for others, there is excellent opportunity to receive feedback from these learners on academic proficiencies and certain kinds of output. Defined more fully in evaluation plan.
  • Mendota Score: automatically assesses public/academic perception of scholar’s contribution to the field of learning as a whole, may be drilled down to measure engagement, contribution, original content, learning proficiencies, etc… in student’s field.

Global (local) Community

  • Example Competencies: Project, Publication, Teaching, Online
  • Peers in the field (Constructive Realm Only) – assess content/field contributions from creative process
  • Field Experts – assess content mastery looking for learning opportunities
  • Project Stakeholders (beneficiaries, connections, sponsors): assess the effects of learning/project
  • Online Connections: informal assessment of community interaction, collaboration, and content leadership.
  • Presenter/Learner events:
    • Learner assesses value of content, teaching methods, and communication
    • Presenter assesses information transfer and perceived engagement on a group basis.

Assessment takes place by members of each category. Students must interact on all community levels to master a subject.

Appendix E – Four Realms of Learning

  • Practical – I can do this
  • Technical – I can explain the process
  • Contextual – I understand its significance
  • Constructive – I am changing how it’s done.


Must proceed in order or else learning is destructive or useless. Similar types of assessment takes place at all levels depending on the students degree of field mastery.

Learning skills include all six of the academic proficiencies exercised in the three relational communities. Students are required to participate in training on each proficiency throughout learning realms 1 and 2. These required courses may be governed by separate learning contracts, but most likely they will be standardized as core curriculum. Assessment, however, still takes place along the same multi-dimensional model and effects the students’ learning contracts for their field of interest.



Appendix F – Program Transition Assessments

Program Completion

This level measures student readiness to participate in ongoing learning, contribution, and/or performance in his or her selected field of interest. Graduation is determined by successful entry into the field of interest. Students will not graduate until that point and once graduation occurs, learning is ongoing through sharing knowledge, building knowledge, mentoring new students and expanding breadth of knowledge through experience.

Module Completion

This level measures student readiness to participate in the next stage of learning. This includes 360 degree student assessment including, mental, emotional, physical, spiritual, and relational aspects of learning. Designed to measure student relationship to learning, subject material, field of interest, and future ambitions. Failure to complete a module indicates that student may be pursuing the wrong field or need to take remedial steps to fix learning issues.

Each student will complete a module built around the “Project” output. Final project is assessed by peer approval to continue application of material, by field experts who give money or advice to the individual and by mentors of the individual. May also include online rating trends for research, discussion contributions and team leadership.

Learning Contract Completion

This level measures whether the student has completed the requirements of his or her learning contract. Assessments at this level help field experts and academic advisors determine student needs and abilities for ongoing learning.

Overall completion of one large project (Module) to assess learning and application of class objectives. The specific form of application is selected by the student. Then smaller pieces are broken out into learning contracts that are necessary for the larger project. These are completed and assessed along the way.

Learning Sub-contract Completion

This level determines student ability to function in the skill or knowledge set defined by the learning sub-contract. Required for revising learning sub-contracts and continuing to subsequent sub-contracts. Sub-contract completion provides the ongoing checkpoint for student progress.

Teachers should try to create contracts that require application of knowledge, critical thinking and relational skills in order to complete. Then measure various levels of these aspects in the completed work to determine emphasis of the next contract.

Learning Objective Completion

This level determines measures completion of activities that require interaction with certain skill or knowledge sets. Emphasis of assessment is on research, writing, speaking, thinking, communication, interpersonal, and creative skills. HOW does the student learn? What is the quality of this learning?

Learning Objectives are all mastery exercises. If a student puts forth the effort to master each one, then the final grade reflects mastery. If they don’t, the final grade for the overall project will suggest room for improvement. However students will know exactly where this is and how to fix it in the next contract.

Ongoing Assessments

Ongoing assessment includes measurement of student experience of learning, attitudes, self-awareness, relational skills, interests, personality, career aptitude, areas of difficulty, areas of success, life effectiveness, etc…take place with students L+L Coaches, Field Experts, and Subject Groups, and Learning Groups. Program, instructor, learning experience, subject matter evaluation take place regularly but are described more fully in the evaluation plan. Assessment of required core curriculum (Learning Proficiency training) takes place on a predefined basis or at teacher discretion, but follows the same multi-dimensional model.

Ongoing Assessment is 3 dimensional. Learning Proficiencies are assessed, Outputs are assessed, and Life Skills are assessed. Recall and repetition are not assessed (as per usual assessment programs) but are suggested as a learning method to increase the efficiency of application. Students are expected to learn through experience.

All areas of learning are peer and mentor reviewed through weekly discussions and interactions around information and individuals. Many of the interactions between the scholastic community take place through Mendota and can be ranked on a randomized basis by various participants.

Outputs are assessed through completion of projects. Life Skills and Learning Proficiencies are both assessed through the quality of the project and through weekly check-ins with the Life and Learning Coach, and student Learning Group.



Appendix G – Integrated Chart of Multi-Dimensional Model of Assessment





ROI Analysis for a Week-Long Program

Kevin Jenson

Colorado State University

December 16, 2014



Program Description

In an attempt to begin applying some of the theories I have developed over the past several months, I am developing a workshop that teaches the basic tools of learning to adults that are occupied by work or education. While the details of the program are still in process, there are a few intended results that can be measured in terms of financial benefit to participants.

In short, the program takes place over a single week or less intensely over a four week period of time. During this time, participants receive practical knowledge of two or more learning tools, an overview of their personal strengths, weaknesses, and interests, practice with leadership, practice with group learning, and coaching in developing a learning contract with themselves. Participants in this program learn how to learn. When they leave, they should have a plan for continuing their education through relationship in a particular field or problem area of passion or relevance.

Because of its short-term nature, the return on students investment in this program can be expected to occur within the week of participation or shortly thereafter. Any habits not formed during this time or results not seen are unlikely to appear without some other external input. Nevertheless, habits, mindsets, and skills that are acquired during the workshop will return their investment long after the program is finished.


Purpose of measuring ROI

Measuring return on investment has value for the individuals who are involved in the program. It can help individuals make the decision to participate based on the perceived tangible and intangible benefits they could receive. It is also a means of determining how to produce the greatest amount of value for individuals that choose participate. Measuring ROI has benefits beyond simple proof of a program’s effectiveness if it is designed as a feedback loop similar to the process for valuations.

This requires that ROI measurements move beyond a simple yes or no answer as to whether the program was effective. The reason that students participate in the workshop in the first place is because they believe its content (or what they learn through the experience) will help them solve a problem or become a starting point for improvement. Some questions that will help turn the ROI measurement into a feedback loop include: What worked? Why? How do we change the program so that it can meet the objectives? What objectives does it meet if not the intended ones?

The program may return a positive ROI to students, but not in the way expected. The one difference between ROI and evaluation is that the value of accomplished objectives are measured in financial terms and compared to others on the same basis. ROI could be used as a comparative measure. How does one program return the money invested relative to another program? On the other hand, the ROI process could be flipped so that the various outcomes determine which aspects of the program are most valuable. This could help program managers determine where to place greater emphasis on development.

Depending on their attitudes, program directors may choose to focus on those things that produce the greatest ROI. However, there are two drawbacks to this. First, it is possible that the instruments designed to measure ROI are ineffective at pinpointing the real benefits of the program. As with most conversion of social activity to financial measurement, ROI is based on assumptions that can just as easily be proved as disproved. Second, it is possible that the ROI from the most beneficial aspects of the program is measured in a different way. Similarly, the same content and experience may produce a different ROI depending on the individuals that are involved.

Despite the difficulties of measurement, ROI still has value in terms of providing some sort of concrete basis on which to determine a price to charge students and an overall understanding of the value of the service. In this case, I hope to use ROI to determine the price that can be charged by the program. Work backwards from the estimated value of the program to determine its cost. Sometimes in business, an increase in cost increases the total value produced by the program because of psychological value by participants.


ROI Schema

So far the greatest interest in this workshop has come from individuals who don’t necessarily want to commit to a college degree and who do not simply want access to information through a free massive online learning platform. For them, the value of the workshop can be measured in terms of time and improvements in their work and study effectiveness. They want to learn and perform, but without paying the necessary costs of time and money required by a degree-based program. This workshop gives them the tools to design their own learning experience and maximize learning through life.

Two or three primary benefits can be measured for ROI for these individuals. First, the cost and time required by alternative degree based courses. Time can be calculated based on the number of hours a student would be required to invest in a program that was not customized to their interests. An average of 8-10 hours weekly over 12 weeks gives a total of 96-120 hours that a student would be expected to spend in an information-centric program which they may or may not be able to take full advantage of.

Financial benefits of the program in terms of costs saved from other academic programs may average between $300-$3000, time spent in less relevant training (2 hours at $25 = $50), and remedial expenses to make up for not participating can also be built into the ROI calculation.

A third factor of the ROI calculation looks at the increase in student efficiency of working or learning. Time can be added in from work or other studies the student currently participates in that will be enhanced by participation in the program. If effectiveness in a single aspect of work or study (for example, meetings) increases by 10-25% for a participant that has an average of 2 meetings per day, this leads to an extra 1-2.5 hours of time savings every 5 day week. 12 weeks this single aspect of efficiency could accumulates to up to 12-30 hours of additional time for the participant. Ongoing efficiencies in learning that develop over course of participation will continue to give the benefit of time to the student, but at a decreasing rate. To average the amount of time saved over a year, multiply the 12 week calculation by 3 or 4. Total time savings should be multiplied by a factor of the participant’s hourly wage (average $25 [Table B-3, 2014]) to determine overall ROI.

Costs of the program include the 25 hours of recommended time (times the wage multiplier) and the price tag of participation. For the purposes of program design, price is the unknown variable that will be calculated based on ROI measurements.


ROI Measurement and Calculation

All of the previous calculations are estimates, but they can be verified through the following methods. The course opens with a group activity in which all participants form teams to solve a problem. This activity forms the basis of learning throughout the week and is compared to a final group activity of a similar nature. Students are asked to compare the two activities on a survey and this becomes the basis of estimating efficiency before the program and efficiency after the program. This difference is then multiplied by the average wage factor (or an exact wage factor) if students choose to anonymously contribute such data.

The following chart shows data for 6 sample trials run with various reported levels of time used and confidence in the solution developed. In this example, trial 6 and 5 were completed by the same group. 5 took place before the training and 6 took place after. While the group used the same amount of time to come up with a solution to the problem, their confidence increased by 20% leading to an overall calculation that their efficiency as a group had increased by (2700/2100 = 1.28) 28% as a result of the program. If the situation were reversed and trial 5 took place after the training, the efficiency of the group would have decreased by (2100/2700 = .77) 23%. This would lead to a negative ROI calculation for the program and a reconsideration of its benefits.


Trial Time Allowed Time Used Difference % Confidence Efficiency Factor
1 60 60 0 80 0
2 60 55 5 80 400
3 60 50 10 90 900
4 60 40 20 80 1600
5 60 30 30 70 2100
6 60 30 30 90 2700


It is assumed that most students will see an average of 10 to 20 percent improvement in their efficiency throughout the course of the program. This change should be effective immediately, but produce much of its benefit beyond the week or month in which the course takes place. In the following formula, an average of 12.5% efficiency is estimated to be conservative in ROI estimates. As students actually complete the program, ROI estimates will be readjusted to reflect actual results.


Benefits Calculation Worksheet

Blank spaces are for student inputs. The estimates used in this paper have been included in parentheses.

Cost of participating in alternative program _______ (estimated earlier at $300)

Time saved from alternative program _________ (averaged earlier at 108 hours)

Time saved from work/study efficiencies _________ (estimated 12.5% efficiency *12 weeks *40 hours = 60 hours)

Ongoing time savings___________ (Multiply previous result of 60 by 3.33 = 200 hours)


Total Hours saved _______ (108 + 60 + 200 = 368)

Total of ___________(368) hours of value multiplied by __________$25/hour (average US hourly wage).


____________($9200) = Total Measured Hourly Benefits to this Student

Add in the cost savings of an alternative program at _________ ($300).


Total Benefits of the program = _______________$9,500   


Cost Calculation

25 hours or required time multiplied by the wage multiplier ($25) = $625

Estimated Cost of the Program = $1500


$2125 = Total Measured Costs to Student


ROI Calculation

Return on investment is calculated by dividing net benefits by total costs. Net benefits are calculated by subtracting total costs from total benefits.


$9,500-$2,125 = $7,375 = Net Benefits


Net Benefits / Total Costs = ROI

$7,375 / $2,125 = 3.470 = 347%


Return on Investment for this program with a price of $1500 is 347%.


Response to ROI

Based on this calculation for ROI, the program produces a 347% return on investment at a price of $1500. It looks like it will be a good idea to continue offering the program, though it will be difficult at first to convince students of its $1500 value. Until it’s value is proved, it is likely that the program will need to provide closer to a 1000% return on investment by reason of a much lower price tag.

As students fill out the calculation worksheet, they will discover the ROI particular to their own situation (alternative programs, changes in efficiency, hourly wage rates, etc…). If I am marketing to college students rather than young professionals who want to continue their education outside a college context, I will need to use a lower wage conversion rate.

Because of the complexity of the ROI formula, it would probably be easiest if this was converted into a digital format that required students to enter the time it took to participate in the measured group discussions, their confidence factor, and their wage rate. Other factors may be suggested and altered as students prefer to change the ROI. This makes the process simple and allows students to focus on the result rather than the process of calculating ROI.

Other factors not directly included in this formula are potential for pay raises and promotions as students continue their education and work more effectively within their teams, the increased student ability to learn from future classes, elf awareness, self confidence, new learning methods, improved leadership, higher GPA, higher levels of learning and engagement, and perhaps an increased ability to enjoy life. These things are not necessarily taught through the workshop, but are potential intangible benefits that may result from learning how to maximize learning through the experience of life.




Table B-3. Average hourly and weekly earnings of all employees on private nonfarm payrolls by industry sector, seasonally adjusted. (2014, January 1). Retrieved December 16, 2014, from http://www.bls.gov/news.release/empsit.t19.htm





How to Use Web 2.0 Technologies for Education

How to Use Web 2.0 Technologies for Education:

Seven Elements of Learning Activity Design

This report makes an effective case that users of Web 2.0 Technologies in a learning context need some kind of training to make them work for education.

Produced as a visiting scholar at Macquarie University under the supervision of Dr. Matt Bower.


Because users don’t know what to do with Web 2.0 technologies, these have been applied in higher education with limited success. Research indicates that successful engagement with learning in the Web 2.0 environment requires an activity framework to create a bridge between learning environments and technology affordances. The context of Web 2.0 and the affordance frameworks for analysing what technology can do are highlighted as the primary approaches to developing a successful integration of Web 2.0 technology in higher education. However, in the process of synthesizing this research, several elements emerged that were commonly employed by researchers to suggest the design of effective learning activities with Web 2.0 technologies. These seven elements identify what to do with Web 2.0 technologies while an analysis of seven pedagogical paradigms suggest how these activities should take place. Considered together, the elements of activity design and the pedagogical frameworks create 49 unique pedagogy/activity intersections that inform the selection and integration of technology from the perspective of student needs. A short description of each intersection and relevant technologies are provided and a proposal is made for their use in developing future technology affordances and pedagogical practices.

Keywords: affordance, pedagogy, technology, Web 2.0, learning activity, behaviorism, cognitivism, social cognitivism, humanism, self-directed, constructivism, connectivism, plan, find, curate, interact, create, publish, assess, teaching and learning, learning environment, technology design, learning design, human-centered learning, activity framework


Download Link – how-to-use-web-2-0-technologies-for-education-kevin-jenson

  Continue reading How to Use Web 2.0 Technologies for Education