Gaming the future of higher education

by Bryan Alexander, Senior Fellow, National Institute for Technology in Liberal Education

Alexander_BioWhat will happen to higher education in the future? Versions of this question have been asked with increasing frequency over the past decade, especially since the 2007-2008 financial crisis and the challenging economic environment for colleges and universities that followed. Demographic, political, technological, and institutional developments have added to an atmosphere of tension and impending crisis. High-profile conferences have summoned campus leaders and media attention to ponder the fate of academia.[1] State and national political campaigns energetically discuss details of college tuition, staffing, curriculum, and policies. The University of Virginia’s board controversially (and briefly) deposed a president over strategy concerning these issues.

Can we use gaming to improve our ability to think through these challenging times?[2] I pose this surprising question because of the parallel rise of another trend from the past decade. The uses of gaming for learning have been much discussed, experimented with, and developed since the 2003 publication of James Paul Gee’s landmark book, What Video Games Have to Teach Us about Learning and Literacy.[3] There has been a flood of discussion about the various ways games and simulations can enhance skills learning, convey curricular content, be used in libraries, and serve as the object of an emerging academic discipline, game studies. Games, “serious games,” and simulations have reached beyond academia into the realms of policymaking, entertainment, and news media.[4] Gamification, the use of game mechanics beyond formal game content, is being discussed to influence business, policy, and daily life. To propose using games to think through education’s fate is actually consonant with the tenor of our times.

This article will describe the results of two projects conducted over the past four years. I will explain the operations of a prediction market and ongoing scenarios practice. At the same time I would like to explore the intersection between two conceptual fields: futuring and the cognitive affordances of gaming. My contention is that these gaming techniques offer useful insights for members of the higher education community. These approaches can also be readily replicated, built upon, and developed further.

The audience for these games was and remains campus leaders, such as deans, chief information officers, faculty, and librarians. Students could also play a role, but did not in these projects.

1: prediction markets

Out of the full range of computer game genres and types, prediction markets occupy a relatively small corner. Also referred to as futures markets (in a pun on the economic tool’s name), prediction markets are simulated bets on forthcoming events. They resemble commodity futures trading markets, in that players take positions in (i.e., buy) possible outcomes of current processes. Rather than speculating in commodity prices, a market’s players estimate the likelihood of an election, an institutional transition, the relative popularity of a Hollywood star, or which marketing strategy a competitor will select. The games usually rely on pretend money, but the rest of the market’s structures are quite real.[5] One of the best-known prediction markets hosting sites is Intrade, where users could find (until April 2013) markets in presidential elections, movies, and technological development.[6]

Several reasons for supporting or playing a prediction market have emerged since the first game was launched during the 1980s, the Iowa Political Stock Market (now the Iowa Electronic Market).[7] First, since markets are continuously open for trading, players can trade in response to events or thoughts as they occur to them during daily life. Thus a prediction market serves as an always-on site for personal expression, if mediated strictly by the selection of topics. Unlike polling, the surveyed population interacts on its own terms, possibly leading to different data. Feedback on a topic can be continuous or ongoing, without the necessary time limits of surveys. Second, prediction markets take advantage of some of the affordances inherent in games: competition with other players, the pleasure in winning, a sense of fun. These two reasons combined can make markets appeal to an institution that fears waning or dispirited responses to surveys. A third reason is that markets are crowdsourcing tools. They are not easily dominated by a single will, since individual traders are at a minimum seriously out-budgeted by other players. Questions can be put to a multitude, hopefully eliciting the famous wisdom of crowds.

In order to exploit this gaming genre for higher education, the National Institute for Technology in Liberal Education (NITLE)launched a prediction market site in spring 2008.This was a soft launch, involving several beta-testing groups, advisors, NITLE staff as administrators and players, and demo-style markets. Those markets (or “questions”) focused on the impact of technology on colleges and universities. NITLE selected Inklings as a vendor. That firm hosted prediction market code on its servers, letting administrators and players interact through familiar and easy Web 2.0 style options (tags, comments, drop-down menus, radio buttons, simple workflow, blog-style page configuration, etc.). After playing the markets for several months and reflecting on the experience, NITLE staff organized new markets for a public launch in September 2008. Since then the site has run continuously through today. Approximately 441 users have created accounts, out of which roughly 10% have been active, persistent players.[8]

A snapshot of prediction market activity helps explain the project. Users can interact with the site right away, without having to create an account. On any given day they will see a list of active markets, such as

How many American universities will have online, open, credentialed programs by the end of 2012?

Will Google and Facebook collapse by 2017?

Which academic fields will dominate mobile apps?

How much will average per-student expenditures on course materials drop annually over the next 5 years?

When will the number of journals in the Directory of Open Access Journals reach 8000?

Inactive markets are also accessible, such as “Which will receive more attention by April 1, 2011: virtual worlds or augmented reality?”[9] “Inactive” describes markets whose futures have already occurred.

Clicking on a market reveals a Web page that resembles a minigame, a game within the larger game. For example, the aforementioned online, open credentialing market[10] displays the market’s question, its range of outcomes, two discussion areas (general discussion plus “Reasons for Prediction”), a visualization of trading activity, more information about the topic, and embed code.

Alexander_figure1

Without logging in, a site visitor cannot buy or sell shares in this market, but login and registration links are available from each page, and account creation has been free for the lifetime of the project. Logged in, the user can buy or sell shares throughout the site, up to the limits of the (fake) money in their account.

Alexander_figure2

Looking back over four years of game play, I think the prediction markets have been very useful as a futuring tool, beyond being fun to play. Trading has yielded insights into trends and events. For example, the above question about virtual worlds versus augmented reality consistently demonstrated low levels of interest in the former, despite years of educational work in Second Life and other realms. This is useful for campus planning purposes, especially in an era of tight budgets. For another example, a question about alternatives to PowerPoint revealed a broad preference for Google’s Web-based presentation tool, as opposed to very visible competitors.[11] This may indicate institutional buy-in for Google Apps, or a sense of the company’s titanic presence; comments also show criticism of those competitors.It also suggests ways for campuses to invest money and staff support time.

Do prediction markets help us think through the future of higher education? To a limited extent, yes: the contents of market results represent the present’s best guesses about alternative outcomes to a chosen trend or issue. To make a trade, players must conceive of current forces for their emergent properties, while anticipating the emergence of new drivers. The act of play, therefore, is a futuring one. It teaches players to be futurists, assembling current data, extrapolating trends, analyzing the unfolding present strategically. Prediction markets are, in this sense, pedagogical tools, or heuristics. Their results are also useful, in the narrower sense of yielding foresight on specific topics.

However, prediction markets limit that futuring by confining play to the testable outcomes stemming from question topics. Unanticipated events or questions – Rumsfeld’s “unknown unknowns” – aren’t in play in advance, by their nature. Additionally, thematic limitations come into play. Our NITLE prediction markets have been devoted to the impact of technology on education, rather than examining other drivers. This can be addressed by generating new markets on other topics.

2: scenarios and role-playing

A very different type of game is scenario-based role-playing. A scenario is a narrative about a possible future based on the activity of selected forces. For example, we could imagine a future where the United States economy becomes vibrant (due to widespread development of shale oil, say) and open content trumps closed. Conceiving what education would be like in such a world – how would libraries change? Would information literacy become central to curriculum? Would states return to earlier levels of funding public education? – is the work of a scenario creator, and then of players.

Role-playing comes into play once a scenario leaves the creator, and this occurs in several ways. At the most basic, a reader or viewer can consume a scenario, then think about how their life would change in that future. For the previous example, a sociology faculty member might imagine a transformed or marginalized Social Sciences Citation Index, while wondering what kind of social imagination an 18-year-old would have after growing up in such a world.[12] At a more social level, scenarios ask groups to collectively pursue this kind of imagination. At a more advanced level groups may be asked for more formal role-playing activity, such as being assigned different positions and having to work through a series of questions and exercises. Such social activity can exist face-to-face or online, either all at once or distributed over real time.

What are the futuring benefits of scenarios? As with prediction markets, the element of fun counts for much in engaging a group. Play triggers creative thinking, a necessity for trying to envision the famously difficult-to-pin-down future. Roleplaying one’s current role in a future situation lets players bring to bear their local, tacit knowledge; playing a different character frees one up from habits and some assumptions. Roleplay can help defuse inter-group conflicts by giving participants time to anticipate their interactions, a kind of preemptive cooling off period.[13] Additionally, scenarios can elicit emergent behavior from players over time. That is, after a round of initial reactions (“SSCI would disappear!”), players rethink their positions, partly in reaction to others. New ideas appear, and submerged thoughts can surface, leading to another round of forecasting. The game organizer can continue this, and facilitate more thinking out loud by asking leading questions: how would your department’s professional development strategy change? What happens to your next accreditation round? The organizer can provoke still more emergence by adding new scenario content (“Five years later, shale oil collapses, and the American economy slumps again”).

In 2009-2010 I began scenario work as NITLE’s senior fellow in order to realize these benefits, in response to encountering the futuring challenges with which this article began. I built up a scenarios library, based on identifying trends articulated through other futures methods, including the NITLE prediction market, the New Media Consortium’s Horizon Report, and continuous environmental scanning.[14] I used social media to test out and garner feedback for scenarios in part and in whole, while developing scenarios through an iterative process. Over time I practiced several role-playing approaches, each based on a plurality of scenarios.

First, presentation to and interaction with a mass audience: rather than offering one’s best, educated guess as to the future of academia, this method involves presenting a set of competing scenarios. For example, one set of five draws heavily on technological drivers. In one future, open content, open access, and open source dominates the world; in its opposite, silos and proprietary systems win. In a third, gamification has reshaped society, while augmented reality takes its turn in the fourth.[15] A different set focused on two major drivers (globalization, digitization) that combined to make four possible outcomes: high globalization and low digitization, low globalization and high digitization, and so on.[16]

After explicating these new worlds and some ways they change higher education, the audience is invited to answer two questions: which would you prefer to occur, and which do you think is most likely to transpire? The first question is an implicit role-playing exercise, since every respondent can be queried for their reasons, revealing their institutional role and organizationally framed mindset. Discussion surfaces a variety of ways of thinking about technology, students, faculty, economics, and more – i.e., the broad gambit of futures thinking. Asked to support their choices, audience members then reveal sources and experiences, adding to the collective understanding. Usually at least one person will advance up Bloom’s taxonomy to synthesize two or more scenarios, arguing for a different model of higher education in the next decade.

Second, small group work with tailored exercises: multiple scenarios are again prepared for a large audience, but with two differences. Each scenario is very narrowly focused on a specific event, and the larger group is broken down into smaller groups (seminar sized, at most), each one receiving a single scenario.

For example, a scenario based on developments in e-books and e-textbooks:

A coalition of textbook publishers have announced an e-book consortium. This group, GoodETexts, is starting up with a bang, releasing a joint catalogue of offerings, to be followed by more each season. Titles cover all major disciplines, and fuller coverage is promised.

Most of these texts are electronic versions of existing titles, emphasizing black and white text. Illustrations are supported, sometimes in grayscale, otherwise in color. GoodETexts says more interactive content is forthcoming, as are “born digital” or “digital only” e-books.

All members of GoodETexts must release their e-books in formats readable by laptops, netbooks, tablets, and several e-Readers. Not cellphones. Amazon announced that their Kindle can play these formats; so far, Barnes and Noble’s Nook cannot.

Prices are cheaper than print editions by a significant degree, costing between 60-80% of paperbacks and hardcovers.

Faced with such specific future events, all groups are instructed in a workflow for responding. I have varied these instructions, but they share common characteristics. Groups must imagine how they would respond to that event in their current role, and explain their decision-making process: do you engage with the new development or not? How did you make this decision? If you engaged, what was the first stage of that action? Social knowledge questions also come into play, as in: how would you be most likely to have learned about this development, and, if you engaged it, how would you share the results? For example, the leader of an academic computing unit would learn about this e-book project from a librarian in their merged organization. They would bring news and ideas to their chief information officer, who usually makes these decisions herself. The CIO decided to engage with GoodETexts in order to keep up with the rapidly changing e-textbook world, and to pilot through a small group of early adopter faculty members. Results would be shared through social media, one conference presentation, and an eventual article.

When the small groups finish their work, they report back to the reassembled whole. My role as facilitator at this point includes building up a model of what the groups had in common: institutional strategies, favored practices, knowledge sources. It is also important to note forces that groups identified other than the ones showcased in the scenario set – in other words, to recognize how participants were futuring beyond the immediate constraints of the exercise.

These two scenario/roleplaying approaches have had certain advantages over other methods. Their social nature, especially the second one, supports discussion-based learning instead of the passive learning experience by one person at a time consuming a scenario presentation (think seminar rather than lecture). Indeed, there’s a strand of constructivist pedagogy in my approach, as I try to nudge audiences to the point of constructing their own sense of meaning. The plurality of scenarios opens up thinking to more complexity than does a single one. The second approach’s structured exercise helps ground often wild futures discussion, bringing it back down to Earth and institution.

However, as with each futuring method, the scenario-role playing way presents its own limitations. Since the ones I have described here were face-to-face events, with one exception, they tend to be evanescent, even when presentation materials are archived. The energy of discussion ends when sessions conclude, trickling out at a reduced level via hallway conversations and social media among whatever proportion of participants actually uses Twitter, Facebook, etc. during an event. One event does not connect to another, essentially, removing the ability for participants to build on others’ discussion. Moreover, the benefits of constructivist pedagogy are limited, since so much of session time is driven by either presentations or processing small-group scenario narratives.

We can redress these temporal limitations by extending the framework over time. A group can pursue scenarios and role-playing in longer sessions, such as a full working day or multi-day retreat. An exercise could be distributed over time and space, using networked technologies to link participants in futuring play, so long as records of each event were accessible to players.

3. Conclusions

What have these two game-based futuring efforts revealed to us about the future of higher education? Answering this question depends on what sense of “futuring” we’re using. If we mean forecasting events, then the prediction market game has had some limited success. One series of questions accurately anticipated a milestone for open access, when the number of OA journals indexed by the Directory of Open Access Journals. It’s a fairly obscure number, but keys into a major issue for scholarly publication.[17]

If we view futuring as improving our ability to think through and respond to future events, then the scenario/role-playing approaches seem to successfully spark discussion and thinking out loud, based on audience reactions. Moreover, within the event limitations noted above, role-playing work encourages audiences to share forecasting resources. This is, in a sense, a capacity-building strategy.

These are small-scale efforts so far, proofs of concept and a series of face-to-face events. One virtue of that limitation is that they can be replicated relatively easily by using Web services (such as Inklings) or various presentation formats (face-to-face or online). Perhaps academics will be moved by the collective intelligence concept and use social media to share resources and thoughts about the future of higher education. Distributed, collaborative futuring may become the next futures method; if so, it could help us better anticipate what comes next for higher education.


BIBLIOGRAPHY

Alexander, Bryan. “Apprehending the Future: Emerging Technologies, from Science Fiction to Campus Reality“. EDUCAUSE Review, May 28, 2009. http://www.educause.edu/EDUCAUSE+Review/EDUCAUSEReviewMagazineVolume44/ApprehendingtheFutureEmergingT/171774.

Armstrong, J. Scott, ed. Principles of Forecasting. Philadelphia: Springer, 2001.

Bogost, Ian, et al. Newsgames: Journalism at Play. Cambridge: MIT Press, 2010.

Gee, James Paul. What Video Games Have to Teach Us about Learning and Literacy. New York: Palgrave Macmillan, 2003.

Thompson , Donald N. Oracles. Cambridge: Harvard Business Review Press, 2012.


Notes

[1] For example, The New School’s “The Future of Higher Education” event, December 8-9 2011 and December 3 2013,

http://www.newschool.edu/cps/future-higher-ed/ . Cf this author’s liveblogging indexed here,

http://blogs.nitle.org/2011/12/12/the-future-of-higher-education-a-conference-blogged/ and http://blogs.nitle.org/2012/12/03/the-future-of-higher-education-a-discussion-blogged/. Or see Swarthmore and Lafayette Colleges’ “The Future of the Liberal Arts College in America and Its Leadership Role in Education Around the World”, April 9-11, 2012, http://sites.lafayette.edu/liberal-arts-conference/ .

[2] Many thanks are due commentators on the CommentPress edition of this site, notably Jason Rosenblum, Nancy Hays, and Mike Roy.

[3] Gee, James Paul. What Video Games Have to Teach Us about Learning and Literacy.

New York: Palgrave Macmillan, 2003.

[4]The best work on gaming in news media is Ian Bogost, Simon Ferrari, Bobby Schweizer , Newsgames: Journalism at Play. Cambridge: MIT Press, 2010.

[5] Donald N. Thompson, Oracles. Cambridge: Harvard Business Review Press, 2012.

[6] http://intrade.com/.

[7]http://tippie.uiowa.edu/iem/.

[8] http://markets.nitle.org.

[9] http://markets.nitle.org/markets/34192.

[10] http://markets.nitle.org/markets/42559.

[11] http://markets.nitle.org/markets/27567.

[12] http://thomsonreuters.com/products_services/science/science_products/a-z/social_sciences_citation_index/.

[13] J. Scott Armstrong, “Role Playing: A Method to Forecast Decisions.” In Armstrong, ed., Principles of Forecasting. Philadelphia: Springer, 2001.

[14] See my “Apprehending the Future: Emerging Technologies, from Science Fiction to Campus Reality“ for an explanation and examples of those methods. EDUCAUSE Review, May 28, 2009. http://www.educause.edu/EDUCAUSE+Review/EDUCAUSEReviewMagazineVolume44/ApprehendingtheFutureEmergingT/171774.

[15] Materials; http://prezi.com/zrcjuf32ogbu/academia-in-2015-five-futures/.

[16]http://prezi.com/whxzhabagrec/four-futures-for-liberal-education-aacu-2012/.

[17]http://markets.nitle.org/markets/24462, http://markets.nitle.org/markets/37081.

 

 


This article is part of a special issue of Transformations on games in education, published on September 30, 2013. An earlier version was circulated for open peer review via Media Commons Press. The “Games in Education” issue was developed by Mike Roy (Middlebury College), guest editor and editorial board member of the Academic Commons, and Todd Bryant (Dickinson College), who was instrumental in organizing and developing the special issue.

 

Managed Cyber Services as a Cyberinfrastructure Strategy for Smaller Institutions of Higher Education

by Todd Kelley, NITLE

Technology and Relationships
Director of the National Science Foundation, Arden Bement, recently stated that “At the heart of the cyberinfrastructure vision is the development of virtual communities that support peer-to-peer collaboration and networks of research and education.”[1] Just as Bement emphasizes networked relationships as an essential component of cyberinfrastructure, I would like to address how small to mid-sized institutions might meet some of the critical challenges of this vision.I propose that in order to realize the cyberinfrastructure vision, colleges and universities reconsider how they approach technology and technology management, which have become just as important as constructing and maintaining the physical facilities on campus. Providing Internet access, for example, should be seen as a key infrastructure asset that needs to be managed. A robust connection to the Internet is necessary for a successful local cyberinfrastructure; however, it is by no means sufficient. The new cyberinfrastructure should include cyber services that enhance existing organizational relationships and make new ones possible–on a national and global basis. However, for some institutions, deploying and sustaining sophisticated organization-wide tools and infrastructure are complex and risky activites.  Smaller institutions often simply cannot implement, sustain and support these initiatives on their own.Cyber Services
While colleges and university libraries were pioneers in using the Internet to provide access to scholarly resources, rarely have they used it to access enterprise technology tools. Instead, most campuses have tried to meet these needs through combining their own hardware infrastructure with (mostly) proprietary software systems that are licensed for the campus, such as Blackboard, ContentDM and Banner. This approach to learning management systems, repository and administrative services may have made sense at a time when the Internet was still in its early stage. It may still make sense for large institutions that have a degree of scale and deep human resources where the organizational benefits of locating all technology services on campus outweigh the costs.

However, smaller, teaching-centered colleges and universities require an attractive alternative to locating all hardware, software, and the attendant technical support on campus, without the onus of locating and selecting application service providers, negotiating licenses and support agreements. They also need to avoid becoming trapped by contractual relationships with new vendors or Faustian bargains with technology giants Google or Microsoft. One option for these institutions is to obtain managed services from organizations such as NITLE, which provide a broad array of professional development and managed information technology services for small and mid-sized institutions. Through using such managed services, institutions are reporting that they lower their technology risk and increase the value proposition for technology innovations.

Lowering Technological Risk Encourages Innovation
Typically, there is a high degree of risk of failure for smaller colleges and universities when they deploy a new technology system. This is because the technical resources and organizational processes required are just not part of the primary focus of these organizations. Typically the risk might be mitigated through devoting significant technological resources and organizational focus to altering the infrastructure in the hope that the institutional culture and processes will adjust to it. But this does not appear a wise approach.

When smaller colleges and universities need organizational technology they often:
1) work to identify the most appropriate vendor and negotiate to obtain the technology they need;
2) focus on how the technology works and on how the technical support for it will be provided; and
3) create organizational processes and procedures that attempt to connect the technical work to the perceived need and the promised benefits.

The focus in this process is often the technical or procedural aspects of a project when the institution would be far better served if the emphasis were on the substantive innovations, relationships, and other benefits that technology can provide. Relationships that are about technical issues per se are off-focus, distracting, and ultimately unproductive, relative to organizational mission.

The continuing development of more sophisticated and complex technologies and the increased dependence on them by these institutions will only increase the potential risk of failure for those that do not make a significant commitment to hiring technology specialists. Increased risk thwarts any interest in using technology to innovate, so technology becomes much less interesting and viable as a route to organizational strength and sustainability. The challenge for smaller colleges, then, is to have dependable, secure and innovative cyber services while reducing the risks and resources traditionally associated with creating new technology systems on campus.

Managed Cyber Services
What do managed cyber services look like and how do they work? In the case of NITLE, it aggregates the cyber services needs of smaller colleges and universities and provides managed services via the Internet so that each individual institution does not have to replicate the hardware, software and technical support on campus for each enterprise application that is needed. NITLE does the legwork, finding reliable and cost-effective hosting solutions and negotiating agreements with applications service providers for services and support. Open source applications are used wherever it is viable. Individual campuses do not have to become involved with these processes, as the goal is to provide an easy on-ramp without legal or contractual agreements with participating campuses. There are also opportunities to test services and experiment with them before participants commit to beginning a new service. In addition, NITLE provides professional development opportunities for campus constituents to learn about the functionality and features of the software in the context of campus needs. Moreover, it encourages campus representatives to participate in communities of practice that it supports.

NITLE currently offers four managed cyber services. The criteria used for selecting cyber services include: participants’ needs; the technology benefits; the development path for the technology (including reliability, scalability, and security); and the expectation and understanding that when adopted by peer institutions, the technology will support the learning communities on campus and peer-to-peer collaboration among campuses.Advantages of Open Source
Colleges are advised to consider open source software (OSS) whenever possible, because OSS offers distinct advantages. The first is the cost savings, as there are no annual licensing fees, and many OSS applications require less hardware overhead, thus helping contain hardware expenditure costs. Second is the support that OSS can provide: a common infrastructure, readily accessible to all, can enable institutions to collaborate more effectively and to focus together on the substantive activities that technology supports.

As a case in point, NITLE provides a repository service using the open source DSpace repository software. The twenty-five colleges and universities that participate in the repository program share their experience and expertise about how the software helps them meet their individual and common goals. Their stated goals include:
1) creating a centralized information repository for information scattered in various difficult-to-find locations;
2) moving archival material into digital formats and making it accessible from one easy-to-access location;
3) bringing more outside attention to the work of students and scholars and thus to the campus;
4) providing the service as a catalyst to help faculty and students begin to learn about and use new forms of publishing and scholarly communication, including intellectual property, open access and publishing rights;
5) preserving digital information.

According to one participating organizational representative to NITLE,

“the open-source approach is definitely helpful in terms of cost. Having [a dependable vendor] administer the hardware and software has been wonderful, since we can concentrate on the applications and not worry about the technical end….Having colleagues from similar schools work on this project has been beneficial, since we can play off of their strengths and weaknesses. They have also given us some good ideas for projects.”*

Another participating organizational representative has added,

“The open source nature of the software is important to us because we know that we are not locked into a closed proprietary system whose existence depends upon the livelihood of a software company. Furthermore, we wouldn’t have gotten started with d-Space on our own because of the infrastructure we’d have to provide to get it going. We don’t have the staff with the skills needed to handle the care and feeding of the server or to customize the software to our needs through application development. Having that part out of the way has given us the opportunity to focus on creating the institutional repository rather than being mired in technical detail of running the software.”*

Open technologies are more than a path to cost savings. They are a critical condition for innovation, access, and interoperability. Many colleges are using OSS for important critical operations, including email (Sendmail), web serving (Apache), and operating system (Linux) applications. This use of OSS suggests that there is a growing acceptance and adoption of OSS. The use of OSS can leverage economies of scale, support network effects, and dramatically increase the speed of innovation.

There is, however, still resistance to making consideration of OSS the de facto approach to meeting organizational software needs. There are several reasons for this opposition, including the view of OSS as hacking, the historical lack of accessible technical support and the paucity of documentation which has complicated the learning curve. Many have long recognized the potential of OSS, but they were reluctant to pursue it because of the increased need for specialized technical support on campus. Thus, for every OSS system, an institution would need to find and hire a technical specialist to support it. This approach certainly is not scalable and smaller institutions were right not to do it.

Multipoint Interactive Videoconferencing (MIV)

Another example of cyber service that institutions should consider is Multipoint Interactive Videoconferencing (MIV). MIV systems enable participants to communicate visually and aurally in real time through the use of portable high-resolution (and inexpensive) cameras and microphones attached to their computers. Participants can see and hear each other, not only on a one-to-one basis, but one-to-many as well. MIV is not a completely new technology; however, its enhanced level of functional maturity, the reduction in costs to provide it, and the need for such systems, have made MIV a technology that is on the verge of widespread adoption and use in a variety of settings.

In the winter and spring of 2007, a dozen participant colleges agreed to evaluate the use of MIV on their campuses and provide NITLE with feedback on the application and their perceptions of its utility. During this evaluation period, participant institutions discovered many types of needs for this technology, both for on-campus and off-campus communications. Uses included guest lectures, meetings between faculty working remotely and connecting with students studying abroad. Since this assessment, NITLE has used MIV for:

1) facilitated conversations led by one or two practitioners among a group of practitioners in an area of common interest, such as incubating academic technology projects or the application of learning theory to the work of academic technologists;
2) presentations by individuals who are using technologies of interest in their classrooms or other campus work to groups of others interested in whether and how they might do something similar, such as historians using GIS;
3) presentations by experts on topics of interest to others in their professional field, such as the academic value of maps;
4) technology training for the participants and users of the cyber services that NITLE offers.

The experience of MIV service participants suggests that the adoption of MIV may be most successful when placed in the context of next steps, developing relationships, individual experience and expertise, and common goals and objectives. This premise suggests learning and collaborative environments that include the use of MIV as part of a range of learning and communications options. Through the pilot study, the many positive benefits that participants have experienced have been documented. However, these benefits are a fraction of what can be realized when many more institutions participate because of network effects and because participants may use the MIV service to collaborate with other organizations outside of the opportunities organized by NITLE.

The “Open” Movement
The promise of information technology can not be met when only large, powerful, and for-profit IT organizations are in control. Open access, open courseware and open source initiatives point toward a world where there is a level playing field for individual learning and organizational innovation by not-for-profit institutions. Where just a few years ago it was difficult to name more than a few organizations that provide technical support for open source applications, the number of service providers is growing. Identifying these providers, selecting the best ones and negotiating agreements–these are the important challenges for managers of cyber services. Providers report that it is often financially unfeasible for them to market to and negotiate with individual institutions for providing cyber services. Creating a reliable and scalable approach to cyber services that works for colleges and providers alike would seem to be an important advance for smaller institutions, both individually and as an important segment of higher education.

The open movement is not about software tools alone, as Arden Bement noted in his comments about the importance of virtual communities. Success depends upon achieving a balance among essential human, organizational and technological components. The potential benefits of the open movement will accrue to colleges and universities that collaborate through using a common set of tools, actively participate in peer information networks and make a priority of mission-focused knowledge and skills. Many institutions know that the value of peer-to-peer communities for individual institutions will increase proportionally to their equal investment in all three of these components. The question may ultimately center on how to support these activities in a systematic and sustainable fashion. This is where small and mid-sized institutions may want to innovate in their approach to technology management.

Collaborative Relationships Foster Organizational Strength and Learning
Technology that supports wide-spread virtual collaboration among smaller colleges and universities such as the repository and MIV services described above demonstrates the potential power of cyber services to enhance organizational innovation, learning and productivity. These peer communities of practice allow campuses to: 1) exchange information about usage, technical issues and support; 2) learn from one another; and 3) synchronize their efforts to use technology to promote shared goals and processes. Having campuses work together and share knowledge as they engage with enterprise systems is a crucial part of the equation. The community of smaller colleges and universities needs a robust organization for that collaboration to happen. Organizations such as NITLE can help fill this need, while also providing opportunities for community participation and encouraging institutions to play lead roles in needs identification, service development, and training and education. As one participant has stated, participation in a managed cyber service is “an opportunity for a group of us to make a leap forward and learn from each other along the way. In addition, [our participating college] saw it as an opportunity to overcome our geographic isolation…I think we have the potential to achieve something tremendous that we will all be proud of.”*
Summary
Technology seems to be much more compelling to smaller colleges and universities–and more cost-effective as well–when it provides substantive benefits while the procedural and instrumental aspects of technology innovation are kept under control. This is not to say that technical expertise at smaller institutions is not necessary or that all cyberinfrastructures should be moved off campus. These extreme changes would be neither productive nor prudent. By working collectively, smaller colleges can use managed services to more effectively apply advanced technologies. Bringing institutions with common needs together in a shared organizational network and aggregating many of their common technology needs through cyber services seems to be a powerful idea. Participating campuses can then provide the scope and scale of programs and services that larger institutions provide while retaining their intimacy and sense of community, and also controlling costs. At the same time, a strong foundation is created both technologically and organizationally for the type of cross-institutional endeavors and learning communities that can help smaller institutions promote scholarship that is vital and attractive to students and faculty alike. When common goals are met in cost effective ways, mission is strengthened for all.

[1]”Shaping the Cyberinfrastructure Revolution: Designing Cyberinfrastructure for Collaboration and Innovation,”  First Monday, volume 12, number 6 (June 2007)http://firstmonday.org/issues/issue12_6/bement/index.html. Accessed September 26, 2007.

* Responses to a survey administered by the author to a subset of NITLE participating organizations during July of 2007.

Whats So “Liberal” About Higher Ed?

by Jo Ellen Parker, NITLE

 

Are new digital technologies compatible with the aims and traditions of “liberal education?” Or do instructional technologies pose an inexorable threat to higher education understood as anything more than vocational training?

The answers to these much debated questions are yes and yes; it all depends on how the aims and traditions of “liberal education” are understood. My observation, admittedly as a practitioner rather than a researcher, is that there is no consensus in the higher education community about what liberal education actually is; rather, the term invokes a range of sometimes-conflicting academic practices and values. Specific instructional technologies support some of these practices and values and challenge others. Both “liberal education” and “instructional technology” are terms that point to a wide array of different things. In discussing their relationship it is therefore necessary to unpack our assumptions about liberal education and to specify which instructional technologies are at issue.

Some hypothetical, but familiar, cases might offer a useful starting point:

College A is trying to decide whether to create a learning commons in its library integrating the help desk and reference functions. Even though projections from the business office suggest that this move would save money, the librarian, the IT leader, and the faculty are all rather passionately opposed to the idea. Their (much more expensive) priority is to add smart classrooms in other buildings. Meanwhile, a mile down the road, College B has a merged organization with a librarian at its head and combined its help and reference services years ago, largely in response to demands for better research support for both faculty and students, but it has yet to install wireless access in its student union and outdoor gathering spaces.

College C spends more and more every year on subscriptions to electronic journals and databases but has not yet implemented a course management system because the faculty technology committee doesn’t see why so much should be spent to just “put our syllabi on the web.” College D, whose campus abuts College C’s, spent its first discretionary IT dollar on a course management system, immediately requiring its deployment in all courses and creating modules to make it an environment that student organizations can use — even though it has not yet been able to increase its budget for digital subscriptions for several years now.

Colleges E and F, meanwhile, have both decided that Internet 2 connection is a high budgetary priority. E’s reason is that a handful of leading faculty members have research agendas that require the transfer of enormous data sets. At F, the decision was driven not by the faculty but by the administration, which is concerned that if the campus isn’t on I2 it will be less attractive to strong prospective students. Asking anyone at College F what they will do with an I2 connection once they have it gets a blank look in return.

And then there is College G, which has made all its course materials open to the secondary schools and community colleges in its region by putting them all on the open web with what some might see as a rather casual attitude toward intellectual property. College G equips students who are going off campus for their required internships with digital cameras and PDAs for data capture, even though it can’t afford to create the GIS lab several science faculty have requested.

And in each of these cases, when these IT decisions are explained to the community, they are justified as “consistent with our college’s core commitment to liberal education.”

One could conclude that there is little logic to the decisions campuses make when it comes to IT strategy. But the issue may actually be that there are multiple competing logics, all bundled together as “liberal education.”

“Liberal education” is a little like “freedom” or “excellence” – a term invoked to convey a sense of undisputed good while encompassing a wide range of contested meanings. Academic institutions aspiring to offer anything distinct from vocational training justify important curricular and resource decisions with reference to it. (Of course, the value of non-vocational higher education itself is not universally assumed by either families or policy-makers; the high value on liberal education within the academic community is not currently shared by American society at large.) However, the claims and aspirations of colleges and universities reflect various theories of “liberal education,” some incompatible and some complementary.

These competing understandings of liberal education are not discrete schools of thought so much as interwoven threads in institutional discussions: colleges end up looking different from one another in part because they weave the threads together in different proportions and patterns at different moments in their history. Tracing the threads can be a useful way of framing the values and goals that shape specific strategic decisions about the adoption and deployment of digital technologies. Further, understanding how their institutions think and talk about liberal education can help IT leaders frame important issues in terms of educational values and purposes, making them more influential advocates by creating a sense of shared mission with their faculty and administrative colleagues.

The most venerable thread in the tapestry of liberal education is the curriculum-focused definition of “liberal education” as the study of the liberal arts and sciences – that is, as study liberated from the pressure of immediate circumstance and pursued by people free to explore the liberal arts disciplines without regard for immediate application or benefit. It is the commitment to learning for learning’s sake. The idea here is that liberal education emphasizes “pure” rather than applied disciplines and requires familiarity with the major areas of intellectual achievement in the Western tradition. By this standard, business, education, nursing, performance, and other applied studies are not seen as properly part of a liberal education. This is the logic that has some colleges giving credit for music theory and history but not for music performance, for economics but not for business or accounting, for developmental psychology but not for counseling, and so on. Further, in this view liberal education is above all else an academic pursuit. Colleges in which this tradition is strong are often leery about giving credit for non-academic work, so that internships, community service, and experiential learning are not highly valued.

This definition has been on the decline for several years now and relatively few institutions remain “pure” liberal arts colleges from this point of view, but it still echoes loudly through discussions of curriculum, requirements, and mission. Just the other day, for example, I was seated at dinner next to someone from a college that doesn’t give credit for the study of introductory language – on the grounds that language acquisition is not itself a liberal study but simply a tool which enables the liberal studies of literature, history, philosophy, and so on. A college where language is taught specifically to enable literary analysis but just as specifically not to enable tourism or business dealings is, for example, acting on this logic of liberal education.

A second, and increasingly influential, logic defines liberal education as operating from a pedagogical methodology that emphasizes active learning, faculty/student collaboration, independent inquiry, and critical thinking. This view is more pedagogical than curricular and emphasizes the development of intellectual skills and capacities over the study of any specific materials or content areas. To return to the example of language, in this approach the justification for teaching language is to develop the capacity to understand how languages work, to problematize the assumptions inherent in the native language, and to master new syntactic and lexical structures – goals that can be accomplished equally well in the study of any language without regard to the literary or historical inquiries that might follow.

The defining characteristics of liberal education in this logic are not disciplines but practices — practices like group study, undergraduate research, faculty mentoring, student presentations, and other forms of active learning. From this point of view, a discipline like nursing or education, for example, can be taught either liberally or illiberally, whereas in the first view nursing would never be seen as a liberal study. If nursing students are engaged in active learning with peer and faculty colleagues, doing direct research on important current issues in their field, encouraged to question dominant assumptions and procedures, and expected to solve complex problems independently, they are seen as being liberally educated. On the other hand, nursing students who are attending lectures, assigned material to learn by rote, rewarded for mastery of “correct” answers, and drilled in unvarying standard procedures are not. Liberally educated nurses are in this view learning to exercise judgment, understand the reasoning behind protocols and standards, and to be lifelong learners, while nurses who are illiberally educated are seen as being trained to be proficient technicians.

This view of liberal education is strongly influenced by social-constructionist theories of knowledge, research in learning theory, and a high value placed on the questioning of authority. Colleges that emphasize small classes over large ones, seminars over lectures, student research, faculty mentoring, peer study groups, and similar educational practices, while including applied studies in the curriculum, tend to be acting on this logic.

These two views reflect the complementary but tense relationship that exists between scholarship and teaching in the reward structure for faculty. Most colleges and universities are committed to both views of liberal education, just as they are committed to both scholarship and teaching. The ideal on many campuses is to teach a liberal arts and sciences curriculum (as in the first definition) using student-centered pedagogies (as valued in the second.) Just as with scholarship and teaching, however, while it is easy to agree that both the curricular and pedagogical understandings of liberal education are valuable, negotiating their competing claims presents real and specific choice points in setting institutional priorities. Colleges C and D took very different paths when investing in IT, for example, C choosing the discipline and content focused priority of subscriptions and databases while D chose the student centered and pedagogical priority of a course management system. These choices suggest that C acted more on the first view of liberal education and D more on the second.

A third notion of liberal education, related to the second but distinct from it, holds that the defining characteristic of liberal education is preparation for democratic citizenship and civic engagement. The AAC&U, for example, has in recent years emerged as a strong advocate for this understanding. In terms of curriculum, this approach tends to value the development of skills specifically believed to be central to effective citizenship — literacy, numeracy, sometimes public speaking, scientific and statistical literacy, familiarity with social and political science, and critical thinking. It tends to value curricular engagement with current social and political issues alongside the extracurricular development of ethical reflection and socially responsible character traits in students, seeing student life as an educational sphere in its own right in which leadership, rhetorical, and community-building skills can be practiced. Where this view is influential, you will find things like community-service requirements or credit-bearing service-learning projects, a high level of intentionality about the paracurriculum offered by student government and residential life, a tendency to focus course modules and assignments on recent or local cases, a sense of shared mission between faculty and student life staff, and a strong concern with extending access to higher education. (For many colleges, the framing of liberal education as preparation for service and citizenship dovetails with values derived from their founding religious traditions.) Campus G, providing open access to its materials on line and equipping students for their mandatory community service projects even when there are unmet needs on campus, is investing in this view.

Finally, a fourth view associates liberal education with a specific institutional type — the small, residential, privately governed, bachelor’s granting college. From this point of view the sum of the experiences such institutions provide is “liberal education.” Identifying liberal education with liberal arts colleges tends to emphasize structural characteristics and institutional settings as essential to liberal education and leads to skepticism that institutions with other characteristics can provide a truly liberal education. Do residential community, small size, and undergraduate focus in fact create conditions in which a distinctive educational experience can be crafted? Certainly there has been acknowledgement of the educational value of these institutional characteristics as an increasing number of large institutions have created units imitating the small, residential, living-learning community typical of the small college, often as honors colleges. And historically it is institutions of this type which have nurtured and attempted to combine all the educational priorities I have mentioned above. But even these small colleges, when attempting to do it all, face strategic choices and have to prioritize what to do when.

To the extent that liberal education is seen as the product of an institutional type, keeping the small colleges alive and vital is essential to its preservation. Technology, from this view, is valued in so far as it supports the survival of this sector of the higher education industry. The president of College F, who feels his institution must have I2 connectivity to remain viable in the marketplace even though he isn’t quite sure what it’s good for, is thinking this way.

There are no doubt other factors interwoven among those I have mentioned. But in general, this broad typology describes the main threads of the current discussion of liberal education: the curricular, the pedagogical, the civic, and the institutional – threads which are woven together on every campus but in different proportions on each. What, though, has all this to do with technology?

Let’s return to the first, curricular, understanding. When a college or its faculty is strongly influenced by this view, it is likely to regard technology as valuable primarily as an extension of the library offering new access points for scholarly resources. These are the people who are most excited about technology’s potential to allow them to view incunabula on line, access massive scientific datasets, or share documents with a remote specialist in their subfield. Institutions influenced by this view are likely to see digital scholarly resources as a priority area for investment, to assume that faculty research priorities should drive many IT decisions on campus, and to see the library as central to planning for information technology and services. These may be institutions that will prioritize digital subscriptions and put a librarian over the information organization – but not really see much point in spending a great deal on a course management system or creating collaborative student work clusters. When College E connects to I2, even though only a handful of its faculty will actually use it regularly, it is acting on these values, as is College C every time it prioritizes subscriptions over course management in the budget process.

What are the resistances to instructional technology that are likely to follow from this view? First, there is often a concern about ascertaining the quality and authority of materials located on line. This view worries that students, exploring cyberspace without the guidance of faculty members or librarians, will be misled about the value of what they find or will not be able to distinguish authoritative sources from irresponsible ones. Calls for “information literacy” programs therefore often come from this angle. There are also faculty concerns that technology offers distractions, erodes student’s ability to “read” and “reflect,” and values the quick and thoughtless over the deliberate and well-informed. In this view technology is valued for expanding the content of study but not for its potential to change the method or nature of study. In this model, the IT organization on campus is often most valued for supporting a powerful network with little or no downtime and easy access points and interfaces for accessing digital materials, but it may not be especially engaged in instructional partnerships with faculty or with maintaining student learning spaces, for example. Typically, in institutions where this view in influential, the IT department is seen as serving the library and faculty.

The second, pedagogical, point of view is much more invested in what technology allows teachers and students to DO than in what it gives them access to. These folks are excited about the way technology can transform study, about new ways of thinking and perceiving that might arise from digital interactions and resources. To the extent that this approach is influential, institutions tend to emphasize a student-centered vision of IT and to prioritize spending and support for communications tools, classroom presentation tools, course and learning management systems, and the like. The hypothetical faculty at Colleges A and D who were advocating for more technology in the classroom and for more robust learning management systems are probably influenced by this view. In this model such tools are valued for their ability to encourage communication outside class, facilitate group study, and allow students to author multi-media assignments. Colleges where this approach is strong might therefore also prioritize upgrading multimedia centers or teaching and learning centers, for example, or might approach the design of networks and spaces by thinking about how collaborative groups as well as individual users will use them. From this perspective, the IT department can be seen as offering important professional development to the faculty, as creating important learning opportunities for students, and sought after as a partner with the faculty in instructional design.

As for the negative side of this coin, resistance can arise when a commitment to digital pedagogy creates a sense of strain in faculty roles. The need for faculty to master new tools and develop the pedagogical skills to use them effectively leads to the perception of IT as an additional, onerous, and sometimes resented job expectation. Faculty and deans complain that there isn’t enough time for faculty to keep up with technology. Those faculty members who do engage in creative digital teaching may wonder if their efforts will be rewarded by tenure and promotion committees. Facing new demands to develop faculty skills and partner with faculty innovators, the IT staff itself feels pressure of time and staffing. And when the faculty/IT relationship is strong and focused on classroom pedagogy, it can be difficult to see what the appropriate role of the library can or should be, leading to tensions between the library and IT departments.

To the extent that the third, civic, approach is present, campuses may be likely to emphasize the ways technology can help them extend beyond their own borders and engage with non-academic materials and activities. These campuses may develop digital projects in partnership with the local secondary schools or public libraries. These will be faculty who are excited about the way technology allows their students to mentor local high school students by offering 24/7 homework assistance or to document their experiences during a community service project. These educational values might lead, for example, to e-portfolio requirements integrating academic and extracurricular learning or investment in videoconferencing technologies to support the integration of on and off-campus learning. These institutions might be more interested in making campus collections and course materials available to community partners, like our hypothetical college G, or to using technologies to support extracurricular activities than in purchasing highly specialized database subscriptions or equipping smart classrooms.

With its strong emphasis on community and ethical relationships, this is the position from which concerns about the impact of technology on the campus community and on relationships among and between students and faculty can give rise to resistance. I sit, as it happens, on the editorial board of a journal. At a meeting of this group we recently had a lively discussion related to a possible future issue. The discussion ping-ponged back and forth between excitement about expanding access to previously excluded students through technology and concern about the erosion of real, carbon-based interactions threatened by these same technologies. Both the excitement and the resistance were born of a commitment to liberal education as preparation for civic and community life.

For those who understand liberal education as essentially identified with one institutional type, much of the value of IT is in making sure that small colleges remain competitive with larger institutions able to offer a more extensive range of opportunities to students and faculty. Many small colleges and faculty cherish the hope that IT will help them to offer the virtues of small and the benefits of big, leading to optimistic ideas about the ability of IT to help small institutions do more with less and save money to boot. However, as we know, technology demands scale, something these colleges cannot muster, leading to continuing and especially difficult assessments about technology costs on small campuses. Will an expensive application be bought for the one faculty member who is likely to use it? When an IT staff has only three positions, how many optional applications can it actually support? Collaboration is an obvious strategy for small colleges to achieve some scale and lower some costs, but it is a difficult strategy for this point of view to consider – since the primary goal is institutional survival, and since collaboration can appear to threaten institutional distinctiveness, collaboration can appear to campus leaders as a counterproductive strategy. Further pulling against the need to control costs is a strong awareness of the need to keep up with the Joneses, leading to resentment and a sense of coercion on the part of decision makers.

All this is not to suggest anything more complex than that the discussion of technology and liberal education is entwined in debates about broader educational priorities and value. When institutions are facing decisions about where to put their IT dollars, they are often indirectly struggling over what their academic and educational values and priorities are. And this struggle can be particularly difficult for institutions committed to “liberal education” because of the multiplicity of competing goals and agendas subsumed within that term, particularly when resources are limited and difficult choices must be made.

Faculty and administrators who express concern about the impact of technology on liberal education are sometimes dismissed by technologists and CIOs as simply resisting change or failing in imagination. However, campus resistance to new technologies is often a matter of defending perceived threats to important educational and professional commitments. IT leaders, for their part, do well to explicitly connect specific IT challenges and issues to the educational values and practices characteristic of their institutional and campus clients. IT leaders have a tremendous opportunity to demonstrate to their colleagues that technology can indeed serve many of the goals of liberal education. They also serve their institutions best by framing technology choices in terms of the various and competing goals of liberal education and promoting discussion of which should be central to institutional strategy and why.

Blended Learning at Small Liberal Arts Colleges

This report was submitted by NITLE to the Associated Colleges of the South (ACS) on December 5, 2011. The ACS now makes it available as context for its January 2014 call for proposals for case studies in blended/hybrid learning (deadline for submissions: February 21, 2014). This report was developed by Rebecca Frost Davis, then program officer for the humanities at NITLE. Dr. Davis is currently the director for instructional and emerging technology at St. Edward’s University.

Historically one of the strengths of liberal arts colleges—their small size—has also been one of their weaknesses: They are limited in the number of classes they can offer, and courses with small numbers may not have the critical mass to justify the expense of offering them. Despite these challenges, however, small colleges can expand their course offerings while retaining their “high-touch,” personal approach to education through shared academics, which are academic experiences that transcend the borders of a single campus by connecting students, faculty, and staff in pursuit of common academic goals.  By partnering with other institutions and leveraging technologies such as high definition video conferencing and collaborative software, colleges can connect students to learning experiences beyond their local contexts and faculty to larger educational communities. Furthermore, by strategically pooling resources, small colleges can collectively develop a shared academic program with the depth and breadth needed to meet the needs of today’s students.

Read more

css.php