by Janet Murray, Georgia Tech
Originally Posted December 16th, 2007
Professor Janet H. Murray is an internationally recognized interactive designer, the director of Georgia Tech’s Masters Degree Program in Information Design and Technology and Ph.D. in Digital Media, and a member of Georgia Tech’s interdisciplinary GVU Center. She is the author of Hamlet on the Holodeck: The Future of Narrative in Cyberspace (Free Press, 1997; MIT Press 1998), which has been translated into 5 languages, and is widely used as a roadmap to the coming broadband art, information, and entertainment environments. She is currently working on a textbook for MIT Press, Inventing the Medium: A Principled Approach to Interactive Design and on a digital edition of the Warner Brothers classic, Casablanca, funded by NEH and in collaboration with the American Film Institute. In addition, she directs an eTV Prototyping Group, which has worked on interactive television applications for PBS, ABC, and other networks. She is also a member Georgia Tech’s Experimental Game Lab. Murray has played an active role in the development of two new degree programs at Georgia Tech, both 0f which were launched in Fall 2004: the Ph.D. in Digital Media, and the B.S. in Computational Media. In spring 2000 Janet Murray was named a Trustee of the American Film Institute, where she has alsoserved as a mentor in the Enhanced TV Workshop a program of the AFI Digital Content Lab. She holds a Ph.D. in English from Harvard University, and before coming to Georgia Tech in 1999 taught humanities and led advanced interactive design projects at MIT. Murray’s primary fields of interest are digital media curricula, interactive narrative, story/games, interactive television, and large-scale multimedia information spaces. Her projects have been funded by IBM, Apple Computer, the Annenberg-CPB Project, the Andrew W. Mellon Foundation, and the National Endowment for the Humanities.
Information infrastructure is a network of cultural artifacts and practices. A database is not merely a technical construct; it represents a set of values and it also shapes what we see and how we see it. Every time we name something and itemize its attributes, we make some things visible and others invisible. We sometimes think of infrastructure, like computer networks, as outside of culture. But pathways, whether made of stone, optical fiber or radio waves, are built because of cultural connections. How they are built reflects the traditions and values as well as the technical skills of their creators. Infrastructure in turn shapes culture. Making some information hard to obtain creates a need for an expert class. Counting or not counting something changes the way it can be used. Increasingly it is the digital infrastructure that shapes our access to information and we are just beginning to understand how the pathways and containers and practices we build in cyberspace shape knowledge itself.
by Jennifer Curran,
Originally Posted December 16th, 2007
Introduction: David Green
Principal, Knowledge Culture
Three art historians were invited to think about how their discipline, and their teaching and research within that discipline, might evolve with access to a rich cyberinfrastructure.
Participants were encouraged to think through what might happen to their practice of art history if:
–they had easy access to high-quality, copyright-cleared material in all media;
–they could share research and teaching with whomever they wanted;
–they had unrestricted access to instructional technologists who could assist with technical problems, inspire with teaching ideas and suggest resources they might not otherwise have known about.
What would they do with this freedom and largesse? What kinds of new levels of research would be possible (either solo or in collaborative teams); what new kinds of questions might they be able to answer; how would they most want to distribute the results of their scholarship; who would the audience be; and would there be a new dynamic relationship with students in and out of the classroom?
Panelist 1: Guy Hedreen, Professor of Art History, Williams College
On The Next Generation of Digital Images Available to Art Historians
Panelist 2: Dana Leibsohn, Associate Professor of Art, Smith College
On the Technologies of Art History
Panelist 3: Amelia Carr, Associate Professor of Art History, Allegheny College
Overcoming the Practice of Visual Scarcity
by David Green, Principal, Knowledge Culture
Originally Published December 16th, 2007
Ken Hamma is a digital pioneer in the global museum community. A classics scholar, Hamma joined the Getty Trust in 1987 as Associate Curator of Antiquities for the Getty Museum. He has since had a number of roles there, including Assistant Director for Collections Information at the Getty Museum, Senior Advisor to the President for Information Policy and his current position, Executive Director for Digital Policy and Initiatives at the Getty Trust.
David Green: Ken, you are in a good position to describe the evolution of digital initiatives at the Getty Trust as you’ve moved through its structure. How have digital initiatives been defined at the Getty and how are they faring at the institutional level as a whole, as the stakes and benefits of full involvement appear to be getting higher?
Ken Hamma: Being or becoming digital as short-hand for the thousands of changes institutions like this go through as they adopt new information and communication technologies has long been discussed at the Getty from the point of view of the technology. And it did once seem that applying technology was merely doing the same things with different tools when, in fact, we were starting to embark upon completely new opportunities. It also once seemed that the technology would be the most expensive part. Now we’ve learned it’s not. It’s content, development and maintenance, staff training, and change management that are the expensive bits.
by John Weber, Skidmore College
John Weber is the Dayton Director of the Frances Young Tang Teaching Museum and Art Gallery at Skidmore College, an interdisciplinary museum opened in 2000 to create links between contemporary art and other disciplines as part of the teaching effort at Skidmore. As director of the museum, he supervises the Tang’s staff and oversees exhibitions, programs, collections, and the Tang website, as well as curating and writing for museum publications. Weber is also a member of the Skidmore faculty and teaches in the art history program. Before coming to Skidmore in 2004, he was the curator of education and public programs at the San Francisco Museum of Modern Art from 1993 to 2004, where he spearheaded the design of the Koret Education Center and founded the museum’s interactive educational technologies program. From 1987 to 1993 Weber served as curator of contemporary art at the Portland Art Museum in Oregon.
Originally Published December 16th, 2007
To begin, let’s take it as a given that the “cyberinfrastructure” we are writing about in this edition ofAcademic Commons is both paradigmatically in place, and yet in some respects technologically immature. The internet and the intertwined web of related technologies that support wired and wireless communication and data storage have already altered our ways of dealing with all manner of textual and audiovisual experience, data, modes of communication, and information searching and retrieval. Higher education is responding, but at a glacial pace, particularly in examining new notions of publishing beyond those which have existed since the printed page. Technologies such as streaming and wireless video remain crude, but digital projectors that handle still image data and video are advancing rapidly, and the gap between still and video cameras continues to close. Soon I suspect there will simply be cameras that shoot in whatever mode one chooses (rather than “camcorders” and “digital cameras”), available in a variety of consumer and professional versions and price points. Already, high definition projectors and HD video are a reality, but they have yet to permeate the market. They will soon, with a jump in image quality that will astonish viewers used to current recording and projection quality.
by David Green, Knowledge Culture
Originally Published December 16th, 2007
James O’Donnell, Provost of Georgetown University, is a distinguished classics scholar (most recently author of Augustine: A New Biography), who has contributed immensely to critical thinking about the application of new technologies to the academic realm. In 1990, while teaching at Bryn Mawr College, he co-founded the Bryn Mawr Classical Review, one of the earliest online scholarly journals, and while serving as Professor of Classical Studies at the University of Pennsylvania, he was appointed Penn’s Vice Provost for Information Systems and Computing. In 2000 he chaired a National Academies committee reviewing information technology strategy at the Library of Congress, resulting in the influential report, LC21: A Digital Strategy for the Library of Congress. One of his most influential books, Avatars of the Word (Harvard, 1998) compares the impact of the digital revolution to other comparable paradigmatic communications shifts throughout history.
David Green: We’re looking here at the kinds of organizational design and local institutional evolution that will need to happen for liberal arts (and other higher-education) institutions to take advantage of a fully-deployed international cyberinfrastructure. How might access to massive distributed databases and to huge computational and human resources shift the culture, practice and structure of these (often ancient) institutions? How will humanities departments be affected–willingly or unwillingly? Will they lead the way or will they need to be coaxed forward?
James O’Donnell: I think the issue you’re asking about here boils down to the question, “What problem are we really trying to solve?” And I think I see the paradox. The NSF Cyberinfrastructure Report, addressed to the scientific community, could assume a relatively stable community of people whose needs are developing in relatively coherent ways. If wise heads get together and track the development of those needs and their solutions, you can imagine it would then just be an ordinary public policy question: what things do you need, how do you make selections, how do you prioritize, what do you do next? NSF has been in this business for several decades. But when you come to the humanities (and full credit to Dan Atkins, chair of the committee that issued the report, for saying “and let’s not leave the other guys behind”) and you ask “what do these people need?” you come around to the question (that I take it to be the question you are asking of us) “Are we sure these people know they need what they do need?”
by Joseph Ugoretz, Director of Technology and Learning, Macaulay Honors College–CUNY
Originally Published June 9th, 2006
The “future history of the media,” EPIC, presents a fictionalized retrospective, from the year 2014, of the history of media, news, and information. “In the year 2014,” the “Museum of Media History” tells us, “people have access to a breadth and depth of information unimaginable in an earlier age. Everyone contributes in some way. Everyone participates to create a living, breathing mediascape.” While we have not reached the point predicted there, and only time will tell if we’re going to be there by 2014, there have, of late, been some significant steps in that direction. One of these steps has been the development of a constellation of online tools that can be (at least loosely) tied together in the broad category of social software.
Social software includes many communication media, but the new tools which are the subject of this essay all fit three broad descriptions. These tools are interactive, with the content created and structured by a wide mass of contributors. These tools are also interconnected, with user-provided searchable links structuring and cross-referencing that content. And finally, these tools are bottom-up and communitarian, with the users of the tools providing and benefitting from associations, reputations, and authority within a many-to-many community. The various tools of social software are an increasing presence in the online world, as well as the offline lives of their users. Four brief vignettes demonstrate this.
by Meg E. Stewart
Originally Published September 25th, 2006
By now most in academia know of GIS, especially those reading an online journal discussing digital technologies in the liberal arts. GIS, or geographic information systems, is mapping on computers. GIS is visualization of geographic data, whether it be from one layer showing demographics taken in the recent census, to many layers that provide information on the surface of the earth (such as soil, topography, or infrastructure), below the surface (the geology), and above the surface (air quality or temperature, for example). GIS is used for analyzing geospatial relationships. One can look at those many layers and make spatial analyses across and between the variables.
by Michael Roy, Middlebury College
Originally Published September 25th, 2006. A report on the NERCOMP SIG workshop Let No Good Deed Go Unpunished; Setting up Centralized Computational Research Support, 10/25/06
Back to the Future of Research Computing
As Clifford Lynch pointed out at a recent CNI taskforce meeting, the roots of academic computing are in research. The formation of computing centers on our campuses was originally driven by faculty and students who needed access to computer systems in order to tackle research questions. It was only years later that the idea of computers being useful in teaching came into play. And once that idea took hold, it seemed that we forgot about the research origins of academic computing.
Lynch argues that the pendulum is swinging back again, as campuses nationwide report an increased interest in having libraries and computer centers provide meaningful, sustainable and programmatic support for the research enterprise across a wide range of disciplines.
Andrew Fiss, Visiting Assistant Professor of Writing and History, Davidson College. Andrew Fiss is a visiting assistant professor in writing and history at Davidson College, where he teaches classes in the history of American science and also science writing. He received a doctoral degree in history and philosophy of science from Indiana University in 2011 and has also taught at Vassar College. In fall 2014, he will start as an assistant professor at Michigan Technological University.
Matthew Vest, Music Librarian, University of Virginia. Matthew Vest was the music librarian at Davidson College until the spring of 2014, when he joined the University of Virginia. At Davidson, he taught library instruction sessions, coordinated reference services, and managed music collections. He has a master of music degree in composition from Butler University and a master of library science from Indiana University.
Science writing; Podcast writing; Information literacy; Liberal arts pedagogy
Our case study discusses an assignment that asks students to translate a specialist scientific article into a short broadcast segment: in our case, a podcast in the style of National Public Radio’s A Moment of Science (http://indianapublicmedia.org/amomentofscience/). The small environment of a liberal arts college facilitates this project through encouraging collaborations between classroom instruction, technology workshops, and information literacy sessions.
The assignment challenges students to not only communicate specialist information at an appropriately broad level but also to do so in an audio-only format. Also, the students work with the familiar, popular, and public outlet of radio or podcast, but in an unfamiliar way: as an academic endeavor. So, while students translate specialist texts to non-expert audiences, they also begin to consider the possibilities and limitations of digital broadcast content.
The case study provides further context for the assignment, giving learning outcomes and sharing the specific challenges and solutions the authors encountered while planning and implementing the assignment. It builds a theoretical framework around the nature of expertise in science writing. In doing so, it proposes a blended plan for teaching scientific and digital literacies in a liberal arts setting.