ToolCenterMain Page | About | Help | FAQ | Special pages | Log in

Printable version | Disclaimers | Privacy policy

Time, Space, Uncertainty

From ToolCenter

This is not an existing tools project but rather a proposal for a tools project arising from the Digital Tools Summit at the University of Virginia. For more on the Summit see or notes at

The Tool_Summit_Conclusions on the Tool Summit are being written on this wiki.

A key problem of historical research of material culture

Any interpretation of the past is an attempt to piece together fragmentary data, dispersed in space and time. Even so-called “intact” cultural historical sites brought to light after centuries, or even millennia of deposition are rarely perfect fossils or freeze-frames of human activity prior to their abandonment. In this sense, all work with material remains constitutes an archaeology of fragments. However, the degree and nature of such fragmentation can vary significantly. Historical landscapes coexisting with modern cityscapes produce far more fragmented records compared to open-air sites. It is both the spatial and temporal configuration of excavations in the city that causes the archaeological record to become further fragmented at the post-depositional stage. The sequence and duration of excavations is frequently determined by non-archaeological considerations (building activity, public works etc.) and the sites are dispersed over large areas. Thus, it becomes problematic to keep track of hundreds of excavations (past and present) and to maintain a relational understanding of individual sites. On the other hand, it is clear that historical habitation itself can also contribute to the fragmentation of the record. Places with a rich occupation history do not simply consist of a past and a present. Rather, they comprise temporal sequences intertwined in three-dimensional space; the destruction of archaeological remains that is caused by overlapping habitation levels can be considerable.

Existing Software Tools for Solving the Problem, and their Limits

To help us manage this “archaeology of fragments,” there are two sets of existing software tools: Geographic Information Systems (hereafter GIS) and 3D visualization software (a.k.a. “virtual reality”). GIS provides the tools to deal with the fragmentation of historical landscapes in contemporary urban settings, as it enables the integration and management of highly complex, disparate and voluminous data sets. Spatial data can be integrated in a GIS platform with non-spatial data; thus, topological and architectural features represented by co-ordinates can be integrated with descriptive or attribute data in textual or numeric format. Vector data can be integrated with raster data; that is, drawings produced by computer-aided design imagery. GIS helps produce geometrically described thematic maps, which are underpinned by a wealth of diverse data. Through visualization, implicit spatial patterns in the data become explicit, a task that can be quite tedious outside a GIS environment. In this respect, GIS is a means to convert data into information. Visualization can be thought-provoking in itself. However, the strength of GIS is not limited to cartographic functionality. It mainly lies in its analytical potential. Unlike conventional spatial analysis, GIS is equipped to analyze multiple features over space and time. The variability of features can be assessed in terms of distance, connectivity, juxtaposition, contiguity, visibility, use, clustering etc. In addition, the overlay and integration of existing spatial and non-spatial data can generate new spatial entities and create new data. Such capabilities render GIS an ideal toolbox for grappling with the fragmented archaeological record of an archaeological site from a variety of perspectives.

3D visualization software can add the third dimension to our data sets, enabling us not only to see the distribution and spacing of features on a flat time-map, but also to understand how the data cohere to form a picture of a lost world of the past. Once this world has been reconstructed as accurately as we can make it, we can do things of great use for historical research and instruction: re-experience what it was like to see and move about in the world as it was at an earlier stage of human history; and run experiments on how well buildings and urban infrastructure functioned in terms of illumination, ventilation, circulation, statistics, etc.

While proprietary GIS and 3D visualization software packages exist, they were designed with the needs and interests of contemporary practitioners and problem-solvers in mind: geologists, sociologists, urban planners, architects, etc. Standard packages do not, for example, have something as simple as a time bar that shows changes over time in 2D (GIS) or 3D. Moreover, GIS and 3D software are typically distinct packages, whereas in historical research we would ideally like to see them integrated into one software suite.

Finally, there is the matter of uncertainty and related issues, such as the ambiguity, imprecision, indeterminacy, and contingency of historical data. In the contemporary world, if an analyst needs to take a measurement or collect information about a feature, this is generally possible without great exertions. In contrast, analysts of historical data must often “make do” with what happens to survive or be recoverable from the historical record. But this means that if historians (broadly defined) utilize GIS or 3D software to represent the lost world of the past, the software makes that world appear more complete than the underlying data may justify. Moreover, since the very quality of the data gathered by a scholar or professional working on a site in the contemporary world is, as noted, generally not at issue, the existing software tools do not have functions that allow for the display and secondary analysis of data quality—something that is at the heart of historical research.

What is needed: an integrated suite of software tools (i.e., a software “machine”)

Participants in this group of the Tools Summit have been actively engaged with understanding and attempting to solve specific pieces of the overall problem. The problem of how to do justice to the complexities of time and its representation has been confronted by B. Robertson (HEML: software facilitating the combination of temporal data and underlying evidence); J. Drucker (Temporal Modeling: software for modeling not so much time per se as complex temporal relations); and S. Rab (modeling the “future of the past,” i.e., how cultural heritage sites should be integrated into designs for future urban development). The problem of the quality of the data underlying a 2D or 3D representation has been studied by D. Luebke (Stylized Rendering and “Pointworks” [Hui Xu, et al.]: software based on aesthetic conventions for representing incomplete 3D datasets); G. Guidi (handling uncertainty in laser scanning of existing objects); and S. Hermon (using fuzzy logic to calculate the overall quality of a 3D reconstruction based on the probabilities of the individual components of the structure being recreated).

Discussion of these various approaches to this cluster of related issues led us to think of the solution to the problem as entailing not so much a new software tool as a software “machine,” i.e., an integrated suite of tools. This machine would allow us to collect data, evaluate them as to their reliability/probability, set them into a time frame, define their temporal relationships with other features of interest in our study, and, finally, represent their degree of (un)certainty by visual conventions of stylized rendering and by the mathematical expressions derived from fuzzy logic.

THRESHOLD is the proposed name for this software machine. It stands for “Temporal-historical research environment for scholarship.” The purpose of THRESHOLD is to provide humanists with an integrated working and display environment for historical research that allows relationships between different kinds of data to be visualized. There are seven goals that this machine needs to fill, listed below. The first five are based on Colin Ware’s work on visualization; we added items six and seven.

1) facilitating understanding of large amounts of data; 2) perception of unanticipated emergent properties in the data; 3) problems in the quality of the data; 4) promoting understanding of large- and small-scale features of the data; 5) facilitating hypothesis formation; 6) promoting interpretation of the data; 7) understanding different values and perspectives on the data.

THRESHOLD would function in two modes: authoring; exploring. Two pilot projects that would be suitable as testbeds for THRESHOLD were identified: C. Shifflett’s Jamestown and the Atlantic World; and S. Rab’s study of the future of cultural heritage sites in Sharjah (U.A.E.).

Written by Bernard Frischer (


Ware, Colin. Information Visualization, 2d ed. San Francisco, CA: Morgan Kaufmann Publishers, 2000.

Page loaded from,_Space,_Uncertainty

This page has been accessed 16,688 times. This page was last modified on 10 February 2006, at 21:05.


Main Page
Current events
Recent changes
Random page
View source
Editing help
This page
Discuss this page
New section
Printable version
Page history
What links here
Related changes
My pages
Log in / create account
Special pages
New pages