.

.

Sunday, September 3, 2017

Close-range photogrammetry models of rock art (1/3)

In fall 2016, Christian Driver and I began experimenting with close-range photogrammetry to document rock art. We help Boulder to manage rock art, specifically a series of 80+ rock art panel panels in eastern Boulder County. Many of the rock art panels in the area are too weathered to document or share satisfactorily with people. Since all of the currently detectable rock art is in relief, popular image transformations such as a principal components analysis or decorrelation are ineffective. Both of these processes are used to reduce autocorrelation (correlation of a signal with a lagged copy of itself) within a set of signals while preserving other (read: important) aspects of the signal. 
  • For explanation on how Dstretch works, read Dr. Ronald E. Alley's white paper. Before reading Dr. Alley's paper, I didn't realize that the Jet Propulsion Lab developed the algorithm used by Dstretch originally for ASTER, a 14-band multispectral sensor. I'm interested in learning more about how dramatically fewer data points in typical RGB camera sensors used for rock art change the dimensionality of the algorithm. 

While the spectral information of the rock art itself is not sufficiently different than that of the surrounding rock to distinguish the rock art, the spatial information of the rock art is sufficiently different than that of the surrounding rock to distinguish it if you can capture it. Traditional rock art documentation was completed between 1984 and 1997 by the Indian Peaks chapter of the Colorado Archaeological Society (Kindig 1997). These recordings, while infinitely valuable for identification and relocation of panels, have a few limitations. They are not to scale, do not capture the type or method of rock art creation, do not capture condition or imminent threats (cracks, sloughing, vandalism, infiltration), and do not document the surrounding rock surface (texture, bedding, color, shape/volume). What data model maximizes production of these data for the least amount of time and cost? 3-dimensional or 3D models is a great start. 

There are several ways to capture these 3D data, many of them are demonstrably "whiz-bang" as many of my bosses say. Terrestrial laser scanners are the fastest and most expensive method we found, which maximizes spatial extent and spatial differences but is less effective in capturing spectral information. We focused on close-range photogrammetry for several reasons: it's fast, it's cheap (price of sufficient quality DSLR camera (thanks Christian!), lenses (usually 50mm), and software unless open source), spectral, spatial, textural data captures are sufficient resolution, and it's reproducible. 

Check out some examples of our 3D models so far. The models are color balanced and to scale. Sometimes the spatial extent of data capture is limited by our computer processing capacity. I'm hoping to experiment with parallel processing/RAM and processor upgrades in the next few months. 

Please leave me comments on either this page or Sketchfab if you enjoy seeing the models, what you did and did not like about the models, and any changes you suggest. 

-------------------------------------------------------------------------------------------------------------

Oil derrick



"Kirkendoll"


"WFM/ 1904"