From archaeological excavations, huge quantities of material are recovered, usually in the form of fragments. Their correct interpretation and classification are laborious and time-consuming and requires measurement, analysis and comparison of several items. Basing these activities on quantitative methods that process 3D digital data from experimental measurements allows optimizing the entire restoration process, making it faster, more accurate and cheaper. The 3D point clouds, captured by the scanning process, are raw data that must be properly processed to be used in automatic systems for the analysis of archeological finds. This paper focuses on the integration of a shape feature recognizer, able to support the semantic decomposition of the ancient artifact into archaeological features, with a structured database, able to query the large amount of information extracted. Through the automatic measurement of the dimensional attributes of the various features, it is possible to facilitate the comparative analyses between archaeological artifacts and the inferences of the archaeologist and to reduce the routine work. Here, a dedicated database has been proposed, able to store the information extracted from huge quantities of archaeological material using a specific shape feature recognizer. This information is useful for making comparisons but also to improve the archaeological knowledge. The database has been implemented and used for the identification of pottery fragments and the reconstruction of archaeological vessels. Reconstruction, in particular, often requires the solution of complex problems, especially when it involves types of potsherds that cannot be treated with traditional methods.
|Titolo:||A 3d informational database for automatic archiving of archaeological pottery finds|
|Data di pubblicazione:||2021|
|Appare nelle tipologie:||1.1 Articolo in rivista|
File in questo prodotto:
|A 3D Informational Database for Automatic Archiving of Archaeological Pottery Finds_compressed.pdf||Documento in Versione Editoriale||Open Access Visualizza/Apri|