A platform for research: civil engineering, architecture and urbanism
Machine-based understanding of manually collected visual and auditory datasets for urban perception studies
Highlights Urban perception studies are primarily based on the visual medium as a stimulus. Study includes multimodal and temporal sensory data collection. Street view photographs and audio is captured in urban streets. Deep Learning classifiers are used to detect sources of sounds and visible street elements.
Abstract While the previous studies have attempted to quantify the environmental characteristics which stimulate different emotions and perceptions, the temporal aspect of the transient urbanscapes and the multimodality in data collection and analysis has not been given enough attention. The characteristics of the urban environment fluctuate with the time, time-based sensory data collection is, therefore, critical for comprehensive assessment of the surroundings. This research note aims to propose a methodology for the temporal and multimodal data collection and demonstrates the use of smartphone-based camera and handheld recorder to manually collect visual and audio datasets simultaneously from urban streets. It utilizes state of the art computer algorithms to extract attributes from the visual and auditory datasets which can further help in establishing comparison and relationships between different urban spaces. Altogether, this study proposes an idea and working proof presenting data collection and automated analysis which can aid decision makers and urban planners to create a lively, safe and healthy environment.
Machine-based understanding of manually collected visual and auditory datasets for urban perception studies
Highlights Urban perception studies are primarily based on the visual medium as a stimulus. Study includes multimodal and temporal sensory data collection. Street view photographs and audio is captured in urban streets. Deep Learning classifiers are used to detect sources of sounds and visible street elements.
Abstract While the previous studies have attempted to quantify the environmental characteristics which stimulate different emotions and perceptions, the temporal aspect of the transient urbanscapes and the multimodality in data collection and analysis has not been given enough attention. The characteristics of the urban environment fluctuate with the time, time-based sensory data collection is, therefore, critical for comprehensive assessment of the surroundings. This research note aims to propose a methodology for the temporal and multimodal data collection and demonstrates the use of smartphone-based camera and handheld recorder to manually collect visual and audio datasets simultaneously from urban streets. It utilizes state of the art computer algorithms to extract attributes from the visual and auditory datasets which can further help in establishing comparison and relationships between different urban spaces. Altogether, this study proposes an idea and working proof presenting data collection and automated analysis which can aid decision makers and urban planners to create a lively, safe and healthy environment.
Machine-based understanding of manually collected visual and auditory datasets for urban perception studies
Verma, Deepank (author) / Jana, Arnab (author) / Ramamritham, Krithi (author)
2019-06-16
Article (Journal)
Electronic Resource
English
Comparison of automatically and manually collected pan evaporation data
Tema Archive | 2000
|Understanding urban perception with visual data: A systematic review
Elsevier | 2024
|Auditory and Visual Contribution to Egocentric Distance and Room Size Perception
SAGE Publications | 2013
|