A platform for research: civil engineering, architecture and urbanism
Multimodal Data Fusion and Deep Learning for Occupant-Centric Indoor Environmental Quality Classification
Amidst the growing recognition of the impact of indoor environmental conditions on buildings and occupant comfort, health, and well-being, there has been an increasing focus on the assessment and modeling of indoor environmental quality (IEQ). Despite considerable advancements, existing IEQ modeling methodologies often prioritize and limit to singular comfort metrics, potentially neglecting the comprehensive factors associated with occupant comfort and health. There is a need for more inclusive and occupant-centric IEQ assessment models that cover a broader spectrum of environmental parameters and occupant needs. Such models require integrating diverse environmental and occupant data, facing challenges in leveraging data across various modalities and time scales as well as understanding the temporal patterns, relationships, and trends. This paper proposes a novel framework for classifying IEQ conditions based on occupant self-reported comfort and health levels to address these challenges. The proposed framework leverages a multimodal data-fusion approach with Transformer-based models, aiming to accurately predict indoor comfort and health levels by integrating diverse data sources, including multidimensional IEQ data and multimodal occupant feedback. The framework was evaluated in classifying IEQ conditions of selected public indoor spaces and achieved 97% and 96% accuracy in comfort and health-based classifications, outperforming several baselines.
Multimodal Data Fusion and Deep Learning for Occupant-Centric Indoor Environmental Quality Classification
Amidst the growing recognition of the impact of indoor environmental conditions on buildings and occupant comfort, health, and well-being, there has been an increasing focus on the assessment and modeling of indoor environmental quality (IEQ). Despite considerable advancements, existing IEQ modeling methodologies often prioritize and limit to singular comfort metrics, potentially neglecting the comprehensive factors associated with occupant comfort and health. There is a need for more inclusive and occupant-centric IEQ assessment models that cover a broader spectrum of environmental parameters and occupant needs. Such models require integrating diverse environmental and occupant data, facing challenges in leveraging data across various modalities and time scales as well as understanding the temporal patterns, relationships, and trends. This paper proposes a novel framework for classifying IEQ conditions based on occupant self-reported comfort and health levels to address these challenges. The proposed framework leverages a multimodal data-fusion approach with Transformer-based models, aiming to accurately predict indoor comfort and health levels by integrating diverse data sources, including multidimensional IEQ data and multimodal occupant feedback. The framework was evaluated in classifying IEQ conditions of selected public indoor spaces and achieved 97% and 96% accuracy in comfort and health-based classifications, outperforming several baselines.
Multimodal Data Fusion and Deep Learning for Occupant-Centric Indoor Environmental Quality Classification
J. Comput. Civ. Eng.
Lee, Min Jae (author) / Zhang, Ruichuan (author)
2025-03-01
Article (Journal)
Electronic Resource
English
Indoor environmental quality and occupant satisfaction in green-certified buildings
Taylor & Francis Verlag | 2019
|