Interactive Visual Interfaces

At IPL, we couple human behavior simulation with interactive visual interfaces to support the dynamic exploration of human-environment interactions. Our goal is to support the decision-making process of architects, planners, engineers, and building managers, both during building design and operations phases. A key aspect of this approach involves coupling parametric modeling of buildings and occupants to integrate human behavior simulation in architectural design workflows using an intuitive visual scripting approach accessible to desires without coding experience. 

Website Images2-59

Interfaces for human-building interaction simulations can include natural language processing to guide the design process and facilitate the simulation of occupants' behavior. Natural language processing interface can rely on multipmodal Large Language Models (LLMs) such as GPT-4 developed by Open AI, or other knowledge-based systems capturing human-building interactions using ontologies. 

Website Images2-55

A vital requirement of interactive visual interfaces is to efficiently and effectively communicate the set up and outcomes of human behavior simulations to different stakeholders with diversified background, including architects, engineers, planners, end-users, and building managers. Such an interface should also enable comparative evaluation of alternative scenarios involving different simulation inputs in the form of alternative building design or use scenarios assumptions. 

Website Images2-51-ok

Media

Publications