Research

My research focus is on modeling and simulation of human behavior for emergency response and decision making with emphasis on multi-agent systems (MAS), multi-user virtual reality (MUVR), and mobile augmented reality applications (MARA). I am interested in merging Data Science and Virtual Reality for Advanced Visualization. I specializes in performing virtual evacuation drills for evacuations and terror events in Multi‐User Virtual Reality (MUVR) environments such as mega city, subway, airplane, bus, and university campus. Current research projects can be found on the DVXR Lab website here and here is a link to Students

My Google Scholar page is linked here and ORCID profile is linked here.

Mobile Augmented Reality Applications (MARA)

This research develops the science needed to enhance mobile augmented reality applications with (a) Situational awareness, (b) Geospatial Data (c) Navigation, (d) Evacuation, and (e) Emergency response. The projects advance discovery of visualization techniques to permit mobile applications to enhance the viewing of the physical world, while promoting contextualized 3D visualizations, spatial knowledge acquisition and cognitive mapping thereby enhancing situational awareness. A range of use cases are tested, including data visualization and immersive data spaces, in-situ visualization of 3D models and full scale architectural form visualization.

Data Science and Data Visualization

Human-centric situational awareness and visualization are needed for analyzing the big data in an efficient way. One of the challenges is to create an algorithm to analyze the given data without any help of other data analyzing tools. This research effort aims to identify how graphical objects (such as data-shapes) developed in accordance with an analyst's mental model can enhance analyst's situation awareness. Our approach for improved big data visualization focuses on both visualization and interaction. The projects include: 1) iHARP- NSF HDR Institute: Annotation and Visualization of Heterogeneous Data, 2) Analysis of Crime, 3) Scientific Data Visualization of COVID-19 data and crime data in Baltimore. 4) COVID-19 DataVisualization, 5) Crime Data in Baltimore Visualization, 6) Scientific Data Visualization, 7) Data Analytics: Improve the Quality of Life in Urban Areas.

Active Shooter Response and Training

The goal of this NSF funded project is to developed and evaluate a collaborative immersive environment in VR for active shooter response for UNT campus and at BSU Campus. Immersive collaborative virtual reality environment also offers a unique way for training in the emergencies for campus safety. The contribution lies in our approach to combining computer simulated agents (AI agents) and user-controlled autonomous agents in a collaborative virtual environment for conducting emergency response training for civilians and security personnel's. The immersive collaborative VR environment offers a unique method for training in emergencies for campus safety.

Virtual Reality Instructional (VRI) modules for Teaching Complex Topics and for Training for Improving Quality of Care and Patient Safety

The goal of this research work is to develop virtual reality instructional (VRI) modules for Teaching, Health Care, Training, and Manufacturing. The projects include 1) Virtual AI Tutor using Generative AI, 2) Digital Twin for Exercise Techniques and Rehabilitation Experience, 3) VR Instructional course curriculum modules with more inquiry based problem-solving activities and hand-on experiences based on Gaming for teaching complex topics, 4) Train integrated care team members to engage patients from vulnerable populations safely and efficiently. 5) development of training modules geared for COVID-19 testing, 6) VR Assembly.

Multi‐User Virtual Reality (MUVR)

MUVR environments for emergency evacuation drills are developed that include: Subway evacuation, airplane evacuation, school bus evacuation, VR city, night club disaster evacuation, building evacuation, and university campus evacuation. Our developed applications show an immersive collaborative virtual reality environment for performing virtual evacuation drills using head displays. Immersive collaborative virtual reality environment offers a unique way for training for emergencies situations. The participant can enter the collaborative virtual reality environment setup on the cloud and participate in the evacuation drill or a tour which leads to considerable cost advantages over large scale real life exercises.

Multi-Agent System (MAS)

Two MAS and models are developed and evaluated namely AvatarSim and AvatarSim2. AvatarSim was developed in Java and AvatarSim2 was developed in C# language. The AvatarSim model comprises of three models which are: a) Geometrical Model, b) Social Force Model, and c) Fuzzy behavioral Model. AvatarSim2 model further combines genetic algorithm (GA) with neural networks (NNs) and fuzzy logic (FL) to explore how intelligent agents can learn and adapt their behavior during an evacuation. The adaptive behavior focuses on the specific agents changing their behavior in the environment. The shared behavior of the agent places an emphasis on the crowd-modeling and emergency behavior in the multi-agent system.

Augmented Reality and HoloLens Applications

This work presents cutting edge Augmented Reality that overcome the visual limitations associated with the traditional, static 2D methods of communicating evacuation plans for multilevel buildings. Using existing building features, we demonstrate how the AR instructional modules provide contextualized 3D visualizations that promote and support spatial knowledge acquisition and cognitive mapping thereby enhancing situational awareness. These AR visualizations are developed for first responders and building occupants to help increase emergency preparedness. The projects include (a) Spatial Analysis at UNT, (b) Navigation, (c) Geospatial  Analysis, (d) Situational awareness, (e) Intelligent Signs, (f) Evacuation, and (g) Emergency response

Megacity: A CVE for Emergency Response, Training, and Decision Making

The goal of this research is to use game creation as a metaphor for creating an experimental setup to study human behavior in a megacity for emergency response, decision-making strategies, and what-if scenarios. It incorporates user-controlled characters as avatars and computer-controlled characters as agents in the megacity CVE (Collaborative Virtual Reality Environment). Virtual crowds for non-combative environments play an important role in modern military operations and often create complications for the combatant forces involved. To address this problem, we are developing crowd simulation capable of generating crowds of non-combative civilians that exhibit a variety of individual and group behaviors at a different level of fidelity.

Human Centric Cyber Situation Awareness and Data Visualization

The goal of this project is to explore ways to visualize the Cyber Situational Awareness capability of an enterprise to the next level by developing holistic human centric situational awareness approaches into new systems that can achieve self-awareness. This research effort aims to identify how graphical objects (such as data-shapes) developed in accordance with an analyst's mental model can enhance analyst's situation awareness. The humans are more adept at inferring meaning from graphical objects, links and associations in a data element. The project aims to use virtual reality techniques to visualize the XML data through the use of a Force Directed Node Graph in 3D which renders and updates in real-time. It can be used to visualize computer networks for cyber-attacks.