PROJECTS

Prism XR - A Curated Exhibition Experience in Virtual Reality with Peer Annotation Features and Virtual Guides for Art and Archaeology Classes

Research Project for Gatech CS 6460 Educational Technology, 2024
Built with Unity3D, OpenXR, and 3D Assets from AK Studio Art

The Prism XR project is a curated exhibition experience in virtual reality (VR) for art and archaeology education with features designed for the enhancement of interactivity and collaborative learning. The project integrates peer annotations and a virtual exhibition guide to augment educational experiences. The peer annotation features are intended for facilitating visitor critiques and comments pivotal in fostering a dialog between the curator and the audience and a dialogue between the visitors in art and archaeology education, which are demonstrated to have positive impacts on the learning motivations and learning outcomes. The virtual exhibition guide is intended to address the issue of isolation in the virtual exhibition space and to increase interactivity in the virtual curatorial experiences.

Paper (arXiv) Demo Video Project Repo

Astrodex - an Interactive Visualized Codex of Celestial Bodies

Collaboration with Ruijun Liu, 2020
Built with D3.js, Aladin Lite API, and HYG 1.1 Dataset

This project aims at creating a visualized CODEX for stars that are documented in the HYG Celestial Body Dataset. On one hand, it visualizes the distribution of non-spatial features of stars, which offers a way for the audience to explore stars by shared qualities. On the other hand, it also provides the user with the opportunity to explore these selected stars in space, showing them a spatial layout and also real telescopic snapshots taken from Aladin Lite API.

Demo Video Project Link Project Repo

Swarmalators - Visualization and Barnes-Hut Approximation

Collaboration with Michael Yue and Benton Liang, 2020
Visualization implemented with React and D3.js.
Barnes Hut Approximation with parallelization implemented in C++ with OpenMP and MPI

This project is a study explores the visualization and parallelization of the ‘Swarmalator’ model for swarming-synchronizing system simulation as discussed in K. O’Keeffe and C. Bettstetter (2019), ‘A review of swarmalators and their potential in bio-inspired computing’. The study includes a frontend implemented in D3.js and React for storytelling and studies on the performance of parallelized implementation of both a naive pairwise algorithm and a Barnes-Hut approximation on the model. Both algorithms are implemented in C++ and parallelized with OpenMP and MPI hybrid models. Performance evaluated on AWS MPI cluster of t2.2xlarge EC2 instances.

Visualization/Storyboard Project Link Project Repo

Ukiyo-e AR - An Interactive AR App for Modern Reinterpreation of Tradition Art

Collaboration with Minzi Long, 2018
AR image recognition, 2D/3D media overlay, and interactions built in Unity3D using Vuforia SDK. iOS UI and application built in Xcode.

Ukiyo-e AR is a mobile app for interactive information display in augmented reality that allows you to scan and access a collection of Ukiyo-e re-creation by contemporary artists. The displayed information is a collection of Unity game objects projected on Vuforia image targets. The 2D art pieces used in this application are credited to Japanese artist Segawa Thirty-seven, Singapore illustrator Sokkuan Tye. The 3D art used in this application is created by Minzi and Huopu.

Demo Video Project Repo

A Python MMLA library for Kinect Analytics used for AR Collaboration Studies

Mentors: Prof. Bertrand Schneider and Iulian Radu, Harvard Graduate School of Education, 2019

This is part of a Python Multimodal Learning Analytics (MMLA) library developed to capture and process learning processes in a series of experiments at the Harvard Learning Innovation and Technology Lab. My work in this library was focused on the development of a Microsoft Kinect sensor data processing functionalities for an AR-assisted learning experiment for Physics. I also implemented methods for parallelizing data cleansing, transformation, and clustering procedures to optimize efficiency, which achieved a substantial reduction in processing time, contributing to the creation of a scalable and robust library for analyzing sensor data from human subjects engaged in AR learning experiments.

Project Link



WORKSHOPS/HACKATHONS

A Visualized Spoofing Detector On a Boosted Tree Model

MIT Fintech Challenge Hackathon 2020, 1st Place
Collaboration with Jerry Shu, Alikhan Nurlanuly Yerlan Sharipov, 2020

This project consists of a boosted Tree model trained on a 2 million dataset to predict three types of spoofing activities in the stock market and a UI that visualizes the suspicious activities on the bid and ask timestamps. The model was trained using the Xgboost package and the imbalanced raw dataset was rebalanced using stratified bootstrapping. The model achieved an overall f1-score of 0.82 for unseen data.

Project Slides Demo Notebook