CS 491 - Project 3 // Shake It Up


Introduction

For Project 3, group members Steve Stranczek and Chris Janowski decided to complete the default project regarding earthquakes. The original idea of displaying earthquake data in 3D appealed to both of us because it is extremely useful and interesting to see this data from a different perspective and also we have not done something like this before. Prior to beginning, we did research on what some points of interest around the world might be. We then proceeded to capture that data which was provided at this link. We chose the following locations to display in our project:

1. Australia
2. South Korea
3. Mexico
4. Chile/Argentina
5. Japan
6. UK
7. Africa (south of equator)
8. Contiguous USA

We implemented our project using the Vuforia Augmented Reality package and displayed the data accordingly via image targets. We focused on eight unique locations and created four image targets for these locations similar to the magazine image target setup from Project 1. We also created a fifth image target that acts as a means to control/filter the earthquake data displayed on the other four targets.



Section 1 - Scripts and More Scripts

Before our group started this project, we knew we would have to be comfortable with scripting in Unity. Our first task was to determine how to plot the data points based on the data available. After cleaning the data for eight unique locations in R, we were left with three columns (plus others) in each file that enabled us to convert latitude, longitude and depth to XYZ coordinates. Next in Unity, we created a few C# scripts by following a reference found online. Here, this guide provided us with a lightweight CSV parser. So now that the data was parsed, we needed a way to create objects based on the values in the CSV file. To do this, we created another script which would instantiate instances of a prefab we created called ‘DataBall’ dynamically every time we ran the project.

Databall

Plotter Variables



After running the program with all eight data files, we were left with eight pre-fabs of 3D plots of eight unique locations around the world. After roughly visualizing a few known hotspots, outlines of the tectonic plates became visible for the first time!

Japan (left) and Mexico (right) shown from the front



In order to allow the user to create filters and color according to magnitude or depth, we had to capture those details in a script previously written. We added variables to capture magnitude as depth was already being recording for a Z-coordinate, so now we were capturing X,Y,Z and magnitude data. So now we were able to, based off of the filters the user selected, display the appropriate points. Prior to this, however, we had to check which image targets were currently registered to even know which points were to be considered. Both of these items had to be completed in order to support filtered, multi-target displays.

Section 2 - Visualization

As a group, it was our first time creating an augmented reality visualization from 2-D into 3-D space. After creating some of the scripts mentioned in the above section, we were ready to make pre-fabs of our plots in order to prevent them being made during run-time every time. Because the data was on a 1:1 scale with the actual earth, we had to do a little normalizing and scaling. As you may have noticed in the screenshot above, there was an attribute named ‘Plot Scale’ for the Plotter object. Tweaking the plot scale does exactly what it sounds like — makes the data points in the plot closer together or farther apart. As an example, the lower you make this value the more closely the data will be plotted. This is ultimately also how we allowed the user to scale the visualization. Also, to make sense of what the user was looking at After scaling, we added a few popular cities as reference points and painted them as black

UI buttons

City labels (see lower right corner)

Discussion

As the project isn't perfect, we had hoped to spend more time on the UI buttons. It seems that in order to make this application truly useful, we would have to have seamless buttons that would act as filters. The issue we ran into during development was working on computers that were not necessarily up to standards on processing power. This made it difficult to display many databalls simultaneously without a lot of lag, ultimately making each test take a significant amount of time. If we would be able to go back in time, we would have made plans to work on the computer we use to present our projects in class. Another issue we had was with our image targets. We wanted to create a unique and interesting way to distinguish between which zone would be visualized. To do this, we attempted to use human readable QR codes. Check out the images below and guess what countries they represent by reading the top row.

USA

Japan

This turned out to be a two-sided sword. The idea was cool and mostly effective, but the image targets were difficult to pick up at times. Anyway, we walked away with the knowledge to take 2D data and convert it into a 3D visualization. Moreover, we gained the skills to do this dynamically and in augmented reality. Ultimately, from our visualization it is clear to see the traces of subduction zones and tectnoic plates from multiple angles. This project was interesting to work on and certainly may end up as a hobby project as soon as we graduate.

Running the project

YouTube Video

In order to run this project on your computer, you need to download and/or install the following software/files:
1.) Install Unity - Link
2.) Download zip and open Scene1 (see below)


For links to files used in our project, see below.
1.) Cleaned .csv files
2.) Image targets
3.) Zip of entire project


Guide used for CSVParser and dynamic instantiation scripts - Link