- Create a series of scripts that allow the user to drive both autonomously and manually (for this we would need to order a wheel)
- Give user tools to reduce sim-sicknesses
- Allow user to place pedestrians, bicyclists and other common moving roadside objects onto the road
- User will be able to customize pedestrian model used, stop and start destination and speed or time to reach
- Allow user to set up auditory distraction tasks that either play an audio file on a trigger or every x seconds
- Allow user to indicate what data they would like to get out of the Oculus (head movement, response time, audio record distraction task responses)
- Scripts will also output data into a csv file for later analysis
- Nice to have: Prewritten python code to visualize common data types aka head tracking data
- In world-buttons for the user to press
- A cellphone with a changeable for distracted driving studies
- Visual distraction tasks
- More thought on what this might look like beyond a video playing or AR cues
- A way to add in traffic / more than one leading vehicle
- Screen interaction for infotainment (aka the big Tesla screen)
- Variety of Weather
- Create ReadMe on how to add in/ customize the world in regards to building/ road design so it's not just the same seattle suburb
- Importing buildings from revit to unity