AR workshop for Swift Aveiro 2019
- IOS 13 device
- Xcode 11 Beta
The starting project contains some helper code and some Entities and Models to kickstart the workshop. It also contains some code that places a banana where user touches, we will run through this code together and start building off from it.
- ARSession && Tracking
- ARView
- ARAnchor && Anchor Entities
- Entity
- ARCoachingView
If you want to play around ARKit and try different things, this are some idea of how to "clean" the starting project or add more functionality:
- Clean
StatusView
how it shows and when - Allow for only one banana to be shown on the app at all times (you can remove entities and anchors from a scene before adding new ones)
- Add segment or buttons on screen that allows the user to switch between the three different bananas
- Create your own model on Reality Composer and add it to the app
- Place animations on the banana (Entities have a composite box and they can react to being touched)
We will be slowly building a better app and learning new things from AR step by step.
The banana looks ok now but it does not really seem to be in place, we can add some shadows, estimated light and some actions that would make it look more real in the world
We can also recognize a lot of different tracking and anchors in AR, in this part we will check Image Recognition and tracking and play around with 3D text.
Let's get our banana to not only be in the world but interact with it and use people occlusion to not appear when is not wanted.