Skip to main content

Week 10-11

Hello Reader,

During the past 2 weeks, I implemented features which takes advantage of force touch on iPhones, as well as features added to AR APIs in iOS 11.3.

The first feature is force touch (known as 3D Touch on iOS). The user can force touch on the app icon on the home screen to quickly access the features that interest them the most (Open from URL, Open Default Model and Open List). This is known as Home Screen Quick Actions on iOS.


Another feature I've implemented is peek & pop (force touch to peek at the contents, and press harder to view in full detail). The user can peek at the 3D models saved on device, as well as the contents in the grid view (the server page). For the on-device-model peek, the user can swipe up and view in augmented reality directly.



In addition, with the release of iOS 11.3, Apple introduced ARKit 1.5, which introduced some new features. One of them is the ability to project model on a vertical surface. This is the feature I've implemented. The user can now pick a surface direction (horizontal / vertical) in the settings page, and project them with AR. Below is an example of vertical surface projection:


Thanks for viewing!

Comments