Skip to main content

Week 10-11

Hello Reader,

During the past 2 weeks, I implemented features which takes advantage of force touch on iPhones, as well as features added to AR APIs in iOS 11.3.

The first feature is force touch (known as 3D Touch on iOS). The user can force touch on the app icon on the home screen to quickly access the features that interest them the most (Open from URL, Open Default Model and Open List). This is known as Home Screen Quick Actions on iOS.


Another feature I've implemented is peek & pop (force touch to peek at the contents, and press harder to view in full detail). The user can peek at the 3D models saved on device, as well as the contents in the grid view (the server page). For the on-device-model peek, the user can swipe up and view in augmented reality directly.



In addition, with the release of iOS 11.3, Apple introduced ARKit 1.5, which introduced some new features. One of them is the ability to project model on a vertical surface. This is the feature I've implemented. The user can now pick a surface direction (horizontal / vertical) in the settings page, and project them with AR. Below is an example of vertical surface projection:


Thanks for viewing!

Comments

Popular posts from this blog

Week 9

Hi All! This week, I added one of the most promising features into the 3D visualizer app.  Given the link to a blob file containing the 3D model, the user can now download the models created as CSDT Anishinaabe Arcs projects from the server. Subsequently, these models can be visualized with the typical 3d viewer or with Augmented Reality. This feature took advantage of three prominent iOS APIs: Alamofire, JSONSerialization, and NSXMLParser.  First of all, the app fetches the JSON containing links to projects from the server and translates them into a convenient Dictionary data structure with the JSONSerialization API. Given the JSON, the app now has access to a specific XML file containing the link to the 3D model. The app now takes advantage of NSXMLParser to read in the XML and locate the link to the 3D model. Afterward, the app downloads the model and visualizes it.  I really enjoy implementing these features because I am always learning more iOS API...

Week 6-7

Hi, I spent the Spring Break learning Android programming, and am continuing to learn more. Once I've learned what I needed, I'll start making an Android version of the 3D visualizer. During the break, I've also fixed a few more bugs and added more features to the iOS version: Fixed a bug where a lack of internet connection would crash the app Fixed a bug where the section headers for Scene Settings would disappear on iPad Added a feature to allow the 2D grid view on iPad to adjust the size upon screen width change, in order to facilitate split view experience Added an internet activity indicator in the status bar to alert the user that the app is fetching data from the internet.