Skip to main content

Week 2 Update

Greetings!

During this past week, I have engineered the main functionalities of my app - fetch models, display models and project models with Augmented Reality. Here's a rundown of all the details:
  • Fetch models with URL or use the default model - the app displays an interface for picking models. The user can enter the urls themselves, or use the default wigwam model.
  • View and manipulate the model: 
    • Zoom, move and rotate model with gestures
    • Manipulate the color of the model with segmented control
    • Adjust light intensity with slider
    • Change the position of the light with a tap on the scene
    • Settings Page
      • Change the scale factor used for the AR model
      • Pick from light types: omnidirectional, directional, probe, spot or ambient
      • Pick from blend modes: add, alpha, multiply, subtract, screen or replace
      • Animations! Choose from no animation or infinite rotation
  • Project models to the real world with Augmented Reality
    • Wait until the device has found a surface to place the model.
    • This app will highlight the the plane detected by ARKit.
    • Once the plane is shown, tapping the plane will place the AR model on the plane
    • Once the model is loaded, pan gesture can be used to adjust the position of the light projected onto the model.
    • The AR model inherits all the settings from the previous SceneKit view.
Here's a few screenshots of this awesome app:

AR Plane Detection

Model Projected with AR

Model Projected with AR

Regular Model Viewer

Model Picker Page

Settings Page










Comments

Popular posts from this blog

Week 9

Hi All! This week, I added one of the most promising features into the 3D visualizer app.  Given the link to a blob file containing the 3D model, the user can now download the models created as CSDT Anishinaabe Arcs projects from the server. Subsequently, these models can be visualized with the typical 3d viewer or with Augmented Reality. This feature took advantage of three prominent iOS APIs: Alamofire, JSONSerialization, and NSXMLParser.  First of all, the app fetches the JSON containing links to projects from the server and translates them into a convenient Dictionary data structure with the JSONSerialization API. Given the JSON, the app now has access to a specific XML file containing the link to the 3D model. The app now takes advantage of NSXMLParser to read in the XML and locate the link to the 3D model. Afterward, the app downloads the model and visualizes it.  I really enjoy implementing these features because I am always learning more iOS API...

Week 6-7

Hi, I spent the Spring Break learning Android programming, and am continuing to learn more. Once I've learned what I needed, I'll start making an Android version of the 3D visualizer. During the break, I've also fixed a few more bugs and added more features to the iOS version: Fixed a bug where a lack of internet connection would crash the app Fixed a bug where the section headers for Scene Settings would disappear on iPad Added a feature to allow the 2D grid view on iPad to adjust the size upon screen width change, in order to facilitate split view experience Added an internet activity indicator in the status bar to alert the user that the app is fetching data from the internet.

Week 10-11

Hello Reader, During the past 2 weeks, I implemented features which takes advantage of force touch on iPhones, as well as features added to AR APIs in iOS 11.3. The first feature is force touch (known as 3D Touch on iOS). The user can force touch on the app icon on the home screen to quickly access the features that interest them the most (Open from URL, Open Default Model and Open List). This is known as Home Screen Quick Actions on iOS. Another feature I've implemented is peek & pop (force touch to peek at the contents, and press harder to view in full detail). The user can peek at the 3D models saved on device, as well as the contents in the grid view (the server page). For the on-device-model peek, the user can swipe up and view in augmented reality directly. In addition, with the release of iOS 11.3, Apple introduced ARKit 1.5, which introduced some new features. One of them is the ability to project model on a vertical surface. This is the feature I'v...