Localization Quickstart
Localization determines where a device is and which direction it is facing in a physical space. This guide describes a complete workflow, from installing the Scaniverse app to localizing at your first scanned location using the NSDK Kotlin or Swift sample apps.
This guide covers the following steps:
- Install - Install and configure the Scaniverse mobile app.
- Capture - Use your mobile device camera to record visual features as you move through a space.
- Generate Assets - Upload and process scans to convert them into 3D spatial representations of a place, such as meshes or splats.
- Process - Choose which scans to use to generate assets for localization and reconstruction and begin processing.
- Retrieve Anchor - Retrieve the anchor payload for use during localization.
- Deploy - Install, configure, build and deploy Niantic's sample application on your desktop and deploy it to a mobile device.
- Run - Use VPS2 to localize the mobile device and place augmented reality content into physical space.
Prerequisites
Before you begin, ensure that you have the following:
For all developers
- A desktop computer to build and run a mobile app.
- A working Git installation.
- A Scaniverse business or enterprise account. If you don't have an account, follow the steps in Create Account to sign up for one.
Android
- Android Studio installed.
- Android SDK platform 24 or higher installed.
- An Android device running Android 7.0 or later with USB debugging enabled.
iOS
- Xcode installed on macOS.
- An Apple ID added to Xcode so that app can be signed and run on a physical iOS device. An Apple Developer account is not required to run the sample on your own device.
- An iOS device that supports ARKit.
Install
The Scaniverse mobile app uses your device camera to capture live images as you move through a space. These scans are later used as input into the NSDK to create spatial assets for localization. After installing the app, you must sign in with a Niantic business or enterprise account.
Android support for the enterprise Scaniverse experience is coming soon. For now, use an iOS device to complete the scanning workflow.
- Install the latest Scaniverse app from the Apple App Store on your mobile device.
- Open the Scaniverse app on your mobile device.
- Select the profile icon, shown as a person, in the top right corner of the app. If you're already logged into Scaniverse as a consumer, sign out.
- At the bottom of the login screen, tap Sign in with Business Account and sign in using your Niantic business or enterprise account.
If you are in multiple organizations, you can select your organization by going to Profile → select Organization dropdown.
Capture
A scan is a collection of camera images that capture visual features as you move through a space. Niantic's backend services use these scans to build a three dimensional representation of the space. You can use this representation to localize a device by aligning it to the physical environment. You can group multiple scans together in a site, but they must overlap so that Niantic can connect some of the same visual features into a single, consistent representation. In the following steps, you will create a site and record scans.
Create a private site
When you create a site an a business or enterprise customer, the sites you create are private by default. This means that only you or your organization can access it. You can use a private site for testing, internal tools, or unreleased locations. You can create a site either from the Scaniverse web or using the Scaniverse app as follows:
- Scaniverse web
- Scaniverse app
To create a private site, do the following:
- Select the Sites icon from the left navigation bar.
- Select the + Add Site button in the top right corner of the main window.
- In the text box under Site name, enter a clear, descriptive, and unique name.
- (Optional) Select a location for the site by typing an address, entering coordinates, or choosing a place on the map. Adding a location gives Niantic geographic context for your scans. This makes sites easier to find in tools and helps match scans to the correct place during processing.
- Refresh the Scaniverse website.
To create a private site, do the following:
- Select the + button on the top right corner of the app. The Add private site window opens.
- Under Site Name, enter a clear, descriptive, and unique name.
- Select Confirm.
Add scans to your site
Use a mobile device camera to capture scans as follows:
-
Start a scan
- On your mobile device, select the site you created in the previous step.
- Select + Capture at the bottom of the screen.
- Select the red button at the bottom of the screen to begin scanning the space.
-
Move and capture
- Point your device camera at the area you want to scan.
- Move slowly and steadily to reduce motion blur.
Scan QualityFor best localization results, ensure that your scan:
- Captures distinct visual features such as walls, furniture, and decorations.
- Includes multiple angles and viewpoints.
- Has adequate lighting.
- Covers the area that you want to capture and localize to.
-
Finish and review
- Select the red button again to stop scanning. A preview of the scan begins to play.
- Select Localize at the bottom of the app to quickly check the quality of your scan. The app will use the camera to compare live images against the visual features in the scan to see if it can recognize the location and determine the device position and orientation. If localization fails, you can immediately rescan the environment instead of uploading the scan and discovering problems later.
- Select the pen tool next to the default name to change the name of the scan to a clear, descriptive name.
- Select the trash icon at the bottom left of the screen to discard, or Save to keep your scan.
Generate Assets
After you capture scans, you upload them so Niantic's backend services can generate spatial assets used for localization and reconstruction. These assets represent the scanned space and include the following:
- Mesh - A three dimensional surface representation built from vertices, edges, and faces. Meshes model physical structure and depth, but require more memory and processing than splats. Use meshes when you need accurate geometry for occlusion or interaction.
- Splat - A three dimensional point-based representation that stores spatial and visual data such as position, size, and color. Splats support efficient rendering and reliable localization with lower memory and processing costs than meshes. Use splats when you need efficient visualization or localization.
- VPS Map - A nonvisual map created from processed scans that stores visual features used to recognize a place and determine a device’s position and orientation. VPS maps are not rendered. Use a VPS map to enable localization at a scanned location.
To generate assets, upload your scans and process them. To use them in an app, configure the assets by setting them to production. After generation, you can test localization to confirm that the scan was processed successfully.
Upload scans
During upload, your scans are transferred to Niantic’s cloud and prepared for processing. You can upload all scans at once or select them individually. Uploading selectively is useful when working with large scans, testing quality, or reducing processing time and cost.
The Scaniverse app lists all scans for your site under the Scans tab.
Select the scan from the previous step to upload it individually, or select Upload All to upload all scans in the site. After upload, your scans are also available to your team in the Scaniverse web.
Process scans
During processing, Niantic's backend services analyze the uploaded scans, extracts visual features, and generates spatial assets used to recognize and align a device in the environment. This step generates the assets required for localization and reconstruction. Depending on the size of the scan, this step can take about an hour and a half to run.
You can start processing from either Scaniverse web or in the Scaniverse app as follows:
- Scaniverse web
- Scaniverse app
- Navigate to the Scans tab for your Site.
- Select the checkbox next to the uploaded scans that you want to process.
- Select Confirm at the bottom of the screen.
- (Optional) Enter a meaningful name under Version Name to help track changes in the scan.
- Select Generate.
- Navigate to the Scans tab for your Site.
- Select the checkbox next to the uploaded scans that you want to process.
- Select Generate Assets at the top right corner of the main window.
- (Optional) Enter a meaningful name under Version name to help track changes in the scan.
- Select Confirm.
Processing typically takes several minutes to hours depending on scan complexity. Once processing is complete, newly generated assets will be available in either Scaniverse web or the Scaniverse app under Assets->History.
Configure assets
After your scans finish processing, Niantic's backend services add the generated assets to your site. By default, these assets are in a preview state and are not available to live apps.
To make them accessible to users and applications, you must set the asset to production.
Once the asset is in production, the site ID and anochor payloads retrieved in the next step will be discoverable and usable within your app.
You can set the assets to production state using on Scaniverse web or in the Scaniverse app as follows:
- Scaniverse web
- Scaniverse app
- Navigate to the Assets tab for your Site. A preview of your assets appear.
- Select the History tab in the right navigation bar.
- Select Set to production.
- Navigate to the Assets tab for your Site.
- Select the three horizontal dots next to the name of your scan at the bottom of your screen.
- Select Set as Production from the drop-down list.
Test localization
Asset generation only creates the underlying spatial data. It does not gaurantee that localization will work reliably in a real space. Before using the assets in an app, test localization to confirm the following:
- Devices can reliably localize in the scanned physical space.
- Tracking remains stable and content stays accurately aligned.
- Performance is acceptable across supported devices.
- Changes in lighting, occlusion, or the environmental don't prevent successful localization.
Testing localization is an optional step to ensure your device can match live camera input to your generated asset, and validate the user experience. Test localization on a mobile device as follows:
- Go to the physical place you scanned.
- In the Scaniverse app, select the site that you want to test.
- Select the Assets tab.
- Select Localize.
- If prompted, select Allow to give Scaniverse permission to access your device camera.
The app attempts to match your current camera view to the generated assets. If localization is successful, the asset will appear correctly aligned in your environment. If unsuccessful, rescan the space using the following recommendations:
- Capture the entire usable area.
- Capture multiple angles of walls, corners, and large objects.
- Include disctinctive visual features including edges, furniture, and textures.
- Move slowly to reduce motion blur.
- Avoid large gaps between areas that you scan.
Once you've completed rescanning the environment, do the following:
- Go back to the Generate assets step.
- Unselect the scan(s) you are replacing, and select the new scan(s).
- Generate assets again.
The new asset appears in the Assets tab after processing. Previous versions remain available under the History tab. Scroll to view different versions of your assets. To test localization, select the three horizontal dots next to its name and select Test localization. When you decide which version to use in your app, select the three horizontal dots next to its name and select Set as Production.
Retrieve Anchor
After your asset is set to Production, you can retrieve the identifier, or anchor payload, that your app uses to localize to that space. An anchor payload is a long base64 encoded string that links your app to the generated assets, and allows it to localize users in the physical space you scanned. Use the Scaniverse web to retrieve the anchor payload as follows:
- Select the Assets tab.
- Select the Production Version tab.
- Select the clipboard icon next to Anchor Payload from the right navigation bar.
Deploy
After generating assets, you can configure the Niantic sample app to verify that your device can successfully localize against them in a real-world environment. To do this, set up the sample app, configure the app to add the anchor payload, then build and deploy the sample app from your desktop as follows:
Set up the sample app
Use git clone to download the Niantic sample repository onto your desktop to deploy to an Andriod or iOS device as follows:
- Android (Kotlin)
- iOS (Swift)
- Clone the Kotlin samples repository:
git clone https://github.com/niantic-lightship/kotlin-samples.git - Open the NsdkSamples project in Android Studio:
- Launch Android Studio.
- Select File → Open.
- Navigate to the cloned kotlin-samples folder.
- Select the NsdkSamples project folder.
- Select Open and allow Gradle sync to complete.
- Clone the Swift samples repository:
git clone https://github.com/niantic-lightship/swift-samples.git - Open the project in Xcode:
- Launch Xcode.
- Select File → Open.
- Navigate to the cloned swift-samples directory.
- Select
Swift NSDK.xcodeproj. - Select Open and allow Xcode to resolve Swift packages and finish indexing.
Configure app
In Niantic's sample app, you will run the VSP2 example, which shows how to localize a device and place AR content in a real world environment. To test your site, add the anchor payload that you copied to your clipboard in the retrieve anchor step to the appropriate platform:
- Android (Kotlin)
- iOS (Swift)
- Open the
build.gradle.ktsfile in the root of the NsdkSamples directory. - Locate the following line:
buildConfigField("String", "DEFAULT_VPS_PAYLOAD", "\"YOUR_PAYLOAD\"") - Replace
"YOUR_PAYLOAD"with the anchor payload in your clipboard:buildConfigField("String", "DEFAULT_VPS_PAYLOAD", "\"YOUR_ANCHOR_PAYLOAD_FROM_RETRIEVE_ANCHOR_STEP\"") - Save the file and allow the project to resync if prompted.
- Open
VPS2ViewController.swiftin the project (located atNsdkSamples/NsdkSamples/VPS2ViewController.swift). - Locate the line near the top of the file:
private let anchorPayload: String = "<PUT_ANCHOR_PAYLOAD_HERE>" - Replace
"<PUT_ANCHOR_PAYLOAD_HERE>"with the anchor payload you copied from Step 4:private let anchorPayload: String = "YOUR_ANCHOR_PAYLOAD_FROM_STEP_4" - Save the file.
Build and deploy
Build the sample app on your desktop and deploy it to your device.
Run
After building the NsdkSamples app on your desktop and deploying it to your mobile device, you can run any of Niantic’s sample modules to explore different features.
This guide focuses on the VPS2 sample, which demonstrates Niantic’s most advanced localization system. VPS2 determines a device’s position and orientation in the real world and enables placement of persistent AR content at a physical location.
The app streams camera and motion sensor data into the NSDK to calculate where the device is and which direction it is facing. VPS2 can function in most environments, but it achieves the highest accuracy in locations that have already been mapped.
Launch the sample app and authenticate as follows:
- Open and login to the NsdkSamples app on your device.
- Select VPS2.
- If an access token is built into the app: No sign in is required. You can proceed directly to localization.
- If no access token is built into the app: Follow the login instructions on the home page after granting the required permissions (location and camera).
- Login with an email and password, or Google SSO
- Follow redirects back to the app upon login
The sample app uses Niantic Spatial Auth to authenticate with the Portal. This allows the app to:
- Access your organization's sites and assets
- Use VPS2 localization features
For more information, see the Auth guide.
Step 8: Localize to Your Scanned Location
Now you'll use the VPS2 scene to localize to the location you scanned earlier. The process and visual feedback differ between Android and iOS platforms.
- Android (Kotlin)
- iOS (Swift)
Android Localization Process
- In the VPS2 sample scene, ensure you're at or near the physical location you scanned.
- The app needs to be able to see the same features that were captured in your scan
- Look for the Start Tracking button at the bottom center of the screen.
- Tap Start Tracking to begin tracking anchors with the configured payload(s).
- The button text will change to "Stop Tracking" once tracking is active
- The app will automatically track the payload configured in
build.gradle.kts
- Point your device camera at the scanned area.
- Move slowly to help the system match visual features
- Ensure good lighting matches the scan conditions
- Watch the visual indicators and status information displayed on screen.
Android Localization States and Visual Feedback
The VPS2 localization process on Android provides feedback through multiple visual indicators:
Transformer Tracking State (Top-Right Circle Indicator)
A colored circle in the top-right corner indicates the overall VPS2 transformer tracking state:
- Red:
UNAVAILABLE- VPS2 is not localized - Yellow:
COURSE- Coarse localization achieved (approximate location) - Green:
REFINED- Precise localization achieved (high accuracy)
Anchor Tracking States (Visual Markers)
Each tracked anchor displays visual feedback based on its AnchorUpdateType:
-
COARSE(Coarse tracking):- Orange cube (large, 1.0m × 1.0m × 1.0m) appears at the anchor location
- Indicates approximate location with lower precision
- Mesh is not rendered during coarse tracking
-
REFINED(Refined tracking):- Green cube (small, 0.2m × 0.2m × 0.2m) appears at the anchor location
- Mesh geometry (with textures) is automatically downloaded and rendered
- Indicates precise location with high accuracy
- Mesh represents the actual scanned environment geometry
Anchor State Display (Bottom-Left Panel)
A panel shows all tracked anchors with their current state:
- Yellow dot + "TRACKED (COARSE)": Anchor is tracked but only with coarse precision
- Green dot + "TRACKED (REFINED)": Anchor is tracked with refined precision
- Red dot + "NOT_TRACKED": Anchor is not currently being tracked
- Yellow dot + "LIMITED": Anchor tracking is limited (degraded state)
Each anchor entry displays: Anchor Name - State (UpdateType)
Geolocation Display (Bottom-Right Panel)
When tracking is active, the bottom-right panel shows:
- VPS2 coordinates: Latitude, longitude, and heading derived from VPS2 localization
- Red compass indicator showing VPS2 heading direction
- Device GPS coordinates: Latitude, longitude, and heading from device sensors
- Blue compass indicator showing device heading direction
This allows you to compare VPS2-localized position with device GPS position.
iOS Localization Process
- In the VPS2 sample scene, ensure you're at or near the physical location you scanned.
- The app needs to be able to see the same features that were captured in your scan
- Look for the Localize button at the bottom center of the screen.
- Tap Localize to begin tracking the anchor specified by your anchor payload.
- The button will disappear once tracking starts
- The info label will show "Localizing..." during the process
- Point your device camera at the scanned area.
- Move slowly to help the system match visual features
- Ensure good lighting matches the scan conditions
- Watch the info label for status updates and observe the visual markers in the AR scene.
iOS Localization States and Visual Feedback
The VPS2 localization process on iOS provides feedback through both status text and visual markers:
Localization Status (Info Label)
- "Not localized.": Initial state before localization begins
- "Localizing...": Active localization attempt in progress
- "Localized!": Successfully localized (transformer is available)
- "Could not localize.": Localization failed or lost
Anchor Tracking States (Visual Markers)
When tracking an anchor, the app displays different markers based on the AnchorUpdateType:
-
.none(Not tracking):- Info label: "Not tracking"
- Both markers disabled (hidden)
- No visual indicator
-
.coarse(Coarse tracking):- Info label: "Tracking (coarse)"
- Red cube marker (1.0m size) appears at the anchor location
- Mesh marker disabled
- Indicates approximate location with lower precision
- A red arrow may appear in front of the camera pointing toward the POI when in coarse mode
-
.refined(Refined tracking):- Info label: "Tracking (refined)"
- Mesh marker appears (downloaded mesh with textures)
- Red cube marker disabled
- Indicates precise location with high accuracy
- The mesh represents the actual scanned environment geometry
Going Further
In this guide, you've learned how to hardcode an anchor payload directly into your application. However, for production applications, you may want to dynamically query anchor payloads from your sites rather than hardcoding them.
The sample app includes a Sites scene that demonstrates how to programmatically query anchor payloads through the Sites APIs. This approach allows your application to:
- Dynamically discover available sites and assets
- Retrieve anchor payloads at runtime based on user selection or location
- Support multiple sites without rebuilding your app
- Update anchor payloads as new assets are processed and set to production
Exploring the Sites Scene
To see this in action:
- In the sample app, navigate to the Sites scene from the main menu.
- The scene will guide you through:
- Viewing your user information and organizations
- Browsing sites within your organization
- Listing assets for each site
- Viewing asset details, including the VPS anchor payload
When you select an asset, you'll see its anchor payload displayed in the asset information. This payload can be extracted from the AssetInfo.vpsData.anchorPayload property and used programmatically in your VPS2 localization code, rather than hardcoding it in your build configuration.
Implementation Details
- Android (Kotlin)
- iOS (Swift)
In the Kotlin sample app, see SitesView.kt for a complete implementation. The key steps are:
-
Create a
SitesSessionfrom yourNSDKSession:val sitesSession = nsdkSession.sites.acquire() -
Query sites and assets:
val sitesResult = sitesSession.requestSitesForOrganization(orgId)
val assetsResult = sitesSession.requestAssetsForSite(siteId) -
Extract the anchor payload from an asset:
val anchorPayload = asset.vpsData?.anchorPayload
In the Swift sample app, see SitesViewController.swift for a complete implementation. The key steps are:
-
Create a
SitesManagerfrom yourNSDKSession:let sitesManager = SitesManager(nsdk: nsdkSession) -
Query sites and assets:
let sitesResult = try await sitesManager.requestSitesForOrganization(orgId: orgId)
let assetsResult = try await sitesManager.requestAssetsForSite(siteId: siteId) -
Extract the anchor payload from an asset:
let anchorPayload = asset.vpsData?.anchorPayload
This programmatic approach enables more flexible applications that can adapt to your organization's sites and assets without requiring code changes or app updates.