8.7 released—WinterCG Compliance Part 1
Learn more
Vision Pro

Apple provides visionOS development starting with Xcode 15.2 or higher.


You will need an Apple developer account to download developer tools and SDKs.

Create a visionOS project

You can develop with a physical Vision Pro plugged in or using the Vision Pro Simulator.

You will need the vision CLI:

npm install -g nativescript@latest


This tagged CLI is backwards compatible so you can use it for standard iOS and Android projects as well.

You can now use the --vision (or --visionos) flags when creating your app.

ns create myapp --vision

This will setup a preconfigured visionOS ready app using a plain TypeScript base.

If you prefer a flavor, you can use any of the following:

  • Angular: ns create myapp --vision-ng
  • React: ns create myapp --vision-react
  • Solid: ns create myapp --vision-solid
  • Svelte: ns create myapp --vision-svelte
  • Vue (3.x): ns create myapp --vision-vue

All projects are preconfigured with tailwindcss.

Run your visionOS project

Open the Vision Pro Simulator, then run your app with:

ns run vision --no-hmr

The vision platform target is a shorthand alias for visionos so this can also be used:

ns run visionos --no-hmr

Develop with physical Vision Pro

You can use a Developer Strap to connect your Vision Pro to your Mac.

The Developer Strap is an optional accessory that provides a USB-C connection between Apple Vision Pro and Mac and is helpful for accelerating the development of graphics-intensive apps and games. The Developer Strap provides the same audio experience as the in-box Right Audio Strap, so developers can keep the Developer Strap attached for both development and testing.

Once connected, you can run ns device to list all connected physical devices:

% ns device
Searching for devices...

 Connected devices & emulators
 # │ Device Name      │ Platform │ Device Identifier         │ Type   │ Status    │ Connection Type │
 1  Apple Vision Pro  visionOS  00008112-001A10812278A01E  Device  Connected  USB             

You can then run on that device as follows:

ns run visionos --no-hmr --device=00008112-001A10812278A01E

What makes a project work on visionOS?

Primarily 2 key elements make up a NativeScript driven visionOS project:

  1. App_Resources/visionOS/src/NativeScriptApp.swift
  2. The following dependencies:
  "dependencies": {
    "@nativescript/core": "~8.7.0"
  "devDependencies": {
    "@nativescript/visionos": "~8.7.0",
    "@nativescript/webpack": "~5.0.21"

Design Guidelines and Notes

We strongly encourage developers to understand and use Apple's system glass materials throughout their apps in addition to closely following their design guidelines.

We recommend watching the following WWDC 2023 videos covering visionOS for fundamental understandings:

You may by interested in more here.

CSS Adjustments for visionOS

You will likely want to make your Pages transparent to allow the natural glass materials to come through by using this CSS specifier:

.ns-visionos Page {
  background-color: transparent;

When running your app on visionOS, you can scope CSS selectors where needed by the root level .ns-visionos class.

Hover effect for visionOS materials

All standard/system UI Component usages like Button, Switch, Pickers, etc. will automatically get system hover style effects on visionOS.

It's common to add tap bindings in NativeScript to things like StackLayout, GridLayout, etc. which are just UIView's.

You can use new @nativescript/core APIs to easily enable visionOS hover styles on any view type throughout your app or customize per view.

Apple discusses some of the important spatial considerations with these effects in this session.

Each view can specify it's own custom hoverStyle as follows:

<GridLayout hoverStyle="{{customHoverStyle}}" tap="{{tapAction}}"/>

The hoverStyle property can be defined as a string or VisionHoverOptions.

import { VisionHoverOptions } from '@nativescript/core'

const hoverStyle: VisionHoverOptions = {
  effect: 'highlight',
  shape: 'rect',
  shapeCornerRadius: 16,

This would apply a visionOS system highlight rectangle with a cornerRadius of 16 to that GridLayout when a hover is detected.

The options are as follows:

export type VisionHoverEffect = 'automatic' | 'highlight' | 'lift'
export type VisionHoverShape = 'circle' | 'rect'
export type VisionHoverOptions = {
  effect: VisionHoverEffect
  shape?: VisionHoverShape
  shapeCornerRadius?: number

When a string is provided, it will look for predefined hoverStyle's within the TouchManager.visionHoverOptions that match the string name. This allows you to predefine and share custom hoverStyle's across your entire app.

You can enable these effects globally throughout your app for any view which has a tap binding by enabling:

TouchManager.enableGlobalHoverWhereTap = true

This allows you to predefine any number of custom hoverStyle's you'd like to use throughout your app. You could do so in the app.ts or main.ts (aka, bootstrap file), for example:

TouchManager.enableGlobalHoverWhereTap = true
TouchManager.visionHoverOptions = {
  default: {
    effect: 'highlight',
    shape: 'rect',
    shapeCornerRadius: 16,
  slimBox: {
    effect: 'lift',
    shape: 'rect',
    shapeCornerRadius: 8,
  round: {
    effect: 'lift',
    shape: 'circle',

You could then apply custom hoverStyle's by their name anywhere in your app:

<GridLayout hoverStyle="default" tap="tapAction"/>
<GridLayout hoverStyle="slimBox" tap="tapAction"/>
<GridLayout hoverStyle="round" tap="tapAction"/>

You can also disable a hoverStyle on any view by adding the visionIgnoreHoverStyle property if desired.


When no hoverStyle is defined and not using TouchManager.enableGlobalHoverWhereTap, visionOS will use default behavior by enabling hoverStyle's on standard controls as mentioned. Other views would have no hoverStyle as expected.

View template visionOS scoping

You can also scope sections of your view templates specifically for visionOS layouts as needed:

    <Label>I only show on visionOS</Label>
    <Label>I only show on iOS</Label>
    <Label>I only show on Android</Label>


You should not have to do a lot of this throughout apps in general but these options are available to you where desired.

NativeScript and the SwiftUI App Lifecycle

Starting with NativeScript 8.6 we support the SwiftUI App Lifecycle for the first time. For a better understanding of the SwiftUI App Lifecycle, we recommend the following articles:

how can we tell the compiler about the entry point to our application?

Historically with NativeScript apps, we would use the Objective C main entry to define the entry point where the NativeScript engine was intialized and your app would be booted.

We now also support a SwiftUI @main entry via a single App_Resources/visionOS/src/NativeScriptApp.swift file:

import SwiftUI

struct NativeScriptApp: App {

    var body: some Scene {

The NativeScriptMainWindow is a SwiftUI WindowGroup which returns a Scene, your NativeScript app. In visionOS apps, you can expand this struct to support new Scenes and Spaces with new and exciting window styles like volumetric as well as Immersive Spaces.

NativeScriptMainWindow is a SwiftUI struct representing a Scene itself which looks like this:

struct NativeScriptMainWindow: Scene {
    var body: some Scene {
        WindowGroup {
            NativeScriptAppView(found: { windowScene in
            }).onAppear {
                // Your app is booted here!
                DispatchQueue.main.async {

    init() {
        // NativeScript engine is configured here!


This is enabled for visionOS only right now with NativeScript however this will be used in iOS and macOS apps in the future.

Support multiple windows

In order to add volumetric and immersize spaces, be sure you add the following setting to your App_Resources/visionOS/Info.plist:


What's Next?

Beyond what is already possible, the innovative possibility is exciting and this is the beginning of an entirely new world. @nativescript/core along with 3rd party plugins could provide even more SwiftUI providers to enable new powerful development workflows.

We will begin sharing more details over time about expanding your visionOS apps to support volumetric windows and immersive spaces while you explore what's already possible.

You can follow along in these "Vision Pro 🥽 Hello World" tutorials: