幼儿园转学需要什么手续| 新房开火有什么讲究| 肾病综合症是什么病| 皮肤起水泡发痒是什么病| 股骨头坏死挂什么科| 去湿气吃什么食物| 松子是什么树的果实| 壁虎喜欢吃什么| 什么是中医学| 为什么会得肿瘤| 居家是什么意思| 什么是管制| 静心是什么意思| 阿胶不能和什么一起吃| 大校上面是什么军衔| 生理期可以吃什么水果| 起酥油是什么做的| 抗巨细胞病毒抗体igg高是什么意思| 风邪是什么意思| 宫腔镜是什么| 泉中水是什么生肖| 吃完饭胃疼是什么原因| 股市量比什么意思| 耳朵不舒服是什么原因| 血液病有什么症状| 人属于什么界门纲目科属种| 公务员什么时候退休| 蒲公英和什么一起泡水喝最好| 喝红酒对身体有什么好处| 手指为什么会脱皮| 七嘴八舌是什么生肖| 视野是什么意思| 眼压高吃什么药| 玄武是什么动物| 血氧低会有什么危害| 华妃娘娘是什么电视剧| 上岗证是什么| 细胞骨架是由什么构成| 苏州立夏吃什么| 月经量多是什么原因| 熬中药用什么锅好| 怀孕吃什么有营养| 磕是什么意思| 胃窦炎是什么病| 耳轮有痣代表什么| c1e驾照能开什么车| 恺字五行属什么| rue是什么意思| 夜盲症是什么意思| lyocell是什么面料| 签注什么意思| 解酒吃什么药| 慢性胃炎吃什么食物好| 物流专员是做什么的| 香蕉吃多了有什么坏处| 明天什么日子| 五红汤什么时候喝最好| 讲述是什么意思| 为什么要喝酒| 失眠吃什么药见效快| 白砂糖是什么糖| 狗狗狂犬疫苗什么时候打| 量贩装是什么意思| 日语为什么怎么说| 四个日念什么| 长颈鹿吃什么树叶| 信必可为什么轻微哮喘不能用| 1989是什么生肖| 尿酸高吃什么食物好| 钴蓝色是什么颜色| hpv感染用什么药| 什么无为| 生肖鼠和什么生肖最配| 人身体缺钾是什么症状| 蓝色妖姬的花语是什么| 什么程度才需要做胃镜| 三位一体是什么生肖| 推崇是什么意思| 宫颈炎有什么症状表现| 牙龈肿痛挂什么科| 眼镜轴位是什么意思| 西柚是什么意思| 为什么不能打死飞蛾| 破溃是什么意思| sparkling是什么意思| 坐骨神经痛吃什么药快| 16 5是什么码| 岁次什么意思| 洗脑是什么意思| 夏天梦见下雪是什么意思| 脂肪瘤看什么科| 八字桃花是什么意思| 1964属什么| 勤字五行属什么| 管型尿是什么意思| 后羿是什么生肖| 8月15号是什么星座| 什么是牙齿根管治疗| 小腿痒痒越挠越痒是什么原因| dpm值是什么意思| sos代表什么| 秦朝之前是什么朝代| 66年属什么| 乾隆和康熙是什么关系| 玉如意什么属相不能戴| 胃肠炎吃什么药| 保胎吃什么食物好| lauren是什么意思| 梦游的人为什么不能叫醒| 痛风为什么要禁欲| 两肺纹理增重什么意思| 山药补什么| 6月5日是什么日子| 炭疽病用什么药最好| 和亲是什么意思| 大姨妈有黑色血块是什么原因| 白醋和小苏打一起用起什么效果| 廾是什么意思| 大学毕业送什么花| 装模作样是什么生肖| 拉油便是什么原因| 什么是手机号| 蛆是什么| 梦见捡到钱是什么意思| 老头疼是什么原因导致的| 算了吧什么意思| 吃瓜群众是什么意思| 活泼的反义词是什么| 破屋什么意思| 人造海蜇丝是什么做的| 黑松露是什么东西| tam是什么意思| 做宫颈筛查能查出什么| bgm网络语什么意思| 爸爸的奶奶叫什么| 什么是文爱| 肾不好会有什么症状| 草莓什么时候种植最好| 怕热出汗多是什么原因| 外阴瘙痒什么原因引起| 阴道炎吃什么| 打桩是什么意思| 想要什么样的爱| 偏瘫是什么意思| 公知是什么意思| 女大一抱金鸡是什么意思| 明星经纪人是干什么的| 梦见吃油饼是什么意思| 什么原因引起抽搐| 刘备是一个什么样的人| 增致牛仔属于什么档次| 79年属什么的生肖| 口腔出血是什么原因| 田螺不能和什么一起吃| 抖阴是什么| 更年期是什么| 40年是什么婚姻| 于无声处是什么意思| 小孩吃鼻屎是什么原因| 脚后跟麻木是什么原因| 补肾吃什么药效果最好| 乱点鸳鸯谱什么意思| 羊白是什么| 什么病不能吃空心菜| 天蝎座是什么性格| jo是什么意思| 心律不齐吃什么药| 红豆和什么一起煮比较好| 老百姓是什么意思| 婴儿什么时候会走路| 长白头发是什么原因| 戒指戴左手中指是什么意思| 君王是什么意思| 肿瘤标志物cA724高说明什么| 杨梅不能和什么一起吃| 孟德是什么意思| 1870年是什么朝代| 子痫是什么病| 女鼠配什么属相最好| 天蝎座的幸运色是什么| 眼尖什么意思| 尿维生素c阳性是什么意思| 蓝绿色是什么颜色| 电轴右偏是什么意思| 飞机上可以带什么吃的| 双肺纤维灶是什么意思| 查三高挂什么科| 疖是什么意思| 复方氨酚烷胺胶囊是什么药| 2h是什么意思| 武当山求什么最灵| aww是什么意思| 胎菊和金银花一起泡水有什么效果| 甘蔗什么时候成熟| 肺和大肠相表里是什么意思| 胃癌吃什么药| 醋泡花生米有什么功效| no是什么气体| 刚出生的小猫吃什么| 什么是桃花劫| 膀胱炎吃什么药| 甘地是什么种姓| 一直打喷嚏是什么原因| 吃什么有利于排便| 什么是甲亢| 开字五行属什么| 儿童尿路感染吃什么药| 前来是什么意思| 3月24日是什么星座| 天珠是什么| 公务员属于什么行业| 微信为什么不能转账| 副市长是什么级别| 婴儿吃什么奶粉好吸收| 中国最长的河是什么河| trx是什么| 羊水栓塞是什么原因引起的| 羽丝绒是什么材料| 庚午日是什么意思| 夺目的什么| 腊肠炒什么菜好吃| 后壁和前壁有什么区别| 大头鱼是什么鱼| 什么一边什么一边什么| pro是什么意思| 四百分能上什么大学| 阿玛施属于什么档次| 睡几个小时就醒了是什么原因| 的确什么意思| 一直拉肚子是什么原因| 感冒吃什么水果比较好| 周瑜是一个什么样的人| 长痘痘用什么药| 黑海为什么叫黑海| 月经老提前是什么原因| 不管事是什么意思| 肝囊肿饮食要注意什么| 痛风是什么地方痛| 精神病的前兆是什么| 植物神经是什么| 什么是佝偻病有什么症状| 坐享其成是什么意思| 淋病挂什么科| 猫咪轻轻咬你代表什么| 吸渣体质是什么意思| 昱字五行属什么| 直接胆红素高是什么病| 肌肤之钥是什么档次| 33周岁属什么生肖| 七夕节是什么时候| 皮肤炎症用什么药| 风湿性心脏病吃什么药| 为什么有些| 黑暗料理是什么意思| 点背是什么意思| 脚趾脱皮是什么原因| 胰腺炎是什么| 鱿鱼不能和什么一起吃| 属猪五行属什么| 腿没有劲是什么原因| 胸为什么会下垂| 高密度脂蛋白是什么| 人为什么会困| 为什么夏天| 百度

View in English

  • Global Nav Open Menu Global Nav Close Menu
  • Apple Developer
Search
Cancel
  • Apple Developer
  • News
  • Discover
  • Design
  • Develop
  • Distribute
  • Support
  • Account
Only search within “”

Quick Links

5 Quick Links

Videos

Open Menu Close Menu
  • Collections
  • Topics
  • All Videos
  • About

More Videos

  • About
  • Summary
  • Transcript
  • 薄姬为什么讨厌窦漪房

    百度 合众国际社网站3月15日报道称,由美国奥普传媒集团进行的这项研究分析了取自9个国家19个地点的259瓶瓶装水,涉及11个品牌。

    Learn about the different ways you can create and present immersive video experiences within your app. We'll explore the diverse media types available in visionOS 26, including profiles for 180°, 360°, and wide FOV video; options for creating and playing Apple Immersive Video; and expanded capabilities for 2D, 3D, and spatial video. Discover which profiles are best for your app and its content.

    Chapters

    • 0:00 - Introduction
    • 1:36 - 2D and 3D video
    • 4:16 - Spatial video
    • 8:10 - 180°, 360°, and Wide FOV video
    • 19:25 - Apple Immersive Video
    • 23:05 - Choosing a profile

    Resources

    • Apple Movie Profiles for Spatial and Immersive Media
    • Authoring Apple Immersive Video
    • Converting projected video to Apple Projected Media Profile
    • Converting side-by-side 3D video to multiview HEVC and spatial video
    • Creating spatial photos and videos with spatial metadata
    • HTTP Live Streaming Examples
    • ISO Base Media File Format and Apple HEVC Stereo Video
    • Playing immersive media with AVKit
    • Playing immersive media with RealityKit
    • Presenting images in RealityKit
    • QuickTime and ISO Base Media File Formats and Spatial and Immersive Media
    • Rectangular Mask Payload Metadata within the QuickTime Movie File Format
    • Writing spatial photos
      • HD Video
      • SD Video

    Related Videos

    WWDC25

    • Learn about Apple Immersive Video technologies
    • Learn about the Apple Projected Media Profile
    • Support immersive video playback in visionOS apps
    • What’s new for the spatial web
    • What’s new in RealityKit

    WWDC24

    • Build compelling spatial photo and video experiences

    WWDC23

    • Deliver video content for spatial experiences
  • Search this video…

    Hi, I’m Dave, an engineer on the visionOS Spatial Media team. Today, we’re going to explore all of the different types of video media available on visionOS, including some new ones in visionOS 26. Vision Pro is a spatial computer that shows high quality pass-through video of the surrounding world. Because Vision Pro is with you wherever you look, content creators have an entire 360° around the viewer in which they can present engaging content. They’re not just limited to a flat screen in front of the viewer. This enables a wide range of media playback options, some of which are only possible on a spatial computer. Let's take a look. We’ll start by exploring how 2D and 3D video can be presented on visionOS, including some cool new features coming in visionOS 26. We'll see how spatial video enables people to capture their own stereo videos, and some new ways you can play those videos in your app.

    We’ll learn about new immersive video formats for 180°, 360°, and wide FOV video.

    And for the ultimate immersive experience, there's Apple Immersive Video. We’ll see how you can now create your own Apple Immersive content and tooling. Finally, we’ll explore how all these different video profiles compare to help you choose the right one for your app. Let's get started with 2D and 3D video. Vision Pro is a great device for enjoying 2D movies and TV shows, and as a stereo device, it’s perfect for watching stereoscopic 3D movies too. 2D videos can be played inline anywhere in your app’s UI using an embedded playback experience, where the video appears alongside other UI elements.

    Here’s an example of playing video inline in freeform as part of a board full of content.

    Note that if you embed a 3D video inline, it gracefully falls back to playing in 2D.

    And both 2D and 3D video can be played in an expanded experience on a floating screen in the shared space. Note that an expanded experience is required to play a 3D movie stereoscopically.

    2D and 3D video can also transition to a docked state in a virtual environment.

    Here’s an example from the Destination Video sample code project, available from developer.apple.com, where the video fades out from its expanded view and fades back in, docked within the app’s own custom studio environment, created with Reality Composer Pro. The video automatically shows dynamic light spill to make it feel like an integral part of that environment.

    Docking in an environment is a great example of how spatial computing enables video playback to move beyond the bounds of a fixed single screen. And there are even more ways that 2D and 3D video playback can take advantage of Vision Pro’s infinite canvas.

    visionOS 2 introduced multi-view video, where viewers can enjoy multiple camera angles of a single event or multiple sources of video simultaneously in Vision Pro.

    A new in visionOS 26, 2D and 3D videos can specify a per-frame dynamic mask to change or animate their frame size and aspect ratio, to accentuate a story point, or to combine archival and modern-day footage in a single scene, without needing to show black bars for letterboxing or pillarboxing.

    These kinds of seamless transitions, with a framing that best suits each shot, are only possible on a spatial computer.

    For more information, see the “Rectangular Mask Payload Metadata” document on developer.apple.com.

    3D video is the medium of choice for professional stereo productions designed to be viewed on a big screen. But stereo video doesn’t have to require a movie studio. Vision Pro supports a new kind of stereo media that people can capture themselves, known as Spatial Video. People can shoot comfortable, compelling stereo content on devices like iPhone without needing to be an expert in the rules of 3D filmmaking.

    Spatial video is stereo video with additional metadata, that enables windowed and immersive treatments on Vision Pro to mitigate common causes of stereo discomfort.

    By default, spatial video renders through a window, with a faint glow around the edges. It can also expand into an immersive presentation, where the content is scaled to match the real size of objects. The edge of the frame disappears, and the content blends seamlessly into the real world.

    Spatial videos automatically fall back to 2D presentation on other platforms, enabling them to be played on all Apple devices. This enables people to capture spatial content to enjoy on Vision Pro, and still share their memories in 2D with friends and family who don’t yet have a Vision Pro.

    We saw earlier how 2D and 3D video is presented on a flat screen. For spatial videos, we inset that flat screen behind a window and blur the edges of the window to soften them. This helps avoid issues when an object is clipped by the window more in one eye than the other, which can be uncomfortable to view.

    Spatial videos can be captured today on iPhone 15 Pro, iPhone 16, and iPhone 16 Pro, in the Camera app and in your own app via AVCaptureDevice APIs. Spatial videos can also be captured on Apple Vision Pro and, with Canon’s R7 and R50 cameras with a Canon DUAL lens.

    In visionOS 2, spatial videos can be played with spatial styling in your own app with the QuickLook PreviewApplication API. In visionOS 26, we’re bringing that same spatial styling to all of Apple’s media playback frameworks. We’re adding QLPreviewController support in QuickLook, plus support in AVKit, RealityKit, Safari, and WebKit, enabling you to incorporate spatial videos however you choose in your app, with support for HTTP Live Streaming, or HLS.

    And if you’re looking to edit and combine spatial videos in your app to create a longer narrative, the format is now supported in industry standard editing tools, such as Compressor, DaVinci Resolve Studio, and Final Cut Pro. To see more examples of what’s possible with spatial videos and spatial photos as well, check out the Spatial Gallery app on Vision Pro. To learn more about the spatial photo and video formats, check out “Build compelling spatial photo and video experiences” from WWDC24. And for more about the new ways you can play spatial videos in your app, check out “Support immersive video playback in visionOS apps”. And talking of spatial photos, there’s new RealityKit API in visionOS 26 for displaying spatial photos in your app and for converting 2D photos to a 3D spatial scene.

    To learn how to present spatial photos and generate spatial scenes using the new ImagePresentationComponent and Spatial3DImage APIs, check out “What’s new in RealityKit”.

    We saw earlier how 2D, 3D, and spatial videos are all played on a flat surface in Vision Pro. This is because these videos typically use what's known as a rectilinear projection, as seen in this photo of the Apple Park Visitor center. Rectilinear just means that straight lines are straight. There’s no lens curvature or warping in the video. And because of this, these kinds of videos feel correct when viewed on a flat surface that also doesn’t have any curvature or warping.

    We saw how the playback experience for these rectilinear videos on Vision Pro expands that flat surface to fill more of the viewer’s field of view, via docking for 2D and 3D video, and immersive presentation for spatial video.

    But on a spatial computing device, we’re not just restricted to a flat surface in front of the viewer. There’s even more space around them we can fill with pixels. And there are other non-rectilinear video types that are great for filling even more of the viewer’s field of view, by presenting video on a curved surface, not a flat one. visionOS 26 adds native support for three of these non-rectilinear media types: 180° video, 360° video, and wide FOV video. Let's take a look. 180° video is presented on a half sphere directly in front of the viewer. Video is projected onto that half sphere to completely fill the viewer’s forward field of view. In this example, it’s a 180° video of the pond at Apple Park.

    From the viewer’s point of view, it’s like being there. This is a great way for content creators to transport their viewers to amazing locations. 180° video is typically captured in stereo. And because it’s stereo, and completely fills your forward field of view, it feels like you’re looking at the real thing.

    360° video takes things a step further, filling the entire world with content. With 360° video, the content literally surrounds the viewer, giving them the freedom to look wherever they like. Here’s an example of a 360° video captured underneath the rainbow at Apple Park. The viewer can look around at any angle and feel like they are right there beneath the rainbow. Everything looks just as it would if they were there in person.

    To achieve this, the 360° video is wrapped onto a sphere completely surrounding the viewer, centred on their eyes and filling their field of view whichever way they look.

    A rectangular video frame, twice as wide as it is high, is used to achieve this, covering 360° of the scene across its width and 180° across its height. This video is mapped onto a sphere around the viewer with an equirectangular projection. This is like how a map of the world is drawn, where the north and south poles are stretched horizontally, to show a flat representation of a sphere.

    180° video’s projection is similar to 360° video, but for half a sphere. This is known as a half-equirectangular projection. Half-equirectangular videos are square and map their square video frame onto the half sphere in the same way 360° videos do for a full sphere.

    For stereo 180° video we simply have two squares of video, one for each eye.

    Many existing stereo 180° videos encode these two squares side-by-side in a single pixel buffer that’s twice as wide as the resolution per eye. This is known as side-by-side or frame-packed encoding. However, there’s a lot of redundancy here. Because they’re two views of the same scene, the left and right eye images are very similar. Vision Pro takes advantage of this similarity to use a different approach for encoding stereo video, known as multiview encoding.

    Apple platforms already have hardware support for HEVC, or High Efficiency Video Coding, a fast modern codec for video compression.

    And so for stereo videos, we use MV-HEVC, or MultiView HEVC.

    For stereo video, MV-HEVC encodes each eye into its own pixel buffer and writes those two buffers together in a single video track. It takes advantage of the image similarity to compress one eye’s pixels relative to the other, encoding only the differences for the second eye. This gives a smaller encoded size for each frame, making stereo MV-HEVC videos smaller and more efficient. This is particularly important when streaming stereo video.

    To learn more about multiview encoding and MV-HEVC, check out “Deliver video content for spatial experiences” from WWDC23.

    Now, there’s a third kind of non-rectilinear video that’s new in visionOS 26. Wide FOV video from action cameras such as the GoPro HERO13 and Insta360 Ace Pro 2.

    These action cameras capture highly stabilized footage of whatever adventures you take them on. They capture a wide horizontal field of view, typically between 120° and 180°, and often use fisheye-like lenses that show visible curvature of straight lines in the real world. This enables them to capture as much of the view as possible.

    Traditionally, these kinds of videos have been enjoyed on flat-screen devices like iPhone and iPad, and this is a fun way to relive the adventure. But in visionOS 26, we’re introducing a new form of immersive playback for these kinds of action cameras, recreating the unique wide-angle lens profile of each camera as a curved surface in 3D space, and placing the viewer at the center of the action.

    Because that curved surface matches the camera’s lens profile, it effectively undoes the fisheye effect, and the viewer sees straight lines as straight, even at the edge of the image. This recreates the feeling of the real world as captured by the wide-angle lens, while still displaying a maximum field of view.

    Wide FOV action cameras have lots of different modes, and the right mode to use depends on what you're shooting.

    Some modes prioritize the middle of the lens, for cases where that’s where the action is.

    Other modes prioritize the edges of the frame to capture as wide a field of view as possible. And there are many more permutations. To represent all of these different modes and lens configurations, we need a more expressive way to describe how the real world gets mapped to pixels in an image. For this, we use math.

    Different lenses have different shapes and profiles.

    To model these different lens profiles, we define a bunch of parameters for things like the focal length, skew, and distortion of the lens. Camera and lens manufacturers can tailor these parameters to describe a wide variety of lenses, and how those lenses map the real world onto pixels in an image.

    Because it's defined by parameters, we call this projection a parametric immersive projection.

    In visionOS 26, these immersive video types, 180°, 360°, and wide FOV video, are supported natively on visionOS via a new QuickTime movie profile called Apple Projected Media Profile, or APMP. If you’ve already worked with QuickTime movies and the spatial video format, this profile will feel very familiar, with new fields added to describe each of the new projection types.

    We’ve updated the Spatial and Immersive Media Format Edition spec on developer.apple.com, to cover all of the details of this new profile. We’ve also added automatic conversion to APMP for many existing videos, both in files on Vision Pro and in your own app.

    Stereo 180° videos captured with Canon’s EOS VR system and converted with the EOS VR Utility, are automatically converted to 180° APMP when opened on Vision Pro.

    360° videos from devices like the GoPro MAX and Insta360 X5 are automatically converted to 360° APMP.

    180° and 360° videos that conform to the equirectangular version of Google Spherical Video v1 or v2 are also detected and converted.

    And straight-off-the-camera videos from recent action cams like the GoPro HERO13 and Insta360 Ace Pro 2, are automatically converted to wide FOV APMP.

    The same is true for single-lens captures from the 360° cameras mentioned earlier. We’ve also updated the avconvert command-line tool on macOS to convert existing 180° and 360° content to APMP on Mac.

    Just like spatial videos, APMP can be played on visionOS 26 by all of Apple’s media playback frameworks, with support for HTTP Live Streaming.

    Note that APMP content is supported for expanded and immersive playback, but not for embedded inline playback.

    Playing video immersively puts the viewer’s head right where the camera was during capture, even if that camera was strapped to the end of a surfboard. This means that immersive playback is especially sensitive to camera motion. To help mitigate this, we’ve added automatic high motion detection in QuickLook, AVKit, and RealityKit. Playback will automatically reduce immersion when high motion is detected, which can be more comfortable for the viewer during high motion scenes. There are options in the Settings app to customize high motion detection to match the viewer’s personal level of motion sensitivity.

    To dive even deeper into the new Apple Projected Media Profile, including how to read and write it for your own content, check out “Learn about the Apple Projected Media Profile”.

    Now, for the ultimate immersive experience, there’s Apple Immersive Video, which we’re making available to developers and content creators for the first time this year. Here’s an example from the Apple TV+ series, “Wild Life”, which transports viewers to meet the elephants at Kenya’s Sheldrick Wildlife Trust. This scene would be almost impossible to experience in reality, but with Apple Immersive Video, it feels like you're truly there.

    If you’re working with the Blackmagic URSA Cine Immersive camera, you can create, edit and distribute Apple Immersive Video yourself.

    The specs for the URSA Cine Immersive camera are astounding. Every lens in every camera is individually calibrated at the factory, using the parametric approach we saw earlier, but tuned to that individual lens.

    The Cine Immersive captures stereo video with 8160 x 7200 pixels per eye. That’s 59 megapixels per eye, at 90 frames per second.

    That’s over 10 billion pixels per second.

    The URSA Cine Immersive captures up to 210° FOV horizontally, and 180° vertically, with a sharpness approaching that of the human eye.

    In visionOS 26, your app can play Apple Immersive Video with all the same media frameworks as APMP, with support for HTTP Live Streaming. Like APMP, Apple Immersive Content is supported for expanded and immersive playback, but not for embedded inline playback. So how do creators bring their Apple Immersive content to Vision Pro? There are four main steps in the content creation pipeline.

    First, capture video on the URSA Cine immersive camera. Next, edit that video in DaVinci Resolve Studio.

    Then preview and validate content using the new Apple Immersive Video Utility apps for macOS and visionOS. And finally, segment content in Compressor for distribution via HTTP Live Streaming.

    For pro app developers who want to create their own tools to work with Apple Immersive Video, such as a non-linear editor, or a custom pipeline tool, we’re introducing the ImmersiveMediaSupport framework for macOS and visionOS 26. This framework enables you to read and write Apple Immersive content programmatically.

    And there are many more capabilities of Apple Immersive Video that are unique to the format, including per-shot edge blends, custom backdrop environments, the new Apple Spatial Audio Format, and live preview on Apple Vision Pro.

    Let’s dig a little deeper into just one of these, per-shot edge blends.

    Every shot in an Apple Immersive Video can define a custom edge blend curve that best suits its content and framing.

    This isn’t a baked-in mask. It’s a dynamic alpha blend curve that feathers the edges of the shot to transition it into a custom backdrop environment.

    And that’s just a flavor of what’s possible with Apple Immersive Video. Check out “Learn about Apple Immersive Video technologies” for more information. Let’s finish by taking a look at all of these media profiles together, to help decide which ones are right for your app’s experience. To recap, we have 2D and 3D video presented on a flat screen, spatial video, where stereo video is inset behind a flat screen for windowed presentation and shown at true scale when immersive, 180° and 360° video projected onto a half sphere, and a full sphere, wide FOV video with a curved mesh that matches the projection of a wide angle lens from an action cam, and Apple Immersive Video, with high resolution stereo video, perfectly calibrated to each lens in the camera that captured it.

    Here are all the fundamentals for these different video profiles, including how they can be captured, their visual characteristics, their horizontal field of view, and their projection type. Note that 180°, 360°, and wide FOV videos can be mono or stereo. Typically 180° is stereo, and 360° and wide FOV are mono.

    And here’s a reminder of how all of the different profiles can be presented during playback. Note that high motion detection is automatically enabled for all three APMP profiles. And 2D, 3D, and spatial videos all offer 2D playback when embedded in line, because of their rectilinear projection.

    Vision Pro enables so many exciting video playback experiences, from new ways to experience 2D and 3D video, to new types of media that only make sense on a spatial computing platform. To get started with all these video profiles, check out our new immersive playback sample code projects for AVKit and RealityKit, and new sample code projects for writing and working with APMP and Apple Immersive Video. We’ve also provided example video downloads and HLS streams for spatial, 180°, 360°, wide FOV and Apple Immersive Video, available from developer.apple.com. For more information, check out related video sessions on immersive video playback, Apple Projected Media Profile, and Apple Immersive Video. And how to play all these media types on the spatial web in Safari and WebKit. And now: Go... immerse!

    • 0:00 - Introduction
    • Learn about the different ways you can create and present immersive video experiences within your app. We’ll explore the diverse media types available in visionOS 26, including profiles for 180°, 360°, and wide FOV video; options for creating and playing Apple Immersive Video; and expanded capabilities for 2D, 3D, and spatial video.

    • 1:36 - 2D and 3D video
    • visionOS offers versatile video playback capabilities for both 2D and 3D content. You can display videos inline within your app's interface, expanded in its own window, or docked within a custom environment. New in visionOS 26, per-frame dynamic masks enable videos to change size and aspect ratio seamlessly, enhancing storytelling and eliminating the need for letterboxing or pillarboxing.

    • 4:16 - Spatial video
    • Spatial Video is a stereo media format that you can capture using devices like iPhone 15 Pro, iPhone 16, iPhone 16 Pro, Apple Vision Pro, and specific third-party cameras. It's a great point-and-shoot format, and includes specific metadata to support more immersive presentations. In visionOS 26, you can integrate Spatial Video directly into your apps using new frameworks.

    • 8:10 - 180°, 360°, and Wide FOV video
    • visionOS 26 introduces the new Apple Projected Media Profile (APMP) to support formats like 180°, 360°, and wide-field-of-view video. To enhance comfort during high-motion scenes, automatic high-motion detection is added, reducing immersion when necessary. Developers can find detailed information on how to work with APMP for their own content on Apple's developer website.

    • 19:25 - Apple Immersive Video
    • Apple Immersive Video is a new format, available to developers and creators, that you can play back in your own apps and experiences. The content-creation pipeline involves capturing video on the URSA Cine camera, editing it in DaVinci Resolve Studio, previewing and validating it using new Apple Immersive Video Utility apps, and then segmenting it for distribution. You can also create custom tools to work with Apple Immersive Video using the ImmersiveMediaSupport framework.

    • 23:05 - Choosing a profile
    • Learn the differences between the media profiles available on visionOS 26, including traditional 2D and 3D video, spatial video, APMP, and Apple Immersive Video. Each profile is detailed in terms of capture, visual characteristics, field of view, and projection type.

Developer Footer

  • Videos
  • WWDC25
  • Explore video experiences for visionOS
  • Open Menu Close Menu
    • iOS
    • iPadOS
    • macOS
    • tvOS
    • visionOS
    • watchOS
    Open Menu Close Menu
    • Swift
    • SwiftUI
    • Swift Playground
    • TestFlight
    • Xcode
    • Xcode Cloud
    • SF Symbols
    Open Menu Close Menu
    • Accessibility
    • Accessories
    • App Extensions
    • App Store
    • Audio & Video
    • Augmented Reality
    • Design
    • Distribution
    • Education
    • Fonts
    • Games
    • Health & Fitness
    • In-App Purchase
    • Localization
    • Maps & Location
    • Machine Learning
    • Open Source
    • Security
    • Safari & Web
    Open Menu Close Menu
    • Documentation
    • Tutorials
    • Downloads
    • Forums
    • Videos
    Open Menu Close Menu
    • Support Articles
    • Contact Us
    • Bug Reporting
    • System Status
    Open Menu Close Menu
    • Apple Developer
    • App Store Connect
    • Certificates, IDs, & Profiles
    • Feedback Assistant
    Open Menu Close Menu
    • Apple Developer Program
    • Apple Developer Enterprise Program
    • App Store Small Business Program
    • MFi Program
    • News Partner Program
    • Video Partner Program
    • Security Bounty Program
    • Security Research Device Program
    Open Menu Close Menu
    • Meet with Apple
    • Apple Developer Centers
    • App Store Awards
    • Apple Design Awards
    • Apple Developer Academies
    • WWDC
    Get the Apple Developer app.
    Copyright ? 2025 Apple Inc. All rights reserved.
    Terms of Use Privacy Policy Agreements and Guidelines
    苏联是什么国家 3加2是什么意思 形同陌路是什么意思 奶奶的姐姐叫什么 91是什么
    zq是什么意思 增大摩擦力的方法有什么 白鸡蛋是什么鸡下的蛋 灯红酒绿是什么意思 人瘦肚子大是什么原因
    肤色不均匀是什么原因 什么是树洞 生理期腰疼是什么原因 笙是什么意思 吃什么降胆固醇最快
    中校是什么级别 理想型是什么意思 为什么一到晚上就咳嗽 什么茶下火 三月是什么生肖
    根茎叶属于什么器官hcv8jop1ns5r.cn 做梦梦到老公出轨代表什么预兆hcv7jop5ns1r.cn 乙肝不能吃什么东西hcv7jop5ns2r.cn 多囊性改变是什么意思hcv7jop7ns1r.cn 白花花是什么意思chuanglingweilai.com
    手掌横纹代表什么意思hcv8jop4ns0r.cn 嘴发苦是什么原因hcv9jop0ns8r.cn 头重脚轻是什么生肖hcv9jop7ns9r.cn icu是什么意思hcv9jop4ns3r.cn 毛囊炎是什么引起的hcv9jop6ns2r.cn
    牛膝有什么功效hcv8jop0ns2r.cn 公丁香和母丁香有什么区别hcv9jop2ns0r.cn 玫瑰茄和洛神花有什么区别吗hcv8jop0ns0r.cn 骨赘是什么意思hcv9jop2ns1r.cn 蝉是什么生肖hcv8jop1ns3r.cn
    天上的云朵像什么hcv8jop0ns9r.cn human什么意思dajiketang.com 坐位体前屈是什么意思hcv8jop5ns9r.cn 禅让制是什么意思hcv9jop1ns1r.cn 胃热吃什么食物好hcv8jop4ns5r.cn
    百度