{"id":56608,"date":"2026-02-09T01:00:59","date_gmt":"2026-02-09T09:00:59","guid":{"rendered":"https:\/\/www.edge-ai-vision.com\/?p=56608"},"modified":"2026-02-06T17:36:11","modified_gmt":"2026-02-07T01:36:11","slug":"into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems","status":"publish","type":"post","link":"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/","title":{"rendered":"Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems"},"content":{"rendered":"<p><em>This blog post was originally published at <a href=\"https:\/\/blogs.nvidia.com\/blog\/openusd-halos-safety-robotaxi-physical-ai\/\">NVIDIA\u2019s website<\/a>. It is reprinted here with the permission of NVIDIA.<\/em><\/p>\n<p><em>NVIDIA Editor\u2019s note: This post is part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advancements in OpenUSD and NVIDIA Omniverse.<\/em><\/p>\n<h3>New NVIDIA safety frameworks and technologies are advancing how developers build safe physical AI.<\/h3>\n<p><a href=\"https:\/\/www.nvidia.com\/en-us\/glossary\/generative-physical-ai\/\" target=\"_blank\" rel=\"noopener\">Physical AI<\/a>\u00a0is moving from research labs into the real world, powering intelligent robots and\u00a0<a href=\"https:\/\/www.nvidia.com\/en-us\/glossary\/autonomous-vehicles\/\" target=\"_blank\" rel=\"noopener\">autonomous vehicles (AVs)<\/a>\u00a0\u2014 such as robotaxis \u2014 that must reliably sense, reason and act amid unpredictable conditions.<\/p>\n<p>To safely scale these systems, developers need workflows that connect real-world data, high-fidelity simulation and robust AI models atop the common foundation provided by the\u00a0<a href=\"https:\/\/docs.nvidia.com\/learn-openusd\/latest\/glossary.html\" target=\"_blank\" rel=\"noopener\">OpenUSD<\/a>\u00a0framework.<\/p>\n<p>The recently published\u00a0<a href=\"https:\/\/aousd.org\/uncategorized\/core-spec-announcement\/\" target=\"_blank\" rel=\"noopener\">OpenUSD Core Specification 1.0<\/a>, OpenUSD \u2014 aka Universal Scene Description \u2014 now defines standard data types, file formats and composition behaviors, giving developers predictable, interoperable USD pipelines as they scale autonomous systems.<\/p>\n<p>Powered by OpenUSD,\u00a0<a href=\"https:\/\/developer.nvidia.com\/omniverse\" target=\"_blank\" rel=\"noopener\">NVIDIA Omniverse libraries<\/a>\u00a0combine\u00a0<a href=\"https:\/\/developer.nvidia.com\/rtx\/ray-tracing\" target=\"_blank\" rel=\"noopener\">NVIDIA RTX<\/a>\u00a0rendering, physics simulation and efficient runtimes to create digital twins and simulation-ready (<a href=\"https:\/\/www.nvidia.com\/en-us\/glossary\/simready\/\" target=\"_blank\" rel=\"noopener\">SimReady<\/a>) assets that accurately reflect real-world environments for synthetic data generation and testing.<\/p>\n<p><a href=\"https:\/\/www.nvidia.com\/en-us\/ai\/cosmos\/\" target=\"_blank\" rel=\"noopener\">NVIDIA Cosmos<\/a>\u00a0world foundation models can run on top of these simulations to amplify data variation, generating new weather, lighting and terrain conditions from the same scenes so teams can safely cover rare and challenging edge cases.<\/p>\n<div class=\"ast-oembed-container \" style=\"height: 100%;\"><iframe title=\"What\u2019s New in OpenUSD and Ways to Contribute\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/tnhqNIv1sY0?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/div>\n<p>&nbsp;<\/p>\n<p>In addition, advancements in\u00a0<a href=\"https:\/\/www.nvidia.com\/en-us\/use-cases\/synthetic-data-physical-ai\/\" target=\"_blank\" rel=\"noopener\">synthetic data generation<\/a>, multimodal datasets and SimReady workflows are now converging with the\u00a0<a href=\"https:\/\/www.nvidia.com\/en-us\/ai-trust-center\/halos\/autonomous-vehicles\/\" target=\"_blank\" rel=\"noopener\">NVIDIA Halos<\/a>\u00a0framework for AV safety, creating a standards-based path to safer, faster, more cost-effective deployment of next-generation autonomous machines.<\/p>\n<h2>Building the Foundation for Safe Physical AI<\/h2>\n<p><b>Open Standards and SimReady Assets<\/b><\/p>\n<p>The OpenUSD\u00a0<a href=\"https:\/\/aousd.org\/uncategorized\/core-spec-announcement\/\" target=\"_blank\" rel=\"noopener\">Core Specification 1.0<\/a>\u00a0establishes the standard data models and behaviors that underpin SimReady assets, enabling developers to build interoperable simulation pipelines for AI factories and robotics on\u00a0<a href=\"https:\/\/developer.nvidia.com\/usd\" target=\"_blank\" rel=\"noopener\">OpenUSD<\/a>.<\/p>\n<p>Built on this foundation, SimReady 3D assets can be reused across tools and teams and loaded directly into\u00a0<a href=\"https:\/\/developer.nvidia.com\/isaac\/sim\" target=\"_blank\" rel=\"noopener\">NVIDIA Isaac Sim<\/a>, where USDPhysics colliders, rigid body dynamics and composition-arc\u2013based variants let teams test robots in virtual facilities that closely mirror real operations.<\/p>\n<p><b>Open-Source Learning\u00a0<\/b><\/p>\n<p>The\u00a0<a href=\"https:\/\/docs.nvidia.com\/learn-openusd\/latest\/index.html\" target=\"_blank\" rel=\"noopener\">Learn OpenUSD<\/a>\u00a0curriculum is now open source and available on GitHub, enabling contributors to localize and adapt templates, exercises and content for different audiences, languages and use cases. This gives educators a ready-made foundation to onboard new teams into OpenUSD-centric simulation workflows.\u200b<\/p>\n<p><b>Generative Worlds as Safety Multiplier<\/b><\/p>\n<p>Gaussian splatting \u2014 a technique that uses editable 3D elements to render environments quickly and with high fidelity \u2014 and world models are accelerating simulation pipelines for safe robotics testing and validation.<\/p>\n<p>At SIGGRAPH Asia, the\u00a0<a href=\"https:\/\/www.nvidia.com\/en-us\/research\/\" target=\"_blank\" rel=\"noopener\">NVIDIA Research<\/a>\u00a0team introduced\u00a0<a href=\"https:\/\/research.nvidia.com\/publication\/2025-12_play4d-accelerated-and-interactive-free-viewpoint-video-streaming-virtual\" target=\"_blank\" rel=\"noopener\">Play4D<\/a>, a streaming pipeline that enables 4D Gaussian splatting to accurately render dynamic scenes and improve realism.<\/p>\n<p>Spatial intelligence company\u00a0<a href=\"https:\/\/www.worldlabs.ai\/\" target=\"_blank\" rel=\"noopener\">World Labs<\/a>\u00a0is using its\u00a0<a href=\"https:\/\/developer.nvidia.com\/blog\/simulate-robotic-environments-faster-with-nvidia-isaac-sim-and-world-labs-marble\/\" target=\"_blank\" rel=\"noopener\">Marble generative world model with NVIDIA Isaac Sim<\/a>\u00a0and\u00a0<a href=\"https:\/\/docs.nvidia.com\/nurec\/index.html\" target=\"_blank\" rel=\"noopener\">Omniverse NuRec<\/a>\u00a0so researchers can turn text prompts and sample images into photorealistic, Gaussian-based physics-ready 3D environments in hours instead of weeks.<\/p>\n<p><img decoding=\"async\" class=\"size-full aligncenter\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2025\/12\/WorldLabs_IsaacSim_Clip.gif\" \/><\/p>\n<p>Those worlds can then be used for physical AI training, testing and sim-to-real transfer. This high-fidelity simulation workflow expands the range of scenarios robots can practice in while keeping experimentation safely in simulation.<\/p>\n<p><b>Lightwheel Helps Teams Scale Robot Training With SimReady Assets<\/b><\/p>\n<p>Powered by OpenUSD,\u00a0<a href=\"https:\/\/www.nvidia.com\/en-us\/customer-stories\/lightwheel\/\" target=\"_blank\" rel=\"noopener\">Lightwheel<\/a>\u2019s SimReady asset library includes a common scene description layer, making it easy to assemble high-fidelity digital twins for robots. The SimReady assets are embedded with precise geometry, materials and validated physical properties, which can be loaded directly into NVIDIA Isaac Sim and Isaac Lab for robot training. This allows robots to experience realistic contacts, dynamics and sensor feedback as they learn.<\/p>\n<h2>End-to-End Autonomous Vehicle Safety<\/h2>\n<p>End-to-end autonomous vehicle safety advancements are accelerating with new research, open frameworks and inspection services that make validation more rigorous and scalable.<\/p>\n<p>NVIDIA researchers, with collaborators at Harvard University and Stanford University, recently introduced the\u00a0<a href=\"https:\/\/www.arxiv.org\/pdf\/2506.20553\" target=\"_blank\" rel=\"noopener\">Sim2Val framework<\/a>\u00a0to statistically combine real-world and simulated test results, reducing AV developers\u2019 need for costly physical mileage while demonstrating how robotaxis and AVs can behave safely across rare and safety-critical scenarios.<\/p>\n<p>Learn more by watching NVIDIA\u2019s \u201cSafety in the Loop\u201d livestream:<\/p>\n<div class=\"ast-oembed-container \" style=\"height: 100%;\"><iframe title=\"Safety in the Loop: Advancing Autonomous Vehicle Safety Validation Through Simulation\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/930a2dYO32U?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/div>\n<p>&nbsp;<\/p>\n<p>These innovations are complemented by a new, open-source NVIDIA Omniverse NuRec Fixer, a Cosmos-based model trained on AV data that removes artifacts in neural reconstructions to produce higher-quality SimReady assets.<\/p>\n<p>To align these advances with rigorous global standards, the\u00a0<a href=\"https:\/\/www.nvidia.com\/en-us\/ai-trust-center\/physical-ai\/safety-certification\/\" target=\"_blank\" rel=\"noopener\">NVIDIA Halos AI Systems Inspection Lab<\/a>\u00a0\u2014 accredited by ANAB \u2014 provides impartial inspection and certification of Halos elements across robotaxi fleets, AV stacks, sensors and manufacturer platforms through the\u00a0<a href=\"https:\/\/www.nvidia.com\/en-us\/ai-trust-center\/physical-ai\/safety-certification\/\" target=\"_blank\" rel=\"noopener\">Halos Certification Program<\/a>.<\/p>\n<p><strong>AV Ecosystem Leaders Putting Physical AI Safety to Work<\/strong><\/p>\n<p><a href=\"https:\/\/us.bosch-press.com\/pressportal\/us\/en\/press-release-28736.html\" target=\"_blank\" rel=\"noopener\">Bosch<\/a>,\u00a0<a href=\"https:\/\/www.nvidia.com\/en-us\/solutions\/autonomous-vehicles\/partners\/nuro\/\" target=\"_blank\" rel=\"noopener\">Nuro<\/a>\u00a0and\u00a0<a href=\"https:\/\/wayve.ai\/thinking\/wayve-gen-3\/\" target=\"_blank\" rel=\"noopener\">Wayve<\/a>\u00a0are among the first participants in the NVIDIA Halos AI Systems Inspection Lab, which aims to accelerate the safe, large-scale deployment of robotaxi fleets. Onsemi, which makes sensor systems for AVs, industrial automation and medical applications, has recently become the first company to pass inspection for the NVIDIA Halos AI Systems Inspection Lab.<\/p>\n<div class=\"ast-oembed-container \" style=\"height: 100%;\"><iframe title=\"How Robotaxis Are Gaining Ground to Make Streets Safer\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/gIkf870-GNc?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/div>\n<p>&nbsp;<\/p>\n<p>The open-source\u00a0<a href=\"https:\/\/carla.org\/\" target=\"_blank\" rel=\"noopener\">CARLA<\/a>\u00a0simulator integrates\u00a0<a href=\"https:\/\/developer.nvidia.com\/blog\/accelerating-av-simulation-with-neural-reconstruction-and-world-foundation-models\/\" target=\"_blank\" rel=\"noopener\">NVIDIA NuRec and Cosmos Transfer<\/a>\u00a0to generate reconstructed drives and diverse scenario variations, while\u00a0<a href=\"https:\/\/voxel51.com\/\" target=\"_blank\" rel=\"noopener\">Voxel51<\/a>\u2019s FiftyOne engine, linked to Cosmos Dataset Search, NuRec and Cosmos Transfer, helps teams curate, annotate and evaluate multimodal datasets across the AV pipeline.\u200b<\/p>\n<div class=\"ast-oembed-container \" style=\"height: 100%;\"><iframe title=\"Real World Driving With NVIDIA Omniverse NuRec\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/ydreCGxFDXs?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/div>\n<p>&nbsp;<\/p>\n<p>Mcity at the University of Michigan is enhancing the digital twin of its\u00a0<a href=\"https:\/\/mcity.umich.edu\/mcity-enhances-digital-twin-of-av-test-facility-with-nvidia-omniverse\/\" target=\"_blank\" rel=\"noopener\">32-acre AV test facility<\/a>\u00a0using Omniverse libraries and technologies. The team is integrating the NVIDIA Blueprint for AV simulation and Omniverse Sensor RTX application programming interfaces to create physics-based models of camera, lidar, radar and ultrasonic sensors.<\/p>\n<p>By aligning real sensor recordings with high-fidelity simulated data and sharing assets openly, Mcity enables safe, repeatable testing of rare and hazardous driving scenarios before vehicles operate on public roads.<\/p>\n<h2>Get Plugged Into the World of OpenUSD and Physical AI Safety<\/h2>\n<p>Learn more about OpenUSD, NVIDIA Halos and physical AI safety by exploring these resources:<\/p>\n<ul>\n<li><b>Watch\u00a0<\/b>the on-demand NVIDIA GTC session, \u201c<a href=\"https:\/\/www.nvidia.com\/en-us\/on-demand\/session\/gtcdc25-dc51172\/\" target=\"_blank\" rel=\"noopener\">Reconstructing Reality: Simulating Indoor and Outdoor Environments for Physical AI<\/a>.\u201d<\/li>\n<li><b>Visit\u00a0<\/b>the\u00a0<a href=\"https:\/\/www.nvidia.com\/en-us\/ai-trust-center\/physical-ai\/safety-certification\/\" target=\"_blank\" rel=\"noopener\">NVIDIA Halos AI Systems Inspection Lab<\/a>\u00a0webpage.<\/li>\n<li><b>Follow\u00a0<\/b>the NVIDIA DRIVE LinkedIn newsletter: \u201c<a href=\"https:\/\/www.linkedin.com\/newsletters\/nvidia-safety-in-the-loop-7376030785146904576\/\" target=\"_blank\" rel=\"noopener\">NVIDIA Safety in the Loop<\/a>.\u201d<\/li>\n<li><b>Read\u00a0<\/b>the corporate blog explainer:\u00a0<a href=\"https:\/\/blogs.nvidia.com\/blog\/level-4-autonomous-driving-ai\/\">How AI Is Unlocking Level 4 Autonomy<\/a>.<\/li>\n<li><b>Get started\u00a0<\/b>with the\u00a0<a href=\"https:\/\/docs.nvidia.com\/learn-openusd\/latest\/index.html\" target=\"_blank\" rel=\"noopener\">Learn OpenUSD curriculum<\/a>, now open source.<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<p><a href=\"https:\/\/blogs.nvidia.com\/blog\/author\/kburke\/\">Katie Washabaugh<\/a>, Product Marketing Manager for Autonomous Vehicle Simulation, NVIDIA<\/p>\n","protected":false},"excerpt":{"rendered":"<p>This blog post was originally published at NVIDIA\u2019s website. It is reprinted here with the permission of NVIDIA. NVIDIA Editor\u2019s note: This post is part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advancements in OpenUSD and NVIDIA Omniverse. New NVIDIA safety [&hellip;]<\/p>\n","protected":false},"author":15833,"featured_media":56609,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"content-type":"","_uag_custom_page_level_css":"","site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"default","adv-header-id-meta":"","stick-header-meta":"default","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"set","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[763,3,800,765,773,774],"tags":[],"class_list":["post-56608","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-automotive","category-blog","category-nvidia","category-robotics","category-software","category-tools"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.8 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems - Edge AI and Vision Alliance<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems - Edge AI and Vision Alliance\" \/>\n<meta property=\"og:description\" content=\"This blog post was originally published at NVIDIA\u2019s website. It is reprinted here with the permission of NVIDIA. NVIDIA Editor\u2019s note: This post is part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advancements in OpenUSD and NVIDIA Omniverse. New NVIDIA safety [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/\" \/>\n<meta property=\"og:site_name\" content=\"Edge AI and Vision Alliance\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/EdgeAIVision\/\" \/>\n<meta property=\"article:published_time\" content=\"2026-02-09T09:00:59+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1280\" \/>\n\t<meta property=\"og:image:height\" content=\"680\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"pigzippa47\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@edgeaivision\" \/>\n<meta name=\"twitter:site\" content=\"@edgeaivision\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"pigzippa47\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/\"},\"author\":{\"name\":\"pigzippa47\",\"@id\":\"https:\/\/www.edge-ai-vision.com\/#\/schema\/person\/c34c467177decc0866478bad524d50af\"},\"headline\":\"Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems\",\"datePublished\":\"2026-02-09T09:00:59+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/\"},\"wordCount\":1077,\"publisher\":{\"@id\":\"https:\/\/www.edge-ai-vision.com\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png\",\"articleSection\":[\"Automotive\",\"Blog Posts\",\"NVIDIA\",\"Robotics\",\"Software\",\"Tools\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/\",\"url\":\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/\",\"name\":\"Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems - Edge AI and Vision Alliance\",\"isPartOf\":{\"@id\":\"https:\/\/www.edge-ai-vision.com\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png\",\"datePublished\":\"2026-02-09T09:00:59+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/#primaryimage\",\"url\":\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png\",\"contentUrl\":\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png\",\"width\":1280,\"height\":680},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.edge-ai-vision.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.edge-ai-vision.com\/#website\",\"url\":\"https:\/\/www.edge-ai-vision.com\/\",\"name\":\"Edge AI and Vision Alliance\",\"description\":\"Designing machines that perceive and understand.\",\"publisher\":{\"@id\":\"https:\/\/www.edge-ai-vision.com\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.edge-ai-vision.com\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.edge-ai-vision.com\/#organization\",\"name\":\"Edge AI and Vision Alliance\",\"url\":\"https:\/\/www.edge-ai-vision.com\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.edge-ai-vision.com\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/1200x675header_edgeai_vision.jpg\",\"contentUrl\":\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/1200x675header_edgeai_vision.jpg\",\"width\":1200,\"height\":675,\"caption\":\"Edge AI and Vision Alliance\"},\"image\":{\"@id\":\"https:\/\/www.edge-ai-vision.com\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/EdgeAIVision\/\",\"https:\/\/x.com\/edgeaivision\",\"https:\/\/www.linkedin.com\/company\/edgeaivision\/\",\"http:\/\/www.youtube.com\/embeddedvision\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.edge-ai-vision.com\/#\/schema\/person\/c34c467177decc0866478bad524d50af\",\"name\":\"pigzippa47\",\"url\":\"https:\/\/www.edge-ai-vision.com\/author\/pigzippa47\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems - Edge AI and Vision Alliance","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/","og_locale":"en_US","og_type":"article","og_title":"Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems - Edge AI and Vision Alliance","og_description":"This blog post was originally published at NVIDIA\u2019s website. It is reprinted here with the permission of NVIDIA. NVIDIA Editor\u2019s note: This post is part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advancements in OpenUSD and NVIDIA Omniverse. New NVIDIA safety [&hellip;]","og_url":"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/","og_site_name":"Edge AI and Vision Alliance","article_publisher":"https:\/\/www.facebook.com\/EdgeAIVision\/","article_published_time":"2026-02-09T09:00:59+00:00","og_image":[{"width":1280,"height":680,"url":"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png","type":"image\/png"}],"author":"pigzippa47","twitter_card":"summary_large_image","twitter_creator":"@edgeaivision","twitter_site":"@edgeaivision","twitter_misc":{"Written by":"pigzippa47","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/#article","isPartOf":{"@id":"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/"},"author":{"name":"pigzippa47","@id":"https:\/\/www.edge-ai-vision.com\/#\/schema\/person\/c34c467177decc0866478bad524d50af"},"headline":"Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems","datePublished":"2026-02-09T09:00:59+00:00","mainEntityOfPage":{"@id":"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/"},"wordCount":1077,"publisher":{"@id":"https:\/\/www.edge-ai-vision.com\/#organization"},"image":{"@id":"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/#primaryimage"},"thumbnailUrl":"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png","articleSection":["Automotive","Blog Posts","NVIDIA","Robotics","Software","Tools"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/","url":"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/","name":"Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems - Edge AI and Vision Alliance","isPartOf":{"@id":"https:\/\/www.edge-ai-vision.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/#primaryimage"},"image":{"@id":"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/#primaryimage"},"thumbnailUrl":"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png","datePublished":"2026-02-09T09:00:59+00:00","breadcrumb":{"@id":"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/#primaryimage","url":"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png","contentUrl":"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png","width":1280,"height":680},{"@type":"BreadcrumbList","@id":"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.edge-ai-vision.com\/"},{"@type":"ListItem","position":2,"name":"Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems"}]},{"@type":"WebSite","@id":"https:\/\/www.edge-ai-vision.com\/#website","url":"https:\/\/www.edge-ai-vision.com\/","name":"Edge AI and Vision Alliance","description":"Designing machines that perceive and understand.","publisher":{"@id":"https:\/\/www.edge-ai-vision.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.edge-ai-vision.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.edge-ai-vision.com\/#organization","name":"Edge AI and Vision Alliance","url":"https:\/\/www.edge-ai-vision.com\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.edge-ai-vision.com\/#\/schema\/logo\/image\/","url":"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/1200x675header_edgeai_vision.jpg","contentUrl":"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/1200x675header_edgeai_vision.jpg","width":1200,"height":675,"caption":"Edge AI and Vision Alliance"},"image":{"@id":"https:\/\/www.edge-ai-vision.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/EdgeAIVision\/","https:\/\/x.com\/edgeaivision","https:\/\/www.linkedin.com\/company\/edgeaivision\/","http:\/\/www.youtube.com\/embeddedvision"]},{"@type":"Person","@id":"https:\/\/www.edge-ai-vision.com\/#\/schema\/person\/c34c467177decc0866478bad524d50af","name":"pigzippa47","url":"https:\/\/www.edge-ai-vision.com\/author\/pigzippa47\/"}]}},"uagb_featured_image_src":{"full":["https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png",1280,680,false],"thumbnail":["https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai-150x150.png",150,150,true],"medium":["https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai-300x159.png",300,159,true],"medium_large":["https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai-768x408.png",768,408,true],"large":["https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai-1024x544.png",1024,544,true],"1536x1536":["https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png",1280,680,false],"2048x2048":["https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png",1280,680,false]},"uagb_author_info":{"display_name":"pigzippa47","author_link":"https:\/\/www.edge-ai-vision.com\/author\/pigzippa47\/"},"uagb_comment_info":0,"uagb_excerpt":"This blog post was originally published at NVIDIA\u2019s website. It is reprinted here with the permission of NVIDIA. NVIDIA Editor\u2019s note: This post is part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advancements in OpenUSD and NVIDIA Omniverse. New NVIDIA safety&hellip;","_links":{"self":[{"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/posts\/56608","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/users\/15833"}],"replies":[{"embeddable":true,"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/comments?post=56608"}],"version-history":[{"count":2,"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/posts\/56608\/revisions"}],"predecessor-version":[{"id":56611,"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/posts\/56608\/revisions\/56611"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/media\/56609"}],"wp:attachment":[{"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/media?parent=56608"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/categories?post=56608"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/tags?post=56608"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}