
Simultaneous Localization and Mapping (SLAM) Systems Integration in 2025: Unleashing the Next Wave of Autonomous Solutions. Explore How Advanced Integration is Reshaping Robotics, Automotive, and Beyond.
- Executive Summary: Key Trends and Market Drivers in SLAM Integration
- Market Size and Forecast (2025–2030): Growth Projections and Revenue Analysis
- Core Technologies: Advances in Visual, Lidar, and Sensor Fusion SLAM
- Integration Challenges: Interoperability, Scalability, and Real-Time Processing
- Industry Applications: Robotics, Automotive, Drones, and AR/VR
- Competitive Landscape: Leading Companies and Strategic Partnerships
- Regional Insights: North America, Europe, Asia-Pacific, and Emerging Markets
- Regulatory and Standards Landscape: Compliance and Industry Initiatives
- Innovation Pipeline: R&D, Patents, and Next-Gen SLAM Solutions
- Future Outlook: Market Opportunities, Risks, and Strategic Recommendations
- Sources & References
Executive Summary: Key Trends and Market Drivers in SLAM Integration
Simultaneous Localization and Mapping (SLAM) systems integration is rapidly advancing as a foundational technology for robotics, autonomous vehicles, augmented reality (AR), and industrial automation. In 2025, the sector is characterized by a convergence of hardware and software innovation, driven by the need for robust, real-time spatial awareness in increasingly complex environments. Key trends include the adoption of multi-sensor fusion, edge computing, and AI-enhanced algorithms, all of which are accelerating the deployment and scalability of SLAM solutions across industries.
A major driver is the automotive sector’s push toward higher levels of autonomy. Leading automotive suppliers such as Bosch and Continental are integrating SLAM into advanced driver-assistance systems (ADAS) and autonomous vehicle platforms, leveraging sensor fusion from LiDAR, radar, and cameras to enable precise localization and mapping in real time. These integrations are essential for safe navigation in dynamic urban environments and are being validated in pilot deployments and commercial rollouts.
In robotics, companies like Boston Dynamics and ABB are embedding SLAM into mobile robots and industrial automation systems. The focus is on enabling robots to operate autonomously in unstructured and changing environments, such as warehouses and manufacturing floors. The integration of SLAM with AI-driven perception and planning modules is reducing setup times and increasing operational flexibility, supporting the trend toward “lights-out” factories and 24/7 logistics operations.
The AR and mixed reality sector is also a significant adopter, with technology leaders like Microsoft (HoloLens) and Apple (Vision Pro) incorporating SLAM to deliver persistent, spatially anchored digital content. These systems rely on real-time mapping and localization to provide seamless user experiences, and their success is driving further investment in miniaturized, power-efficient SLAM hardware and software stacks.
Looking ahead, the next few years will see increased standardization and interoperability efforts, as industry bodies and consortia work to ensure that SLAM systems can be integrated across diverse platforms and ecosystems. The proliferation of 5G and edge AI is expected to further enhance SLAM performance, enabling distributed processing and real-time data sharing between devices. As a result, SLAM integration is poised to become a ubiquitous enabler for spatial computing, smart infrastructure, and autonomous systems worldwide.
Market Size and Forecast (2025–2030): Growth Projections and Revenue Analysis
The global market for Simultaneous Localization and Mapping (SLAM) systems integration is poised for robust growth between 2025 and 2030, driven by accelerating adoption across robotics, autonomous vehicles, augmented reality (AR), and industrial automation. As of 2025, the integration of SLAM technologies is increasingly recognized as a critical enabler for real-time spatial awareness and navigation in dynamic environments, underpinning advancements in both consumer and enterprise applications.
Key industry players such as Intel Corporation, NVIDIA Corporation, and Qualcomm Incorporated are investing heavily in SLAM-enabled hardware and software platforms. These companies are leveraging their expertise in high-performance computing, AI, and sensor fusion to deliver scalable SLAM solutions for integration into robotics, drones, and AR/VR devices. For instance, Intel Corporation continues to expand its RealSense product line, which incorporates visual and depth sensing for SLAM-based navigation, while NVIDIA Corporation is advancing its Jetson edge AI platform to support real-time SLAM processing for autonomous machines.
The automotive sector is a major growth driver, with SLAM integration becoming essential for next-generation Advanced Driver Assistance Systems (ADAS) and fully autonomous vehicles. Companies such as Robert Bosch GmbH and Continental AG are actively developing SLAM-based perception modules to enhance vehicle localization and mapping capabilities, supporting safer and more reliable navigation in complex urban environments.
In the industrial domain, SLAM systems are being integrated into automated guided vehicles (AGVs) and mobile robots for logistics, warehousing, and manufacturing. ABB Ltd and OMRON Corporation are notable for deploying SLAM-powered robotics solutions that enable flexible, mapless navigation and real-time adaptation to changing floor layouts.
From a revenue perspective, the SLAM systems integration market is projected to experience a compound annual growth rate (CAGR) in the double digits through 2030, with total market value expected to reach several billion USD by the end of the forecast period. This growth is underpinned by increasing demand for autonomous systems, the proliferation of IoT devices, and ongoing advancements in sensor technology and edge computing. The outlook for 2025–2030 suggests continued expansion, with new entrants and established technology leaders alike driving innovation and adoption across diverse sectors.
Core Technologies: Advances in Visual, Lidar, and Sensor Fusion SLAM
The integration of Simultaneous Localization and Mapping (SLAM) systems has become a cornerstone for autonomous robotics, automotive, and augmented reality (AR) applications. As of 2025, the convergence of visual, lidar, and sensor fusion SLAM technologies is accelerating, driven by the need for robust, real-time mapping and localization in increasingly complex environments.
Visual SLAM, leveraging camera-based perception, remains a dominant approach due to its cost-effectiveness and high spatial resolution. Companies such as Intel and NVIDIA have advanced visual SLAM through dedicated hardware accelerators and AI-powered algorithms, enabling edge devices to process visual data with low latency. These advancements are particularly evident in AR headsets and mobile robotics, where real-time performance and power efficiency are critical.
Lidar-based SLAM, on the other hand, offers superior depth accuracy and robustness in low-light or feature-poor environments. Industry leaders like Velodyne Lidar and Ouster have introduced compact, high-resolution lidar sensors that are increasingly being integrated into autonomous vehicles and industrial robots. The trend in 2025 is toward hybrid systems that combine lidar with visual and inertial sensors, leveraging the strengths of each modality.
Sensor fusion SLAM represents the forefront of integration, combining data from cameras, lidars, IMUs (Inertial Measurement Units), and sometimes radar to achieve higher accuracy and reliability. Bosch and Honeywell are notable for their work in multi-sensor fusion platforms, targeting automotive and aerospace sectors. These systems use advanced algorithms to reconcile discrepancies between sensor inputs, providing robust localization even in challenging conditions such as dynamic urban environments or GPS-denied areas.
A key enabler for these integrated SLAM systems is the development of standardized interfaces and middleware, allowing seamless communication between heterogeneous sensors and processing units. Robotics Industries Association and other industry bodies are actively promoting interoperability standards, which are expected to mature further in the next few years.
Looking ahead, the outlook for SLAM systems integration is marked by increasing adoption of AI-driven sensor fusion, miniaturization of hardware, and the proliferation of open-source frameworks. As edge computing capabilities expand, more devices will be able to perform complex SLAM tasks locally, reducing reliance on cloud processing and enabling new applications in consumer electronics, logistics, and smart infrastructure.
Integration Challenges: Interoperability, Scalability, and Real-Time Processing
The integration of Simultaneous Localization and Mapping (SLAM) systems into diverse platforms—ranging from autonomous vehicles to industrial robots and consumer electronics—faces persistent challenges in interoperability, scalability, and real-time processing. As of 2025, these challenges are at the forefront of research and commercial deployment, with industry leaders and standards bodies actively seeking robust solutions.
Interoperability remains a significant hurdle due to the proliferation of proprietary SLAM algorithms and hardware architectures. Many manufacturers, such as Intel and NVIDIA, offer specialized hardware (e.g., RealSense depth cameras, Jetson edge AI modules) optimized for their own SLAM stacks. However, the lack of standardized data formats and APIs complicates integration across multi-vendor systems. Efforts by organizations like the Open Source Robotics Foundation (maintainers of ROS) are driving the adoption of middleware and open interfaces, but full interoperability—especially in safety-critical applications—remains elusive. In 2025, cross-platform compatibility is a key focus, with new initiatives aiming to harmonize sensor data streams and map representations.
Scalability is another pressing concern as SLAM systems are deployed in increasingly complex and expansive environments. For instance, warehouse automation providers such as ABB and Bosch are scaling SLAM solutions to support fleets of robots operating over large facilities. This requires distributed mapping, efficient data fusion, and robust cloud-edge coordination. The challenge intensifies when integrating heterogeneous sensor suites (LiDAR, cameras, IMUs) and managing the computational load across multiple agents. In 2025, scalable SLAM architectures are leveraging edge AI accelerators and cloud-based map management, but seamless scaling across thousands of devices is still an active area of development.
Real-time processing is critical for applications such as autonomous driving and augmented reality, where latency directly impacts safety and user experience. Companies like Qualcomm and Apple are embedding dedicated SLAM accelerators in their chipsets to achieve low-latency, high-throughput performance. However, real-time constraints are exacerbated by the need to process high-bandwidth sensor data and perform complex optimization on resource-constrained platforms. In 2025, advances in hardware acceleration, algorithmic efficiency, and software optimization are narrowing the gap, but achieving robust, real-time SLAM in all conditions—especially in dynamic or unstructured environments—remains a technical frontier.
Looking ahead, the next few years will likely see increased collaboration between hardware vendors, robotics platforms, and standards organizations to address these integration challenges. The convergence of open-source frameworks, standardized APIs, and heterogeneous computing will be pivotal in enabling scalable, interoperable, and real-time SLAM deployments across industries.
Industry Applications: Robotics, Automotive, Drones, and AR/VR
Simultaneous Localization and Mapping (SLAM) systems have become foundational technologies across several high-growth industries, including robotics, automotive, drones, and augmented/virtual reality (AR/VR). As of 2025, the integration of SLAM is accelerating, driven by advances in sensor technology, edge computing, and artificial intelligence. The convergence of these technologies is enabling more robust, real-time mapping and localization capabilities, which are critical for autonomous operation and immersive user experiences.
In robotics, SLAM is essential for autonomous navigation in dynamic and unstructured environments. Leading robotics manufacturers such as Bosch and ABB are integrating advanced SLAM algorithms into their mobile robots and industrial automation platforms. These systems leverage multi-modal sensor fusion—combining LiDAR, cameras, and inertial measurement units (IMUs)—to achieve centimeter-level accuracy in mapping and localization. The trend is toward edge-based SLAM processing, reducing latency and improving reliability for real-time applications on factory floors and in logistics centers.
The automotive sector is witnessing rapid adoption of SLAM, particularly in the context of advanced driver-assistance systems (ADAS) and autonomous vehicles. Companies like Tesla and Toyota Motor Corporation are investing in SLAM-based perception stacks to enhance vehicle localization, lane detection, and obstacle avoidance. Integration with high-definition maps and vehicle-to-everything (V2X) communication is expected to further improve the robustness of SLAM in complex urban environments. The next few years will likely see SLAM systems becoming standard in new vehicle platforms, supporting both partial and full autonomy.
In the drone industry, SLAM is enabling precise navigation and mapping in GPS-denied environments, such as indoors or under dense foliage. Companies like DJI are equipping their commercial and enterprise drones with visual-inertial SLAM systems, allowing for autonomous inspection, surveying, and delivery missions. The integration of lightweight, power-efficient SLAM modules is a key focus, as it extends flight times and operational range. Regulatory developments and the need for safe, reliable autonomous flight are expected to drive further innovation in SLAM integration for drones through 2025 and beyond.
AR/VR applications are also benefiting from SLAM, which underpins real-time tracking and environmental understanding for immersive experiences. Technology leaders such as Microsoft (with HoloLens) and Meta Platforms, Inc. (with Quest devices) are advancing inside-out tracking using SLAM, enabling users to interact seamlessly with digital content anchored in the physical world. The outlook for 2025 includes more compact, power-efficient SLAM solutions, supporting untethered AR/VR devices and expanding use cases in enterprise training, remote collaboration, and entertainment.
Overall, the integration of SLAM systems is set to deepen across robotics, automotive, drones, and AR/VR, with industry leaders investing in scalable, real-time, and robust solutions. The next few years will likely see SLAM become a ubiquitous enabler of autonomy and spatial computing, with ongoing improvements in accuracy, efficiency, and interoperability.
Competitive Landscape: Leading Companies and Strategic Partnerships
The competitive landscape for Simultaneous Localization and Mapping (SLAM) systems integration in 2025 is characterized by a dynamic interplay between established technology giants, specialized robotics firms, and emerging startups. As SLAM becomes increasingly central to autonomous vehicles, robotics, augmented reality (AR), and industrial automation, companies are leveraging strategic partnerships and acquisitions to accelerate innovation and market reach.
Among the most prominent players, Intel Corporation continues to advance its RealSense technology, integrating SLAM algorithms for depth sensing and spatial awareness in robotics and AR applications. Intel’s collaborations with robotics manufacturers and software developers have enabled the deployment of robust SLAM solutions in logistics, warehouse automation, and consumer robotics.
Similarly, NVIDIA Corporation has solidified its position through its Jetson platform, which provides high-performance computing for edge devices running SLAM workloads. NVIDIA’s partnerships with robotics companies and its support for open-source SLAM frameworks have fostered a vibrant ecosystem, enabling rapid prototyping and deployment of autonomous systems across industries.
In the mobile and AR sector, Apple Inc. and Google LLC are integrating advanced SLAM capabilities into their respective ARKit and ARCore platforms. These frameworks empower developers to create immersive AR experiences by providing real-time mapping and localization on consumer devices. Both companies are investing in proprietary SLAM technologies and collaborating with hardware partners to enhance spatial computing capabilities.
Specialized robotics firms such as Boston Dynamics and Clearpath Robotics are incorporating SLAM systems into their autonomous mobile robots (AMRs) for industrial and research applications. These companies often partner with sensor manufacturers and software providers to optimize SLAM integration for specific operational environments, such as warehouses, factories, and outdoor terrains.
Emerging startups are also shaping the competitive landscape. Companies like SLAMcore focus exclusively on developing robust, hardware-agnostic SLAM software, targeting robotics, drones, and AR/VR markets. Strategic alliances with sensor manufacturers and robotics OEMs are enabling these startups to scale their solutions and address diverse market needs.
Looking ahead, the next few years are expected to see intensified collaboration between hardware and software providers, with a focus on interoperability, real-time performance, and edge computing. The integration of AI-driven perception and sensor fusion will further differentiate leading SLAM solutions, as companies seek to address the demands of increasingly complex autonomous systems.
Regional Insights: North America, Europe, Asia-Pacific, and Emerging Markets
The integration of Simultaneous Localization and Mapping (SLAM) systems is experiencing significant regional differentiation, shaped by technological maturity, industrial priorities, and government initiatives across North America, Europe, Asia-Pacific, and emerging markets. As of 2025, these regions are advancing SLAM adoption in robotics, autonomous vehicles, augmented reality (AR), and industrial automation, with distinct trajectories and leading players.
North America remains at the forefront of SLAM systems integration, driven by robust investments in autonomous vehicles, robotics, and AR/VR. Major technology companies such as Apple Inc. and Microsoft Corporation are embedding SLAM into consumer devices and enterprise solutions, notably in AR platforms and mixed reality headsets. The automotive sector, led by firms like Tesla, Inc. and Ford Motor Company, is leveraging SLAM for advanced driver-assistance systems (ADAS) and full autonomy. Additionally, North American robotics companies, including Boston Dynamics, are integrating SLAM for navigation in dynamic environments. The region benefits from a strong ecosystem of research institutions and government support for AI and robotics, accelerating SLAM innovation.
Europe is characterized by a focus on industrial automation, smart manufacturing, and collaborative robotics. Companies such as Robert Bosch GmbH and SICK AG are deploying SLAM in industrial robots and automated guided vehicles (AGVs) for logistics and manufacturing. The European automotive sector, with players like Volkswagen AG and BMW AG, is integrating SLAM for in-vehicle navigation and safety systems. The European Union’s emphasis on digital transformation and Industry 4.0 is fostering cross-sector SLAM adoption, with collaborative projects and regulatory frameworks supporting interoperability and safety.
Asia-Pacific is witnessing rapid SLAM integration, propelled by large-scale manufacturing, smart city initiatives, and consumer electronics. Chinese technology giants such as Huawei Technologies Co., Ltd. and DJI are embedding SLAM in smartphones, drones, and robotics. Japanese firms like Panasonic Corporation and Hitachi, Ltd. are advancing SLAM in industrial and service robots. South Korea’s Samsung Electronics is also active in integrating SLAM for mobile devices and smart appliances. The region’s strong manufacturing base and government-backed digital infrastructure projects are expected to drive further SLAM adoption through 2025 and beyond.
Emerging markets in Latin America, the Middle East, and Africa are gradually adopting SLAM, primarily in logistics, agriculture, and infrastructure monitoring. While local integration is at an earlier stage, partnerships with global technology providers and increasing investment in automation are expected to accelerate SLAM deployment in the coming years.
Looking ahead, regional collaboration, open-source initiatives, and standardization efforts are likely to shape the global SLAM landscape, with North America, Europe, and Asia-Pacific leading innovation and emerging markets closing the gap through technology transfer and localized solutions.
Regulatory and Standards Landscape: Compliance and Industry Initiatives
The regulatory and standards landscape for Simultaneous Localization and Mapping (SLAM) systems integration is rapidly evolving as these technologies become increasingly central to robotics, autonomous vehicles, and industrial automation. In 2025, the focus is on harmonizing safety, interoperability, and data integrity requirements across diverse application domains, with both governmental and industry-led initiatives shaping the compliance environment.
A key driver is the growing deployment of SLAM in safety-critical sectors such as autonomous driving, where regulatory bodies are intensifying scrutiny. The United Nations Economic Commission for Europe (UNECE) continues to update its vehicle regulations, including those related to automated lane keeping and environmental perception, which directly impact SLAM-based navigation systems. In parallel, the International Organization for Standardization (ISO) is advancing standards such as ISO 3691-4 for driverless industrial trucks and ISO 22737 for low-speed automated driving systems, both of which reference localization and mapping requirements.
Industry consortia are also playing a pivotal role. The Open Robotics foundation, steward of the Robot Operating System (ROS), is collaborating with manufacturers and research institutions to define open interfaces and data formats for SLAM modules, promoting interoperability and reducing vendor lock-in. Meanwhile, the Autonomous Vehicle Computing Consortium (AVCC), which includes major automotive and semiconductor companies, is working on reference architectures that specify SLAM integration points and performance benchmarks for automotive-grade systems.
In the realm of indoor and industrial automation, the ODVA and PI (PROFIBUS & PROFINET International) are updating their communication standards to better support real-time SLAM data exchange between mobile robots and factory infrastructure. These updates are expected to be reflected in new device profiles and certification programs by 2026, facilitating smoother integration and compliance for solution providers.
Data privacy and cybersecurity are also gaining prominence, especially as SLAM systems increasingly rely on cloud-based processing and data sharing. The European Telecommunications Standards Institute (ETSI) is developing guidelines for secure data handling in connected robotics, while the National Electrical Manufacturers Association (NEMA) is addressing safety and reliability in sensor and actuator networks that underpin SLAM.
Looking ahead, the convergence of regulatory and industry standards is expected to accelerate, with cross-sectoral working groups and pilot certification schemes emerging by 2027. This will likely result in more robust compliance frameworks, fostering trust and wider adoption of SLAM-integrated systems across transportation, logistics, and smart infrastructure.
Innovation Pipeline: R&D, Patents, and Next-Gen SLAM Solutions
The innovation pipeline for Simultaneous Localization and Mapping (SLAM) systems integration is accelerating rapidly in 2025, driven by advances in sensor fusion, edge computing, and artificial intelligence. SLAM, a foundational technology for autonomous vehicles, robotics, and augmented reality, is seeing a surge in R&D investment from both established industry leaders and emerging startups. The focus is on developing robust, real-time mapping and localization solutions that can operate reliably in complex, dynamic environments.
Major technology companies are at the forefront of this innovation. Intel Corporation continues to invest in RealSense depth cameras and associated SLAM algorithms, targeting robotics and industrial automation. Their open-source SLAM libraries and hardware-accelerated solutions are being integrated into a growing number of commercial platforms. Similarly, NVIDIA Corporation is leveraging its GPU and AI expertise to enable real-time SLAM on edge devices, with its Isaac robotics platform supporting advanced sensor fusion and deep learning-based mapping.
Automotive suppliers such as Robert Bosch GmbH and Continental AG are pushing the boundaries of SLAM for autonomous driving. Bosch, for example, is developing multi-modal SLAM systems that combine LiDAR, radar, and camera data to enhance vehicle localization in urban and highway scenarios. These efforts are reflected in a growing portfolio of patents related to sensor fusion and real-time map updating, as seen in recent filings with the European Patent Office and the United States Patent and Trademark Office.
In the consumer electronics and AR/VR space, Apple Inc. and Microsoft Corporation are integrating next-generation SLAM into their devices. Apple’s ARKit and Microsoft’s HoloLens both rely on proprietary SLAM algorithms for spatial awareness, with ongoing R&D aimed at improving accuracy, power efficiency, and multi-user experiences. These companies are also active in patenting novel SLAM techniques, particularly those involving neural network-based feature extraction and cloud-based map sharing.
Looking ahead, the next few years will see SLAM systems become more collaborative and cloud-connected. Industry consortia such as AUTOSAR are working on interoperability standards for SLAM data exchange, which will be critical for vehicle-to-everything (V2X) applications and smart city infrastructure. Meanwhile, startups are exploring edge-cloud hybrid SLAM architectures, enabling lightweight devices to offload heavy computation while maintaining real-time performance.
Overall, the SLAM innovation pipeline in 2025 is characterized by cross-industry collaboration, a surge in patent activity, and the emergence of scalable, AI-powered solutions poised to transform robotics, mobility, and immersive computing in the coming years.
Future Outlook: Market Opportunities, Risks, and Strategic Recommendations
The integration of Simultaneous Localization and Mapping (SLAM) systems is poised for significant expansion in 2025 and the following years, driven by rapid advancements in robotics, autonomous vehicles, augmented reality (AR), and industrial automation. As SLAM technologies mature, their deployment is broadening from research and niche applications to large-scale commercial and industrial use, creating a dynamic landscape of opportunities and challenges.
Market Opportunities
- Autonomous Vehicles and Robotics: The automotive sector is accelerating the adoption of SLAM for advanced driver-assistance systems (ADAS) and fully autonomous vehicles. Companies such as Tesla and NVIDIA are investing heavily in sensor fusion and real-time mapping, integrating SLAM with AI to enhance navigation and safety. In robotics, Bosch and ABB are embedding SLAM into mobile robots for logistics, manufacturing, and service applications.
- Augmented and Mixed Reality: The AR sector is leveraging SLAM to enable precise spatial awareness for headsets and mobile devices. Microsoft (HoloLens) and Apple (Vision Pro) are integrating SLAM to deliver seamless, interactive experiences, while Qualcomm is providing chipsets optimized for real-time spatial computing.
- Industrial Automation and Drones: In warehouses and factories, SLAM is being used for autonomous navigation of AGVs (Automated Guided Vehicles) and drones. SICK AG and Open Source Robotics Foundation (ROS) are key players, providing both hardware and open-source software frameworks for SLAM integration.
Risks and Challenges
- Data Security and Privacy: As SLAM systems collect and process spatial data, concerns over data security and privacy are intensifying, especially in public and consumer environments. Companies must address regulatory compliance and implement robust encryption and anonymization protocols.
- Interoperability and Standardization: The lack of universal standards for SLAM data formats and interfaces can hinder integration across platforms and devices. Industry bodies and alliances are working toward greater interoperability, but fragmentation remains a risk.
- Computational Demands: Real-time SLAM requires significant processing power, which can limit deployment on edge devices with constrained resources. Companies like NVIDIA and Qualcomm are addressing this with specialized hardware accelerators, but cost and energy efficiency remain ongoing challenges.
Strategic Recommendations
- Invest in Cross-Platform Solutions: Companies should prioritize SLAM systems that are modular and compatible with multiple hardware and software environments to maximize market reach and future-proof deployments.
- Collaborate on Standards: Active participation in industry consortia and open-source initiatives, such as those led by the Open Source Robotics Foundation, will help drive interoperability and accelerate adoption.
- Focus on Security and Compliance: Embedding security features and ensuring compliance with evolving data protection regulations will be critical for market acceptance, particularly in consumer and public sector applications.
Looking ahead, the SLAM systems integration market is set for robust growth, but success will depend on addressing technical, regulatory, and ecosystem challenges through innovation and collaboration.
Sources & References
- Bosch
- Microsoft
- Apple
- NVIDIA Corporation
- Qualcomm Incorporated
- Velodyne Lidar
- Ouster
- Honeywell
- Open Source Robotics Foundation
- Toyota Motor Corporation
- Meta Platforms, Inc.
- Google LLC
- Clearpath Robotics
- SLAMcore
- Boston Dynamics
- SICK AG
- Volkswagen AG
- Huawei Technologies Co., Ltd.
- Hitachi, Ltd.
- International Organization for Standardization
- ODVA
- PI (PROFIBUS & PROFINET International)
- National Electrical Manufacturers Association