Table of Content


1 Introduction
2 Methodology of the Study
3 Entities Active in the Event-Based Vision System
4 Patent Trend Analysis
4.1 Filing Trends
4.2 Assignee Landscape
4.3 Patenting Activities by Startups
4.4 Patent Trend Focused on Key Challenges
4.5 Patent Publications Mapped to Automotive Applications
4.5.1 Collision Avoidance
4.5.2 Monitoring of Parked Vehicles
4.5.3 Always On Operations
4.5.4 Analysis of a Road Surface
4.5.5 In-car Installment of DVS Camera
4.5.6 Object Detection and Classification
4.5.7 Multi-object Tracking
4.5.8 Inaccuracies Introduced by Non-event Pixel Points
4.5.9 LiDAR and 3D Point Cloud
4.5.10 3D Pose Estimation
4.5.11 Hardware Security
4.5.12 Edge Processing
4.5.13 Other Highlights
4.5.14 Key Takeaways
5 Competitive Landscape
5.1 Prophesee
5.2 iniVation
5.3 Insightness
5.4 Qelzal
5.5 MindTrace
5.6 CelePixel
5.7 Sunia
5.8 Australian Institute of Technology
5.9 Samsung
5.10 Sony
5.11 Benchmarking of the Commercialized/In-pipeline Event-based Vision Products
5.12 Key Takeaways
6 Projects
6.1 Project 1 – Ultra-Low Power Event-Based Camera (ULPEC)
6.2 Project 2 – The Internet of Silicon Retinas (IoSiRe): Machine to machine communications for neuromorphic vision sensing data
6.3 Project 3 – Event-Driven Compressive Vision for Multimodal Interaction with Mobile Devices (ECOMODE)
6.4 Project 4 – Convolution Address-Event-Representation (AER) Vision Architecture for Real Time (CAVIAR)
6.5 Project 5 – Embedded Neuromorphic Sensory Processor – NeuroPsense
6.6 Project 6 – Event–Driven Morphological Computation for Embodied Systems (eMorph)
6.7 Project 7 – EB-SLAM: Event-based simultaneous localization and mapping
6.8 Project 8 – SLAMCore
7 Research Laboratories
7.1 Lab 1: Robotics and Perception Group
7.2 Lab 2: Neuroscientific System Theory (NST)
7.3 Lab 3: Perception and Robotics Labs
7.4 Lab 4: Robot Vision Group
7.5 Key Takeaways
8 Research Institutes Focusing on Event Cameras
9 Insights and Recommendations
10 Concluding Remarks
11 Acronyms
12 References