Event-based computer vision and processing internship
The eyes of mammals, insects and shrimps are optimize to sense their environment. At Prophesee, we design optimal eyes for machines to sense the world.
We usually respond within a week
About PROPHESEE
Prophesee is the inventor of the world’s most advanced neuromorphic vision systems. The company developed a breakthrough event-based vision approach to machine vision that enables dramatic reductions in power consumption, latency, and data processing requirements. By mimicking how the human eye and brain work, Prophesee’s patented Metavision® sensors and algorithms reveal information that is invisible to traditional frame-based sensors.
Prophesee’s technology is transforming applications across industrial automation, aerospace and defense, autonomous systems, IoT, AR/VR, and mobile.
Headquartered in Paris, Prophesee has offices in Grenoble and Shanghai.
Job background
Prophesee designs and produces a new type of cameras that are bio-inspired and thus free themselves from the concept of images. They don't gather information with a fixed \emph{frame-rate} but instead each pixel is captured asynchronously when needed. This is called \emph{event-based} image processing. Therefore the output is extremely sparse and allows a real time treatment of the information at an equivalent frequency of a kHz or more. But since the data coming from the sensor are quite different from the images used in standard vision, Prophesee is also advancing the algorithmic and machine learning side of this new kind of machine vision. It enables its clients to build new applications mainly in automotive, virtual reality and industrial automation.
Prophesee designs and produces a new type of cameras that are bio-inspired and thus free themselves from the concept of images. They don't gather information with a fixed frame-rate but instead each pixel is captured asynchronously when needed. This is called event-based image processing. Therefore the output is extremely sparse and allows a real time treatment of the information at an equivalent frequency of a kHz or more. But since the data coming from the sensor are quite different from the images used in standard vision, Prophesee is also advancing the algorithmic and machine learning side of this new kind of machine vision. It enables its clients to build new applications mainly in automotive, virtual reality and industrial automation.
The intern will be part of the Event Signal Processing (ESP) team, whose main objective is to design algorithms close to the pixel array. Noise filtering, flicker detection and mitigation or bandwidth control are some of the ESP features already improving the data generated by the Prophesee commercialized sensors.
Internship details
Topic 1: Event-based focusing in the wild
Focusing event sensor is well established in conditions where the target is highly contrasted and stable. This introduces restrictions in application where the object of interest is moving fast in cluttered environments. Optimizing the lens position is crucial to maintain high contrasts on objects of interest, while removing pat of its background. The motion aware data stream of event sensors contains unique information, such as depth, stability, occlusions, that ease the focusing of the sensor. Some of these statistics can be extracted close to the sensor in the ESP processing pipeline, and it makes sense to run the auto-focusing algorithm near the sensor to exploit the low latency of events. The purpose of this internship is to evaluate and implement a full event-based auto-focusing algorithm running next in the sensor ESP. Thanks to the unique hardware setups owned by Prophesee, this project will unlock real time experimentation of the proposed solution. The overall road-map of the internship is:
implement latest focusing algorithm from the ESP team
optimize the algorithm for resource-constrained platforms
organize the benchmark and evaluation procedure of the solution and the comparison with the state of the art
Topic 2: Event-based biasing in active light conditions
The event-based sensor shines when it is synchronized with an illumination source: this lead to many application such as depth estimation, eye tracking, visual light communications, SLAM, etc. The light is often pulsed at high frequencies, above the kHz, to generate unique features. This process can also be used to communicate information about the object state (IMU, etc.). However, such applications fail in many common use cases, and this prevents the development of products based on such solution. The goal of this internship is to implement and improve some existing algorithms to robustify event-based sensors against adversarial active lighting conditions. These algorithms can be programmed inside the sensor, or next to it, which is mandatory in applications where latency is heavily constrained. The overall road-map of the internship is:
implement latest event-based algorithms to perform active lighting event based demodulation in adversarial conditions.
prototype the solution on the sensor ESP and help building sensor filter for that purpose
improve the existing test bench used to measure and benchmark the different algorithm
Topic 3: Event-based 2D features for drone navigation for embedded ML
In some robotics applications, the shape is used to estimate the 3D transformation between the object and the camera (e.g SLAM). The raw timestamp information of the event pixel can be discarded for such feature, as only the spatial structure of the event is used by the algorithm. The timestamp is then most used to smoothly track these features and adapt them at optimal speed. Such filters were designed to extract such information inside the sensor ESP, and the output can be combined with conventional ML algorithms to enhance the full system performance. The goal of this internship is to evaluate how these sensor filters can better use the ESP spatial filters, and how including these ESP filters inside the training can lead to more efficient filter transfer functions. The overall road-map of the internship is:
implement machine learning pipelines using ESP 2D features using modern neuromorphic algorithms
benchmark against existing algorithms with a specific emphasis on data/power reduction
design the next generation of such filters inside the sensor ESP and implement the processing on an embedded platform
Topic 4: Event-based stream encoding and data storage
Event-based sensors produce sparse, asynchronous streams with highly variable event-rates, which creates specific challenges for efficient data exchange, storage, and decoding. In the context of event-based standardization efforts, defining generic and efficient encoding schemes is becoming a key enabler for interoperability, long-term storage, and high-performance processing pipelines. The goal of this internship is to investigate and benchmark new encoder/decoder techniques for event-based streams, explore different data representations and compression strategies, and identify the best trade-offs depending on bandwidth, latency, complexity, and storage constraints.
A large part of the work will consist in proposing new encoding techniques, implementing prototypes, and benchmarking them on representative datasets and use cases. The internship may also involve the use of machine learning or optimization frameworks to tune encoding parameters or guide the design space exploration. This work is important for helping define future data formats for event-based data exchange and storage.
The overall road-map of the internship is:
propose and implement generic encoding and decoding techniques for event-based streams
design benchmarking methodologies and evaluate compression ratio, bandwidth, latency, complexity, and robustness across representative workloads
optimize encoding technologies and parameters, and recommend the most relevant approaches for future event-based standardization and storage formats
Required qualifications, experience, and skills
Good programming skills (C/C++)
Mathematical and/or Computer Science
Computer Vision, ML, or Robotics
Benchmarking and algorithm evaluation
Digital/Embedded platforms/Microcontroler is a plus.
English C1 minimum
Education
Master 1 or 2
Soft skills:
Strong problem-solving skills, strong analytical skills. Flexible to dynamic environments and fast changing technologies. Passionate about technology. Team player. Good sense of autonomy. Must be pragmatic and self-motivated to complete a task even if it is outside of just the “well known” realm. “Can Do Attitude” is preferred.
- Department
- Hardware
- Locations
- Paris, Grenoble
- Employment type
- Internship
- Employment type
- Internship
About Prophesee
Prophesee is the inventor of the world’s most advanced neuromorphic vision systems.
Inspired by the human eye and brain, we developed a breakthrough Event-based approach to machine vision, revealing what was previously invisible with traditional frame-based technology.
Prophesee brings a new level of speed, efficiency, and reliability to real-world perception across industries such as industrial automation, aerospace and defense, autonomous systems, IoT, AR/VR, and mobile.
Visit our main website: www.prophesee.ai