Conference day 2 - 12 March 2020

Conference day 2

Thursday 12 March

  1. Delegate sign-in and morning refreshments

  2. Chair's opening remarks


  1. Economic trends in the global vision markets

    Ron Mueller | CEO of Vision Markets of Associate Consultant of Smithers

    This presentation will deliver findings from the continuous monitoring of major enterprises who either have image sensors integrated in their products (mobile phones, automobiles, surveillance cameras, machine vision components) or use imaging to manufacture their products (semiconductors, drugs, pharmaceuticals, food, etc.). The analysis of the manufacturing sectors and currency exchange rates of USA, China, and the Eurozone will explain  differences in the economic developments and an outlook for 2020.

  2. High Dynamic Range imaging - a practical state-of-the-art comparison

    Dana Diezemann | Strategical Product Manager of ISRA Vision AG

    The trend towards Autonomous Driving in Automotive is pushing technological advances in High Dynamic Range (HDR) imaging. Autonomous cars need to rely on their visual perception of the road course, other road users, and traffic signs etc. at all times. Image sensors and thus automotive cameras often face scenes of mixed brightness, such as entries and exits of tunnels, bridge underpasses, or low frontal sunlight. To capture all required image details in very dark and very bright areas, cameras need to show High Dynamic Range (HDR) performance.

    Besides all other autonomous vehicles, multiple other applications can build upon the all-new designs of lenses, pixels, readout, and image signal processing in modern Automotive Vision, such as outdoor surveillance, the inspection of complex shapes and shiny surfaces in industrial automation or melt-pool monitoring in 3D printing. This presentation will introduce the state of the art in High Dynamic Range imaging based on a practical comparison of the latest automotive image sensors from Sony, ON Semi, OmniVision, and ST Microelectronics, showing their individual advantages in theory and real-life performance.

  3. A ultra-low noise Global Shutter Sensor family with embedded HDR for demanding applications

    Bernard Godet | Senior Application Engineer of PYXALIS

    In current evolution to computer vision and decision, flexible, reliable and high performance image sensors are key.

    We’ll present applicative case studies in visible and NIR range demonstrating unique capabilities in high dynamic scene acquisition, active lighting operations and exceptional image sharpness both in colour and NIR spectral ranges.

  4. Large-format SPAD cameras

    Prof. Edoardo Charbon | Chair of VLSI of EPFL

    SPAD technology has brought new perspectives to various applications, owing to its single-photon sensitivity, picosecond photoresponse, and low dark noise. Implementation of large-scale SPAD arrays with miniaturized pixels is advantageous in those applications to achieve higher spatial resolution, higher dynamic range, and more sophisticated data processing. We present the design challenges in the miniaturized SPAD pixels, as well as current and future applications of large-format SPAD cameras.

    • Miniaturized SPAD pixels: analysis of corresponding scaling laws, manufacturing trends, technology and physics limitations, and experimental measurements on very small pitch SPAD pixels.
    • Large-format SPAD cameras: architectures, trends, performance evaluation and uniformity.
    • Time-of-flight application of large-format SPAD cameras.

    Authors: Edoardo Charbon, Kazuhiro Morimoto, Claudio Bruschini

  5. Networking break


  1. Time-of-Flight is here to stay, but where is it going?

    Dr. Bernd Buxbaum | CEO/CTO, Founder of pmdtechnologies, Germany

    In his presentation Dr Bernd Buxbaum, CEO at pmdtechnologies ag will give a deep dive into the future developments of 3D depth sensing with special attention to the Time-of-Flight technology and the pmd principle which he co-developed 20 years ago. After a small lookback to the early developments and breakthroughs that paved the way for today's technological advances, he will focus on the three different markets Consumer, Industrial and Automotive Exemplarily he will give a detailed overview on how the integration of 3D-Cameras has enhance products and applications in these markets. Further, he will give an outlook into the future of Time-of-Flight depth sensing, technological developments, current challenges and its potential for major markets and products.

  2. High-speed single-point d-TOF sensor for industrial applications

    Matteo Perenzoni | Head of IRIS Research Unit of Fondazione Bruno Kessler (FBK)

    Single-point dTOF sensors are now the main commercial application of CMOS SPADs, started with consumer modules for proximity/autofocus. However, industrial applications require more robustness, speed, and precision. Here, a CMOS sensor for single point dTOF is presented, embedding a procedure for on-the-fly identification of the laser spot position, and a programmable ROI connected to several TDCs. It can reliably measure up to 3m with <1.6mm precision and 7klux background, at 1kHz rate.

  3. Device/system technologies of a long range- and high resolution direct Time-of-Flight (DTOF) system based on a vertical avalanche photodiode (VAPD) CMOS image sensor

    Yutaka Hirose | Manager of Panasonic Corporation

    We present a long range (>250m), high resolution (10cm), direct Time-of-Flight (DTOF) system based on a vertical avalanche photodiode (VAPD) CMOS image sensor. Starting with the device design, a unique operation mode of relaxation quenching during Geiger-mode pulse generation enabling one to detect single photon with high stability and pixel circuits realizing high precision photon counting are described. Then, we introduce design and performances of the DTOF system based on the Sub-Ranges Synthesis (SRS) method. Its solid capability of long-ranging and subrange manipulation techniques that directly affect speed, depth resolution and versatility of the system are discussed.

  4. Quanta Image Sensor characterization and denoising algorithms

    Dr. Dakota S. Robledo | Senior Image Sensor Scientist of Gigajot Technology

    The quanta image sensor (QIS) is a novel CMOS-based image sensor capable of photon counting without avalanche gain. A QIS is implemented in a state-of-the-art backside-illuminated (BSI) CMOS image sensor commercial stacking (3D) process. The detector wafer consists of a megapixel array of specialized active pixels known as jots and the ASIC readout wafer utilizes cluster-based readout circuitry. This specialized pixel architecture and cluster readout circuits allow for the QIS to achieve accurate photon-counting capabilities with average read noise of 0.23e- rms, average conversion gain of 345µV/e-, and average dark current less than 0.1e-/sec/jot at room temperature. We will present the experimental characterization of this developed photon-counting quanta image sensor (QIS) camera module under a wide range of temperature. The experimental data from this sensor closely matches what is expected from theory for photon counting sensors based on photon arrival Poisson statistics. By using new denoising algorithms that take advantage of the unique statistical properties of these photon counting jots, accurate image reconstruction can be performed at light levels as low as 0.3 photoelectrons/jot/frame.

  5. Networking break


  1. Wide spectrum 3D sensing based on GeSi technology

    Erik S. Chen | CEO of Artilux

    In this presentation, we will first present recent progress on this technology, followed by application cases for consumer and automotive industries, then will conclude with partnerships and ecosystem updates.

  2. Nanomaterial-based broadband imager

    Dr. Tapani Ryhänen | CEO of Emberion Oy

    We present Emberion’s novel photodetector technology that converts light to an electronic signal using graphene-enhanced transducers with nanocrystal light absorbers. Our technology development results in an affordable high performance detector suitable for visible to SWIR (400 nm-2000 nm) imaging applications. Additionally an imager concept that works for ultra-broadband solutions i.e. 400 nm - 14 um (visible to LWIR) is presented.

  3. Title to be announced

    Speaker to be confirmed

  4. Chair's closing remarks and close of 2020 conference