Conference Day 1 - 25 April 2017

Conference Day 1

Tuesday 25 April – Conference Day 1

  1. Registration and Refreshments

  2. Chair's opening remarks

    Ian Riches | Global Automotive Practice of Strategy Analytics

FUTURE SENSORS NEEDS AND WANTS FOR NEXT-GENERATION CARS

  1. OEM autonomous vehicles will use sensors, not crystal balls!

    Dr Frédéric Large | Research and Advanced Engineering Department, Vision Systems and ADAS Applications and Alain Servel | ADAS and ITS Senior Expert Engineer of PSA Group

  2. Challenges, current and future trends of sensor fusion for autonomous vehicles

    Bharanidhar Duraisamy | Development Engineer – Radar, Sensor Fusion Expert of Daimler

  3. Networking break

  4. Innovations in driver monitoring software as part of autonomous mobility future

    Ophir Herbst | CEO of Jungo Connectivity Ltd.

    • Use cases for driver monitoring 
    • Impact of Deep Learning
    • Implementation insights
  5. Germany’s unique path to autonomous fleets, a Silicon Valley perspective

    Jason Radisson | Vice President of Nauto

    There is a consensus emerging on the future of the automotive industry and urban mobility. As transportation becomes commoditized, consumers will pay a few cents per mile or dollars per week for consumption. It will be the transportation equivalent of an intelligent on-demand service like Spotify. Owning the latent capacity of a car will be a ‘retro’ luxury.

    No transportation market will be exempt from the pull of microeconomics. Price elasticity, consumer incentives and liquidity of supply are all keys to understanding product market fit and planning for the ramp of autonomous fleets in each market. The consumer relationship will move from the OEM to the Fleet Manager.

    Due to its unique starting conditions, Germany is a walled-garden and will evolve more harmoniously than in Anglo-Saxon markets. For another 18 to 24 months there are significant moats that will delay disruption. Driven by transportation network mathematics and the accelerating advancement of autonomous technology, there will be powerful incentives for the German auto industry to build a common platform during this brief window of opportunity.

  6. Networking lunch

Day 1 Stream A

Stream A: FROM SENSING TO PROCESSING: FROM EYES TO BRAIN

Chairman: Dr Hamma Tadjine, Senior Project Leader, IAV

  1. Solving challenges of High Dynamic Range in automotive imaging

    Igor Tryndin | Camera Architect of NVIDIA

    • Automotive HDR imaging
    • Computer vision, Camera in autonomous driving safety
    • Artifacts in automotive HDR imaging

    Computer vision and camera technology play a major role in self-driving car technological revolution. Automotive cameras are expected to operate and output good image results in very harsh, high dynamic range lighting conditions of real world. This directly impacts safety. We will review the multiple challenges that camera technology faces in dealing with real-world HDR inputs from modern automotive HDR sensors. We will address and briefly go over the solution for each of the challenges.

  2. A new Computer Vision Processor Chip Design for automotive ADAS CNN applications in 22nm FDSOI based on Cadence Vision Processor Technology

    Dr Jens Benndorf | COO, Managing Director of DreamChip Technologies GmbH

    The presentation explores the European collaboration on a 22nm System-on-Chip development for Advanced Driver Assistance System (ADAS).

    It will highlight partner co-operation and key trade-offs necessary to deliver the target architecture and performance. The scope includes:

    Project setup and target applications in the automotive market
    Chip architecture and performance achievements, e.g. for Convolutional Neural Networks (CNN)

    Physical Chip and reference platform for evaluation

    Dream Chip designed a new generation of such chips together with its partner Global Foundries which is not just a derivative of a consumer electronic devices but which was explicitly designed for automotive. By doing so, key parameters like performance per Watt can substantially improved compared to available devices helping on the way to cost and power efficient implementations for autonomous driving. The chip can deal with 4 high performance cameras or radar sensors or a combination of it.

    The project is part of a European Initiative to strengthen the semiconductor industry in Europe.

  3. Imaging vision in automotive linked cameras/ISP: processing chain, algorithms and camera systems architecture

    Gregory Roffet | Camera System Expert of STMicroelectronics

    • Automotive Camera systems HDR flicker free sensor
    • Image signal processing
    • Machine Vision algorithm chain
    • Display vision algorithm chain
    • Algorithm for image quality

    Imaging systems are becoming ubiquitous in Automotive and used for many applications. STMicroelectronics Imaging Division is enabling automotive solutions through a large portfolio of products (TOF, Image Sensors, ISP,.. ).

    The presentation will focus on ST new generation of HDR Flicker-free image sensor and Automotive ISP enabling, within the same device, an optimised processing for both Machine Vision and Display Vision. The key algorithm processing steps will be presented. 

  4. Networking refreshment break

  5. In-vehicle vision through image processing at the edge

    Dr Alexis Lluis Gomez | Director, Image Quality of ARM

    The amount of cameras in cars is projected to triple over the next 5 years, where as many as 12 cameras could be the standard in a normal saloon. These image sensors feed into the image signalling processor (ISP) that processes this raw data in real time for further use by on-board computers vision engine for ADAS and for human vision consumption and guidance systems in the car.  However the nature of in-vehicle computing means the requirements are stringent. This talk will explain what is needed for an automotive-grade ISP, going into detail on functional safety requirements, multi-camera inputs, dynamic range requirements and use of different CFA patterns in automotive. The talk will also highlight the software requirements for an automotive grade ISP.

  6. Visual processing crucial to automotive: applications, architectures and algorithms

    Marco Jacobs | VP of Marketing of Videantis

    • The automotive opportunity: sensors and visual processing are the key components for future cars
    • Where do the visual processing SOCs that drive these ADAS in the car go and what do they do?
    • What are some typical characteristics of these computer vision algorithms?
    • What are typically different visual processing architectures?

    Driverless cars and ADAS systems heavily rely on image sensors and computer vision to understand the vehicle’s surroundings. The challenge is to design ICs that are powerful enough to perform the complex computer vision algorithms at very low power and low cost. We will show different types of visual processing architectures, both at the SOC and system level, and how they’re used to implement key computer vision algorithms like object detection and Structure from Motion.

  7. ADAS pre-processor technology: The strong bridge of safety and developer-friendly solution in ADAS system with pre-processor

    Young-Jun Yoo | Head of Strategic Planning for R&D of Nextchip

  8. Chair's closing remarks

  9. Leave for networking drinks reception and dinner at Günnewig Restaurant Top 180

    Coaches will take delegates to a networking drinks reception and dinner at  Günnewig Restaurant Top 180 at the top of the Rhine Tower. You can see the full itinerary  here

    Places are limited at the dinner, therefore once you have booked your place at the conference you will need to confirm your participation at the evening reception by  downloading and completing the evening dinner form, and then emailing Sharon Garrington on  sgarrington@smithers.com  to secure your place.

    Coaches will be provided from 22.15 to return delegates to the conference hotel. 

Day 1 Stream B

Stream B: CUTTING-EDGE SENSOR TECHNOLOGIES

Chairman:  Jason Radisson, Vice President, Nauto

  1. The latest innovations in bio-inspired vision sensors and how they benefit autonomous driving applications

    Jean-Luc Jaffard | VP Sensor Engineering and Operations of Chronocam

  2. Functional safety for image sensors in ADAS and highly automated driving – design considerations and benefits

    Giri Venkat | Technical Marketing, Automotive and Vision of ON Semiconductor

    • Increasing levels of active safety and automation in road vehicles places new demands on image sensors
    • Assessing how Functional Safety capabilities must be designed into the image sensor products and development processes to meet these new demands
    • Understanding the benefits of implementing functional safety features delivers key 
      • High coverage to meet ASIL B and ASIL C metrics…far beyond any practical post-processing techniques
      • Low latency (maximize time for processing/decision)
      • Avoiding unnecessary post-processing to attempt fault discovery through image analysis (power, cost, coverage)
  3. Hot to cold – Optical and camera system design for stable image quality over the automotive temperature range

    Corey Zehfus | Optical Designer of Sunex Inc

    • Thermal properties of optical glass and plastic materials, best practices in material selection
    • Thermal concerns in lens opto-mechanical design
    • Best practices in thermally optimizing an optical design and camera housing as a holistic system rather than two separate components. 
    • Unusual thermal failure modes and suggestions to avoid them  
  4. Networking refreshment break

  5. Solid State LiDAR teardown: How do you build one for <$250?

    Dr Carl Jackson | CTO and Founder of SensL

    • Practical design considerations for solid-state LiDAR
    • Sensor technology comparison for 905nm and other higher wavelength devices
    • Beam steering approaches to enable large FoV systems
    • Tear down example of a solid-state long distance LiDAR to explore costs and performance 

    Long distance LiDAR is now a mandatory requirement for autonomous vehicles and most ADAS applications. Now reality is setting in and the challenges are hard. How can the system achieve 150m+ ranging not only with highly reflective targets but also when the target is <10% reflective and the conditions are harsh? This presentation will take a bottom up approach to looking at the technologies and design considerations facing the industry as it works toward low cost solid-state LiDAR. Practical 

  6. A System-On-Chip Flash LiDAR

    Claude Florin | CEO of Fastree3D SA

    • SoC ToF processor
    • SPAD array sensor (19K pixels)
    • VCSEL array pulsed illumination
    • On chip DSP and RISC processor

    Fastree3D is introducing a new concept of Flash Lidar with a resolution enabling pedestrian detection up to 30m. The System-On-Chip (SoC) allows adaptive control of the illumination according to range and background illumination. The CMOS System-On-Chip integrates a SPAD array, Time-To-Digital Converter and a RISC processor. We have achieved cm resolution at up to 300 fps with 2mW illumination 
     

  7. Dual head Time-of-Flight camera for wide field-of-view applications

    Kristof Lieben | Senior Applications Engineer of Melexis

    Time-of-Flight (TOF) sensors are ideally suited for 3D machine vision in any lighting condition. Continuous development of CMOS device and packaging technology drive TOF cameras to smaller sizes, lower costs and higher reliability. However, a common challenge for TOF camera integrators is to achieve wide field-of-view (FOV) at sufficient resolution, e.g. for vehicle interior monitoring applications. We present results and learnings of a novel dual head proof of concept, built around two QVGA TOF sensors and 1 companion chip controller, enabling wide FOV 45 x 120 deg2 at 640 x 240 px or two independent QVGA detection zones.

  8. Chair's closing remarks

  9. Leave for networking drinks reception and dinner at Günnewig Restaurant Top 180

    Coaches will take delegates to a networking drinks reception and dinner at Günnewig Restaurant Top 180 at the top of the Rhine Tower. You can see the full itinerary here

    Places are limited at the dinner, therefore once you have booked your place at the conference you will need to confirm your participation at the evening reception by downloading and completing the evening dinner form, and then emailing Sharon Garrington on  sgarrington@smithers.com  to secure your place.

    Coaches will be provided from 22.15 to return delegates to the conference hotel.